Jump to content

LucShep

Members
  • Posts

    1693
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by LucShep

  1. lol what? ...okey dokey, bye
  2. .....what I see is that you don't know what you're talking about. The RTX4080S is actually the worst example you could find to defend your nonsense argument, because that model came out later exactly to disguise the fact that the original RTX4080 was a monumental failure from Nvidia - it was left abandoned and ignored on the shelves, while the RTX4090 imediately sold out, scalping and hording galore (its price still overly inflated today!), inclusively the RTX4070 was outselling it by some margin. That's why the RTX4080S came out later with such a considerably lower MSRP, just to entice sales numbers. But then the novelty interest of 4000 series had already ran out. While it sold "decently", it was never that great, not really a huge success - more like the "oh well, can't reach the 4090 I guess I'll go to that" usual pick of the list. Hence maintaining a reasonable stability in price, never inflating to the levels you see, for instances, for the 5080 now. Though the RTX4080S price did actually increase in last months, prior and during the RTX5000 series launch. And, FWIW..... It has NOTHING to do with common sense, it has all to do with bottom line for manufacturers, what gives them most profit for least cost. You're oversimplifying and underestimating the necessary measures and costs to take GDDR7 to levels that make GDDR6 obsolete for gaming GPUs (not for AI computing / data center!!), which it simply doesn't atm. And those 3GB modules will be GDDR7. Here are some points you're not taking into account: Hardware Costs: GDDR7 memory modules are more expensive than GDDR6/X modules, primarily due to the higher bandwidth and power efficiency requirements, which leads to even higher priced gaming GPUs than they already were/are. Power Consumption: While GDDR7 is designed to be more power-efficient, the initial investment in GDDR7-based hardware is higher due to the cost of the more advanced memory technology, and already the case for Nvidia 5000 series. System Design: The design of the system architecture also impacts the cost of using GDDR7 memory. For example, the need for more advanced cooling on GPUs or higher-speed interconnects adds to the already higher overall cost, in a market already cursed with inflated prices. Higher Working Temperatures: While bandwith is a bit larger, GDDR7 speeds are (for now) necessarily capped due to higher temperatures, to equivalent speeds in practice to what is already in use for GDDR6/X, and as the bandwidth was already eficient for gaming before, it makes GDDR6/X still competitive enough while being more cost effective for this end use (gaming). RTX5000 already uses it, but how much of its silly high price is due to GDDR7 adoption? It's not a "lets just slap some mem modules in the PCB and go home", and not something to rave about for increased VRAM gaming GPUs either. It increases production cost and final price for gaming GPUs, even more, for very little benefit (none for now) - when GPUs already were/are at used car prices. No, because it simply is not true. LOL "okay bro".... You must live on another planet then? If you don't see increased price in GPUs from 2019/2020 to 2024/2025, gen to gen, I think every person in this forum board will be surprised. Anyway, you like to look for charts and numbers, go do it, from one generation to another of Nvidia 2000, to 3000, to 4000, to 5000 series. Go research, for example, how much the street-price (not MSRP bullsh!t!) of an RTX2060, RTX2070, RTX2080 (Super or not) was before RTX3000 series replacements came out and how much those then went for. And then, even after the pandemic and mining craze ended (the excuse then for crazy prices), how the RTX4000 series and now RTX5000 series kept such prices up. And same for AMD with the RX5700/XT versus RX6700/XT, RX7700/XT and upcoming RX9060/XT. I agree there. The problem is when prices are so much higher (for whatever reasons we may discuss) but performance uplifts are not significant enough. It's discouraging, for both the users/buyers to invest in new hardware, and the PC gaming developers to do better. PC gaming has always had its highs and lows, but it's not in a happy/healthy place right now. Since the economy and geopolitical scene is at a weird spot, it'll depend on the most clever, innovative and risk-taking type of people to push things further, for the better and cheaper.
  3. Let's see if I can make sense (not one of my qualities, I admit, and sorry for the wall of text).... It was all in context, but you didn't notice. You first replied to a quote on a previous post of mine, where I said "the imediate reality is, the days of ≥20GB GPUs for DCS users are pretty much gone and over", where you disagreed and mentioned the lack of potential adoption of 3GB VRAM modules (instead of the current 2GB we have) hindering the RTX 5080, likely having a 24GB version later - which (as I said), if happening, will be again a sh!tshow with availability and ludicrous street price. And on next post you ilustrated your optimism towards higher VRAM on next GPUs "where the industry is going" (your words, not mine) with a Micron roadmap "vision". I countered that argument in my reply by saying "sorry for not positively fantasizing on future possibilities, speculation and roadplans, which may or may not be viable or actually better solutions in the end", and also giving an example of that, with the GDDR7 case - something that memory manufacturers hyped (like 3GB modules are being now) but, in the end, is proving (atm) to be a sidegrade, not an upgrade, to GDDR6X for gaming on latest Nvidia GPUs RTX 5090, 5080 and 5070/Ti (because it has its own issues, such as higher working temps hindering higher clocks - which could make a difference otherwise, but simply doesn't atm). Same thing may, or not, happen with 3GB VRAM modules, that's what I meant. You don't know, and I don't know. Even if basing on a roadmap ilustration from a memory manufacturer, it doesn't mean that anything substancial will happen VRAM ammount wise for gaming market GPUs. It's Nvidia, AMD and Intel who decides how (and how many of) those VRAM modules are used for X and Y model and market segment. Not the memory makers. More than likely what we'll end up seeing is 18GB VRAM (6x 3GB modules) instead of 16GB VRAM (8x 2GB modules). Think about how 1) gaming development in general is NOT developing (it is stagnated) and 2) the budget constraints and how it would work on very large VRAM GPUs (24GB and over), already very expensive GPUs as they are on the current world economy, itself not showing great prospects in general for the coming years (actually the contrary). If Nvidia and AMD wanted to sell you 20GB and 24GB RTX5070s and RX9070s they would've already - the tech to do it already exhists for plenty years with 2GB modules. That'll only happen in years from now with following generations, by natural progression, just like before, as has always been. Not because of bigger capacity memory modules, although they will be a factor later on (modules with bigger capacity have also been natural progression). The general consensus is that there is no need for more than 16GB VRAM for PC gaming, not for the time expectation of this generation (as in, for at least 2 years). More than that would be for GPUs aimed at content creators, but we now know those would be at (even more) prohibitive prices. So, and no less important, bigger VRAM GPUs, in the current and near future PC gaming market and economy, would mean even more expensive GPUs than today... .....and hence my mention of "the rich buyers" (as you said "people who are willing to spend big on gaming" and who you consider to be a factor) who in reality are not enough to sustain the PC gaming market, at all (nor who ED should be focusing on for DCS). As I said, and in my opinion, "those (at max) are not even 5% from the total userbase/market, like what hypercar millionaires are to the automobile market". What has been really happening is FOMO pushing people to "buy higher" than they should - a worldwide endemic in PC hardware, which any PC hardware vendor/store will tell you clearly exhists and influences the market, and is exploited by the manufacturers. And PC tech influencers (glorified salesmen) also have blame on this. Which then also influences PC gaming developers (less optimization is a big growing problem), and so on, all in a vicious circle. It's not balanced like you say, it's actually unbalanced. Have you noticed how much the GPUs have exponentially increased price in just five years? This isn't "almost certainly temporary" (your words, not mine). Prices are ridiculous because it's simple market pricing exploitation from the GPU makers, prices are not really based on tech advancements. There is the problem of scalping and hording at releases, but that's only one moment. Prices barely decrease during the whole lifespan of a GPU generation (rare exceptions to some Intel and AMD GPUs that simply don't sell), the opposite actually happening in this last generation. GPU manufacturers ARE betting on PC gamers to pay ridiculous ammounts of money - because it worked during and after the pandemic - it's working for them, it's here to stay. Yes, it'll reach a point (more than today) when people simply can't keep up, and many (I'd say most?) pass on it and will be left with obsolete GPUs, or resignate into buying lower segment than they would have years before. It's then, of course, that GPUs will have to get lower price tags to sell - "it will get worse before it gets better", like I said.
  4. Yeah, I'd say keep the RTX5070Ti. The fountain has dried now anyway, and it looks like prices on the 9070XT and 9070 are going to shoot up sky high very soon. At least it went decenty for a few hours this time (unlike Nvidia's) but it seems it's another GPU launch and release going bad, again. Welp, congrats to those who managed to get one anywhere near MSRP...
  5. Bugger you really took the last one didn't you? LOL The 7900XT 20GB are all over the regular 790€ price now. (so better get the new RX9070XT 16GB instead)
  6. Alright, now that is a really good price for a 7900XT 20GB, for sure! And it does makes a case for a better purchase than anything similar performance wise (than even the new 9070/XT). The problem is....... where are brand new 7900XT 20GB being sold for 530€? In the current market, and for someone wanting to upgrade to a new high end GPU, yes I'd agree that the AMD 7900XTX is definitely a better value purchase than the Nvidia RTX 5090 (near unobtanium) or the RTX 4090 stock left overs (at over double the price of it). But the counter to the 7900XTX argument is that, for less than 2/3 of its price you now have the new RX 9070XT 16GB. With basically similar raster performance, better driver support from here on (knowing AMD, will imediately focus on RX9000 series almost exclusively), not to mention better Ray Tracing performance and much superior FSR4 upscaling tech (coming to more games) - very important if your gaming life extends beyond DCS. Even if with less VRAM, I'd probably accept it and consider 16GB is plenty, even for DCS. And BTW.... (conclusions at 16:05 if you just want the gist of it)
  7. The 9070XT are selling like hotcakes, any and all models it seems. From the four main stores in my area that I see listing them, only two have a few units available, most models already out of stock (and no prevision date for resupply). And that's just hours after they went online. Not sure if it's same in your area but, if you managed to get that RTX5070Ti for a decent price, and want a new GPU very soon, then perhaps it may be better to keep it.
  8. I'm dramatizing? "merely a narrative"! Have a look then....... That's a search engine showing what is in stock and at the lowest price, in the whole territory peninsular and insular of good old Portugal nation. Please don't tell me it's a country specific thing, because I'm seeing the exact same thing on Spanish, French and Italian stores. Even in the other side of the ocean at USA ones (Newegg, etc). And I'm sorry if I'm not able to positively fantasize on future possibilities, speculation and roadplans, which may or may not be viable or actually better solutions in the end, when the present indicates unstable predictions. If that much, it'll get worse before it gets any better. I'm sorry if it makes for an unpleasant read. Like the previously hyped GDDR7, for example - it runs so hot that (atm) they can't push it enough to make a real practical difference to gaming compared to GDDR6X, so barely any difference but it's even more expensive now (not what an upgrade is supposed to be). Also, I don't abide to the richest userbase that doesn't even count as the PC gaming market, because they'll always buy what they want, regardless of price, global economy, or even a likely recession. Those (at max) are not even 5% from the total userbase/market, like what hypercar millionaires are to the automobile market. Not even worth placing them in the global picture, IMO. I prefer to take every positive from what can be taken in the current bad moment, one that we still hope (months if not years later?) to be temporary. For example like with the AMD RX9070/XT right now - they are undoubtedly the best performance GPUs in value that one can buy (brand new) at the moment (how long it'll last is unknown), likely as good as it gets in many months. Even if they're just the "least of all evils" in a market that is absolutely corrupted by very poor releases in the last few years.
  9. Okay, let's consider a supposed launch of an RTX5080S/Ti 24GB sometime in near future. The cost, how much will its street price be, and that's if ever available in stock? ....2.500,00+ EUR/USD? ...3.000,00+ EUR/USD? Because that's a reality one should now count for, if hoping for that GPU to come. (that, and also burning connectors, and broken ROPs, etc ) Desire has nothing to do with feasability. The idea is simple and has been discussed in other forums. And it comes from the lack of availability and unpractical prices of highest-end GPUs for PC gamers. Check how it is right now. It was expensive before, but now it became impossible, well beyond "enthusiast" buyer level. How many RTX5090 users are you seeing in these forums? My bet is they're no more but a handfull (probably not even that much). Are people really hell bent in buying from the remaining overpriced old stock of AMD 7900XT/XTX? I don't think so. And how many here are buying RTX4090s now priced at nearly double the MSRP? None? Are people really lining up to buy 4 and 5 year old used RTX3090s? I really don't think so. If we're talking content creators, they don't make part of the gaming market, won't make such GPUs feasable for a market that is inclusive to PC gamers, not even to DCS users. GPUs made for them will be near unattainable to gamers. The AI market is its own thing, doesn't really fit in the same puzzle, 1) because those usually are for profit and 2) are willing to buy GPUs in bulk or professional models (Nvidia H200, etc) - maybe even willing to smuggle them, if following rumours. It's the "Mining Craze - Part II". They won't positively push the market for gaming GPUs as we know it. Again, desire has nothing to do with feasability. The reality now is, people buying a brand new GPU will go, at max, for a 5080, 5070Ti and RX9070XT, all of which are 16GB VRAM. People willing to pay for new GPUs have their limits, economy dictates it. For any game developer to ignore today's reality is to disregard the current and near future userbase.
  10. We won't see an 9070XTX 20GB or 24GB because the previous iteration was a costly failure in sales numbers for AMD. It's more expensive to produce and even harder to sell, only to be left forgotten on the shelves - as the piles of older 7900XT and 7900XTX available and not selling clearly show. Maybe if the RX9070 series are a resounding success they will re-evaluate the rumoured (and abandoned) 9070XTX 32GB prototype aimed at content creators. People need to understand that GPU manufacturers don't even consider anything over 16GB VRAM to be necessary for any sort of PC gaming, regardless of resolution, 2D or VR. It's not the GPU manufacturers fault, or ours as paying customers, that one single game uses over 16GB VRAM and over 40GB or RAM with certain modules and maps in MP. The fact is DCS is a total anomality in the gaming world, even in the niche simming area, with its silly high hardware demands. To pay absurd ammounts of money for a particular feature (abnormal ammounts of VRAM) on an overpriced GPU market, and all for just one game title, makes no sense. The issue comes from the game, itself and alone, and the game developers (ED in this case) have to readjust priorities and their product to worldwide market shifting tendencies, also affected by raise in inflation and taxes on most countries. The imediate reality is, the days of ≥20GB GPUs for DCS users are pretty much gone and over.
  11. Even good old reviewer Leo Waldock from KitGuru jumped on the AMD RDNA4 bandwagon and dumped Nvidia...
  12. For a moment, me too! oh well... Small trip down memory lane, by damsonn retro flight simmer:
  13. LOL that should be the next spin by the crocodile jacket man, along with "the more you buy, the more you save!". And the sad thing about the RTX5070 (or rather, their gullible buyers) is that they're all sold out!
  14. LMAO! the friggin Nvidia RTX5070 series are nothing more than a meme now.
  15. Video review recap based on 6 different reviews, by Paul's Hardware:
  16. On a different note... For the RX9070XT models that use the 12VHPWR power connector, some AIBs such as Sapphire placed a few neat details, doing it better than what Nvidia (and AIBs) does for their own models. It's not that it's so much more reliable, but makes it inclusively easier and cheaper to repair it in case of a short circuit. Which is to be praised, even if considering that this GPU is well within the theoretical 375W safety margins for the 12VHPWR power connector (none RX9070XT works over 350W), it might help in a repair years later if something occurs out of warranty. That said, I think I'd still recommend models with the classic 8-pin power connectors - still better than this nonsense 12VHPWR solution. Anyway, if you like to nerd on such details, here's Buildzoid dissecting it, on the Sapphire Nitro model:
  17. Heh... and it better be (and also make you coffee in the process), looking at the outrageous price that they still are (in Europe at least). Can't find any 7900XTX below 1.000,00 EUR anywhere, and most popular places are actually listing them at over 1.200,00 EUR! That's at least a third and closer to double the increase in cost over the expected street price of the RX9070XT and RX9070 16GB. Now that's a lot of money! 16GB VRAM is not considered dead on arrival for DCS. The problem (IMHO) is this weak DCS userbase not putting enough pressure on ED, to fix the VRAM consumption problem once and for all (it's been how many years??). People seem more concerned and afflicted by FOMO (over utterly expensive hardware) rather than standing their ground as customers. Honestly, to reach a conclusion -based on their large VRAM buffer alone- that there are only five GPUs (RTX5090, RTX4090, RTX3090, RX7900XTX and RX7900XT) worth true consideration for DCS, would make the developer itself look pretty darn stupid, wouldn't you agree?
  18. Color me impressed! Some things are exactly as leaks suggested, others are better. Some points: Power consumption on the RX9070XT perhaps could be better (if compared to Nvidia 5070Ti and 4070Ti/S) but it's not egregious, it's still acceptable. It's curious how the RX9070XT actually places within (or against) the competicion - does better in 4K than it does in 1440P. How good the Ray Tracing performance is on the RX9070XT and RX9070 - a noticeable improvement. For example, Cyberpunk2077 is extremely demanding in RT (one of the worst offenders there) and when at 4K resolution + RT + Quality upscaling, the RX9070XT performs as good as an RTX4070Ti (though not as good as an RTX5070Ti), which is really surprising to me. In the past, there was a huge gap to Nvidia, not anymore. The RX9070 16GB (non XT) is as fast, sometimes faster, than the RTX5070 12GB and RTX4070S 12GB - for similar price and more VRAM, it is a better choice. The RX9070XT 16GB is nearly as fast, sometimes as fast, as the RTX5070Ti 16GB and RTX4080S 16GB which sell at higher prices, it then is a great alternative. So, AMD with RDNA4 is now not only consistently competitive in rasterization (no upscaling, no RT) but also competitive in RT as well (close to Nvidia's RTX4000 series here). FSR4 is a massive improvement (now as good as DLSS3/4, image quality wise - see video right below) but the games support list needs to grow. Previous FSR3/3.1 (currently used in most games) is nowhere as good, game developers really need to adopt/update latest FSR4 version in current and future games (hello ED??). All in all, if at these prices (if not inflated), it's a no brainer - for 2025 it seems AMD RX9000 series will be most recommended for people upgrading slower/older GPUs. FSR4 Review - Compared to FSR3.1, DLSS3 and DLSS4 - at 1080p, 1440p and 4K ------------------------------ EDIT: more reviews
  19. TBH, what I'm concerned about are the real prices, if AMD RDNA4 turns out to be competitive as some hint they are. The recent launch of Intel B580 turned out as a disaster in pricing, due to scalping. The AIB custom models of new Nvidia RTX5000 series are all with super inflated prices. There's plenty good AIB custom models of RX 9070 and 9070XT, but not sure how much they'll go for, given the $549 and $599 MSRP that AMD gives to their reference models. And if scalpers have their way........... I just hope that, for once, there's a GPU launch that finally goes good, performance and price wise, and not yet another blow to PC gaming and tech. Especially for those that have been patiently waiting for an upgrade to a new GPU, it's time to have it right.
  20. Yep. All across the board, it's really bad. Basically same thing as the RTX4070 Super it replaces. Not even an upgrade in the slightest over previous generation. (but hey, at least you can get the fake-frames of MFG... with a huge latency penalty!) I honestly don't remember a generation of GPUs so bad as the RTX5000 series, probably not even the maligned FX series back in 2003 (curiously, they too were 5000 series). It's really ironic that it's now the lower budget 5060 and 5060Ti becoming the last hope of this generation, as saving grace for Nvidia. And I'm not sure they'll hold up.
  21. Yeah, for those considering the RTX5070... just don't. It's not worth it. Tomorrow reviews will tell if AMD finally got it right and has the solution for this price point (with the RX 9070 and 9070XT), or if it's another disapointment.
  22. I completely forgot, there's the Sony PSVR2, though it only works through OpenXR via SteamVR. While not "universally praised" like the Reverb G1/G2 has been, there are users who are happy with it. https://www.youtube.com/results?search_query=psvr2+pcvr
  23. You and me both (...I'm alergic to social media ). I'm still hopeful that someone will find a way to bypass WMR on latest Windows, with a free open source alternative. Though I suspect it'll only happen soon on Linux. Not only the HP Reverb G1/G2 but also the old Samsung Odyssey+ are still popular WMR headsets that a lot of people keep using.
  24. I think the problem you'll realize is finding a "better" headset in all parameters combined (image, comfort, sound, I/O tracking, HW resources consumption, and price) is hard. Honestly, I think the closest you'll get (in my experience) as a direct replacement for the HP Reverb G1 or G2, is the Oculus/Meta Quest 3, or maybe the Quest Pro. But you'll need a Wifi 6E router (as it works best wireless), a better headstrap (look at BoboVR and KIWI-design ones for Quest 3) and, of course, a Facebook account...
  25. Can't help with the comparison to the Crystal, as I haven't tried one yet. I did try the Pimax 8KX, and the wider FOV does impress right away, somewhat fixing that downside of the HP Reverb G1 and G2 (which feel like looking through "divers goggles" or "binoculars" if compared) but the image is not as crisp, it was more demanding on resources, and comfort was appaling in comparison (HP Reverb G1 and G2 still better). Reading instruments is not as easy or imediate on the Pimax 8KX. It's not that it's bad (it isn't) and I guess one could get used to it, but I personally couldn't get over it. I think it's a matter of what you're looking for and prefer. The "sweet spot", as is the FOV, on both the HP Reverb G1 and G2 may be a bit small, but it's still so good, as are the colors, that I find even newest headsets barely improving in these aspects while being quite a bit more demanding on resources (and far more expensive). Same for the comfort, and the sound. I haven't tested many but, from all HMDs I tried with DCS - HP Reverb G1 (v1 and v2), HP Reverb G2 (v1 and v2), Oculus/Meta Rift-S, Quest 2, Quest 3, Pico 4, and Pimax 8KX - both the HP Reverb G1 and G2 proven to be the best and by far (for me). Also, and perhaps controversial, but I even found the 2nd version of the HP Reverb G1 (aka Reverb G1 Pro) as good, if not better, than both the Reverb G2 v1 and v2, with only the sound and comfort being better for me on the G2 v2. Could be from the units themselves and not representative of all in the market, but... there is that. There's the issue of cables, on both HP Reverb models, that may get broken with time and harsh use. But there are sellers on EBAY selling replacements at lower prices than before (as are the G1 and G2 headsets themselves, plenty mint ones available in second hand market), as VR users have abandoned them due to the WMR software breaking issues with latest Win11 24H2 update (now also affecting Win10 with latest updates). There are also mods to atempt fixes on cables, but may be complicated for most users. I think the downside of Windows updates breaking WMR, and HP no longer producing them, are really the only main problems on the HP Reverb G1 and G2. Otherwise, they'd still dominate as the unbeatable balanced choice of VR headsets for people who are into simulators today, for DCS inclusively. Great image, comfort, sound, and inside-out tracking, all provided right out of the box, without outrageous prices or brutal hardware requirements.
×
×
  • Create New...