-
Posts
1699 -
Joined
-
Last visited
-
Days Won
2
Content Type
Profiles
Forums
Events
Everything posted by LucShep
-
I concur. 16GB VRAM is getting shorter for DCS VR, which also likes very high raster performance and large mem-bus bandwidth. But, to be fair, the Nvidia RTX5080 and 5070Ti are also 16GB and 256-bit mem-bus, and those are considered the upcoming "top gaming models" (the RTX 5090 is enthusiast / pro-sumer level). AMD did state last year they are not focusing on the higher segment models anymore, so that still makes the 9070XT an interesting proposition, if it's close to RTX5070 but with 4GB more VRAM (plus the promising FSR4), and at a lower price. With all their faults, I think the RX 7900XT 20GB and 7900XTX 24GB should have been at least "polished" (and FSR4-able) and price readjusted, to be competitive. They could still gather some attention if so. Instead, AMD dropped them altogether.
-
Sure, and that makes sense. But let's see, and AFAIK, the MSRP of the 4070 Super is 600$ (582.23€). It sure doesn't seem like we got close to it in practice, even by the cheapest bottom of the barrel dual-fan models like you say, by looking at the search engine/app for the very best prices in my country (all over 20 retailers, and also includes Amazon Spain). So, as for the "False" ............................. ? If that much, my calculations say it has to do with taxes (in my country that's 23%), plus the retailer margin/fee - so, along with what you say. Yet you can see prices flutuate immensely (those you see there are the lowest of the lowest, but most stores go much, much higher than that!). That's why MSRP is basically "fictional" to me and those in my country (Portugal), which belongs to the EU, because it hasn't correspond to the reality since 2020 (it did before!). ...and please, don't even get me on the 4080s and 4090s (close to a comical horror movie)...
-
Nope. After the pandemic MSRP became sinonymous with "nothing" because that baseline became ficticious -- 25% to 50% over MSRP for "real price" is normal these days for GPUs. Which was not the case up to that point - GPUs were still selling at MSRP, even in Europe. GTX670 "real price at retailers" was 300€~400€ in 2012 (depending on model version), which is 380€ to 510€ in today's money. GTX770 "real price at retailers" was 300€~400€ in 2013 (depending on model version), which is 380€ to 510€ in today's money. GTX970 "real price at retailers" was 300€~400€ in 2014/2015 (depending on model version), which is 370€ to 500€ in today's money. GTX1070 "real price at retailers" was 350€~450€ in 2016/2017 (depending on model version), which is 440€ to 550€ in today's money. RTX2070/S "real price at retailers" was 375€~475€ in 2018/2019 (depending on model version and after AMD RX5700XT came out), that's 450€ to 570€ in today's money. Notice a pattern here? The 70 series always had been a reference in each generation of Nvidia, because they represented that sort of a "sweet spot", fast enough and not outrageously expensive. Where is that since the pandemic? Or more precisely, since the RTX3070, which was always sold at ridiculous prices? (often seen at over 750€!) You see, the pandemic and the mining craze was a justification then for the outrageous prices, but all that has been gone for years. Yet that practice with prices was, and is, mantained by the manufacturers - we've been duped. Do your own quick research - at least in Europe, the RTX4070 and RTX4070 Super still sell for 700€ to 850€, in this very day. And now, does anyone really believe the upcoming RTX5070 will be at lower prices than that? "MSRP where are thou?" PS: you can also apply that to the 60, to the 80 and, to some extent, to the 90 series as well.
-
Some great points in the discussion, can't disagree with any of it. I think what we can all agree is that the GPU industry/market has changed since the pandemic. And, so far, it has definitely not changed for the better. We can also see, going through nearly two decades, that the raw performance jump between Nvidia GPU generations has decreased (especially in mid to high range models). The performance % difference of 5000 series over 4000 series is expected to be as low (if not lower) as 2000 series was over 1000 series, and that was pretty mediocre. Perhaps we may have hit the silicone limits. Perhaps this software based AI rubish is the (very) unfortunate solution for the forseeable future. Also, you should all know by now that, for GPUs, the MSRP ends up being almost ficticious (it's always same story every release). As said before in this topic, it's pretty certain that real prices will be considerably higher than that (for sure in Europe, confirmed through contacts in retail business), and comparatively higher than previous generation if taking to account the suspected raw performance % benefit. Right now, Nvidia dominates and "it is what it is". AMD hasn't been able to keep up (and not hopeful about the RX9070/XT myself). Intel slowly getting there (B580/B570) but not yet. And, more than the silicone limits or AI demands, it's this Nvidia "total dominance" that drives silly prices. Which, unfortunately, many will still gladly sustain (and here I agree with both @The_Nephilim and @kksnowbear). Sometimes I'm dumbfounded how both PC gamers and HW enthusiasts can be so sheep-ish, and unable to react to any of it. Yes, it's a bit of a conumdrum... the world economy is not that great, yet thousands upon thousands spend "second-hand car money" on a single overpriced GPU piece which usually is relevant for only three years, or four with luck. Price and worth of GPUs has been often discussed and, as everything hardware, is many times in "the eye of the beholder". But, this time around, I suspect it may change. Personally, by this time last year, I was expecting to upgrade my RTX3090 to at least an RTX5080 and, actually, have now decided against it. To be fair, my RTX3090 still runs absolutely great, plays everything I throw at it in 4K as I like, with the sole exception of one single title in VR (and we all know which one is it...). Will hold to it for as long as I can, calmly wait and see. Not going to spend my hard earned money this time around, solely on matter of principle. Regardless, I'll still say to anyone using an older/slower GPU and looking to upgrade - the used and outlet market is always a valid solution (and it never failed me). There will still be the ocasional worthy deal for second-hand or refurbished units in the market for certain models of previous GPU generation(s), which will still be able to do almost as much and last almost as long as the newest ones, for far, far less. As in, "don't donate another stupid leather jacket to that guy"...
-
Hunting for the stutter-free VR experience.
LucShep replied to Panzerlang's topic in Virtual Reality
No, no. It goes beyond workload spread on CPU cores, that on itself won't solve the problems. How it all works as a package for all the demanding content that is being placed, right from the start, and how it manages all resources (CPU, GPU, RAM, Storage and basically all the I/O load) is what matters. DCS has many of its roots going back over 20 years, historically hogging resources like a pig since release back then, and you know what they say about "lipstick on a pig".... -
Hunting for the stutter-free VR experience.
LucShep replied to Panzerlang's topic in Virtual Reality
Yep. You see, the fixes and adoption of a more advanced game engine is inevitable for ED. It's no longer a "it'd be nice" as we thought many years ago, it became a necessity. With the ammount of much higher polycount maps and objects (aircraft and not only), and a gazillion of overkill 32-bit 4K/8K textures (in everything) that are now being added to the game, plus adding more complex scripts for avionics, systems, weapons, AI, weather and environment effects (and also of more complex missions) and etc, it can only be delayed for so long until the game becomes unusable as is. Because it got to a point where you spend absurd money for the strongest hardware but get only a small fraction of its benefit, due to game code/engine constraints. If you've upgraded your system recently, try other games and, in comparison, you'll see a monster jump in performance... -
Be aware - careful when buying a used RTX4090
LucShep replied to LucShep's topic in PC Hardware and Related Software
With the "AI fake frames" discussion so prevalent with upcoming RTX5000 series, the RTX4090 will still be relevant and hold its own value. People looking into a RTX4090 from the second hand market (and why shouldn't you, it's a fantastic GPU) really need to be extra careful. This stuff is still happening, and more than ever, it seems... -
The problem is, we hear that with every new GPU release. (heard it all before when the 780/Ti got out, then the 980/Ti, and the 1080/Ti, then the 2080/TI, and then the 3080/3090, and 4080/4090, and now 5080/5090...) Then DCS moves along with its own "evolution" (but is it? ...plenty times looks like the opposite), and the goal posts are moved further and further away from the capabilities of what you paid so hard to achieve. And so there we go rinse and repeat every two or three years.... I know this is a completely different subject for another topic but, seriously, sometimes you gotta wonder if it makes sense. ^^ this right here. That video may sound a bit bitter but, really, it is true. How out of touch can that Jensen-leather-jacket guy be? ED should also do something more, IMO. Optimizations are at reach and ignored for years. It'd also save people some money because it could represent similar performance while being with one tier down of GPU, and for longer. ....unless ED, and the VR HMD manufacturers, all do a dinner party every year with Jensen-leather-jacket guy? hmmm
-
LOL Yes, be carefull with tricksters selling 2nd hand high-end GPUs at unreal tempting prices, lots of damaged and even apparently mint cards which have no core and memory in the PCB of the GPU, for which you can only know after dismounting the whole thing (so, a major scam). Some even go further in the scam....
-
PCIe 5.0 devices are back compatible with older PCIe slots (4.0 and 3.0) in motherboards. You're actually not losing much (if anything at all, my guess is nothing at all) by running any of those PCIe 5.0 GPUs on the motherboard for that 5800X3D (which is PCIe 4.0), so long as the PCIe in the motherboard is set at x16 in the BIOS, to get the full speed/bandwidth that the PCIe slot can give (I hope it makes sense?). If unsure that the PCIe slot for the graphics is at x16, then run GPU-Z and click the "?" symbol, and run the small test for a few seconds (you can stop it after a small while). It'll then show if it's @ x16, x8 or x4 (see image right below). As for recommendation, and based on personal experience, for VR I'd suggest Nvidia.... "it just works". LOL IMO, getting a used or refurbished RTX4070Ti Super 16GB (check Amazon and Newegg, usually there's a few at discount) would be a good purchase for that system. Those should have prices lowered right when the new 5070/5070Ti come out, and I suspect these won't be much faster (10% or 15% difference?) than 4070/4070Ti Super.
-
LOL We got to such a silly point with GPU prices that even retailers can't resist a bit of trolling!
-
Yeah, an i9 13900K is more processor than it should be required, even for an RTX4090. But then "this is DCS" said Sparta-like as in the 300 movie. LOL I'd still keep HAGS set at OFF in there (and FWIW, see thread here). PS: Off-topic, and this will sound disruptive or like I'm selling something but, if DCS VR is getting on your nerves and got the disc space, have a go with DCS 2.56 (see my sig). Nothing to lose with it other than time.
-
HAGS is recommended only if your CPU is considerably weaker then your GPU, otherwise is always best left OFF, to avoid stuttering/hitching in games. One of the problems with DCS has to do with its own coding. Even with the addition of Multi-Threading you still get somewhat CPU limited, even with the fastest processors that you throw at it. Notice this in-game, when you look at the side scenery from inside the cockpit.... everything inside it and in front looks quite smooth, but glancing outside at your 9 or 3 o'clock you still always get that rhythmic stutter. Nothing I've atempted solved it so far (only going back to the much older 2.56 version did, oddly enough). Maybe when Vulkan gets out, things will improve considerably, as we've wished for many years now. But then I'm not holding my breath for that...
-
Actually, it really doesn't create troubles there. Both the Upscalling and Frame Generation tech of Nvidia does work very good resources wise, it's been a total success in that regard. The problem is in the execution, because it's all about the AI working on the background predicting what it doesn't know - hence all the artifacts that can only be mitigated so far. Doesn't matter how good the tech will ever be, the AI can not predict the future, like where the user is moving and what he/she is doing, so the tech is all about "best guess estimation" magic trick work. The "Fake Frames" of Frame Generation are exactly that. It interpolates the image just like the "soap opera" effect that you can turn on in your TV, or that some broadcasting channels use. It's really just that in a more elaborate AI complex way. So, let's say that this new Multi-Frame Generation of RTX5000 is 4x the performance.... you'll get an image motion "soap-opera effect" that is actually working four times slower in the background, because three of the four frames are "guess estimated", not real frames (just one is). As in, imagine the game at 120FPS, but the main player object in game (aircraft / car / persona, etc) always feels like 30FPS (this is why it always feels strange, what you see and what you feel differs). And that's what they meant in the presentation with "RTX5070 as fast as RTX4090" (LOL) when in reality it isn't, at all. Same thing can be said for DLSS, it lowers the resolution and then re-upscales it (a-la DLDSR) through AI algorithm. And hence the very soft look in its final result, for which you always need to add a bit of sharpness (which then tends to add a bit of shimmering and aliasing, as side-effect). Some users resort to the DLSS-Tweaks tool, in an atempt to counter its issues. This one is actually good for very high resolutions, and it'll surely have a place once 4K 240FPS and 8K 120FPS become the norm (we'll get there in coming years), but then we're talking 2D monitors, not VR where DLSS usually looks like vaseline has been plastered on your lenses. None of these solutions work great for VR, because every little flaw of anything rendering related is hyper-magnified in VR, and so raw power is absolutely necessary. ...which then has to do with your next point: If you have an RTX4090 then, no, I don't think the upgrade to the RTX5090 makes sense (unless you've got money to burn and "have to have the bestest", that is). But if you come from, say, an RTX3080 or 2080Ti (and alikes), and DCS VR is your main hobby thing, then it starts to make sense - if you have the budget. The point is, DCS is an odd beast in the world of gaming. It's probably the worst and most demanding sim/gaming title regarding VRAM consumption and inconsistent heavy rendering, with the ocasional stuttering and hitching always creeping in - always amplified in VR. Especially in VR, with the latest headsets, you need as much raw power (core and mem clocks, speed and bandwidth) and VRAM (mem capacity) to counter the issues as best as possible, and this is available at its most on the RTX X090 GPUs only. I shiver just imagining if ED decides to adopt Ray-Tracing and Path-Tracing.... LOL
-
Yep, it's more about being "careful with what you wish". The pretty numbers in the corner of the screen get higher, sure, but the user experience does not. Frame Generation adds input latency, as well as artifacts being introduced by the fake frames. And if in conjunction with DLSS Upscalling, then it all gets a whole lot(!) worse. Especially for VR users these AI techniques are meaningless and are -in my experience- horrible when used there. There's a growing sense of disillusionment with these new announced GPUs, people are now starting to get the problem with"where we are going" with these AI solutions. But then, as soon as the first RTX gen of Nvidia GPUs was announced in 2018 (RTX2000 series), we knew it'd come down to this at some point, exactly how we feared back then. We've seen ludicrous price increases on every new gen of GPUs since then, which now slowly become less and less developed for raw performance (Rasterization). Instead, focus now goes onto these Upscalling and Interpolation AI technics, and each new iteration gets blocked on previous gen GPUs, to generate sales of the newest models. The problem is, FG and DLSS aren't a proper solution for most cases, raw power and proper optimization of games is. But game devs now became lazy and use this AI tech as a crutch. So here we go on a new vicious loop, which became another business exploitation by corpos...
-
Yes prices will reflect that again. Do not expect the RTX5090 in coming months selling for same price that the RTX4090 models are now. I wouldn't be suprised seeing the regular RTX5090 prices being 3000,00€ and over. Among other things, 32GB of GDDR7 on a faster RTX5090 would never be made to same price of the 24GB GDDR6X (50% increase of capacity, and faster VRAM). I think same story that happened with the RTX3090 over two years ago (when the 4000 series gone out) will happen again with the RTX 4090. Second hand and refurbished RTX4090 models will still be highly sought and won't see prices decreasing any time soon. It's a shame that ED and 3rd parties are so stubborn on keeping with such overkill (ludicrous, really) sizes and formats on textures of every module. Noone should ever need more than 16GB of VRAM today, but alas. Otherwise, an RTX5080 should be all that anyone needs for DCS, even in VR.
-
As said, they will not compete in the higher-end segment this time around, betting on the mid and lower segments. But indeed, AMD is looking like the biggest loser so far, unfortunately. Latest speculation is that AMD decided to abandon any mention of the RX 9070 / 9070XT in CES at the last minute, after watching what Nvidia shown for the RTX5070 (performance, new tech, and price) that it will compete with. As in, either AMD will have to reassess (and hence postpone) its presentation and also lower MSRP (to cut the RTX5070 competitor) or they may be dead in the water. FSR4 will now feature their own AI tech and be exclusive to the AMD 9000 series GPUs (and that has been announced). But then, if that market will be about 10% (if that much?) of the global one, how many developers are going to waste time and resources to adopt such exclusive tech into their PC games?
-
The new RTX 5000 series are expected to be 20%~25% faster (rasterization) depending on game title, and that's about it. Their prices (real ones, not MSRP) should also reflect this as well. It's looking like a repeat of what we saw with the RTX 2000 series going back to late 2018, where the performance lift was linear (not that big compared to previous generation) with focus on proprietary features advancements, and we see that's where those big performance increase numbers are being based at, using DLSS4 (with FG). Frame Generation (FG), aka "fake frames", is a mediocre solution because it always introduces a huge ammount of input lag and ghosting, and no AI solution can fix that. It's a miserable experience, unless you drop down your standards so low that you acccept it in defeat, as a desperate atempt to increase framerate. That said, it's only once the reviews are out that we will find out how it really is. Now that's good to know, but then that's Founder Edition models from Nvidia (which, as usual, will be out of stock very quickly once they're made available). So that's still "MSRP", and not the final retail prices of AIB models that most users always end up getting from retailers. People who think they will buy those at such prices are in for quite the disapointment.
-
It's been a while and demand will be high, lots of people have been waiting for them to upgrade and it'll be another crazy race (scalpers warning) as soon as they get out. My guess is that prices will be noticeably higher than that. The announced USA prices are good, but then those are just MSRP ("Manufacturer's Suggested Retail Price", emphasis on suggested). Now for Europeans, better expect prices to be at least 25% higher than that (they always are), so something like this: RTX5090 ---> ~2500€ RTX5080 ---> ~1250€ RTX5070Ti ---> ~950€ RTX5070 ---> ~700€ Anyways, let's hope the performance increase is worth the long wait (GDDR7 should be great), and that prices don't get stupid crazy inflated.
-
I only trialed those modules, and it was quite sometime ago. I understand the appeal and place of Jet Trainers, and I'd bet that they are grossly overlooked by most users (but shouldn't). The thing is, after trialing them, I really think it's one of those "each to its own", a bit of a personal/subjective preference, or taste that some will enjoy and others not really. I tend to prefer simpler and older aircraft systems (as in, less modern computorized) and don't enjoy FBW stuff as much, but the problem I got with those Jet Trainers when trialing them was that, although they feel great (IMO), once you try to do something more "serious" (combat wise), or reach for more "ooomph", they (obviously) fall short. So, I just tend to go to Cold War jets (50s, 60s and 70s), perhaps because I also enjoy their own quirks too. Something about a MiG-21Bis or an F-5E, even an F-86F and MiG-15Bis, that just feels more interesting and makes you want to come back to them. That said, I totally get why such Jet Trainers were picked in the first place and am glad that they exhist as modules for DCS. Actually, I often wonder why weren't a couple of those modules made as the "free" aircrafts included in DCS World already, instead of the Su-25T or TF-51D.....