Jump to content

LucShep

Members
  • Posts

    1687
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by LucShep

  1. The problem is, we hear that with every new GPU release. (heard it all before when the 780/Ti got out, then the 980/Ti, and the 1080/Ti, then the 2080/TI, and then the 3080/3090, and 4080/4090, and now 5080/5090...) Then DCS moves along with its own "evolution" (but is it? ...plenty times looks like the opposite), and the goal posts are moved further and further away from the capabilities of what you paid so hard to achieve. And so there we go rinse and repeat every two or three years.... I know this is a completely different subject for another topic but, seriously, sometimes you gotta wonder if it makes sense. ^^ this right here. That video may sound a bit bitter but, really, it is true. How out of touch can that Jensen-leather-jacket guy be? ED should also do something more, IMO. Optimizations are at reach and ignored for years. It'd also save people some money because it could represent similar performance while being with one tier down of GPU, and for longer. ....unless ED, and the VR HMD manufacturers, all do a dinner party every year with Jensen-leather-jacket guy? hmmm
  2. LOL Yes, be carefull with tricksters selling 2nd hand high-end GPUs at unreal tempting prices, lots of damaged and even apparently mint cards which have no core and memory in the PCB of the GPU, for which you can only know after dismounting the whole thing (so, a major scam). Some even go further in the scam....
  3. PCIe 5.0 devices are back compatible with older PCIe slots (4.0 and 3.0) in motherboards. You're actually not losing much (if anything at all, my guess is nothing at all) by running any of those PCIe 5.0 GPUs on the motherboard for that 5800X3D (which is PCIe 4.0), so long as the PCIe in the motherboard is set at x16 in the BIOS, to get the full speed/bandwidth that the PCIe slot can give (I hope it makes sense?). If unsure that the PCIe slot for the graphics is at x16, then run GPU-Z and click the "?" symbol, and run the small test for a few seconds (you can stop it after a small while). It'll then show if it's @ x16, x8 or x4 (see image right below). As for recommendation, and based on personal experience, for VR I'd suggest Nvidia.... "it just works". LOL IMO, getting a used or refurbished RTX4070Ti Super 16GB (check Amazon and Newegg, usually there's a few at discount) would be a good purchase for that system. Those should have prices lowered right when the new 5070/5070Ti come out, and I suspect these won't be much faster (10% or 15% difference?) than 4070/4070Ti Super.
  4. LOL We got to such a silly point with GPU prices that even retailers can't resist a bit of trolling!
  5. Yeah, an i9 13900K is more processor than it should be required, even for an RTX4090. But then "this is DCS" said Sparta-like as in the 300 movie. LOL I'd still keep HAGS set at OFF in there (and FWIW, see thread here). PS: Off-topic, and this will sound disruptive or like I'm selling something but, if DCS VR is getting on your nerves and got the disc space, have a go with DCS 2.56 (see my sig). Nothing to lose with it other than time.
  6. HAGS is recommended only if your CPU is considerably weaker then your GPU, otherwise is always best left OFF, to avoid stuttering/hitching in games. One of the problems with DCS has to do with its own coding. Even with the addition of Multi-Threading you still get somewhat CPU limited, even with the fastest processors that you throw at it. Notice this in-game, when you look at the side scenery from inside the cockpit.... everything inside it and in front looks quite smooth, but glancing outside at your 9 or 3 o'clock you still always get that rhythmic stutter. Nothing I've atempted solved it so far (only going back to the much older 2.56 version did, oddly enough). Maybe when Vulkan gets out, things will improve considerably, as we've wished for many years now. But then I'm not holding my breath for that...
  7. Actually, it really doesn't create troubles there. Both the Upscalling and Frame Generation tech of Nvidia does work very good resources wise, it's been a total success in that regard. The problem is in the execution, because it's all about the AI working on the background predicting what it doesn't know - hence all the artifacts that can only be mitigated so far. Doesn't matter how good the tech will ever be, the AI can not predict the future, like where the user is moving and what he/she is doing, so the tech is all about "best guess estimation" magic trick work. The "Fake Frames" of Frame Generation are exactly that. It interpolates the image just like the "soap opera" effect that you can turn on in your TV, or that some broadcasting channels use. It's really just that in a more elaborate AI complex way. So, let's say that this new Multi-Frame Generation of RTX5000 is 4x the performance.... you'll get an image motion "soap-opera effect" that is actually working four times slower in the background, because three of the four frames are "guess estimated", not real frames (just one is). As in, imagine the game at 120FPS, but the main player object in game (aircraft / car / persona, etc) always feels like 30FPS (this is why it always feels strange, what you see and what you feel differs). And that's what they meant in the presentation with "RTX5070 as fast as RTX4090" (LOL) when in reality it isn't, at all. Same thing can be said for DLSS, it lowers the resolution and then re-upscales it (a-la DLDSR) through AI algorithm. And hence the very soft look in its final result, for which you always need to add a bit of sharpness (which then tends to add a bit of shimmering and aliasing, as side-effect). Some users resort to the DLSS-Tweaks tool, in an atempt to counter its issues. This one is actually good for very high resolutions, and it'll surely have a place once 4K 240FPS and 8K 120FPS become the norm (we'll get there in coming years), but then we're talking 2D monitors, not VR where DLSS usually looks like vaseline has been plastered on your lenses. None of these solutions work great for VR, because every little flaw of anything rendering related is hyper-magnified in VR, and so raw power is absolutely necessary. ...which then has to do with your next point: If you have an RTX4090 then, no, I don't think the upgrade to the RTX5090 makes sense (unless you've got money to burn and "have to have the bestest", that is). But if you come from, say, an RTX3080 or 2080Ti (and alikes), and DCS VR is your main hobby thing, then it starts to make sense - if you have the budget. The point is, DCS is an odd beast in the world of gaming. It's probably the worst and most demanding sim/gaming title regarding VRAM consumption and inconsistent heavy rendering, with the ocasional stuttering and hitching always creeping in - always amplified in VR. Especially in VR, with the latest headsets, you need as much raw power (core and mem clocks, speed and bandwidth) and VRAM (mem capacity) to counter the issues as best as possible, and this is available at its most on the RTX X090 GPUs only. I shiver just imagining if ED decides to adopt Ray-Tracing and Path-Tracing.... LOL
  8. Yep, it's more about being "careful with what you wish". The pretty numbers in the corner of the screen get higher, sure, but the user experience does not. Frame Generation adds input latency, as well as artifacts being introduced by the fake frames. And if in conjunction with DLSS Upscalling, then it all gets a whole lot(!) worse. Especially for VR users these AI techniques are meaningless and are -in my experience- horrible when used there. There's a growing sense of disillusionment with these new announced GPUs, people are now starting to get the problem with"where we are going" with these AI solutions. But then, as soon as the first RTX gen of Nvidia GPUs was announced in 2018 (RTX2000 series), we knew it'd come down to this at some point, exactly how we feared back then. We've seen ludicrous price increases on every new gen of GPUs since then, which now slowly become less and less developed for raw performance (Rasterization). Instead, focus now goes onto these Upscalling and Interpolation AI technics, and each new iteration gets blocked on previous gen GPUs, to generate sales of the newest models. The problem is, FG and DLSS aren't a proper solution for most cases, raw power and proper optimization of games is. But game devs now became lazy and use this AI tech as a crutch. So here we go on a new vicious loop, which became another business exploitation by corpos...
  9. Yes prices will reflect that again. Do not expect the RTX5090 in coming months selling for same price that the RTX4090 models are now. I wouldn't be suprised seeing the regular RTX5090 prices being 3000,00€ and over. Among other things, 32GB of GDDR7 on a faster RTX5090 would never be made to same price of the 24GB GDDR6X (50% increase of capacity, and faster VRAM). I think same story that happened with the RTX3090 over two years ago (when the 4000 series gone out) will happen again with the RTX 4090. Second hand and refurbished RTX4090 models will still be highly sought and won't see prices decreasing any time soon. It's a shame that ED and 3rd parties are so stubborn on keeping with such overkill (ludicrous, really) sizes and formats on textures of every module. Noone should ever need more than 16GB of VRAM today, but alas. Otherwise, an RTX5080 should be all that anyone needs for DCS, even in VR.
  10. As said, they will not compete in the higher-end segment this time around, betting on the mid and lower segments. But indeed, AMD is looking like the biggest loser so far, unfortunately. Latest speculation is that AMD decided to abandon any mention of the RX 9070 / 9070XT in CES at the last minute, after watching what Nvidia shown for the RTX5070 (performance, new tech, and price) that it will compete with. As in, either AMD will have to reassess (and hence postpone) its presentation and also lower MSRP (to cut the RTX5070 competitor) or they may be dead in the water. FSR4 will now feature their own AI tech and be exclusive to the AMD 9000 series GPUs (and that has been announced). But then, if that market will be about 10% (if that much?) of the global one, how many developers are going to waste time and resources to adopt such exclusive tech into their PC games?
  11. The new RTX 5000 series are expected to be 20%~25% faster (rasterization) depending on game title, and that's about it. Their prices (real ones, not MSRP) should also reflect this as well. It's looking like a repeat of what we saw with the RTX 2000 series going back to late 2018, where the performance lift was linear (not that big compared to previous generation) with focus on proprietary features advancements, and we see that's where those big performance increase numbers are being based at, using DLSS4 (with FG). Frame Generation (FG), aka "fake frames", is a mediocre solution because it always introduces a huge ammount of input lag and ghosting, and no AI solution can fix that. It's a miserable experience, unless you drop down your standards so low that you acccept it in defeat, as a desperate atempt to increase framerate. That said, it's only once the reviews are out that we will find out how it really is. Now that's good to know, but then that's Founder Edition models from Nvidia (which, as usual, will be out of stock very quickly once they're made available). So that's still "MSRP", and not the final retail prices of AIB models that most users always end up getting from retailers. People who think they will buy those at such prices are in for quite the disapointment.
  12. It's been a while and demand will be high, lots of people have been waiting for them to upgrade and it'll be another crazy race (scalpers warning) as soon as they get out. My guess is that prices will be noticeably higher than that. The announced USA prices are good, but then those are just MSRP ("Manufacturer's Suggested Retail Price", emphasis on suggested). Now for Europeans, better expect prices to be at least 25% higher than that (they always are), so something like this: RTX5090 ---> ~2500€ RTX5080 ---> ~1250€ RTX5070Ti ---> ~950€ RTX5070 ---> ~700€ Anyways, let's hope the performance increase is worth the long wait (GDDR7 should be great), and that prices don't get stupid crazy inflated.
  13. I only trialed those modules, and it was quite sometime ago. I understand the appeal and place of Jet Trainers, and I'd bet that they are grossly overlooked by most users (but shouldn't). The thing is, after trialing them, I really think it's one of those "each to its own", a bit of a personal/subjective preference, or taste that some will enjoy and others not really. I tend to prefer simpler and older aircraft systems (as in, less modern computorized) and don't enjoy FBW stuff as much, but the problem I got with those Jet Trainers when trialing them was that, although they feel great (IMO), once you try to do something more "serious" (combat wise), or reach for more "ooomph", they (obviously) fall short. So, I just tend to go to Cold War jets (50s, 60s and 70s), perhaps because I also enjoy their own quirks too. Something about a MiG-21Bis or an F-5E, even an F-86F and MiG-15Bis, that just feels more interesting and makes you want to come back to them. That said, I totally get why such Jet Trainers were picked in the first place and am glad that they exhist as modules for DCS. Actually, I often wonder why weren't a couple of those modules made as the "free" aircrafts included in DCS World already, instead of the Su-25T or TF-51D.....
  14. This is one of those non-consensual topics because.... "it depends". At least in my experience, I'd say it really "depends" on situation, person and scenario. If you have the space, and it's mostly for flight sims, then yes it's worth it getting a big 16:9 screen (42'' to 55'' size), be it propper monitor or decent 4K TV with gaming features. Notice I mention 16:9, but don't mention 21:9 or 32:9 widescreen formats. This is because, contrary to racing-sims which operate mostly in a flat point of view (mostly looking center, and slight left/right glances for corners and overtaking/disputes), in flight-sims we operate in full 3D space (especially when dogfighting). So, to lose the precious vertical real estate is wasting a vital piece of immersion for flight-simming. Therefore, 21:9 and 32:9 widescreens are not the best choice for sims like DCS World, though they do excel with racing-sims. Personally, I'd not recommend a 49" ultrawide monitor for DCS or flight-sims in general (I had a Samsung C49 here a few years ago and ended up hating it). VR is not for everyone, as said. It "clicks" with some (imediately or not) and it doesn't with others. I fully agree that VR is a step well above (waaaaay above) in regards to immersion, and a totally different experience altogether. Basically, you're "there" in the game, in the cockpit. But you also get visually detached from all your physical surroundings, which can be annoying (if not desconcerting at first). I never looked to "pancake" (monitor) the same way again for DCS once I got mildly used to a VR headset. Yes, it took a bit of time to get used (initially getting heavy nausea). But the main problem with VR, in my opinion, is that it requires a LOT more hardware horsepower (also lots of fiddling with settings), which means you need to be ready to take the time to read/learn stuff and set it right, and to sacrifice a good deal of eye-candy, to keep framerate as high as possible and frametimes as low as possible. And that last part is a really big problem with DCS, especially if your preference lies with the "heavy weight" modules and maps, and even worse in MultiPlayer. As said above, not everyone is willing to sacrifice graphical fidelity to ensure performance/immersion, and that'll happen regardless in DCS VR (no current hardware solves it). Anyways, and to close the long post... I think if you got to question, then it means you'll always be tempted to at least try VR, and see how it is. Even if happy with a big screen (it was the case for me). So..... rather "just do it" and see for yourself. Just don't go wasting a big budget on a top fancy VR headset, disregard the latest (and even more demanding) trendy models that you'll see praised now, instead just get a good and proven affordable one, even if it's 2nd hand (for example, the Quest 3 or the Pico 4, also the HP Reverb G2 if you still use Win10). If it doesn't work for you, then resell it later.
  15. So, this may be the single worst title in the gaming universe when it comes to VRAM consumption, especially in VR, and you're going again for overkill sized textures? You haven't learned a thing, have you? DCS World, as it is, is already very problematic in this aspect, why exarcebate the issue even more? ....why?? To please the few close-up screenshot nerds? Is this a screenshot contest game or a practical usable simulator, able to accomodate the largest number of hardware from its user base? Huge thumbs down. I honestly don't understand who makes such decisions, for hardware resources logistic/management when developping modules and maps. It just shows how disconnected ED is from the current issues, IMO.
  16. You're taking a comparison with an older version of DCS, and multiple Agesa updates to AMD motherboards since then that have allowed better/stable performance. The thing is, as others said here, the X3D chips from AMD have a very clever "cheat", which is the 3D V-Cache. And that on itself, for now, is an unbeatable crutch over any downsides that the AMD Chiplet design does have. A particular feature so good with CPU limited games including simulation genre (DCS, Assetto Corsa Competizione, etc) - especially with the single CCD ones like the 7800X3D and 9800X3D - that makes them the best gaming chips. Though definitely not as good for production/work tasks, are low in stock, and also (IMO) became too expensive. That said, it makes sense why the i9 12900K holds so well, even today - it's a good Monolithic CPU design, which has significant advantages for latency (noticeably lower) over Chiplet designs, such as those from AMD and newest Intel (Arrow Lake) as well. And another upside, it's now much cheaper (half the price of the 9800X3D) and it doesn't suffer from any degradation whatsoever (as seen in the later Raptor Lake). If Intel 12th gen Alder Lake and 13th/14th Raptor Lake would have had 3D V-Cache, they'd certainly be unbeatable CPUs for gaming but, alas.... it's an AMD feature (so far). Although it is now a dead platform, the i9 12900K is still a valid solution and a great price/performance choice if somewhat on a budget. In such situation, I'd still recommend it with a mid-range Z690 or Z790 motherboard (Asus TUF Gaming Plus, MSI Tomahawk), new or used. These also also exhist in DDR4 versions (not just DDR5 versions), which means it may allow for re-usage of DDR4 memory (in case that you already have 64GB of DDR4 RAM) with negligible difference to gaming performance. One thing is certain... avoid Intel 15th gen Arrow Lake (or better, "Error Lake"), because of its inconsistency and huge latency penalty for gaming. It teased with great expectations but turned out to be the biggest flop I've seen since the AMD Bulldozer days.
  17. Try DCS 2.5.6, runs much better on older machines: https://www.digitalcombatsimulator.com/en/files/3319459/
  18. 9800X3D versus 14900KS @ 6GHZ (MAX OC) BENCHMARKS
  19. I don't fly helicopters for some three years now (since 2.56 and 2.7), but I don't recall having issues with grass when using the Huey or the Hip, on Caucasus and Persian Gulf. The fact that the Apache is one of the most demanding modules, and Syria being more demanding than the Caucasus or Persian Gulf, may have to do with it. Perhaps grass requires a lot more processing now than what I recall back then (recent wind physics thing?). And as helicopters fly very low most of the time, it could explain it. Either way, I'm advocate of "use what works for you". Thanks for the tip on the grass.
  20. ??? + infinite (I win? )
  21. Absolutely. It's the job of the server host to warn about and provide the list of mods (and respective version) that is in use in that server session. The rest is up to people. ...just imagine joining a server that has various DCS aircrafts mods AND assets packs AND terrain textures mod AND liveries.....(list goes on) With autodownload we'd be talking about possibly various gigabytes of mods. And possible conflicts with other similar mods covering same aspects that you might actually prefer and/or already have instaled. Not to mention the obvious ammount of waiting time lost with such process. It's been done in some RTS and FPS games but makes no sense at all for DCS.
  22. Yeah, the ever changing aspect of DCS, it's both a boon and a bane. And unfortunately, as always, mod projects become part of the "collateral damage" with the changes from updates. Ultimately, with Taz's absence since earlier this year, and like with his mod project before, yours has been a solution for a long time neglected aspect. Many thanks for keeping at it.
  23. No, that makes no sense. Don't do it, not necessary for Maps. Such optimizations work for Core and Modules because those only use a single set of textures. That's not the case for Maps. CGTC already has included both "HIGH" and "LOW" resolution packages of textures, so it makes no sense to resize anything in this mod. And same applies to all other Maps, ones that you buy for the game. Leave the terrain textures as they are and, if you need better performance and/or lower VRAM usage on your GPU, then just set "Terrain Textures" to LOW (in game options). You basically get same result by doing that as you would with "optimizing" terrain textures. Resuming, no need to optimize anything here, and CGTC should be always installed last, after such optimized textures mods (batch process or not).
×
×
  • Create New...