Jump to content

LucShep

Members
  • Posts

    1687
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by LucShep

  1. That was my point. My dad was very much in the race for a long time, in fact, he'd been at it since he swapped his second Atari for a PC. As a kid, hand-me-downs from him were sufficiently powerful that I seldom complained about not being able to run something, except when things like SSE2 appeared, at the time I had the last high end AMD CPU that didn't support it, which kind of sucked when it became a hard requirement. At one point, he just decided further upgrades weren't worth the money, the 20xx series RTX being a bit of a dud had a lot to do with it, and the 30 series launch pricing sealed the deal. The "target consumers" are getting tired of ballooning prices for incremental performance gains. While I haven't yet declared the 3090 will be the last GPU I buy, unless something truly revolutionary is added (more than spamming fake frames, anyway), I expect it to suffice for the foreseeable future. When I do upgrade my rig, it'll likely be because something like retina-level VR came out. Well said. I'm not sure what other "gaming" interests are among people here in DCS forums, but I'm quite active in other PC gaming genres (and modding as well). I've been into it for well over two decades (maybe three?) and I clearly see a shift in mentalities like I haven't for many years, somewhat similar to what I recall in the late 2000s. I'm noticing lots of new people in forums, clearly resorting to older games, and also emulators (previous gen consoles gaming, on pc), finding out that there's a huge list of quality game titles providing gameplay fun and enough eye candy. More so with modding, which also empowers and prolongues the life of such games, which don't require ubber-hardware, and are relatively bug-free at this point. UE5 is getting a pretty bad rap with the unoptimization issues stuff, quickly becoming a meme among most gaming communities, who'll likely start boycotting any and all games powered by it. Which, in these days of social media and community driven groups, may not really be a minor thing. I'm also noticing increasing numbers of disgruntled DCS users joining ranks on certain competitor titles. I can only guess but, I'd wager, might have to do with having far less concern and issues with performance and hardware requirements, as you do in DCS (outrageous when it comes to VR), being a major part of it. Money isn't growing in trees and, years after the pandemic (when so many bought decent gaming PCs), there are more and more people deciding to keep what they have, and use it for as long as they can, rather than spend another small fortune in yet another expensive system (or hardware part) upgrade. I'm building or upgrading less and less computers these days and, those that I get involved with, there's more and more "mid-range" and far less "high-range" systems, even for people who I remember years back spending big bucks on "top stuff". I too decided to hold on my current GPU - also an RTX3090 (I think I said here before). It still runs great, not going to spend my hard earned money, also on matter of principle. The price gouging of GPUs (and other hardware) has become just stupid with every new generation and, as much as I like PC hardware, I won't be a part of it this time around.
  2. If you clicked "Apply Changes" in the NVInspector app, then... it's done! DLLSTweaks is just an alternative (i.e, not required over what you already done), in case you wish to mess around and tweak stuff (I prefer it myself, see here).
  3. By trial and error, as the other method by NVInspector does not give me same results with DLAA forced onto DLSS profiles (can't live without it). I later found out it's transversal to other people in reddit and guru3d forums, who had already found same results (for example, see here) - it seems I'm not alone experimenting until stuff gets borked! LOL (in this case it didn't get borked ) Nope, still using good old 537.58 drivers here (in my experience the best drivers to this day for my RTX3090). No issues whatsoever.
  4. For those accustomed with DLSS Tweaks (an old video tutorial here), you can keep using it for the new DLSS4 .DLL with transformer model (v310.1.0.0). Just make sure to select force preset G. This is because, somehow, forcing preset G will actually make it utilize the J preset (aka "transformer"), which does not appear in DLSS Tweaks for selection. Have to say, with a few settings changed (see image below, I also attach my .INI here if you wish to try it), it seems this version of DLSS works great in Cyberpunk2077. EDIT: actually, I'm noticing some shimmering in the foliage, whereas there is none with "preset C" hmmmm... (NOTE: haven't tested with DCS yet) I'm very impressed. The image clarity is noticeably better, and inclusively in motion most of the smear is gone. There's a tiny hit in performance, but then improvements are so good that it even allows decrease of one step to get back performance (from quality to balanced, etc).
  5. For those accustomed with DLSS Tweaks (an old video tutorial here), you can keep using it for the new DLSS4 .DLL with transformer model (v310.1.0.0). Just make sure to select force preset G. This is because, somehow, forcing preset G will actually make it utilize the J preset (aka "transformer"), which does not appear in DLSS Tweaks for selection. Have to say, with a few settings changed (see image below, I also attach my .INI if you wish to try it), it seems this version of DLSS works great in Cyberpunk2077. EDIT: actually, I'm noticing some shimmering in the foliage, whereas there is none with "preset C" hmmmm... (NOTE: haven't tested with DCS yet) I'm very impressed. The image clarity is noticeably better, and inclusively in motion most of the smear is gone. There's a tiny hit in performance, but then improvements are so good that it even allows decrease of one step to get back performance (from quality to balanced, etc). dlsstweaks.ini
  6. GPU-Z will be updated at some point and rectify the sensor reading in the software. But... holy ****!!! 92ºC on the memory temp?? I sure hope GDDR7 is reliable, because (IIRC) GDDR6 used to slowly degrade if at 100ºC~105ºC.
  7. Honestly, the more I read/see reviews of the RTX 5090, the more I think this is suspiciously built as an AI research GPU to be more affordable for smaller businesses, who'll rack a bunch of them with the right software stack, i.e, like a cheaper solution to the much(!!) more expensive professional Nvidia Hopper GPUs? I know the 90 series was always the "prosumer" model of the lineup and not a "gamer" product but, looking at the rasterization performance (underwhelming generational improvement), which by the way also VR relies on, gaming sometimes looks to be a secondary objective of the 5090? And then that insane power consumption (undervolting looks like a "must do") especially with RT? (PCWorld got Cyberpunk@4K+RT to push it to 700w peaks on the GPU alone!) ...and all that added to the CPU bottlenecks? For example, just look at the Hardware Canucks review below and check the stats confronting the 5090 and the 4090. It makes no sense. I'm expecting really disapointing reviews of the RTX 5080 (and also of 5070Ti and 5070) in the next coming days/weeks. Considering the gap between the 5080 and 5090, I'm getting more and more certain that a 5080Ti 20GB or 24GB is going to be built sometime down the line (...in six months? a year? no idea). Regardless, considering that the RTX5090 is the "best case scenario" of the whole line-up, it does look like this really is a disapointing generation of Nvidia gaming GPUs. I'd say to hold on to your wallets. Review by Hardware Canucks Review by Level1Techs Review by Gear Seekers
  8. Meanwhile.... TECHPOWERUP just released an interesting article about PCIe5.0 scalling on GPUs, and whether it makes sense to spend extra on a PCIe 5.0 (Gen 5 x16) motherboard just because of newest GPUs. (spoiler alert: no need, as expected - PCIe4.0 (Gen4 x16) and PCIe3.0 (Gen3 x16) dedicated slot still all good for the newest GPUs) https://www.techpowerup.com/review/nvidia-geforce-rtx-5090-pci-express-scaling/
  9. Heh.... exactly as expected. Now prepare for the even more disapointing reviews of the RTX5080 tomorrow.... Nah, no thanks. Rather get a nice second hand motorcycle (or "whatever"!) than overspending on power-guzzling GPUs in a revolting price gouging market, which will not make miracles in VR for a game that has its own long running issues yet to be solved.
  10. Touché! For graphics cards, you need an Nvidia RTX 3090, 4090, 5090, AMD RTX7900XT or XTX (i.e, more than 16GB VRAM), if you're hoping to run those modules in one of the latest maps in VR. And it seems the trend is spreading to every new module and map. I don't care how "high-end-hardcore-elitism-best-art-ubber-detail-fidelity" type of arguments are thrown to excuse such a poor decision. This is absurd, and we've been saying it for some five years now. More, to add insult to injury, the reduction of texture setting in the game options means a horrible downgrade in image quality, because it's done through the MIPs of the .DDS texture (modules use single set of textures, they don't have HIGH and LOW packages, only the terrain maps), really blurry as there is no propper textures resizing through manual process, to ensure it's done in the best quality. Honestly, at this point I simply give up, it's useless.
  11. oh oh.... Hey guys, wanna hear a joke? It's hilarious. Ready? RTX 50 series. *hysterical laughter* And another joke: AMD competition *silence*
  12. Hi Ian, Logitech (Saitek) X56 is basically same thing as the old Saitek X55 minus compatibility issues solved for newer OS's. Some things to do to for the X56, be it the Logitech or Saitek versions: Make sure to connect the X56 through a self-powered USB HUB, and same thing for the pedals. No need for expensive ones, cheap ones like TP-Link UH720 ($25.00) are absolutely fine. LEDs always OFF in the Logitech X56 Profiler software (download here), pretty much a requirement to avoid any issues. if ghosting and jittering problems occur (usual in throttle and rotaries), do the following for both the throttle and stick (maybe pedals too?). Use Vjoy +Joystick Gremlin and follow this tutorial (see also video description for software links):
  13. The sad reality is that Nvidia, being in such a strong domination of the GPU market, and leading in AI solutions (which also all AAA games development now hold onto as a crutch over bad optimization) can put whatever price on their GPUs - they will sell regardless. I got nothing against premium stuff being priced as such (it has always exhisted), the fortunate and wealthy will always reach it anyway. I mean, there are audiophile headphones selling at well over 5.000 Euros. And just look at the prices of high-end PC monitors and TVs.... So, stuff like the RTX 4090 and 5090 is just one more drop in the silly waters of that ocean. If there's a market for such, there's obviously an audience willingly paying for it. My problem is prices on the medium and medium/high range products, getting so ludicrously high that it makes the adoption of PC-gaming a really expensive affair these days. And it's not really a matter of "just lower the graphical settings in game options", because some of these games (and to some extention that includes DCS) run and look downright horrible when at reduced settings, worse than many older games. Nvidia 50, 60, 70 and 80 series selling for nearly double their regular price in a matter of five years, does not have to do with inflation or currencies falling in relation to dollar. It's pure greed and blatant exploitation of the market. And, unfortunately, it's something that also AMD has proven to go along with whenever they got a chance.
  14. I didn't read all the pages of this thread (sorry) but, the thing where I get a "bad vibe" with the F-35 announcement goes beyond the debatable fidelity of it. It's also the context in which the module is going to be placed. I understand DCS needs to be profitable, and that this is obviously aimed at a certain crowd (younger userbase, I suppose), perhaps aimed at newcomers (maybe the casual type getting into more in-depth sims) who only wants or knows about the "newest, most modern stuff". The F-35 may make sense in this context. But then, and this is what makes me wonder about the module... DCS has been, for many years now, mocked for being a "cockpit simulator for nerds" but it is, most of all and in essence, a combat flight simulator that takes itself pretty serious. In a combat, there are two sides. And if you've chosen a "protagonist", you must also have its respective "antagonist". I mean, doing an F-35 and knowing there'll be no modern REDFOR counterpart module (J-20, Su-57, etc, not counting the missing 4th gen ones) makes it all kind of, I don't know, miss the point? Personally, I'm a bit suprised ED went for a "full-fidelity" 5th gen aircraft, for which I don't think there's much unclassified information to make such module all that credible. I also have zilch interest (zero, nada, niente) in the F-35 (or any 5th gen fighter, for that matter) but, most of all, I'm actually disapointed seeing that ED went for the wrong "Lightning" as an FF module. Instead, I think the current userbase would've been thrilled with announcements for a Lockheed P-38 Lightning (WWII warbird) or an English Electric Lightning (CW era jet fighter, early 1960s), either of which would fit the current content better, and do have a LOT more documentation to make them feasable (and far less controversial) as DCS module(s).
  15. I concur. 16GB VRAM is getting shorter for DCS VR, which also likes very high raster performance and large mem-bus bandwidth. But, to be fair, the Nvidia RTX5080 and 5070Ti are also 16GB and 256-bit mem-bus, and those are considered the upcoming "top gaming models" (the RTX 5090 is enthusiast / pro-sumer level). AMD did state last year they are not focusing on the higher segment models anymore, so that still makes the 9070XT an interesting proposition, if it's close to RTX5070 but with 4GB more VRAM (plus the promising FSR4), and at a lower price. With all their faults, I think the RX 7900XT 20GB and 7900XTX 24GB should have been at least "polished" (and FSR4-able) and price readjusted, to be competitive. They could still gather some attention if so. Instead, AMD dropped them altogether.
  16. Exactly. And the pill is even harder to swallow when you've presented DCS to your friends (who are into VR) and they get hooked, only to see how heavy and demanding the game is (and prices of hardware...). Could almost swear this hobby gets closer and closer to cocaine addition prices!
  17. Sure, and that makes sense. But let's see, and AFAIK, the MSRP of the 4070 Super is 600$ (582.23€). It sure doesn't seem like we got close to it in practice, even by the cheapest bottom of the barrel dual-fan models like you say, by looking at the search engine/app for the very best prices in my country (all over 20 retailers, and also includes Amazon Spain). So, as for the "False" ............................. ? If that much, my calculations say it has to do with taxes (in my country that's 23%), plus the retailer margin/fee - so, along with what you say. Yet you can see prices flutuate immensely (those you see there are the lowest of the lowest, but most stores go much, much higher than that!). That's why MSRP is basically "fictional" to me and those in my country (Portugal), which belongs to the EU, because it hasn't correspond to the reality since 2020 (it did before!). ...and please, don't even get me on the 4080s and 4090s (close to a comical horror movie)...
  18. Nope. After the pandemic MSRP became sinonymous with "nothing" because that baseline became ficticious -- 25% to 50% over MSRP for "real price" is normal these days for GPUs. Which was not the case up to that point - GPUs were still selling at MSRP, even in Europe. GTX670 "real price at retailers" was 300€~400€ in 2012 (depending on model version), which is 380€ to 510€ in today's money. GTX770 "real price at retailers" was 300€~400€ in 2013 (depending on model version), which is 380€ to 510€ in today's money. GTX970 "real price at retailers" was 300€~400€ in 2014/2015 (depending on model version), which is 370€ to 500€ in today's money. GTX1070 "real price at retailers" was 350€~450€ in 2016/2017 (depending on model version), which is 440€ to 550€ in today's money. RTX2070/S "real price at retailers" was 375€~475€ in 2018/2019 (depending on model version and after AMD RX5700XT came out), that's 450€ to 570€ in today's money. Notice a pattern here? The 70 series always had been a reference in each generation of Nvidia, because they represented that sort of a "sweet spot", fast enough and not outrageously expensive. Where is that since the pandemic? Or more precisely, since the RTX3070, which was always sold at ridiculous prices? (often seen at over 750€!) You see, the pandemic and the mining craze was a justification then for the outrageous prices, but all that has been gone for years. Yet that practice with prices was, and is, mantained by the manufacturers - we've been duped. Do your own quick research - at least in Europe, the RTX4070 and RTX4070 Super still sell for 700€ to 850€, in this very day. And now, does anyone really believe the upcoming RTX5070 will be at lower prices than that? "MSRP where are thou?" PS: you can also apply that to the 60, to the 80 and, to some extent, to the 90 series as well.
  19. Some great points in the discussion, can't disagree with any of it. I think what we can all agree is that the GPU industry/market has changed since the pandemic. And, so far, it has definitely not changed for the better. We can also see, going through nearly two decades, that the raw performance jump between Nvidia GPU generations has decreased (especially in mid to high range models). The performance % difference of 5000 series over 4000 series is expected to be as low (if not lower) as 2000 series was over 1000 series, and that was pretty mediocre. Perhaps we may have hit the silicone limits. Perhaps this software based AI rubish is the (very) unfortunate solution for the forseeable future. Also, you should all know by now that, for GPUs, the MSRP ends up being almost ficticious (it's always same story every release). As said before in this topic, it's pretty certain that real prices will be considerably higher than that (for sure in Europe, confirmed through contacts in retail business), and comparatively higher than previous generation if taking to account the suspected raw performance % benefit. Right now, Nvidia dominates and "it is what it is". AMD hasn't been able to keep up (and not hopeful about the RX9070/XT myself). Intel slowly getting there (B580/B570) but not yet. And, more than the silicone limits or AI demands, it's this Nvidia "total dominance" that drives silly prices. Which, unfortunately, many will still gladly sustain (and here I agree with both @The_Nephilim and @kksnowbear). Sometimes I'm dumbfounded how both PC gamers and HW enthusiasts can be so sheep-ish, and unable to react to any of it. Yes, it's a bit of a conumdrum... the world economy is not that great, yet thousands upon thousands spend "second-hand car money" on a single overpriced GPU piece which usually is relevant for only three years, or four with luck. Price and worth of GPUs has been often discussed and, as everything hardware, is many times in "the eye of the beholder". But, this time around, I suspect it may change. Personally, by this time last year, I was expecting to upgrade my RTX3090 to at least an RTX5080 and, actually, have now decided against it. To be fair, my RTX3090 still runs absolutely great, plays everything I throw at it in 4K as I like, with the sole exception of one single title in VR (and we all know which one is it...). Will hold to it for as long as I can, calmly wait and see. Not going to spend my hard earned money this time around, solely on matter of principle. Regardless, I'll still say to anyone using an older/slower GPU and looking to upgrade - the used and outlet market is always a valid solution (and it never failed me). There will still be the ocasional worthy deal for second-hand or refurbished units in the market for certain models of previous GPU generation(s), which will still be able to do almost as much and last almost as long as the newest ones, for far, far less. As in, "don't donate another stupid leather jacket to that guy"...
  20. No, no. It goes beyond workload spread on CPU cores, that on itself won't solve the problems. How it all works as a package for all the demanding content that is being placed, right from the start, and how it manages all resources (CPU, GPU, RAM, Storage and basically all the I/O load) is what matters. DCS has many of its roots going back over 20 years, historically hogging resources like a pig since release back then, and you know what they say about "lipstick on a pig"....
  21. Yep. You see, the fixes and adoption of a more advanced game engine is inevitable for ED. It's no longer a "it'd be nice" as we thought many years ago, it became a necessity. With the ammount of much higher polycount maps and objects (aircraft and not only), and a gazillion of overkill 32-bit 4K/8K textures (in everything) that are now being added to the game, plus adding more complex scripts for avionics, systems, weapons, AI, weather and environment effects (and also of more complex missions) and etc, it can only be delayed for so long until the game becomes unusable as is. Because it got to a point where you spend absurd money for the strongest hardware but get only a small fraction of its benefit, due to game code/engine constraints. If you've upgraded your system recently, try other games and, in comparison, you'll see a monster jump in performance...
  22. With the "AI fake frames" discussion so prevalent with upcoming RTX5000 series, the RTX4090 will still be relevant and hold its own value. People looking into a RTX4090 from the second hand market (and why shouldn't you, it's a fantastic GPU) really need to be extra careful. This stuff is still happening, and more than ever, it seems...
  23. Oh don't you worry, Jensen-leather-jacket guy knows very well that people also want a 20 or 24 GB variant model.... A matter of waiting for a supposed "5080Ti 24GB" within a year from this? Something like that couldn't be released now, or it'd affect sales of the 5090 and the 5080.
×
×
  • Create New...