Jump to content

LucShep

Members
  • Posts

    1686
  • Joined

  • Last visited

  • Days Won

    2

8 Followers

About LucShep

  • Birthday 06/17/1975

Personal Information

  • Flight Simulators
    - DCS World

    - Falcon BMS

    - IL-2 Great Battles

    - Wings Over The Reich

    - Strike Fighters 2
    (with mods)

    - IL-2 1946
    (VP Modpack & JetWars)
  • Location
    LX - PT
  • Interests
    Gaming/simming and modding, PC hardware, motorcycles
  • Website
    https://www.digitalcombatsimulator.com/en/files/filter/user-is-Luke%20Marqs/apply/

Recent Profile Visitors

10146 profile views
  1. Make sure to take pictures and posted them here. Seeing the actual product(s) should speed up the process.
  2. Really? hmmmm ok..... RTX6090 ! unless....
  3. No one should tell you how to spend your own money. That said, I honestly can't see any point in paying the absolutely ridiculous prices of any RTX 5090, especially if already owning a fully funcional RTX 4090... In such case, to upgrade from 4090 to 5090, and for what is gaming/simming "leisure use", the perf.advantage versus purchase cost is really, really bad. So bad that the only reasonable conclusion seems to be to wait for the 60 series (6090, 6080), which will be on a whole new architecture (50 Series really wasn't) based on TSMC's 3nm process (50 series still uses 5nm process like 40 series did) and, supposedly, also faster GDDR7X memory, which should (speculated) provide the jump in performance that all of the 50 series failed to provide. And that's ignoring the melting power connectors and cables design issues discussion, already worrisome on the 4090 before, has been turned up to eleven on the 5090...
  4. source TechPowerup: According to one of the most reliable AMD leakers, Kepler_L2, AMD's upcoming UDNA (or RDNA 5) GPU generation will reintroduce higher-end GPU configurations with up to 96 Compute Units (CUs) in the top-end Navi 5X SKU, paired with a 384-bit bus for memory. When it comes to the middle of the stack, AMD plans a GPU with 64 CUs and a 256-bit memory bus, along with various variations of CUs and memory capacities around that. For the entry-level models, AMD could deliver a 32 CU configuration paired with a 128-bit memory bus. So far, memory capacities are unknown and may be subject to change as AMD finalizes its GPU lineup. We still don't know what type of memory AMD will ultimately use, but an early assumption could be that GDDR7 is on the table. After the RDNA 4 generation, which left AMD without a top-end contender, fighting for the middle-end market share, a UDNA / RDNA 5 will be a welcome addition. We are looking forward to seeing what UDNA design is capable of and what microarchitectural changes AMD has designed. Mass production of these GPUs is expected in Q2 2026, so availability is anticipated in the second half of 2026. A clearer picture of the exact memory capacity and type will emerge as we approach launch.
  5. Very cool. A nicely ilustrated tour you got there with some very iconic aircraft. And I hope you continue at it (we want moooar!). Many thanks for sharing!
  6. Initial tests and comparisons suggest that Loseless Scaling still does the same job better than Nvidia's Smooth Motion. Granted, it's a payware app (7$ on Steam) but it works with any GPU and any driver, and on any game... ....plus, it's lightweight - doesn't suffer from the extra weight of Nvidia's bloatware app.
  7. In my brief experience with an RX 7900XTX, it runs DCS pretty darn well at 4K resolution, on a 2D screen (monitor or TV). But it was not all that great for DCS VR (go figure, my older RTX 3090 performed better than it, IIRC). The 7900XT that you have is identical to the 7900XTX, albeit a little less powerful (~15% difference). If VR is your aim, and you're planning for a new GPU then I'd strongly suggest aiming for an Nvidia GPU, either RTX 5070Ti or, if budget allows, an RTX 5080. Nvidia still performs much better than AMD in VR. For DCS VR, I'd say a "decent minimum" for latest versions of DCS is an RTX 4070Ti Super 16GB and, on a better time, I'd say to get one from the second-hand market.... ....the problem is, even there all Nvidia GPUs are overpriced! If you can wait, the upcoming RTX 50 series "SUPER" refresh with considerably higher VRAM (18GB and 24GB) should be announced some months from now. Something like an RTX 5070Ti Super should be a considerable upgrade for VR then, and even more so the RTX 5080 Super (but at higher prices, of course...).
  8. DCS VR is extremely demanding (increasingly so with continuous updates bringing higher and higher detailed content), even on the most performant systems. What graphics card in use? It looks like you're severely GPU limited, and the new faster CPU+RAM combo is not making noticeable difference because it's being restricted by that. Even if just for a test, try to reduce settings that are GPU demanding (ones that increase GPU usage - avoid having it completely exhausted at or near 100%). Once the GPU usage is decreased, then you should see some difference for sure.
  9. YAAY he's back!! and GFY!
  10. https://forum.dcs.world/forum/57-pc-hardware-and-related-software/
  11. F-100D Dev Log: FSExpo 2025 An inside look at the 2025 FSExpo Event, testing DCS VR with the F-100D Super Sabre by Grinelli Designs. More videos and info Grinelli Designs website: https://grinnellidesigns.com/ Grinelli Designs Youtube channel: https://www.youtube.com/@grinnellidesigns
      • 2
      • Like
      • Thanks
  12. You definitely have the system for 4K resolution, do it! If you end up getting a high refresh 4K monitor (or TV, there are 120Hz and 144Hz VRR variants), the one thing that I'd recommend is to lock your framerate irregardless. Locking the framerate to a max of 100 FPS (like you already do) is the sweet spot. But 90 FPS or even 80 FPS lock may be a better idea for demanding scenarios, to maintain frame pacing consistency as it'll decrease both CPU and GPU % usage. I see that you're reluctant to use DLSS, but to use it is actually a very good idea, irregardless of potent hardware, because the newest DLSS version, aka "Transformer", is much better (and it can be further adjusted). Start by getting the recently released DLSS 310.3.0 DLL - drop the .DLL file in same folder location where the game executable is, to update DLSS upscaler in use there. This new version, along with the improved image quality/clarity that the recent Transformer preset "K" brought, now also has lower VRAM consumption. Then, in the game, set DLSS at a desired setting (the "Upscaling" seting). On a 3840x2160 resolution screen panel, if set at "Quality" setting (=66% upscaling), you're basically running the game at 2560x1440 with AI upscaling. You may find that (re)adjusting the "Sharpening" setting may also be needed, as the image may look a little too soft (personal taste, but from 0.3 upto 0.7 should be good). For reference, 4K (3840x2160) has same GPU usage demand as 1440P (2560x1440) x 1.33. As to say, a third higher total pixel count and respective impact. I'd suggest to try DSR set at 1.20x and also 1.50x (a shame but there is no 1.33x). It should give you a (very) rough estimation of the possible increased hit in rasterization performance (i.e, with no DLSS upscaling - which you can combine with if desired). This is done through Nvidia control panel. For example: (NOTE: ignore the resolutions numbers in the screenshot, I'm on a 4K screen so of course those numbers are higher than those you'll see in your 1440P screen) After applying it, you'll then see that, in the game options for the screen resolution, there are new selectable resolutions (higher than those that your monitor natively supports) that can be used in game, which are respective to DSR settings you've chosen.
  13. What I find interesting is, why would you force a lower resolution, when you are able to use DLSS upscaling in the DCS game options and your GPU supports it? In DCS options, first set it at the native 3840x2160 (i.e, 4K) resolution, and then use DLSS at a desired setting (the "Upscaling" seting). - If set at "Quality" setting (=66% upscaling), you're basically running the game at 2560x1440 (i.e, 1440P) resolution, but upscaled through AI algorithm. - If set at "Balanced" setting (=50% upscaling), you're basically running the game at 1920x1080 (i.e, 1080P) resolution, but upscaled through AI algorithm. You'll then find that adjusting the setting "Sharpening" (right below it) will be required, as the image may look a bit too soft (personal taste, from 0.3 upto 0.7 should be good). It will look incomparably better than manually forcing a lower resolution, while keeping the native resolution and all the big performance benefits. - - - On a different aspect, about the Nvidia driver selection - you do whatever suits you (it's your PC) but from personal experience with plenty different troubled systems, I'd strongly advise against using latest drivers (any that are post 566.36) on any RTX 30 series GPU (like yours), though there'll be plenty "rubish, it's perfectly fine, ignore that" comments. I'd actually urge you to install Nvidia 537.58 WHQL drivers, and even better if it's a bloatware cleaned driver version. These are considered the best drivers for RTX 30 (and older) series, because of stability, latency and frame pacing benefits that none later version was ever able to match (though driver 566.03 is not too bad). If interested, you can get a 537.58 WHQL "clean version" from here: https://mega.nz/folder/dQRX3AQI#9RmtXT0cTw45RsWxTQgYiw/file/JJ5kQAAC Uncompress (unzip) the package, then go to the folder with the extracted files, and run setup.exe to start the drivers installation. And, while optional, to first uninstal current drivers with DDU (https://www.tomshardware.com/how-to/uninstall-nvidia-amd-intel-gpu-drivers) is a good idea. This will ensure that your system is clean of any display driver left overs (which can cause issues and conflicts) before (re)installing any Nvidia drivers.
  14. Heh... on an RTX 30 series GPU? Sure thing. Enjoy your "placebo fix" with a pinch of performance loss and instability....
×
×
  • Create New...