Jump to content

some1

Members
  • Posts

    3457
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by some1

  1. Your CPU frametimes look really high even on low settings. I wonder if this is Varjo overhead, recording software inssue, or just 5800x3d doing its magic on my system. Do you have a full track for this benchmark? Because for me, the mission looks like this: Compared to:
  2. It doesn't help much if the reprojection itself is of lower quality, with more artifacts showing. And the reprojection preemptively drops to 30 fps in places where it still maintains 45 on SVR. Anyway, I don't want this thread to turn into another SVR into OXR debate. I simply checked both and currently I can't tell a clear winner in terms of performance.
  3. I may not be the right person to ask, I never saw any real gains with OXR on my system.
  4. I've been thinking the same when I got the 3080. Yeah yeah, allocated doesn't mean it's actually used, gpu drivers can adjust to less VRAM, 10GB will be fine, etc. Then DCS got to a point where I couldn't even bring any VR menus or overlays in VR on top of DCS, because everything would slow down to a crawl and stutter massively from VRAM shortage. It may not always be 17 gig and up, and it may run fine on 16 GB card, but it certainly requires above 10GB. To my understanding, OpenXR only shows VRAM usage by the game, and does not take into account VRAM required for your system and other apps working in the background. I did some comparisons yesterday, but just as I didn't see any performance benefit on my system before, I don't see it now. FFR improves things a bit, but I can also enable FFR with SteamVR for similar results.
  5. It think 16GB is a bare minimum. It may even be quite ok for now, but not necessarily for the next 2 years.
  6. I had a 3080 before 4090 and unfortunately, with only 10GB of VRAM, RTX3080 really sucks with high resolution headsets. It's been sort of "okay" two years ago, but ED increased VRAM usage since then, especially with the new modules, terrains, and AI units. I have a simple mission with AH-64, overcast and IHADSS on, sitting on a tarmac in Caucasus. Even at SS 70%, the mission was nearly unplayable on a 3080, with GPU frametimes around 23 ms. Now with RTX4090 the frametimes dropped to 9.5 ms at the same location and settings. Also CPU frametimes dropped from 11ms to 8.5 ms, since it doesn't have to juggle the textures in and out of memory. Turns out that mission requires more than 17 GB of VRAM at the very beginning, and it only gets worse as you keep flying. So in this particular scenario at 70% SS, I went from low 40-ish, not enough for reprojection to kick in, to a pretty stable 90 fps. Even at 150% SS it still hovers around 90 fps, the increased resolution simply does not seem to take much toll on a new card.
  7. If your GPU does not show close to 100% utilisation when gaming on a 2D monitor, and you are not at the vsync limit, then yes, you are most likely limited by CPU. With CPU it's different, basically no game will use it to 100% and looking at cpu utilisation is mostly pointless. In MSFS switching from rtx3080 I saw 40% increase on ultra settings in one spot i tested, and 100% increase in another scenario that is not cpu limited. Except I have 5800x3d, while 10900k performance in this game is comparable with 5600x/5900x
  8. I see no point in going higher, at least for SP. It only makes missions load longer. Maybe it was a useful setting back in Lock On days, when your HDD topped at about 60 MB/s
  9. I don't know if it's Varjo Aero or your CPU, but with 5800X3D and Reverb G2@150% (3868x3784) I see much better frametimes in the same spot despite higher resolution. This is High preset with mirrors enabled, like on your video: null compared to And this is with the settings I use and mirrors off, much more practical for VR than the "HIGH" preset. At these settings 4090 holds 90 FPS quite well in many scenarios, not everywhere of course.
  10. Your setup is fine, the PSU has non-standard sockets on the PSU side. Your previous post was a bit confusing and suggested you run the whole card from two PCIe 8 pin cables.
  11. DLSS3 frame interpolation does not support VR. DLSS2 looks good in marketing materials but in reality it reduces instruments readability. Good for typical games, not so great for flightsims. Besides, Msfs runs even worse in VR than DCS. It's simply more demanding on hardware and there's not much headroom for increased vr requirements.
  12. You're CPU limited. CPU frametime at 17.5 ms means your theoretical CPU limit is 57 FPS (1000/17.5). The real limit is lower as not everything in the pipeline is accounted for. But the GPU frametime below 11 ms means that the GPU is capable of running DCS at 90 FPS, if the rest of your PC can keep up (faster CPU or a simpler mission). I swapped 3080 for a 4090 yesterday. Didn't have time to do much testing, but a simple overcast mission in the AH-64 which was barely playable with RTX3080/5800X3D/Reverb G2 at 70%SS now runs at 90 fps at the same settings. Gpu frametime went down from 17.5 to 8.5 without IHADSS, and from 23ms to 9.3 with IHADSS. Of course in the case of RTX3080 a lot of that uplift comes from extra VRAM on a new card. Still, your graphs show that compared to 3090, RTX4090 gives 50-80% more headroom for GPU processing. MP or a heavy mission with a lot of units, I don't think any CPU is capable of running DCS at 90FPS in VR, but at least with 4090 you can crank up the resolution and eyecandy.
  13. You are quoting a 3-year old post. Three years ago these shortcuts were not implemented.
  14. AM5 is rumoured to debut Sept. 15th https://www.notebookcheck.net/AMD-Zen-4-Ryzen-7000-to-launch-on-September-15-at-US-799-for-Ryzen-9-7950X-Ryzen-7-7800X3D-and-Ryzen-9-7950X3D-purported-3D-V-Cache-versions.637813.0.html Also rumours about more AM4 x3d cpus https://www.techpowerup.com/296392/amd-readies-more-ryzen-5000x3d-processors?cp=5#comments Intel also has new cpus coming this fall. So if you aim for high end machine and can postpone a month or two, its better to wait. If you need something now or don't want to spend much, the current cpus are not a bad choice.
  15. It's because it's conveniently located next to the pilot's hand resting on the throttle. Dogfight switch in the F-14 is in a similar location.
  16. BTW there are rumours of more X3D processors coming to AM4, and also AM5 is right around the corner. At this time of year, it's best just to sit and wait for Intel and AMD to show their new product lineups.
  17. All your typical background programs during gaming session are not enough to make even one modern CPU core sweat. In terms of games, 5800X3D very rarely lags behind faster clocked CPUs, and usually the larger cache is enough to stay on top, even in popular titles. With productivity tasks, it varies from program to program. Quite often in real life applications the difference is insignificant. For example going from 5900X to 5800X3D, compilation times in Visual Studio are slightly longer for me (10-20%), but that's something I can accept as I get better performance in simulators.
  18. In the advanced waypoint options, you can add option: reaction to threat - no reaction.
  19. What we have in game is the virtual IPD adjustment. It does indeed change the sense of scale, hence the confusion https://xinreality.com/wiki/Interpupillary_distance#Virtual_IPD
  20. Added Mirage F1 CE.
  21. This can be done on any motherboard using software created by overclockers (PBO2 tuner or Project Hydra). It's not official AMD software, so use at your own risk. But I can confirm it works for me, lowering the core temps and raising the boost clock under heavy loads at the same time. https://github.com/PrimeO7/How-to-undervolt-AMD-RYZEN-5800X3D-Guide-with-PBO2-Tuner/blob/main/README.md
  22. The new generation of CPUs will be released in Autumn. But if you cannot wait, then if MSFS is any indicator, 5800X3D beats 12900K by a wide margin. https://forums.flightsimulator.com/t/amd-5800x3d-performance/510937 Keep in mind that other, more typical games do not always show such improvements.
  23. The next generation of AMD CPUs won't have 3D cache, at least not initially.
×
×
  • Create New...