Jump to content

Recommended Posts

Posted
On 4/14/2025 at 5:52 AM, onib89 said:

Guys I have built a beast and I suffer of these stutters...AMD Ryzen 9950X3D, ROG ASTRAL 5090 OC, 64GB RAM 6000 CL32 2X32, PF1 ARGB 1200W Platinum , rog ruyijin 360 iii extreme, Samsung 9100 pro 2tb, 990 evo plus 4tb. Samsung g8 oled 34". On my monitor 175hz 175fps stable and stuttering,  on pimax crystal light 90hz 90fps stable and stuttering , so many people with different systems have the same issues, I mean.. what the heck man..it's Nvidia drivers?? I tried all settings from low to highest and it's the same thing. I tried all refresh rates in vr , everything I saw in YouTube from guides to fixes and nothing.. it's just frustrating..I'm not into commercial flights but I'm thinking seriously buying mfs  2020 or 24 to test if stutters there..

In the Nvidia control panel try setting Vsync to OFF for DCS. This worked for me. My rig is very close to what you are running.

Posted
1 hour ago, Rebel28 said:

In the Nvidia control panel try setting Vsync to OFF for DCS. This worked for me. My rig is very close to what you are running.

No difference for me I'm afraid.

I'm currently sticking to Half Frame Rate in the Pimax client, that seems to fix stutters best for me ATM.

Windows11
RTX 4090
Processor    13th Gen Intel(R) Core(TM) i9-13900K   3.00 GHz
Installed RAM    64.0 GB (63.7 GB usable)
System type    64-bit operating system, x64-based processor

Posted (edited)
On 5/9/2025 at 2:08 AM, MarkyMarkUK said:

No difference for me I'm afraid.

I'm currently sticking to Half Frame Rate in the Pimax client, that seems to fix stutters best for me ATM.

Same here - i spent weeks messing about with this and XRFrametools data - half frame rate in the pimax client was the ONLY option that minimised stutters

Edited by nephilimborn

i7-13700KF; RTX-4090; 64GB RAM; Quest3 & PimaxCL; Virpil CM3 + VKB Gunfighter Mk.IV MCE-Ultimate + VKB T-Rudder Mk.Vl; Windows10 (F*ckOff W11)

  • 2 weeks later...
Posted
On 4/14/2025 at 6:15 AM, Blackhawk163 said:

When I upgraded to the 5080 I was still getting the micro stutters out the side. Then I decided to actually tax the you and not the cpu. First DLSS/DLAA had to go, and then my Pimax custom settings that I had used on the 4080 super (.85 resolution ) had to be increased to native. 120hz (no capping at half). 
 

I don’t mind QV, and FFR. All modules save for the F4 phantom (which cuts down to 32 fps for some reason) run smooth with no stutters at 60 capped in DCS and OPXRTK. I usually test this over The BFM mission over tarinkot (just shoot down the mig right away) and then fly circles over the airfield as the density of objects go from low to high and then low again. 

Are you saying that DLSS/DLAA taxes the CPU, and that it's better to have these off for those who are CPU bound?

I don't mind QV either, but my problem is that my CPU is being hammered single thread, and I get stutters at around 68-71 FPS. (Just under the native 72hz of the Pimax). This is solely CPU related. I found that using QV made it worse - as it seems QV increases the demand on the CPU in the main DCS Render thread. 😢

I tried 120 locked at half for 60fps, which definitely helps with the cpu stutters (as it's under that threshold), but I get the other artifacts that come in (blur, and consistent minor stutters out the side). 

Is this something that Vulcan is supposed to help with when it comes? Reducing the demand on the CPU with additional threading? I'm running a 13900K and the crazy part is, I can run at 5.5Ghz, or 5.8Ghz and it seems to make no difference - as though something else is jumping in and interrupting the core that the DCS main thread is using to create the stutters. 

I've done all the usual tricks. Rebar enabled, NVIDIA Power to max performance, Park Control off, Process Lasso - P Core only, Disabled Tacview, Windows Power to Ultimate, No gaming mouse / polling rates to turn down, Disabled Microsoft Device Association Root Enumerator, XBOX Game bar off, Game Mode off, no_device_hotplug = true, disable_write_track = true, ran "logman stop HolographicShell -ets", VSync off, Thread optimization off, Prerender threads up to 3, windows power to ultimate, etc. Nothing seems to get the CPU to operate at a level that will give me smooth uninterrupted 72fps in VR.  I can turn graphics down to minecraft level and still CPU bound. 

Is there a tool that can inspect anything that's interrupting on the same thread(s) that DCS uses just incase there is a windows function or other process creating issues?

  • Like 1
  • 3 weeks later...
Posted
On 5/22/2025 at 9:25 PM, Dangerzone said:

Are you saying that DLSS/DLAA taxes the CPU, and that it's better to have these off for those who are CPU bound?

I don't mind QV either, but my problem is that my CPU is being hammered single thread, and I get stutters at around 68-71 FPS. (Just under the native 72hz of the Pimax). This is solely CPU related. I found that using QV made it worse - as it seems QV increases the demand on the CPU in the main DCS Render thread. 😢

I tried 120 locked at half for 60fps, which definitely helps with the cpu stutters (as it's under that threshold), but I get the other artifacts that come in (blur, and consistent minor stutters out the side). 

Is this something that Vulcan is supposed to help with when it comes? Reducing the demand on the CPU with additional threading? I'm running a 13900K and the crazy part is, I can run at 5.5Ghz, or 5.8Ghz and it seems to make no difference - as though something else is jumping in and interrupting the core that the DCS main thread is using to create the stutters. 

I've done all the usual tricks. Rebar enabled, NVIDIA Power to max performance, Park Control off, Process Lasso - P Core only, Disabled Tacview, Windows Power to Ultimate, No gaming mouse / polling rates to turn down, Disabled Microsoft Device Association Root Enumerator, XBOX Game bar off, Game Mode off, no_device_hotplug = true, disable_write_track = true, ran "logman stop HolographicShell -ets", VSync off, Thread optimization off, Prerender threads up to 3, windows power to ultimate, etc. Nothing seems to get the CPU to operate at a level that will give me smooth uninterrupted 72fps in VR.  I can turn graphics down to minecraft level and still CPU bound. 

Is there a tool that can inspect anything that's interrupting on the same thread(s) that DCS uses just incase there is a windows function or other process creating issues?

Sorry for the very late reply. I'm sorry that I can't answer the technical aspects of your question, as I'm not well versed in it all. I just do a lot of trial-and-error type experimentation. I never really liked DLSS even in its newer state, it helps, but I don't like the softer look to the game. Also look to reduce some settings in DCS. I know that when I had the 4080/5080 (just upgraded to the 5090) I set clouds to normal and textures (not ground textures) to medium as well as LOD to .4 as well as other tweaks. I played around with all this until I could look outside either side of the aircraft and observe the scenery moving smoothly. Once there I started to add some settings back in until it started to stutter.

 

Sorry that I can't get into the technical weeds.

  • Like 1

My first assigned aircraft is in my profile name

Ryzen 9800x3d/64gb DDR5 amd expo/RTX 5090/4tb m2/ Win11 pro/Pimax crystal light 

Winwing Orion F16ex (Shaker kit)/Skywalker pedals/Orion 2 F15EX II Throttle/3 MFD units/Virpil CM3 Mongoose Throttle/Trackir 5 

F-16/A10II A/C /F-18/F-15E/F-15C/F-14/F5E II/F-4/Ah64/UH60/P51-D/Super Carrier/Syria/Sinai/Iraq/Persian Gulf/Afghanistan/Nevada/Normandy 2.0

Posted
2 hours ago, Blackhawk163 said:

Sorry for the very late reply. I'm sorry that I can't answer the technical aspects of your question, as I'm not well versed in it all. I just do a lot of trial-and-error type experimentation. I never really liked DLSS even in its newer state, it helps, but I don't like the softer look to the game. Also look to reduce some settings in DCS. I know that when I had the 4080/5080 (just upgraded to the 5090) I set clouds to normal and textures (not ground textures) to medium as well as LOD to .4 as well as other tweaks. I played around with all this until I could look outside either side of the aircraft and observe the scenery moving smoothly. Once there I started to add some settings back in until it started to stutter.

 

Sorry that I can't get into the technical weeds.

Could I please ask you to send me a track of your data? I have similar spec PC to you and am trying to gather data on VR. Ideally several tracks with different settings with a a subjective review of the stutter experience. I just need the XRFrametool log. I can convert this and analyse. 

Cheers

 

 

 

PC specs: 9800x3d - rtx5080 FE - 64GB RAM 6000MHz - 2Tb NVME - (for posts before March 2025: 5800x3d - rtx 4070) - VR headsets Quest Pro (Jan 2024-present; Pico 4 March 2023 - March 2024; Rift s June 2020- present). Maps Afghanistan – Channel – Cold War Germany - Kola - Normandy 2 – Persian Gulf - Sinai - Syria - South Atlantic. Modules BF-109 - FW-190 A8 - F4U - F4E - F5 - F14 - F16 - F86 - I16 - Mig 15 - Mig 21 - Mosquito - P47 - P51 - Spitfire.

IMG_0114.jpeg

 

Posted
On 5/22/2025 at 9:25 PM, Dangerzone said:

I've done all the usual tricks. Rebar enabled, NVIDIA Power to max performance, Park Control off, Process Lasso - P Core only, Disabled Tacview, Windows Power to Ultimate, No gaming mouse / polling rates to turn down, Disabled Microsoft Device Association Root Enumerator, XBOX Game bar off, Game Mode off, no_device_hotplug = true, disable_write_track = true, ran "logman stop HolographicShell -ets", VSync off, Thread optimization off, Prerender threads up to 3, windows power to ultimate, etc. Nothing seems to get the CPU to operate at a level that will give me smooth uninterrupted 72fps in VR.  I can turn graphics down to minecraft level and still CPU bound. 

Have you tried setting your GPU MSI mode ''on'' and prioritized it?

Try the little utility attached.

1. Open ''MSI_util_v3.exe'' as administator

2. Find your GPU and turn on MSI mode if supported

3. Set prioity to high

4. Apply and restart

It might help also to set everything else's priority to ''undefined'' and only keep the GPU on high.

MSI_util_v3.zip

  • Thanks 1

------------------------------------------------------------------------------------------------------------------------------------------------------------

9800X3D, RTX 4090, 96GB DDR 5, MSI Tomahawk 870E, Crucial 2TB x 2, TM WARTHOG COMBO + PENDULAR RUDDER PEDALS, THE AMAZING PIMAX 8K X, Sony 5.1 Spks+SubW | DCS, A-10C_II, AH-64D, F-14/15E/16/18, F-86F, AV-8B, M-2000C, SA342, Huey, Spitfire, FC3.

Posted
7 hours ago, WipeUout said:

Have you tried setting your GPU MSI mode ''on'' and prioritized it?

Try the little utility attached.

1. Open ''MSI_util_v3.exe'' as administator

2. Find your GPU and turn on MSI mode if supported

3. Set prioity to high

4. Apply and restart

It might help also to set everything else's priority to ''undefined'' and only keep the GPU on high.

MSI_util_v3.zip 18.56 kB · 0 downloads

Thanks WipeUOut. I checked MSI. My GPU was selectd for MSI, however the interrupt priority was undefined. I've changed that to High now, so will be keen to see if this helps!

Cheers

DZ

  • 3 weeks later...
Posted

Very strangely setting Pimax quality to high and then adjusting pixel count to a similar average value as before(when in medium) made the game vastly smoother. Very strange

  • Thanks 1

Forum-Signature-335.gif.1dd4085e8589c710

Website | Digital Coalition Air Force | Discord

CPU: AMD R9950X  \ Mobo: MSI MPG X670E Gaming Carbon WiFi \ RAM: Corsair Vengeance 96GB 6000MT/s \ GPU: RTX 5090 \ Various SSDs

Posted

 

Quote

adjusting pixel count to a similar average value as before

HI @Panny, can you explain what exactly you mean by the above please, thanks

  • Like 2

Windows11
RTX 4090
Processor    13th Gen Intel(R) Core(TM) i9-13900K   3.00 GHz
Installed RAM    64.0 GB (63.7 GB usable)
System type    64-bit operating system, x64-based processor

Posted
8 minutes ago, MarkyMarkUK said:

 

HI @Panny, can you explain what exactly you mean by the above please, thanks

OK so when opening Pimax Play, device settings, and then games, you have the option to set render quality. Basically this is how many pixels the GPU has to render. So with OG Crystal that's 2880x2880 for each eye. So with render quality at high - the headset is asking for a pixel for each, meaning 16,128,000 need to be rendered. However, by using Tallymouse's Quadviews Companion tool we can adjust that. So for the foveated area I have 170% (so let's say that covers 1/3 of what has to be rendered by the GPU), that section alone is now asking for 8,225,280. However, I have set the peripheral resolution to 15%, which covers the remaining 2/3 of my view, I need to render 1,596,672 there, making a total count of 9,821,952 that have to be rendered in my headset.

null

Before when I had render quality set to medium, the headset asks for fewer pixels to be rendered than what the headset actually has, at 75% - so by default it is asking 12,096,000. I used to have foveate resolution at 200%, and peripheral at 30% to make up for this. Foveated asked for 7,983,360, while peripheral resulted in 2,431,296, for a grand total of 10,414,656. 

So ironically with my new settings my pixel count is slightly lower, but because the base resolution the headset is asking for is the same, it feels clearer.

 

Why my stutters largely disappeared changing from medium to high - I have no idea. The GPU workload according to MSI afterburner is largely the same.

null

image.png

image.png

  • Thanks 3

Forum-Signature-335.gif.1dd4085e8589c710

Website | Digital Coalition Air Force | Discord

CPU: AMD R9950X  \ Mobo: MSI MPG X670E Gaming Carbon WiFi \ RAM: Corsair Vengeance 96GB 6000MT/s \ GPU: RTX 5090 \ Various SSDs

Posted
В 14.04.2025 в 12:52, onib89 сказал:

Ребята, я собрал зверя и страдаю от этих подтормаживаний... AMD Ryzen 9950X3D, ROG ASTRAL 5090 OC, 64 ГБ ОЗУ 6000 CL32 2X32, PF1 ARGB 1200 Вт Platinum, rog ruyijin 360 iii extreme, Samsung 9100 pro 2 ТБ, 990 evo plus 4 ТБ. Samsung g8 oled 34". На моем мониторе 175 Гц 175 кадров в секунду стабильно и с заиканиями, на pimax crystal light 90 Гц 90 кадров в секунду стабильно и с заиканиями, у многих людей с разными системами одни и те же проблемы, я имею в виду... Что за черт, чувак... Это драйверы Nvidia?? Я перепробовал все настройки от низких до самых высоких, и все то же самое. Я перепробовал все частоты обновления в VR, все, что я видел на YouTube, от руководств до исправлений, и ничего... это просто раздражает... Я не увлекаюсь коммерческими полетами, но я серьезно думаю купить mfs 2020 или 24, чтобы проверить, будут ли там заикания...

Good day, maybe not quite on topic, but you have similar equipment to mine. I bought 5090 and when playing audio in crystal light headphones there were clicks, on 4090 there was no such thing. I tried reinstalling drivers and swapped cables, it does not help. Maybe I had to deal with this problem? Thanks in advance.

Posted
10 hours ago, Panny said:

OK so when opening Pimax Play, device settings, and then games, you have the option to set render quality. Basically this is how many pixels the GPU has to render. So with OG Crystal that's 2880x2880 for each eye. So with render quality at high - the headset is asking for a pixel for each, meaning 16,128,000 need to be rendered. However, by using Tallymouse's Quadviews Companion tool we can adjust that. So for the foveated area I have 170% (so let's say that covers 1/3 of what has to be rendered by the GPU), that section alone is now asking for 8,225,280. However, I have set the peripheral resolution to 15%, which covers the remaining 2/3 of my view, I need to render 1,596,672 there, making a total count of 9,821,952 that have to be rendered in my headset.

null

Before when I had render quality set to medium, the headset asks for fewer pixels to be rendered than what the headset actually has, at 75% - so by default it is asking 12,096,000. I used to have foveate resolution at 200%, and peripheral at 30% to make up for this. Foveated asked for 7,983,360, while peripheral resulted in 2,431,296, for a grand total of 10,414,656. 

So ironically with my new settings my pixel count is slightly lower, but because the base resolution the headset is asking for is the same, it feels clearer.

 

Why my stutters largely disappeared changing from medium to high - I have no idea. The GPU workload according to MSI afterburner is largely the same.

null

image.png

image.png

Hmm... does this mean there's less overhead / CPU working within the Pimax app, because there's a 1:1 relationship with the image resolution it's being sent and what it has to display? Is it possible that part of the issue is double-CPU work. QVFR is using the CPU to change upscale/downscale, but then Pimax is also using the CPU to upscale/downscale, and by reducing Pimax out of the equasion, things run much more efficient on the CPU as a result, even though the resolution setting in Pimax is higher? 

Posted
11 hours ago, Panny said:

OK so when opening Pimax Play, device settings, and then games, you have the option to set render quality. Basically this is how many pixels the GPU has to render. So with OG Crystal that's 2880x2880 for each eye. So with render quality at high - the headset is asking for a pixel for each, meaning 16,128,000 need to be rendered. However, by using Tallymouse's Quadviews Companion tool we can adjust that. So for the foveated area I have 170% (so let's say that covers 1/3 of what has to be rendered by the GPU), that section alone is now asking for 8,225,280. However, I have set the peripheral resolution to 15%, which covers the remaining 2/3 of my view, I need to render 1,596,672 there, making a total count of 9,821,952 that have to be rendered in my headset.

null

Before when I had render quality set to medium, the headset asks for fewer pixels to be rendered than what the headset actually has, at 75% - so by default it is asking 12,096,000. I used to have foveate resolution at 200%, and peripheral at 30% to make up for this. Foveated asked for 7,983,360, while peripheral resulted in 2,431,296, for a grand total of 10,414,656. 

So ironically with my new settings my pixel count is slightly lower, but because the base resolution the headset is asking for is the same, it feels clearer.

 

Why my stutters largely disappeared changing from medium to high - I have no idea. The GPU workload according to MSI afterburner is largely the same.

null

image.png

image.png

First off - thanks for looking in to this.

But, you start off your calculations with the wrong numbers. Pimax Crystal OG rendering resolution at 100% (“High” or 1.0 in Pimax Play settings)  is 4312x5104 - per eye!  
75% (“Balanced” or0.75) is 3,234x3,828 - per eye. 

You are correct in that the physical resolution of the panels are 2880x2880 per eye. But the rendering resolution is much higher - like I wrote above. The reason is to adjust for barrel distortion. 🙂 

 

PC: I9 13900K, Asus ROG Strix GeForce RTX 4090 OC, 32 GB RAM@6000Mhz.

Thrustmaster Warthog Hotas. Virpil Base for Joystick. Thrustmaster TPR Pendular Rudderpedals. Realsimulator FSSB-RL MKII ULTRA base + Realsimulator F16SGRH V2 grip

VR: Pimax Crystal, 8KX, HP Reverb G2, Pico 4, Quest 2. Buttkicker Gamer Pro. Next Level Motion Platform V3.

Posted
3 hours ago, Peedee said:

First off - thanks for looking in to this.

But, you start off your calculations with the wrong numbers. Pimax Crystal OG rendering resolution at 100% (“High” or 1.0 in Pimax Play settings)  is 4312x5104 - per eye!  
75% (“Balanced” or0.75) is 3,234x3,828 - per eye. 

You are correct in that the physical resolution of the panels are 2880x2880 per eye. But the rendering resolution is much higher - like I wrote above. The reason is to adjust for barrel distortion. 🙂 

 

Yes you are right - I completely forgot about that. Then that would likely suggest with my adjusted values having render quality set to high would have a higher total render resolution.

 

4 hours ago, Dangerzone said:

Hmm... does this mean there's less overhead / CPU working within the Pimax app, because there's a 1:1 relationship with the image resolution it's being sent and what it has to display? Is it possible that part of the issue is double-CPU work. QVFR is using the CPU to change upscale/downscale, but then Pimax is also using the CPU to upscale/downscale, and by reducing Pimax out of the equasion, things run much more efficient on the CPU as a result, even though the resolution setting in Pimax is higher? 

I think, in context of @Peedee's correction what you suggest could well be the case. For what it's worth I have generally tried to not get too engrossed with VR optimisation and maybe double check it once a year so I generally try to stay away getting too bogged down in FPS overlays and frametimes. I will check to see what they are presently as and when I have the time. It's a shame that the DCS fps overlay isn't the most useful. Either way, the tinkering has made for a much more enjoyable experience in which the visual experience is much improved and significantly smoother. Sounds dumb on my end to have missed this, but still an interesting note

3 hours ago, Peedee said:

First off - thanks for looking in to this.

But, you start off your calculations with the wrong numbers. Pimax Crystal OG rendering resolution at 100% (“High” or 1.0 in Pimax Play settings)  is 4312x5104 - per eye!  
75% (“Balanced” or0.75) is 3,234x3,828 - per eye. 

You are correct in that the physical resolution of the panels are 2880x2880 per eye. But the rendering resolution is much higher - like I wrote above. The reason is to adjust for barrel distortion. 🙂 

 

Yes you are right - I completely forgot about that. Then that would likely suggest with my adjusted values having render quality set to high would have a higher total render resolution.

 

4 hours ago, Dangerzone said:

Hmm... does this mean there's less overhead / CPU working within the Pimax app, because there's a 1:1 relationship with the image resolution it's being sent and what it has to display? Is it possible that part of the issue is double-CPU work. QVFR is using the CPU to change upscale/downscale, but then Pimax is also using the CPU to upscale/downscale, and by reducing Pimax out of the equasion, things run much more efficient on the CPU as a result, even though the resolution setting in Pimax is higher? 

I think, in context of @Peedee's correction what you suggest could well be the case. For what it's worth I have generally tried to not get too engrossed with VR optimisation and maybe double check it once a year so I generally try to stay away getting too bogged down in FPS overlays and frametimes. I will check to see what they are presently as and when I have the time. It's a shame that the DCS fps overlay isn't the most useful. Either way, the tinkering has made for a much more enjoyable experience in which the visual experience is much improved and significantly smoother. Sounds dumb on my end to have missed this, but still an interesting note.

My performance in dogfighting has definitely vastly improved. Because of the near constant juddering it was much harder to track the aspect of the hostile and react accordingly. I did notice quite abruptly that against the AI it suddenly became trivial again

  • Like 2

Forum-Signature-335.gif.1dd4085e8589c710

Website | Digital Coalition Air Force | Discord

CPU: AMD R9950X  \ Mobo: MSI MPG X670E Gaming Carbon WiFi \ RAM: Corsair Vengeance 96GB 6000MT/s \ GPU: RTX 5090 \ Various SSDs

Posted
19 hours ago, Peedee said:

But, you start off your calculations with the wrong numbers. Pimax Crystal OG rendering resolution at 100% (“High” or 1.0 in Pimax Play settings)  is 4312x5104 - per eye!  
75% (“Balanced” or0.75) is 3,234x3,828 - per eye. 

You are correct in that the physical resolution of the panels are 2880x2880 per eye. But the rendering resolution is much higher - like I wrote above. The reason is to adjust for barrel distortion. 🙂 

Not sure about that, I believe that barrel distortion correcting is done through shaders, not rendering.  In other words, the computer generates the image at the native resolution, then shaders are applied to increase image size (or resolution) in the shape of a barrel to compensate for the pinching effect of lenses.  the barrel shaped image has to be bigger in order to fill completely the panel in your HMD, thus exceeding native resolution but the panel in your HMD will never display more pixels than native because it is physicaly impossible.  Transforming image by using shaders is much faster than generating an actual rendering and GPUs are very good at this with thousands of computing units to apply the shaders.  At least that is my understanding.

------------------------------------------------------------------------------------------------------------------------------------------------------------

9800X3D, RTX 4090, 96GB DDR 5, MSI Tomahawk 870E, Crucial 2TB x 2, TM WARTHOG COMBO + PENDULAR RUDDER PEDALS, THE AMAZING PIMAX 8K X, Sony 5.1 Spks+SubW | DCS, A-10C_II, AH-64D, F-14/15E/16/18, F-86F, AV-8B, M-2000C, SA342, Huey, Spitfire, FC3.

Posted
14 hours ago, Thorns said:

Panny, would please show the rest of Pimax Play screen shots. Thx!

Hi Thorns - what other information are you looking for specifically?

Forum-Signature-335.gif.1dd4085e8589c710

Website | Digital Coalition Air Force | Discord

CPU: AMD R9950X  \ Mobo: MSI MPG X670E Gaming Carbon WiFi \ RAM: Corsair Vengeance 96GB 6000MT/s \ GPU: RTX 5090 \ Various SSDs

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...