Jump to content

Recommended Posts

Posted (edited)

My understanding is that judder is present in all VR headsets and games to some degree.  It's a problem that is yet to be solved by smart smoothing or motion reprojection.  What happens (correct me if I am mistaken), is that when the GPU can't make the vsync timing (say 14.1ms frame time at 72hz), OpenXR will get an "older" frame to send to the headset.  Ideally, you have enough headroom to run 72fps at 72hz, and you never see judder.  But, if you don't have the headroom, is it better to judder at 72hz or 120hz?  At 72hz you wait 14.1ms and maybe you get another old frame.  At 120hz you wait 9ms but if you get an older frame it's "less older", ie. less judder, less positional difference between frames.  So, it seems to me that although 72hz/72fps is easier to hit, if you can't maintain that, then go 120hz.  That seems logical to me.  Does the math add up?

I think this makes sense with higher FPS.  If you are only getting 30fps, you could wait 3 vsync cycles for a new frame and then get a very large position difference in the frames.  Judder fast at the highest hz and fps you can manage for the smoothest VR experience short of fully syncing with the headset.

Edited by Glide
Posted (edited)

In some circumstances, but mostly no.

To address this issue, most (if not all) headset manufacturers use a technology called motion reprojection, which involves generating fake frames or motion smoothing. Although these terms used to have specific meanings, they are now often used interchangeably, so it’s simpler to refer to them collectively.

For the G2, which is an older headset, OpenXR Reprojection works in stages. At 90Hz, if you can achieve more than 90 FPS, you experience no judder. When you drop to 89 FPS, motion reprojection kicks in, producing one real frame for every fake frame, effectively resulting in 45 FPS with a slight overhead of creating 45 FFPS (fake FPS). It can also step down to 30 FPS. This approach minimizes judder but can introduce artifacts. The quality of these artifacts depends on the headset manufacturer and the specific technique used—some may be more tolerable than others (G2 versus Pimax or Meta).

In practice, you are effectively at 90 FPS @ 90Hz (or FPS = Hz if you have a different refresh rate) unless your system drops below 45 FPS (i.e., half the refresh rate), in which case the experience will be less smooth and more like a fast slideshow.

What you’re describing is essentially a simplified form of V-Sync, known as reprojection (without the motion part, though the term is often used to describe the overall concept). If your system can handle it, you’re generally better off with the highest refresh rate you can manage divided by 2. For example, at 120 Hz, 60 FPS is preferable.

Given your example and figures, your math is correct. However, I would recommend using motion reprojection if available. Keeping the refresh rate at 72 Hz means that if you can maintain 72 FPS or above most of the time, you’ll have true smoothness. During moments when you can’t maintain this, reprojection will activate to maintain 72 FPS with 31 fake frames.

Edited by nikoel
  • Like 1
Posted

Yes, I understand all that quite well.  You have confirmed my thinking.  I have been kicking the tires with Smart Smoothing again lately.  In some cases it works quite well, but I still see "batches" of screen tears now and then, like it desyncs and needs to get back to square one.  I haven't found the sweet spot yet.  I had good success with MSAA and without Quadviews and tuning for around 40ps, then locking to 36 with smoothing.  But, for heavy missions and good graphics, I need quadviews.

What works well is just letting it go as fast as it can, then using the Max Frames slider at the beginning of each mission to just give yourself some headroom.  I can hit 72fps with my 3080Ti with "performance settings", but if I want to add SS, AO, etc. I'm not going to reach 72. 

Posted
24 minutes ago, Glide said:

Yes, I understand all that quite well.  You have confirmed my thinking.  I have been kicking the tires with Smart Smoothing again lately.  In some cases it works quite well, but I still see "batches" of screen tears now and then, like it desyncs and needs to get back to square one.  I haven't found the sweet spot yet.  I had good success with MSAA and without Quadviews and tuning for around 40ps, then locking to 36 with smoothing.  But, for heavy missions and good graphics, I need quadviews.

What works well is just letting it go as fast as it can, then using the Max Frames slider at the beginning of each mission to just give yourself some headroom.  I can hit 72fps with my 3080Ti with "performance settings", but if I want to add SS, AO, etc. I'm not going to reach 72. 

If it makes you feel better; after the last couple updates I have been struggling

I have a 4090 with a highly tuned system and even when I reduce the settings to potaetoe quality I get a momentary frame spike every few seconds. Only the bare minimum background services running (I even culled security settings) with overhead of 90%, frametimes for GPU and CPU in low single digits

Reverting to older Nvidia Version helped a little bit

I just came to a conclusion that there is something wrong with this build of DCS and it was my time to be the collateral and I just need to wait for a fix

  • Like 1
Posted

No worries, I love testing this stuff.  I took another run at the Smart Smoothing settings.  It seems that 120hz works best with it.  Time to leave the FPS counter off for a week or so.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...