Jump to content

mbucchia

Members
  • Posts

    420
  • Joined

  • Last visited

  • Days Won

    1

Everything posted by mbucchia

  1. Disable Lens Effects, as indicated in the instructions: https://github.com/mbucchia/Quad-Views-Foveated/wiki#dcs-world I don't know how many times we're going to have to repeat that one before people start reading the instructions on the website...
  2. You can use SteamVR via OpenXR too. That's two things that aren't mutually exclusive.
  3. Look up "vr barrel distortion" on google. Every device does this, you render at a resolution higher than the physical display (typically 1.4x) to account for the pixel distortion created by the optics.
  4. There is no motion reprojection setting for Quest in OpenXR Toolkit. So you're somehow mistaken.
  5. Yes ahah the horrible hack worked. It looks like the Oculus runtime isn't properly resetting their frame state machine upon beginning a new session. I had to add some unnecessary calls (that would fail on non-Oculus platform) to force it back to a good state. I will report the issue to the Oculus dev. I'll have that new version out later this week. Thank you @Sile for all the testing!
  6. @nikoel after some back and forth with @Sile and some experiments, I can now confirm that the crash on removing/putting back the headset looks like an Oculus bug and nothing I can fix. It is exacerbated by the Turbo Mode, but nonetheless an Oculus bug. EDIT: I have one last idea for a disgusting hack to try to fool the Oculus runtime, but after that I have no other avenues.
  7. You need to check if you are CPU bound. If you're CPU bound or close to CPU bound, quad views DFR isn't going to help and will actually do the opposite.
  8. Only the application can know certain information. The CPU frame times, when the application is doing multithreading, may overlap two frames, therefore is not measurable "externally" (ie OpenXR Toolkit). I wrote a very detailed explanation for MSFS, but this is also applicable to DCS since the MT version was released: https://forums.flightsimulator.com/t/openxr-toolkit-upscaling-world-scale-hand-tracking-release-thread/493924/2903?u=mbucchia
  9. `smoothen_focus_view_edges=0.4`?? That is ridiculous, you are making your focus region opaque only on a tiny portion at the center, meanwhile you are still rendering all of its pixels!! focus_section=0.5/0.5 means 1/4th of the screen, then on top of it you are only leaving 0.2 of it opaque (0.4 out of 0.5), that means only 4% of your screen is at uncompromised full resolution!! You don't need to have such a big transition between the two regions, that transition is meant to make sure that you're not abruptly going from full resolution to half-resolution, but the smaller the better! Use `debug_focus_view=1` to visualize the focus region only, this might help you visualize what these settings are doing.
  10. This is common to every VR headset, render higher than the panel resolution to compensate for the lens barrel distortion. This is because pixels look magnified by the lenses, and therefore you want to render at higher pixel density to counter that and keep the graphics crisp. Your G2 had the same thing, with 100% being something like 3000x3000, aka 1.4x of the panel's 2160x2160. The exact factor depends on the optics and should be advised by the vendor. Running Crystal at 1.4x is likely too challenging though, but it's up to you if you can.
  11. I got the trace from someone else already so it's ok
  12. > I can produce 60fps uncapped with MR off. > GPU frame times around 18-21ms That's contradictory. 1/0.021 = 47 FPS, not 60 FPS. Your 2 FPS headroom is insufficient to maintain a solid 45 FPS with MR on, hence you get dumped to 30 FPS (there is no granularity in-between, you can get half - ie 45 - or one third - ie 30). So MR isn't costing you 30 FPS, but it is costing you more than the 2 FPS headroom you have, which in turn turns into a dramatic fall to the one third mode. It's unclear whether you meant that this measurement is with MR on or off though. But regardless, this is enough for me to think that getting 30 FPS is the expected outcome given your system's performance. You can try _forcing_ the algorithm to use 45 FPS mode via OpenXR Toolkit settings as suggested above (set Motion Reprojection to On in OpenXR Toolkit and then use the Lock Motion Reprojection right below it). But assuming that you often dip just even to 44 FPS, the "forced" mode is likely going to make things worse.
  13. Can you do the following? Preferably with Quad-Views-Foveated and no other program like OpenXR Toolkit. 1) open ADMIN command prompt 2) run `wpr -start %ProgramFiles%\OpenXR-Quad-Views-Foveated\Tracing.wprp -filemode` 3) run DCS and reproduce the crash 4) back to the command prompt, run `wpr -stop trace.etl` 5) ZIP up the `trace.etl` file (the ZIP step is important!) that was created (it is created in the current folder, aka the one you can see the prefix for in the command prompt) and send it to me Thanks.
  14. There is no motion reprojection in OpenXR Toolkit, it's only a shortcut to control the motion reprojection of WMR. OpenXR Toolkit doesn't do any motion reprojection. SteamVR motion smoothing is only for SteamVR headsets AFAICT (Valve Index, HTC Vive) and you can't enable it with Pimax. Pimax Smart smoothing had always been a little problematic. Are you using SteamVR or PimaxXR as your OpenXR runtime?
  15. @nikoel the only crash I got taking the headset off/on is with "double Turbo" aka having Turbo on in both Quad-Views-Foveated and OpenXR Toolkit. That's a really bad idea in the first place (and one of the reasons I ask people to reset all settings in OpenXR Toolkit). Use Turbo either in one or the other. I'm not sure this is your issue, but I couldn't get a crash unless I intentionally made that mistake.
  16. Can you try reinstalling the component called "OpenXR for Windows Mixed Reality" (not "OpenXR TOOLS for Windows Mixed Reality). The service mentioned in that error lives in that package. Uninstall from Add or remove programs in Windows. Then reinstall from https://apps.microsoft.com/store/detail/openxr-for-windows-mixed-reality/9P9596DJJ19R
  17. Can you tell me exact steps to reproduce? I tried yesterday with a Quest 2 (cause that's all I have that can use the Oculus runtime), and I have no issues when removing the headset.
  18. There is nothing in OpenXR Toolkit that precludes motion reprojection (well except Turbo mode). Motion Reprojection (aka motion smoothing) is a setting in Pimax Client, completely independent of OpenXR Toolkit.
  19. Use the Render Quality slider in Pimax Client, set it to 0.75, it will provide just enough supersampling. You always want the rendering resolution to be a bit higher than the panel resolution.
  20. Yes unless you have setup in Pimax Client an override specifically for DCS.
  21. Thanks for trying. Yes I should've mentioned that the purple thing is used for the foveated rendering, and instead for an "aim" thing we would be using the center. So you guessed right. Doing the crosshair isn't hard, what is difficult is how to translate the input into something usable by the game. This is truly something the game NEEDS to implement themselves. Otherwise we have to resort to faking input... - We can't really use the mouse. Let me explain why. All games have different ways of capturing mouse input, because mouse is an OS concept, and not an OpenXR one, and many of them do not follow best practices. For example games will typically use relative mouse movement upon blocking the cursor into a confined space. "Relative" movement means impossible for an external software to calibrate absolute position (and by nature your eye gaze is a point projected in absolute coordinates on a plane in front of you). This is why you can't see/use the mouse when your game window has focus (and you need to Alt-Tab or Win button to "release" the cursor to Windows). Also apps like DCS are a good example: if you look closely at DCS after Alt-Tabbing, you'll notice that the cursor on the desktop and the cursor inside DCS aren't aligned at all. This is due to the difference in absolute vs relative coordinates. It would be impossible for a "driver" to correlate these coordinate systems. - One thing that OpenXR eye tracking and the game use in common is the motion controller. There is a direct and easy way to correlate your eye gaze position and your motion controller. But this isn't practical for use. We could make your eye gaze act as a virtual motion controller, but then it comes with many issues, such as your motion controller is not "stuck to your face", but instead at arm length. So we'd have to somehow fake a certain depth for the "eye-to-controller". The problem is we don't know what the scene in the game looks like, so we'd arbitrarily pick a depth, which sometimes will be too far from the cockpit switches, and sometimes will be behind the cockpit switches. That won't be usable. - I hear there are those "pointCTRL" and @actually_fred has a software called HTCC to control it from OpenXR. I think it could in theory plumb the eye gaze input into it, but I suspect it would have the same issues described above related to motion controller usage. So it's unclear where to go from there. Idk if you have any idea based on the currently available input methods in the game, and how we could fake the input using the eye tracker, now that I've explained some of the challenges with it.
  22. very weird indeed. As you can see in the log file you are rendering 63% less pixels... so unclear why the game would be working harder. I've been adding all sorts of performance instrumentation in a new version (not published yet), once I release that you'll have to try again. I don't think anybody else has reported such issue so far, the only known performance issue is with OBSMirror and that tanks the CPU frame times, not GPU
  23. Can you try disabling all other software you have, like XRNS? Even if they are not running.
  24. Please give us the full log from Quad-Views-Foveated
×
×
  • Create New...