Jump to content

actually_fred

Members
  • Posts

    174
  • Joined

  • Last visited

Everything posted by actually_fred

  1. As concrete examples: - Meta XR simulator reports a FOV of 50 degrees upwards of center (well, 0.87266463 radians....) , and 49 degrees down, which is close enough to symmetrical for this to be subtle - My Quest Pro reports a much more substantial difference of 42 degrees up, 53 degrees down, so a crosshair that is put in the center of the render target without paying attention to the FOV angleUp/angleDown will be quite substantially off. These numbers are the angles provided to DCS by the runtime in the `xrLocateViews()` call - they are not a subjective measurement of my perceived field of view.
  2. This probably varies depending on whether the headset has a symmetrical field of view: my guess is that DCS draws the crosshairs in the vertical middle of the panel, which isn’t the center of the FOV on some recent headsets, but often is in simulators. this also ties into the “mirror window is too high” reports , but in that case it’s not a bug - it’s a mirror and it’s showing what’s on the panels, the panels are mounted off-center intentionally
  3. Hey - this is all a bit off-topic, and would be better suited to the main VR forum rather than the bug report/suggestions forum. In particular, there's another recent thread on thumb/finger mice.
  4. There's really just one thing needed here:
  5. This is half way between a bug report and a feature request: DCS's built in support for hand tracking is highly immersive, but not practical, because of how control interactions are triggered by a moment of 'touch'. For example: due to the switch positions and limitations of tracking, pushing the throttle fully forward will often... incorrectly turn off fuel pumps or engines in the A-10C eject stores in the F-16 activate the fire suppression in the F-18 due to tracking limitations and just closeness/stability, interacting with the UFC in the A-10C can incorrectly trigger a fire supression handle in many aircraft, interacting with the lower front and side panels can lead to accidentally ejecting HTCC still exists for DCS entirely because of this issue. Suggested fix Add option to require a button to be held for an interaction to happen Add option for a pointing gesture triggering a 'laser', like a controller, which *also* requires a button to be held if the above option is also on "A button can be held" should support: mouse buttons (e.g. PointCTRL with stock firmware, generic 'ring mice' from amazon/ali express) - this would need to ignore the mouse cursor position and just use the hand tracking position directinput game devices (e.g. pointctrl with HTCC firmware, slugmouse) nice to have: optionally some kind of gesture, e.g. pinching thumb and index fingers XR_FB_hand_tracking_aim makes this easy, but is not universally supported. It is currently supported on Quest Link in dev mode only, Quest-series headsets via Virtual Desktop, and Ultraleap-based devices (including the hand tracking module on the original Pimax Crystal) it can be implemented more generally by comparing joint positions this should be optional because like hand tracking overall, it is not perfect; people who have buttons bound are likely to want to disable this to further reduce the chances of incorrect interactions
  6. Download link: <https://github.com/OpenKneeboard/Fresh-Start/releases/latest> For most people, uninstalling OpenKneeboard from add/remove programs is sufficient; there's a few cases where it isn't: the uninstaller intentionally does not delete your settings; if you want to start with fresh settings, this tool can help you (or you can just delete the settings files yourself) Microsoft limitations prevent OpenKneeboard installers/uninstallers from banning or repairing some edge cases, such as installing certain older versions after newer versions without uninstalling newer versions first If multiple versions were installed simultaneously (e.g. via the above edge case), Microsoft limitations sometimes prevented installs/upgrades from cleanly removing all previous versions instead of just one previous version This tool cleans them all up. In short, use this if: you tried or used to use OpenKneeboard and want to kill it with fire you want a complete fresh start of OpenKneeboard, deleting your previous settings you're having problems, and have been using OpenKneeboard for a long time; e.g. some of the issues it can repair only occur if you've had a 2021 version of OpenKneeboard installed, then installed a 2025 version, then installed a 2021 or 2022 version again
      • 2
      • Thanks
  7. HTCC is an alternative hand tracking implementation for DCS World. It: reduces immersion: hand models won't line up and should be disabled (see instructions) improves practicality: drastically reduces the chances of accidentally triggering switches, which frequently leads to accidentally ejecting, turning off fuel pumps, or pulling fire handles Download here: https://github.com/fredemmott/HTCC/releases/latest Then read: https://htcc.fredemmott.com/getting-started.html Highlights Improved compatibility and reliability, including SteamVR added basic report on OpenXR usability added option to fix basic Ultraleap issues (including the hand tracking module for the OG Pimax Crystal) added option to enable/disable the suspend/hibernate gesture completely rewritten settings app Please note: While SteamVR itself supports hand tracking, most SteamVR headsets and drivers do not HTCC is limited by DCS's 'absolute mouse' support (e.g. like a touchscreen or tablet). HTCC's in-game experience will never substantially improve. Requests for improved hand tracking experience should be sent to the game developers, and should be asking for improved hand tracking - not for improved HTCC support. HTCC is a workaround; everything it does could - and should - be done at least as easily and at least as well in the game itself. I do not provide any support for HTCC on these forums. See https://htcc.fredemmott.com/getting-help.html
  8. Make sure you have the windows installer service running. If not, set it to start automatically and reboot. if you don’t have the windows installer service or it won’t start, you need to reinstall windows - it’s broken, usually due to an over eager attempt to “de-bloat” it.
  9. Yep, and the current nvidia driver has a known issue in that it often crashes in D3D12 games with HAGS off.
  10. HAGS is *usually* best left on nowadays but it’s still a “can vary - try it and see” thing. For people holding back on win10 or old versions of win11 it’s more likely to be good to turn it off
  11. Two contrasting approaches: RTSS is 'delay stuff', Turbo Mode is 'remove delays'. Both can increase latency, but 'remove delays' is usually the better of the two Reason that 'remove delays' can be bad for latency is say you want 1 frame per second, and it takes 100ms to render a frame. Say your next frame is due at exactly 10:00:00 The theoretical ideal time to start that frame is 09:59:59.900 - in practice, most runtimes will aim to start a little earlier, say .895. If the game says 'ready to start the next frame' at .801, the runtime may introduce a .094 delay. Turbo mode gets rid of that delay, so your frame will be ready at .901, but it will still be displayed at the next panel refresh at 10:00:00.000 - so it will be 99ms out of date. Another frame will be ready at 10:00:00.101, but that will be discarded, with the goal of making another for 10:00:01 With RTSS or non-VR tools, when they have any effect on VR, you'll still have whatever delay the runtime wants (which will change), but also whatever delay RTSS inserts, and RTSS has no idea about when the next panel refresh rate is. Usually this is either 'no effect' or 'adds an extra 1 frame of delay'.
  12. You seem to have skipped or misread this section twice - note 'non-placebo effect': RTSS is not capable of modifying framerate in a way that respects VR panel timing; all it can do is delay the game loop based on non-VR data. If you are getting a smoother result with RTSS, it is because is most likely it's delaying it enough to entirely miss a panel refresh, effectively adding a 1 frame delay with out-of-date headtracking. If this is the case, the root cause would either be DCS having unpredictable CPU load or over-optimistic wait time prediction in the pimax software. For this kinds of issues, Turbo mode has its' own problems, but is a often a better workaround, at the cost of increased GPU load (but not higher per-frame load) - the 'real' fix is to report the problems you're having to pimax and ask them to improve their runtime - in particular the implementation of xrWaitFrame.
  13. Anyway, the actual way to limit the game to 72hz in VR: - set headset to 72hz - turn off 'turbo mode', 'prefer framerate over latency', and any similar options in the headset software and any third-party mods. Note that if you're using mbucchia's quad views layer (you probably don't need it on the pimax), it turns on turbo mode by default If that's not sufficient, test disabling all mods with https://github.com/fredemmott/OpenXR-API-Layers-GUI. If you have > 72hz with all mods disabled and it set to 72hz in the pimax software, contact pimax support. The *only* way to get correct frame pacing is for it to be dictated by the headset/runtime, not the game or any third-party software, as it should match the display panel timing.
  14. RTSS and similar tools literally modify the 'send this to my *monitor*' thing, which is not used for VR. The game will usually render to the mirror window in a way that just tells windows "here's a frame, display it whenever you're ready, or don't" - the game doesn't wait for it to display. A forced limit or forced vsync forces the game render to wait for the next permitted frame time *when sending to the monitor*. When not placebo, this limits the entire game, so it ties your VR headset to your monitor's behavior, which is bad. It's placebo when the game more thoroughly decouples VR from the monitor behavior.
  15. RTSS affects your mirror window - it does not directly affect your headset. There’s a fair chance it’s placebo, but if not and the mirror window ends up somehow linked to the VR display, it likely varies depending on your vsync settings, your monitor refresh rate, and if your monitor supports Variable Refresh Rate when RTSS and similar non-VR tools have any non-placebo effect on VR, it’s by tying the game to your monitor timing rather than the headset timing, leading to at a minimum microstutters, but can also lead to motion prediction issues, and larger stuttering and tracking issues.
  16. For openxr, capframex, RTSS, presentmon and others may be easier, but the numbers they give are wrong, and they can actively harm the VR experience because they work on the “send to mirror window” part of the game, not the VR part. https://github.com/fredemmott/xrframetools?tab=readme-ov-file#why-should-i-use-this-instead-of-my-favorite-tool-for-non-vr-games If you go beyond monitoring with these into framerate limiting, vsync, or other modifications, these will either do nothing to VR beyond placebo, or tie your headset to your monitor's timing rather than the runtime's timing, often leading to stutters, motion misprediction, and other tracking issues. Edit: for example, accurate support for OpenXR has been an open feature request for CapFrameX for over a year: https://github.com/CXWorld/CapFrameX/issues/277 While Google's AI and a guest post on Pimax's blog say PresentMon support OpenXR, this is incorrect. This is obvious from the fact that it does not have an API layer. The only other way it could provide accurate numbers would be if it integrated with the runtimes - given PresentMon itself is open source, it would be an unusual choice for them to integrate with some runtimes, but none of the open source ones (like VDXR) - and another bad choice to display incorrect information on other runtimes rather than a 'runtime not supported' message which woudl require an API layer. PresentMon can hint about your load, but this says nothing about timings/bottlenecks unless something is at 100%. I'm also not seeing any trace of OpenXR-related code in PresentMon's source code.
  17. While I don't have a workaround or solution, please let Varjo know the bug in v4.4 and above is important to you; despite it not being in their 'known issues' list, they've been aware of it since at least November. It also affects other world-locked OpenXR overlays. I've been keeping track of the details over on https://github.com/OpenKneeboard/OpenKneeboard/issues/698 as Varjo do not appear to have a public bug tracker.
  18. Has something changed with this campaign recently to add spaces to the end of the mission names? I've had several reports of like Farside's. While this crashing OpenKneeboard is a bug in OpenKneeboard (https://github.com/OpenKneeboard/OpenKneeboard/issues/774), this seems strange and undesirable. For whatever reason, this is mostly being reported with mission 5 of 6.
  19. You also received instructions on how to collect logs and ask for help, which you have not followed. It is not possible to provide any concrete suggestions if you do not follow the steps you have already been provided.
  20. New version with bugfixes: https://github.com/fredemmott/XRFrameTools/releases/tag/v0.2.0 Also worth noting that https://github.com/fredemmott/XRFrameTools?tab=readme-ov-file#why-should-i-use-this-instead-of-my-favorite-tool-for-non-vr-games applies to the several non-VR tools mentioned in this thread.
  21. You can also find various guides online to move Saved Games via other techniques such as the registry or various NTFS features - DO NOT USE THEM, they break things. The way described by others above is the only way to move Saved Games without subtly breaking some windows internals.
  22. This is true - it uses the OpenVR *driver* API, which is separate to the OpenVR game API. However, every other OpenXR runtime except for the simulators (e.g Meta's XR simulator, or the simulator built into varjo base to allow testing their runtime without a headset) also talks to the headset with their own internal driver/API (or in some cases, Windows.Devices.Display.Core - this is documented as a UWP API, but is not limtied to UWP), not with OpenXR. While early OpenXR presentations mentioned an OpenXR device interface (a.k.a. driver), this was never actually specified or implemented. As a developer working with and on OpenXR, I would like to see more (good) runtimes. As a consumer interested in niche headsets from small manufacturers, I'd much rather have one with a mandatory complex piece of software developed by Valve rather than the small manufacturer. I have more trust in Valve to keep delivering a reliable runtime for my expected lifetime for a headset. My definition of a 'good' runtime is: - publicly released versions pass a recent version of the freely-available OpenXR test suite - performant Sadly the majority of runtimes fail the first point - i.e. they do not work correctly. Most vendors seem to test with common games, and maybe run the test suite every year or two, but mostly leave it to users to discover issues that the manufacturer already has access to nice, isolated, reproducible examples with clear pass/fail definitions. I personally consider it misleading to call something an 'OpenXR runtime' if it does not pass a recent version of the OpenXR test suite. Khronos have their own definition which overlaps, but is in some ways weaker (e.g. Meta's PC software is considered conformant, but the tested version is from 2020 on a rift S, and Valve's is from 2021), and some ways stronger (formal process and review, membership, and IP framework rather than just 'tests pass')
  23. SteamVR supports OpenXR natively. Other runtimes from hardware manufacturers are generally substantially worse than steamvr, and I’m talking about correctness, not just performance. Valve did recently introduce some openxr correctness bugs, and when I reported them, they fixed them within two days - Meta is the only other hardware vendor to actually fix openxr correctness issues when I’ve reported them (but sadly seem to pretty much never fix issues reported in PTC before rolling out to stable), even when I point at failing tests in the openxr conformance test suite. Openxr isn’t a magic wand for lockin - using your own example, WMR had its own OpenXR runtime, and Microsoft abandoned it. The crystal is similarly tied to Pimax’s software, quests are tied to Metas (or when using the alternative to Link, still Metas firmware), etc. That said, I’m 100% with you on *game* adoption of openxr being a good thing. On a minor note, OpenComposite (the openvr to Openxr translation layer) is not one of mbuchhia’s projects, and for most of his VR projects, he explicitly did not support them when used with opencomposite. Edit: While it is possible, this doesn't necessarily mean what you think it means; several other vendors ship additional OpenXR integrations as API layers which require SteamVR - for example, HTC have done this for several of their headsets, adding face and hand tracking via OpenXR on top of SteamVR.
  24. SteamVR supports OpenXR, and is one of the best OpenXR runtimes - but only when used for SteamVR-native headsets, like the BigScreen Beyond and the Valve Index. It tends to have issues/overhead when the headset is *not* SteamVR-native (e.g. when SteamVR is just wrapping other software from the manufacturer like Oculus Link or WMR). Most hardware manufacturers with their own runtime have done a substantially worse job than Valve have with SteamVR - especially the smaller manufacturers - and it takes engineering time from other things that are more useful. By asking BigScreen to create/support an alternative OpenXR runtime, you're asking BigScreen to spend significantly more time on software development for no concrete benefits, and a likely worse overall result. I understand some of you may dislike SteamVR, but that doesn't mean that SteamVR is technically bad or the wrong choice for all headsets.
×
×
  • Create New...