Jump to content

actually_fred

Members
  • Posts

    177
  • Joined

  • Last visited

3 Followers

About actually_fred

  • Birthday 03/01/1987

Personal Information

  • Flight Simulators
    FSX, DCS World
  • Location
    Austin, TX
  • Interests
    VR
  • Occupation
    Software Engineer

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. This is false: - it changes the difference between what the left eye sees and what the right eyes see, that's all. - look at the screenshots I attached: the corners of the highlighted squares are at the same cockpit positions, and are the same size Best: leave it off If stuff feels wrong, it is best to adjust your headset, not this. It's the same thing, but sometimes better.. Using this can have side-effects, and will *at best* be as good as your headset setting. Using this is more likely to give you eye strain, headaches, or artifacts. If this feels fine to you, go ahead. it's the same thing, just potentially worse, and possibly more convenient to you.
  2. I've seen a few conflicting statements about these, so I decided to test using one of Meta's developer tools, which provides a simulated OpenXR-compatible headset. "It's not IPD": Mostly false It's inter-camera-distance (ICD), which is roughly the same thing as IPD. While they are different measurements, changing IPD works by changing ICD. If you make your IPD 1cm larger, you're almost always making your ICD 1cm larger. Both are changing your binocular overlap/separation - that is, the difference between where things appear to be in your left eye compared to your right eye. The difference between ICD and IPD is largely irrelevant to anyone except headset manufacturers, runtime developers, or engine developers. "It's world scale": Mostly true in terms of perception Changing IPD or ICD is a form of world scale. This is also what OpenXR Toolkit's "world scale" option does: https://mbucchia.github.io/OpenXR-Toolkit/other-features.html#world-scale-override Reducing the ICD can make the world feel 'smaller', increasing the ICD can make the world seem 'larger', however, it also has an impact on depth perception. The experience also varies from person-to-person: because (spoiler alert: see below) it's not a true world scale, it makes it so your brain is seeing inconsistent data about reality which doesn't make sense. Some people will feel a depth change, some a size change, some neither, and some will just get eye strain or a headache. "It's world scale": Mostly false in terms of what the game actually does It does not change the size of anything sent to the displays. All it changes is how far apart the left eye view and right eye view are. In theory, if you look through just one eye at a time, the "force IPD distance" will have no effect on perceived size - however, if you mix this with looking through both eyes, your brain is great at filling in the blanks, and it will still feel like it changes the one-eye size even though it objectively doesn't. If an MFD is 200px by 200px with "Force IPD Distance" set to 70, it's still 200px by 200px with "Force IPD Distance" set to 80. Practical advice Your runtime/headset options should be at least as good. This is because if a runtime uses IPD as part of reprojection, using "Force IPD Distance" will hurt it. Note you often can't entirely turn off reprojection, even if there's an option for it: if DCS start rendering a frame 10ms before it will be displayed on your headset, the runtime provides a 'best guess' at where the headset will be in 10ms. Even without missed frames, 10ms later, runtimes will often reproject the image that DCS provided to account for the difference between the prediction and the reality. If you don't see a difference, maybe your headset/runtime doesn't use IPD as part of the reprojection, or maybe it's just not something you're sensitive to. In that case, use whatever's most convenient to you. Receipts I've used a simulated headset instead of a real headset so that: there is absolutely zero head movement it is pixel-perfect reproducible Fake headset is set to 60mm IPD for both screenshots. You can save them and see the pixel measurements are identical in both screenshots and have the same reference points, or, you can see that the front panels are the same width in both screenshots - but the distance between the left and right eye changes. If this were a 'true' world scale, changing it would change the pixel size of objects. The easiest reference point for overlap is the hole/rivet on the front canopy, near the centerline. image.png60 IPD headset, no override: 60 IPD headset, 80 IPD override:
  3. 1. https://htcc.fredemmott.com/faq.html#my-anti-virus-says-it-found-something-what-do-i-do 2. master is newer than the latest version, with the exception of the version number. If you're familiar with git you can easily confirm this yourself 3. (made an exception for those last two given how they look together, but):
  4. As concrete examples: - Meta XR simulator reports a FOV of 50 degrees upwards of center (well, 0.87266463 radians....) , and 49 degrees down, which is close enough to symmetrical for this to be subtle - My Quest Pro reports a much more substantial difference of 42 degrees up, 53 degrees down, so a crosshair that is put in the center of the render target without paying attention to the FOV angleUp/angleDown will be quite substantially off. These numbers are the angles provided to DCS by the runtime in the `xrLocateViews()` call - they are not a subjective measurement of my perceived field of view.
  5. This probably varies depending on whether the headset has a symmetrical field of view: my guess is that DCS draws the crosshairs in the vertical middle of the panel, which isn’t the center of the FOV on some recent headsets, but often is in simulators. this also ties into the “mirror window is too high” reports , but in that case it’s not a bug - it’s a mirror and it’s showing what’s on the panels, the panels are mounted off-center intentionally
  6. Hey - this is all a bit off-topic, and would be better suited to the main VR forum rather than the bug report/suggestions forum. In particular, there's another recent thread on thumb/finger mice.
  7. There's really just one thing needed here:
  8. This is half way between a bug report and a feature request: DCS's built in support for hand tracking is highly immersive, but not practical, because of how control interactions are triggered by a moment of 'touch'. For example: due to the switch positions and limitations of tracking, pushing the throttle fully forward will often... incorrectly turn off fuel pumps or engines in the A-10C eject stores in the F-16 activate the fire suppression in the F-18 due to tracking limitations and just closeness/stability, interacting with the UFC in the A-10C can incorrectly trigger a fire supression handle in many aircraft, interacting with the lower front and side panels can lead to accidentally ejecting HTCC still exists for DCS entirely because of this issue. Suggested fix Add option to require a button to be held for an interaction to happen Add option for a pointing gesture triggering a 'laser', like a controller, which *also* requires a button to be held if the above option is also on "A button can be held" should support: mouse buttons (e.g. PointCTRL with stock firmware, generic 'ring mice' from amazon/ali express) - this would need to ignore the mouse cursor position and just use the hand tracking position directinput game devices (e.g. pointctrl with HTCC firmware, slugmouse) nice to have: optionally some kind of gesture, e.g. pinching thumb and index fingers XR_FB_hand_tracking_aim makes this easy, but is not universally supported. It is currently supported on Quest Link in dev mode only, Quest-series headsets via Virtual Desktop, and Ultraleap-based devices (including the hand tracking module on the original Pimax Crystal) it can be implemented more generally by comparing joint positions this should be optional because like hand tracking overall, it is not perfect; people who have buttons bound are likely to want to disable this to further reduce the chances of incorrect interactions
  9. Download link: <https://github.com/OpenKneeboard/Fresh-Start/releases/latest> For most people, uninstalling OpenKneeboard from add/remove programs is sufficient; there's a few cases where it isn't: the uninstaller intentionally does not delete your settings; if you want to start with fresh settings, this tool can help you (or you can just delete the settings files yourself) Microsoft limitations prevent OpenKneeboard installers/uninstallers from banning or repairing some edge cases, such as installing certain older versions after newer versions without uninstalling newer versions first If multiple versions were installed simultaneously (e.g. via the above edge case), Microsoft limitations sometimes prevented installs/upgrades from cleanly removing all previous versions instead of just one previous version This tool cleans them all up. In short, use this if: you tried or used to use OpenKneeboard and want to kill it with fire you want a complete fresh start of OpenKneeboard, deleting your previous settings you're having problems, and have been using OpenKneeboard for a long time; e.g. some of the issues it can repair only occur if you've had a 2021 version of OpenKneeboard installed, then installed a 2025 version, then installed a 2021 or 2022 version again
      • 2
      • Thanks
  10. HTCC is an alternative hand tracking implementation for DCS World. It: reduces immersion: hand models won't line up and should be disabled (see instructions) improves practicality: drastically reduces the chances of accidentally triggering switches, which frequently leads to accidentally ejecting, turning off fuel pumps, or pulling fire handles Download here: https://github.com/fredemmott/HTCC/releases/latest Then read: https://htcc.fredemmott.com/getting-started.html Highlights Improved compatibility and reliability, including SteamVR added basic report on OpenXR usability added option to fix basic Ultraleap issues (including the hand tracking module for the OG Pimax Crystal) added option to enable/disable the suspend/hibernate gesture completely rewritten settings app Please note: While SteamVR itself supports hand tracking, most SteamVR headsets and drivers do not HTCC is limited by DCS's 'absolute mouse' support (e.g. like a touchscreen or tablet). HTCC's in-game experience will never substantially improve. Requests for improved hand tracking experience should be sent to the game developers, and should be asking for improved hand tracking - not for improved HTCC support. HTCC is a workaround; everything it does could - and should - be done at least as easily and at least as well in the game itself. I do not provide any support for HTCC on these forums. See https://htcc.fredemmott.com/getting-help.html
  11. Make sure you have the windows installer service running. If not, set it to start automatically and reboot. if you don’t have the windows installer service or it won’t start, you need to reinstall windows - it’s broken, usually due to an over eager attempt to “de-bloat” it.
  12. Yep, and the current nvidia driver has a known issue in that it often crashes in D3D12 games with HAGS off.
  13. HAGS is *usually* best left on nowadays but it’s still a “can vary - try it and see” thing. For people holding back on win10 or old versions of win11 it’s more likely to be good to turn it off
  14. Two contrasting approaches: RTSS is 'delay stuff', Turbo Mode is 'remove delays'. Both can increase latency, but 'remove delays' is usually the better of the two Reason that 'remove delays' can be bad for latency is say you want 1 frame per second, and it takes 100ms to render a frame. Say your next frame is due at exactly 10:00:00 The theoretical ideal time to start that frame is 09:59:59.900 - in practice, most runtimes will aim to start a little earlier, say .895. If the game says 'ready to start the next frame' at .801, the runtime may introduce a .094 delay. Turbo mode gets rid of that delay, so your frame will be ready at .901, but it will still be displayed at the next panel refresh at 10:00:00.000 - so it will be 99ms out of date. Another frame will be ready at 10:00:00.101, but that will be discarded, with the goal of making another for 10:00:01 With RTSS or non-VR tools, when they have any effect on VR, you'll still have whatever delay the runtime wants (which will change), but also whatever delay RTSS inserts, and RTSS has no idea about when the next panel refresh rate is. Usually this is either 'no effect' or 'adds an extra 1 frame of delay'.
  15. You seem to have skipped or misread this section twice - note 'non-placebo effect': RTSS is not capable of modifying framerate in a way that respects VR panel timing; all it can do is delay the game loop based on non-VR data. If you are getting a smoother result with RTSS, it is because is most likely it's delaying it enough to entirely miss a panel refresh, effectively adding a 1 frame delay with out-of-date headtracking. If this is the case, the root cause would either be DCS having unpredictable CPU load or over-optimistic wait time prediction in the pimax software. For this kinds of issues, Turbo mode has its' own problems, but is a often a better workaround, at the cost of increased GPU load (but not higher per-frame load) - the 'real' fix is to report the problems you're having to pimax and ask them to improve their runtime - in particular the implementation of xrWaitFrame.
×
×
  • Create New...