Jump to content

mbucchia

Members
  • Posts

    548
  • Joined

  • Last visited

  • Days Won

    3

Everything posted by mbucchia

  1. Idk either. It used to work fine in DCS through OpenComposite. Probably just a badly emulated call that prevents DCS from seeing that the virtual controller is alive.
  2. I'm not following your math at all. You can use the option `debug_focus_view=1` to make it really obvious what your focus region looks like. The drop is linear but it doesn't actually drop all the way to 0. It goes from 1 (fully opaque) to 0.5. This is is inspired from what Varjo's implementation seems to be doing. Peripheral region should be at a resolution that doesn't make it too distracting. There aren't really any other rules.
  3. Yes this is correct, but 100% different from what OpenXR Toolkit is doing (which is feeding the hand joints data as motion controller). It would be neat to write the tool you mentioned above, but it would be limited exclusively to DCS (and maybe a small handful of other games with Leap Motion support).
  4. This is the "thickness" of the transition band contained in the focus area. You can see a little bit what it looks like on the wiki if you zoom much it's expressed as a fraction of the width of the focus area (so 0.1 is 10%), this explains why 0.5 is the maximum (because it means half of the focus area, which is pretty much from the center).
  5. Displaying the hands joins is really the easy part, faking the buttons and controller interactions is the hard part. It can't be done smoothly and give you a good experience. It requires the game to be designed specifically for hand interaction. Even with all the work I put into this mode for MSFS, it's still a pretty mediocre experience. I do not have time to invest into it for DCS. I believe you can probably persist and try making it work. My guess is you need to change the interaction_profile in the config to something else, probably the oculus/touch_controller one. There are instructions on the website to do that.
  6. This mode is primarily designed MSFS and I don't expect it to work well in DCS. Instead, look into @actually_fred's HTCC: https://htcc.fredemmott.com/
  7. MT will Not use OpenComposite, and that's a GOOD thing.
  8. Thanks for joining those threads. I'm actually not 100% convinced now that it might not be my issue. I'll have to give it a try on Quest 2 (without ET obviously) and see if I can repro. I might ask you to provide some more detailed logs then.
  9. If you pay close attention to the video, you'll hear very clearly Steve saying that it's NOT released yet! He has been using a beta version that is not public. It won't be released for at least two if not three more weeks. Edit: and to be clear there are 2 unreleased components here: Pimax Client with eye tracking support, I cannot comment on that one. And PimaxXR with quad views DFR, that is the one that you will not see publicly for another 2-3 weeks at least.
  10. OK I will need to put a note about this so other ppl don't fall in this trap! Really really hate this dev mode with a fiery passion.
  11. The software to do DFR in DCS with OpenXR isn't released publicly, and it won't be until mid-July at least. There's only Pimax beta testers with the eye tracking Pimax Client. There's only 2 PimaxXR beta testers with the DFR tool for DCS.
  12. It's alright I am about to go to bed
  13. Using the quad views is turning on all sorts of code in DCS that you have never used before. It's making the game behave differently, regardless of what Meta-Foveated itself does. Best you can provide is to make an ETL trace when the issue happens: 1) open an Administrator command prompt 2) navigate to `Program Files\OpenXR-Meta-Foveated` 3) run the command `wpr -start tracing.wprp -filemode` 4) capture a few seconds of the issue, make sure to move your headset around so we can see the tracking data passed from Oculus to the game 5) back to the command prompt run `wpr -stop trace.etl` 6) compress the file `trace.etl` and send it over to me
  14. There is nothing in Meta-Foveated that does anything about tracking data. It relays the head tracking data as-is to the game. It certainly _does not touch at all anything related to motion controllers_. You might be looking at a DCS bug here.
  15. This has been said quite a few times, but DCS MT enables OpenXR by default, which on Varjo will automatically enable a mode called "quad views". This mode is incompatible with OpenXR Toolkit. This is listed on my website: Compatibility | OpenXR Toolkit (mbucchia.github.io) You can either - disable "quad views" by using something called OpenXR-InstanceExtensionsWrapper, - embrace quad views and enhance it via Home · mbucchia/Varjo-Foveated Wiki (github.com) in order to take advantage of eye tracking
  16. Not easily. The settings are stored per-app by using the app self-proclaimed (meaning chosen by the app developer) name. Both ST and MT specify the same "DCS World" name. I don't plan on changing that logic.
  17. Can anyone file a bug report or start a thread for this NVG bug? Now that more people are using this quad views mode in DCS, maybe ED will be more inclined to fix the issue?
  18. Thank you all! Seeing a lot of you folks from this thread made very generous contributions! This will for sure help me cover some of my past costs (like new GPU) and future costs (signing certificates which just quadrupled in price this year...). Much appreciated!
  19. TL;DR: I'm still undecided about this one. Long version (because y'all know I love to write up the details): Varjo-Foveated was comparatively simple. It 100% relies on Varjo's own quad views support in their OpenXR runtime. All Varjo-Foveated does is: - Add the necessary setup for foveated rendering on top of quad views - that DCS isn't doing otherwise - that's the primary bit of magic; - Add the ability to override the focus region pixel density. size etc, which people love to play with; - Implement a workaround for DCS incorrect frame metadata submission introduced by MT. There is nothing that requires to touch any of the pixels. So there is no set-up in the code to do Direct3D stuff. This is also why Varjo-Foveated took maybe 1/5th of the time to implement compared to Meta-Foveated. Meta-Foveated does all the above too, but it cannot leverage existing quad views support (it's not in the Oculus OpenXR runtime). So it has to reimplement it, including: - Hooking into the eye tracker to tell DCS how to render the 4 views properly. Fortunately this code is 100% reused from OpenXR Toolkit for the eye tracker part and 100% reused from PimaxXR for the view projections calculations. So that bit was easy. - Compositing the 4 views into a stereo view. Now that's the complex one. This one requires to do some drawing of my own. First because the Oculus OpenXR runtimes lacks of certain features (which Pimax supported for example, and therefore I did not have to implement them in PimaxXR) and second because there are edge cases in the Oculus OpenXR runtime (for example submitting multiple stereo layers seems to preclude ASW). So for that last one, there is quite some significant code needed to set up drawing via Direct3D, pull data from the DCS backbuffers, setup new backbuffers to receive the produced views etc. That code does not exist in Varjo-Foveated, nor in PimaxXR. With all of that set up for Meta-Foveated, adding CAS was pretty trivial (I think it took an hour). There's also Varjo being an odd-ball which caused quite a few issues in the past, the way they manage OpenXR backbuffers internall is 100% different than other OpenXR implementations, and it requires additional logic. It would be complex to add this into Varjo-Foveated. Not impossible, but probably 2-3 days of work. I know, I sound terrible, because 2-3 days of work sound like nothing but they are 2-3 days where I'm not doing something else. I'm literally that busy. So wait and see... Right now I will focus on applying my learnings from Meta-Foveated back into PimaxXR for Crystal support. Ideally I can find a solution where Quest Pro/Varjo Aero/Pimax Crystal/others all share the same solution without having additional maintenance costs. For example, I have Meta-Foveated working on G2 Omnicept - but releasing that is an additional maintenance cost (which I deem not worth it due to low volume of users). Ideally, there would be >1 maintainer to do work like this - eg someone passionate about G2 Omnicept support willing to spend a couple of hours a month to maintain it. <long brain dump over>
  20. Are you using wireless? That could be cause for latency.
  21. It will always depend on user to user. If you were only getting a 10% gain then you are on the very low end of the spectrum... What quad views does is help reduce the overhead of 1) pixel shading and 2) pixel rasterization. In you are not in a situation (game settings, resolution, GPU) where these are your bottleneck, then you won't see as much improvements. It's similar to CPU vs GPU bound. Your sig shows you are on a GTX 1080... I suspect such an old system will have many more bottlenecks than just 1) and 2).
  22. Possibly the reason but also quite possibly my bug too. OpenXR Toolkit DFR on Quest Pro doesn't need that offset for some reason ^^ It's probably something I need to get to the bottom of, but was OK to release as-is for now.
  23. I'm not sure. Remember that I don't have a Quest Pro. I use very similar code between my projects and the headsets I support (Pimax/Omnicept/Quest Pro). Somehow this bias isn't needed on the other ones. Not sure why it is on Quest Pro. I trust my beta testers! Sharpening-only has a very little cost. I haven't even tested without it, but with it I found the end-to-end overhead of the entire API layer to be less than 200 microseconds (it was closer to 160 microseconds on my 4070, and I rounded up). I suspect if you remove CAS you probably go down to 80-100 microseconds. The performance gain is likely imperceptible, but I'm sure the quality will suffer more... Thanks, fixed! That's a typo. The parsing would probably just ignore it at this point and still see 0.25.
  24. It varies from people to people, it's not universal. Though both with Aero and Quest Pro so far, it looks like the large majority of people are fine with the defaults. Be sure to disable the two "bad" options in DCS, 'VR' -> 'Bloom effect' and 'System' -> 'Lens Effects' because the make things worse. Also, you may have noted that there is a lighting bug in DCS that may create a different contrast between the two views. Nothing we can do except hope ED will fix that.
×
×
  • Create New...