Jump to content

Sielu

Members
  • Posts

    36
  • Joined

  • Last visited

About Sielu

  • Birthday August 2

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Funny! I have just come back to DCS after some time away and was having this exact issue. @frenzon your timing could not be better!
  2. Nope, this option in WMR settings: LMFAO, I love it. And, yeah yours might actually beat my first button + hot glue monstrosity.
  3. I've played around with this functionality and have so far found it to be pretty lacking, so you're not missing much. The toolkit solution emulates standard VR controllers based on hand position, orientation, and finger movements. Needless to say, this gets real finicky real fast. I actually find DCS's native leap implementation to be better, but both approaches are still weak due to two fundamental issues: 1. 3D vs 2D: with all the current hand/controller to plane interfaces there's a default requirement that your hand be in the 3D vicinity of the control you want to interact with. This seems great in theory, but in reality we all have 'stuff' in the way that can prevent that from actually happening. Thus, most solutions allow for a ray cast of some form from your hand to the control that you want, but this typically requires contorting your hand to 'point' to a given control, which (for me at least) feels SUPER unintuitive and not immersive. 2. Finger movement to buttons: Since basically every solution is emulating a standard VR controller, registering buttons is done via finger movement/wrist turning of some form. This is problematic, especially for ray cast interaction, since typically moving your finger or wrist shifts you hand position slightly, creating a lot of 'missed' interactions. Not to mention LEAP is far from infallible, and will frequently not catch the movement to trigger the interaction. For these reasons, I like the Fingers solution better. It is effectively a 'ray cast' style interaction, but the 'ray' is being cast from your head, not your hand. You're effectively given two 'control points' for the ray, your hand AND your head, instead of just your hand which makes it more stable. Then, relying on physical buttons to trigger the different interactions makes the interactions themselves more reliable and more stable. My ideal solution would be to program a head-based ray cast through each identified hand from the LEAP, then use the Espruino buttons to register button interactions. I'm not that good at programming (yet).
  4. I'm far from an expert, but I would also suspect that since most of these manufacturers are in the defense industry, not the hobbyist flight sim industry, these parts are not exactly on their legal radar, even if they are technically patent violations. Even that would be suspect though, because outside of copying the form of the various panels, we're definitely not replicating their "real-life" function. Movie Prop manufacturers would have a lot of headaches if patent law was that strict. But again, I literally have no idea what I'm talking about.
  5. I've also had an issue with the latest version of Gemini and second monitors messing fingers up. WMR Was creating 'extra' fake monitors to handle 'legacy' windows apps every time it started, which was really chewing up Fingers. I could get it to work by re-starting fingers AFTER WMR, but disabling the extra monitors setting in WMR fixed it for good. I was considering something like that for a future revision. Right now this whole solution doesn't work for beans in a certain other popular, WMR-based, non-combat flight simulator because it's cursor implementation doesn't like the winput() commands that Fingers sends to windows. Outside of going down the rabbit hole of programming new OpenXR controllers entirely, I'm thinking of some way to get the Espruino's to send the PC a mouse change signal to kludge in support for other sims. Just gotta figure out a way for the Espruino's to know where they are in space.
  6. I probably shouldn't have generalized so much. I'm genuinely ecstatic that you've gotten your setup to work that well! Out of curiosity, what modules are you using? I found the 'touchability' of the F16 to be particularly finicky, and after a week couldn't get through half of a cold start without having to deploy the mouse. The hand-position-to-cursor emulation route has proven to be much more immediately intuitive to me, and I really enjoy the tactile feedback of having physical buttons to push once 'in position'.
  7. I use the Stereo IR 170 in my setup, it works just as well as the normal leap (which is to say, not very) for 3D hand tracking in DCS. If anything, I would recommend going with the normal leap for less money and then using the fingers app to emulate a mouse cursor.
  8. Just a note - Tehrawk was able to fix their issue and I have updated the guidance on the RoosterHands github to hopefully prevent further mishaps!
  9. Don't know if we're a little past the proper technique debate, but I did some testing and found that perfectly aligning the boresight bullseye in the reticle, at least for me, produces the best results:
  10. For those with leap motion hardware, this feels like it may fit well with @frenzon's fingers app:
  11. Thanks Viper! If I was as artful at modeling/printing/building as you I would be making a lot more physical interaction doodads. Been following your build for a while, super impressive yourself!
  12. I think I've finally gotten my rig to a place where I'm reasonably happy with it, and wanted to show off! I come from a family of aviators, but myself never got past soloing. The work + expense = payoff equation for IRL flying just never added up for me, but I never could sahke the 'itch' either. Thus, I got into simming. I've had a forever-project simpit for about the last 12 years or so, making incremental advancements to it every couple of years. For the vast majority of that time I've focused on space, but made the leap to DCS about two years ago and haven't looked back. While at this point I spend the majority of my time in DCS, I do enjoy a good general aviation romp every once in a while. VR has been an absolute game changer, and I've been "investing" in headsets since the DK2. One of the under-appreciated powers of VR--to me--is it's flexibility. The ability to slide into any cockpit and be present is unmatched, and I wanted my physical rig to take advantage of that. I also tend to be a bit fickle with my aircraft, and often find myself swapping between modules. Thus, while I have massive admiration for the detailed replica cockpits on this forum and others, I can't justify building out so many physical buttons for a platform that I may get bored of in a month. Wanting to up my immersion game even more, and with the onset of the pandemic, I decided to start going down the motion-rig rabbit hole, and was able to design and complete a compact motion G-seat in the style of Bergison's. The whole build thread can be found over on Xsimulator if you want more info on that. The next piece of the equation was getting really good, immersive cockpit interaction. I love the elegance of the PointCTRL solution, but after sitting on the waitlist for more than a year I decided to take matters into my own... hands... and stumbled on the fingers app. A relatively quick round of programming and prototyping my own finger buttons (affectionately named 'Roosterhands') and I was good to go! Finally, the OpenKneeboard project has been ported to OpenXR, which means my rig is now in it's final form! Until next month when I figure out how I want to tweak it again... Without further ado, a cold start of the F16 with takeoff:
  13. I'm getting 30-45 depending on where I'm at. Georgia/PG/NTTR on the 45 end, Syria/Marianas closer to 30 in the urban areas. I'm running relatively high spec on a 10900K @4.8 and a 3090, 64gig RAM. I DO have a lot of CPU overhead, running both leap tracking and simtools in the background which I think contributes to slightly less-than-stellar frames. For the OpenXR toolkit I've got frames locked to 45 and turned on FFR. I know it doesn't do much but I barely notice it when I'm looking for it so I figure something is better than nothing. Also upped the contrast a tad to compensate for the G2's washy colors.
  14. Just wanted to throw my thanks in for @nikoel et al. for putting this thread and walkthrough together. Absolutely incredible tool. My framerate is slightly worse after upping my PD and settings, but the OpenXR 50%/30%/25% reprojection capability has more than made up for it. Much, much better experience for me. FWIW, not getting any wobbles on the Apache rotors here. (no motion smoothing turned on either).
×
×
  • Create New...