Jump to content

Sielu

Members
  • Posts

    36
  • Joined

  • Last visited

Everything posted by Sielu

  1. Funny! I have just come back to DCS after some time away and was having this exact issue. @frenzon your timing could not be better!
  2. Nope, this option in WMR settings: LMFAO, I love it. And, yeah yours might actually beat my first button + hot glue monstrosity.
  3. I've played around with this functionality and have so far found it to be pretty lacking, so you're not missing much. The toolkit solution emulates standard VR controllers based on hand position, orientation, and finger movements. Needless to say, this gets real finicky real fast. I actually find DCS's native leap implementation to be better, but both approaches are still weak due to two fundamental issues: 1. 3D vs 2D: with all the current hand/controller to plane interfaces there's a default requirement that your hand be in the 3D vicinity of the control you want to interact with. This seems great in theory, but in reality we all have 'stuff' in the way that can prevent that from actually happening. Thus, most solutions allow for a ray cast of some form from your hand to the control that you want, but this typically requires contorting your hand to 'point' to a given control, which (for me at least) feels SUPER unintuitive and not immersive. 2. Finger movement to buttons: Since basically every solution is emulating a standard VR controller, registering buttons is done via finger movement/wrist turning of some form. This is problematic, especially for ray cast interaction, since typically moving your finger or wrist shifts you hand position slightly, creating a lot of 'missed' interactions. Not to mention LEAP is far from infallible, and will frequently not catch the movement to trigger the interaction. For these reasons, I like the Fingers solution better. It is effectively a 'ray cast' style interaction, but the 'ray' is being cast from your head, not your hand. You're effectively given two 'control points' for the ray, your hand AND your head, instead of just your hand which makes it more stable. Then, relying on physical buttons to trigger the different interactions makes the interactions themselves more reliable and more stable. My ideal solution would be to program a head-based ray cast through each identified hand from the LEAP, then use the Espruino buttons to register button interactions. I'm not that good at programming (yet).
  4. I'm far from an expert, but I would also suspect that since most of these manufacturers are in the defense industry, not the hobbyist flight sim industry, these parts are not exactly on their legal radar, even if they are technically patent violations. Even that would be suspect though, because outside of copying the form of the various panels, we're definitely not replicating their "real-life" function. Movie Prop manufacturers would have a lot of headaches if patent law was that strict. But again, I literally have no idea what I'm talking about.
  5. I've also had an issue with the latest version of Gemini and second monitors messing fingers up. WMR Was creating 'extra' fake monitors to handle 'legacy' windows apps every time it started, which was really chewing up Fingers. I could get it to work by re-starting fingers AFTER WMR, but disabling the extra monitors setting in WMR fixed it for good. I was considering something like that for a future revision. Right now this whole solution doesn't work for beans in a certain other popular, WMR-based, non-combat flight simulator because it's cursor implementation doesn't like the winput() commands that Fingers sends to windows. Outside of going down the rabbit hole of programming new OpenXR controllers entirely, I'm thinking of some way to get the Espruino's to send the PC a mouse change signal to kludge in support for other sims. Just gotta figure out a way for the Espruino's to know where they are in space.
  6. I probably shouldn't have generalized so much. I'm genuinely ecstatic that you've gotten your setup to work that well! Out of curiosity, what modules are you using? I found the 'touchability' of the F16 to be particularly finicky, and after a week couldn't get through half of a cold start without having to deploy the mouse. The hand-position-to-cursor emulation route has proven to be much more immediately intuitive to me, and I really enjoy the tactile feedback of having physical buttons to push once 'in position'.
  7. I use the Stereo IR 170 in my setup, it works just as well as the normal leap (which is to say, not very) for 3D hand tracking in DCS. If anything, I would recommend going with the normal leap for less money and then using the fingers app to emulate a mouse cursor.
  8. Just a note - Tehrawk was able to fix their issue and I have updated the guidance on the RoosterHands github to hopefully prevent further mishaps!
  9. Don't know if we're a little past the proper technique debate, but I did some testing and found that perfectly aligning the boresight bullseye in the reticle, at least for me, produces the best results:
  10. For those with leap motion hardware, this feels like it may fit well with @frenzon's fingers app:
  11. Thanks Viper! If I was as artful at modeling/printing/building as you I would be making a lot more physical interaction doodads. Been following your build for a while, super impressive yourself!
  12. I think I've finally gotten my rig to a place where I'm reasonably happy with it, and wanted to show off! I come from a family of aviators, but myself never got past soloing. The work + expense = payoff equation for IRL flying just never added up for me, but I never could sahke the 'itch' either. Thus, I got into simming. I've had a forever-project simpit for about the last 12 years or so, making incremental advancements to it every couple of years. For the vast majority of that time I've focused on space, but made the leap to DCS about two years ago and haven't looked back. While at this point I spend the majority of my time in DCS, I do enjoy a good general aviation romp every once in a while. VR has been an absolute game changer, and I've been "investing" in headsets since the DK2. One of the under-appreciated powers of VR--to me--is it's flexibility. The ability to slide into any cockpit and be present is unmatched, and I wanted my physical rig to take advantage of that. I also tend to be a bit fickle with my aircraft, and often find myself swapping between modules. Thus, while I have massive admiration for the detailed replica cockpits on this forum and others, I can't justify building out so many physical buttons for a platform that I may get bored of in a month. Wanting to up my immersion game even more, and with the onset of the pandemic, I decided to start going down the motion-rig rabbit hole, and was able to design and complete a compact motion G-seat in the style of Bergison's. The whole build thread can be found over on Xsimulator if you want more info on that. The next piece of the equation was getting really good, immersive cockpit interaction. I love the elegance of the PointCTRL solution, but after sitting on the waitlist for more than a year I decided to take matters into my own... hands... and stumbled on the fingers app. A relatively quick round of programming and prototyping my own finger buttons (affectionately named 'Roosterhands') and I was good to go! Finally, the OpenKneeboard project has been ported to OpenXR, which means my rig is now in it's final form! Until next month when I figure out how I want to tweak it again... Without further ado, a cold start of the F16 with takeoff:
  13. I'm getting 30-45 depending on where I'm at. Georgia/PG/NTTR on the 45 end, Syria/Marianas closer to 30 in the urban areas. I'm running relatively high spec on a 10900K @4.8 and a 3090, 64gig RAM. I DO have a lot of CPU overhead, running both leap tracking and simtools in the background which I think contributes to slightly less-than-stellar frames. For the OpenXR toolkit I've got frames locked to 45 and turned on FFR. I know it doesn't do much but I barely notice it when I'm looking for it so I figure something is better than nothing. Also upped the contrast a tad to compensate for the G2's washy colors.
  14. Just wanted to throw my thanks in for @nikoel et al. for putting this thread and walkthrough together. Absolutely incredible tool. My framerate is slightly worse after upping my PD and settings, but the OpenXR 50%/30%/25% reprojection capability has more than made up for it. Much, much better experience for me. FWIW, not getting any wobbles on the Apache rotors here. (no motion smoothing turned on either).
  15. Hey folks, I've got a new version of my gloves printed. Switched to smaller batteries and mounted them directly on the back of the hands. Much more comfortable! Also, made a GitHub repo for the stl files and Espruino code I wrote: https://github.com/sielu-rooster/roosterhands
  16. I'm using VRK and a basic wacom tablet strapped to my knee. VRK App: Wacom Tablet: https://www.wacom.com/en-us/products/pen-tablets/one-by-wacom VRK took a couple evenings to get configured but now that it's rolling it's been great!
  17. I will continue to proselytize the Fingers app for mimicking PointCTRL's scheme with a Leap Motion (now that you have one). I highly suggest installing and trying it out, just map right/left click to your HOTAS for now and see how it feels... then you get start going down the rabbit hold of building your own finger buttons! This has literally been a game changer for me, as a VR pilot on multiple modules this is the most immersion I've felt without having to built physical controls for each cockpit.
  18. If you already have PointCTRL, I would stick with that. I see a few fundamental barriers to native, 3D leap support that PointCTRL and other hand-to-2D Plane solutions don't have: 1. No feedback: gestures are great but having tactile feedback, even if it's just a button push against the side of your finger is REALLY nice. From what I understand it's one of the complaints of actual F35/F22 drivers that the touch screens they have don't give that tactile "confirmation" of a button push. I think LEAP is the same way, even if you're looking right at the switch it's nice feeling the actuation. 2. Module support: as has been mentioned, 3rd party modules have the option of implementing LEAP, but to my knowledge none have so far. Anything that works with a mouse cursor will work with PCTRL though. 3. Clearance: This, I'm realizing, is the biggest issue for me. I regularly fly the Viper and the Hog, and I'm starting to dabble in the Hornet as well. These three have different clearance requirements depending on what you're doing. Since PointCTRL maps your hand to a 2D plane, you don't have to actually put your hand at the correct distance to actuate a switch... whereas with a true 3D solution you need to have full clearance to all the buttons and switches you want to actuate. Building a replica cockpit, as have some folks above, is an option but given you fly a variety of aircraft, this could pose a challenge.
  19. This is a really elegant solution to a problem I've been considering for a while. How do I get the benefits of Force Feedback while still maintaining the investments I've made in high-end gimbals/sticks etc? I know next to nothing about the Open FFB project, other than it exsits: https://github.com/Ultrawipf/OpenFFBoard Did you try working with this at all or just go with the Sidewinder since you had the hardware?
  20. Yeah I had a hell of a time too.. turns out the Pi-Hole I had on my network (which blocks trackers) was blocking the login server. I disabled the Pi-Hole temporarily and was able to get in... if @edmuss's download link doesn't work, try disabling whatever VPN/Ad-blockers/etc. you have.
  21. You want to hit the bottom-left "BRT" rocker...
  22. I'd say go for it. The native leap support right now still leaves a lot to be desired for me, although YMMV and just manipulating UFC and DDI may be fine. I wasn't nearly satisfied with the out-of-the-box support, but I've had tremendous luck with the app @frenzon built: This approach requires you have to either make some custom buttons to attach to your hands/fingers, or map right click, left click, and possibly mouse scroll to your HOTAS. That being said it's much, much more reliable than the basic leap support and--with a bit of extra work--extremely immersive. Especially if, like many of us, you're on the waitlist for a PointCTRL this can help bridge the gap!
  23. Hey folks, made a quick video showing off fingers tracking + my wireless button gloves. Doing a cold start on the F16. Hope this inspires folks? Sorry the video is blurry in spots... camera kept having a mind of its own on focus.
  24. Hey @Bojevnik, had to wait until the end of the day before I could finish a write-up. The process is relatively simple but a was a little involved for me with the resources I had on-hand. I only had an Arduino Uno when I started, which doesn't natively allow for connection to a PC as a controller (a 'Human Interface Device', or HID). Therefore I had to use the UnoJoy library, with instructions posted on their github to get the Arduino working: https://github.com/AlanChatham/UnoJoy I would recommend if you're starting from scratch, to use a different model of Arduino. Later reivsions can connect to a PC natively as an HID, which makes the process a bit simpler. A quick google search tells me that this library is probably acceptable: https://github.com/NicoHood/HID, and it looks like you could use an Arduino UNO here as well. I'd recommend an Arduino Leonardo or Pro Micro to start with, for simplicity. Then just follow the instructions in the libraries above to set them up as a game controller. You don't need to worry about axes unless you want to. I just mapped 4 buttons to mine, and left the rest of the available controller unassigned. The wiring is pretty simple. I chose basic 12mm momentary buttons (https://www.amazon.com/TWTADE-Momentary-Tactile-Button-Switch/dp/B07CG6HVY9/ref=sr_1_48?crid=1LNA1PJU0ZESX&keywords=12mm+button&qid=1642627045&sprefix=12mm+button%2Caps%2C46&sr=8-48), which have a really nice click and are easy enough to wire. When the button is clicked, it closes a circuit between the two halves of the button, so to wire you just connect one half to an output pin on the Arduino, and the other half to a ground (GND) pin on the Arduino. You can share multiple GND pins between multiple buttons, but the 'source' pins have to be different. When the button is clicked and the circuit closes, it "drives the pin low" or causes the voltage reading on the pin to drop, since it's now sending its electricity to ground. The software reads this as a click and sends the appropriate report to the PC, where it's interpreted as a controller button click. I know that's a high-level runthrough, I hope it helps!
×
×
  • Create New...