Jump to content

leapMotion implementation possibility


Kariyann71

Recommended Posts

On 6/21/2021 at 12:36 AM, BIGNEWY said:

Leap motion implementation is in early stages and will be refined and tweaked further. 

 

thanks

This is amazing news...

 

I hope it will be as well developed for desktop users as for VR users (please, primarily for VR).

As so simple thing as having mouse cursor moved with it when it is placed front of the throttle or display.

 

And I would like to know in future that is it possible be used when LM mounted on VR HMD with slight downward angle (to maximize utilization of low angle buttons and switches).

 

 

i7-8700k, 32GB 2666Mhz DDR4, 2x 2080S SLI 8GB, Oculus Rift S.

i7-8700k, 16GB 2666Mhz DDR4, 1080Ti 11GB, 27" 4K, 65" HDR 4K.

Link to comment
Share on other sites

Il 20/6/2021 at 23:36, BIGNEWY ha scritto:

Leap motion implementation is in early stages and will be refined and tweaked further. 

 

thanks

 

 

Hi, Bignewy

I think it is essential that there is an option to bypass the control of the hotas with the hands in the simulator. otherwise it will be unusable.

  • Like 1
Link to comment
Share on other sites

My suggestion would be if the DCS user is using physical HOTAS hardware such as throttle and stick, something similar in concept to a dead zone should be implemented but it would be more of a rest zone, this would require each user to custom configure this "rest zone" type area by putting hands on their HOTAS controls and moving the throttle and stick axis around to max axis movement very similar to just doing a normal windows game device calibration.

 

Also, having different profiles that corresponds to different modules IE: Hornet vs Viper for example because some of us use different hardware for different planes like a center stick vs. Side stick.

 

The principal concept would be that DCS during this in game calibration would be tracking your hand gestures and "learning" on an individual basis per module, just what exactly players hands on HOTAS controls looks like for a F-18 vs a F-16, etc... I wouldn't rely on individual finger tracking for that but rather the overall hand positioning in relation to the fingers. The reason for this is I grab my stick different ways depending on flight regime, IE: doing dogfighting vs formation flying or air to air refueling.

 

Something like that is how I'd suggest approaching the HOTAS hand tracking problem. I'd also suggest the ability to copy profiles so that you can use the same "hand configuration" profile for modules you don't change your HOTAS hardware for like for example F-18 and AV8B or F-15.

  • Like 2
Link to comment
Share on other sites

I really hope it is better integration than DCleap for SteamVR or VRK kneeboard with Oculus.
I tried them both. Unplayable.
My rig is not the best but it works pretty good and smooth with DCS. But the hands were completely stuttered.
Other option would be having native Oculus hand recognition... when it comes and if it is compatible.
Keep the good work!

  • Like 1
Link to comment
Share on other sites

I got my Leapmotion controller today. After reading a lot online and on this forum, i really was not expecting much at all, and it was more out of interest than anything else. I had seen ED were starting to support it, but again fairly low expectations.

However I was somewhat surprised and positively surprised at that. I plugged it in straight away. A very simple process, requiring one USB plug, stuck the controller onto the front of my reverb g2, downloaded the software and fired up dcs. 

Basically it worked straight away. My hands visible, moving and doing pretty much what i told them to do. I was, after a little practice able to fairly easily get around the cockpit and activate all the switches. I did a full start up, a short flight with some a2g missiles operating a few systems, and a full shutdown after, all just using the leapmotion. I had no issues at all with the hands interfering with the controls as I am using physical controls. As a very early implentation into DCS I was pretty impressed. There are certainly plenty of areas for improvement, but based on my test, nothing that shouldn’t be possible with some coding and firmware. Unfortunately I have zero experience in any of this and will be relying on community developments and DCS updates. The first thing that probably it needs, and quite possibly exists already, is a simple way to customize the control gestures. The only thing really we need for dcs is to be able to point a finger and click. This is basically possible right now, but there are so many other gestures it picks up, it can get a bit confused and start to wander. I did find it a little too sensitive, so a way to dial this way down would be a big improvement. I would ideally like to have the option of effectively just using this as a mouse. A pointed finger could therefore physically move certain switches and buttons and those that can’t be reached the laser pointer from a finger and a simple click will be perfect.

All in all I am absolutely thrilled DCS has picked this up and I think if they can make some small adjustments to the implementation within DCS it is going to be pretty much a standard for anyone using VR. It has massive potential, and seems to be a relatively low hanging fruit to get working well fairly quickly. For me already, my hand controllers are now ditched and i will be only using the hand tracking. For those times I can’t get it exactly right, I still have the mouse. Can’t wait to see how this progresses.

  • Like 4
  • Thanks 2
Link to comment
Share on other sites

For them to be useful, they need to work identically to PointCTRL: the VR interface is only active for a limited time when you call for it. No matter how well the hand tracking is implemented, if the hands are always active, you will bump controls. In some cockpits, canopy jettison handles and other similar controls are very close to commonly used controls. It is no fun to have your canopy jettison or engine stop or some other catastrophic failure due to the limitations of VR hand tracking.

 

I haven't tried LeapMotion since it was recently integrated, but I used it for a few weeks using the SteamVR implementation. I also tried DCLeap, which does work a lot more like PointCTRL.  But whether I used VR hands or DCLeap, the controls were too sloppy/inconsistent, especially when using controls that were low or on the side. Depending on the angles, it would be a struggle getting a particular gesture to be recognized correctly. If you keep your hands high up in front, it works fairly well, but that isn't where most controls are located.

 

To date, PointCTRL is the only one that works well enough for me to endorse it because it was designed explicitly for DCS by someone that actually spends a lot of time playing DCS. Even with PointCTRL, you have to generally look at the control to operate it. If you are in a dogfight, it is much easier to reach blindly for a real-world switch than to take your eyes off of the situation to look at some control you need to operate.

 

My LeapMotion has spent most of its time in a drawer. I might get it out to see how well the new implementation works. But past experience tells me to have low expectations.

[sIGPIC][/sIGPIC]

Link to comment
Share on other sites

So this is what we have at the moment. Seems promising although its very early stage.

The tracking is as good as the leap motion demo apps. I'm not really sure how good the leap motion module from Pimax is compared to the real Leap Motion one.

 

I think if the finger could be used to press buttons instead of the pointer that would help a lot. And as the phsyical setup wont match the aircraft in DCS because it is just impossible unless you are a pit builder. So it is necessary to have a keybind on the HOTAS to switch it on and off during flight. Otherwise it will interfere while you are using your physical controls on your PC.

 

Excited to see how it develops.

 


Edited by darkman222
Link to comment
Share on other sites

I already have mine. I could not resist. The truth is.... I am impressed and not disappointed to be one of the first iterations in the implementation. The integration of course has to improve enormously, but as BIGNEWY has said, it is in the early stages.

I have been able to start the F18C (with some other problem), but the handling of the MFDs and their buttons deserves it.

 

En 20/6/2021 a las 23:36, BIGNEWY dijo:

Leap motion implementation is in early stages and will be refined and tweaked further. 

 

thanks

 

 

 

Love it!

 

Here first video demo XD:

 

  • Like 1

YouTube Channel


Update: MSI Z790 Tomahawk, i9 13900k, DDR5 64GB 640 MHz, MSI 4090 Gaming X Trio, 970 EVO Plus 1TB SSD NVMe M.2 and 4 more, HOTAS TM Warthog, Meta Quest Pro

Link to comment
Share on other sites

  • ED Team

Nice 🙂 

 

1625770375782.jpg

 

Yes I have some reports open to get it refined, as soon as I have more news I will let you all know but I think it has potential.

 

thanks for the feedback all, we will continue to work on it.

 

 

  • Like 2

smallCATPILOT.PNG.04bbece1b27ff1b2c193b174ec410fc0.PNG

Forum rules - DCS Crashing? Try this first - Cleanup and Repair - Discord BIGNEWY#8703 - Youtube - Patch Status

Windows 11, NVIDIA MSI RTX 3090, Intel® i9-10900K 3.70GHz, 5.30GHz Turbo, Corsair Hydro Series H150i Pro, 64GB DDR @3200, ASUS ROG Strix Z490-F Gaming, HP Reverb G2

Link to comment
Share on other sites

13 hours ago, BIGNEWY said:

Nice 🙂 

 

1625770375782.jpg

 

Yes I have some reports open to get it refined, as soon as I have more news I will let you all know but I think it has potential.

 

thanks for the feedback all, we will continue to work on it.

 

 

I am also using the G2, and like others very interested to see how this develops. I am not disappointed so far as a very early adopter, and see it very much as a work in progress to help improve our community. As i mentioned in my previous post, all of the potential is definitely there and we can see already the possibilities. In my view its implementation just needs to simplified somewhat. For what its worth, I am already using with it with a certain amount of success in DCS. After the summer I might dive into a bit of coding to see also what is possible although my experience in this is more than a little limited! Starting simply with perhaps just 2 gestures and a mouse type of function, and reducing the sensitivity of the movement. My suggestion would be just to have a pointed index finger as a gesture to show the laser pointer and a thumb movement as a right click and trigger gesture as left click. One further gesture may be needed to hide the hands or disable the function somehow to prevent control interference. I definitely feel this is not as far from being  fairly complete solution as some have described on various forums. Keep up the good work and please keep posting feedback and updates. There are many of us who are extremely positive and realistic to the progress of this. 


Edited by TED
Link to comment
Share on other sites

I think if you move the joystick or throttle axis the hands should snap to that control, this would eliminate the need to get your controls exact to the cockpit. Maybe set a adjustable deadzone range so micromovements won't cause the hands to inadvertantly snap to the controls.

Sent from my Samsung Chromebook Plus using Tapatalk

F-14B, F-16, F-18C, A-10C, F-5E, F-86, FC3, BF-109, FW-190, P-51, Spitfire, UH-1,AJS-37 Viggen, MIG-15, MIG-19, MIG-21, AV-8B Harrier, P-47D

Persian Gulf, Caucuses, NTTR, Normandy, The Channel, Syria

Combined Arms, WWII Assets,Super Carrier

TM Warthog, Virpil VFX,BuddyFox UFC, Saitek Pro Flight quadrant & Switch Panel, Odyssey+ VR, Jet Pad w/ SSA, Voice Attack w/Viacom Pro

GeForce RTX2080TI OC, Intel Core i7-7700K 4.5Ghz, 64GB DDR4, Dedicated 1TB SSD

Link to comment
Share on other sites

I am not really sure about it. Every cockpit in DCS is different. And your setup at home will stay the same. For example in the viper. You can see in my video above my physical throttle is outside of the cockpit and my stick sits in the DED as I have a center stick setup. Snapping the viper stick hands from side to center stick will end up in weird results.

A disable / enable hand tracking on the HOTAS would be very useful.

Another problem in the viper is that the MFDs position is physically way further from the DED in front of me, that I have to grab below my desk, which breaks leap motion tracking, because my hands disappear.

I have not tested other aircraft, but I am sure there will be other issues you'd run into.

Maybe an approach like its used for point track, where you customize the areas where you'd expect certain buttons would be useful. Dont know how much manpower ED will put on that project anyway. Not sure how many leap motion users DCS will have in the future.


Edited by darkman222
Link to comment
Share on other sites

I have just bought the hand tracking module for my Pimax 8kx after reading this thread and seeing the progress being made by Ed on this subject.

I haven’t tried it yet since the module arrives on Sunday.

I am not expecting it to be perfect and probably not even usable at this point having watched other user’s videos but it’s very promising and I also agree there must be some kind of on/off switching of the hand tracking to avoid any interferences with the physical flight controls and the fact that they won’t be placed exactly where they should be.

great work Ed, keep it up!

  • Like 2
Link to comment
Share on other sites

I tried it out and have a couple thoughts:

As silly as it may seem, it would be nice to have the option to disable interaction and just render hands so those with sim pits can just visualize hand position without inadvertently blowing the canopy when I'm trying to lower the hook. Those with 1:1 pits would benefit the most as you wouldn't have to feel around for a switch if you can see where your hand is going.

 

Having an option for the hands to cast shadows in the cockpit would be great, so I could block the glare from the sun on a critical MFD or dial that I'm having trouble seeing otherwise.


A keybind for turning them on and off like the control stick or pilot body would nice and super useful. 

As I mentioned above, if you move the joystick or throttle axis the hands should snap to that control and stay there. Be able to set an adjustable dead zone to keep micro axis movement from snapping your hands inadvertently, also to set how far your hands move before they become unlocked from the control stick

If you have the stereo parser set to TRUE in autoexec.cfg, the hands don't render at all.

 

Aiming at switches and buttons is kinda difficult to hold still, it would be nice if what your targeting would auto aim assist a little. Still not really clear as to what causes it to register a click, sometimes when I turn my wrist, sometimes just moving the thumb. I notice if my finger was directly on a switch and I move my finger it would press a button or throw the switch. Kinda dangerous.

setting the aim point to the tip of the index finger and not a laser pointer, make it so it's only "active" when the index finger is extended and the other fingers are curled.

I'll continue to monkey around with it, it's not there yet, but it could be good. 

Separate from DCS, Leap almost needs a second stationary sensor to track the hands when they're out of your viewing cone.

 

  • Like 2

F-14B, F-16, F-18C, A-10C, F-5E, F-86, FC3, BF-109, FW-190, P-51, Spitfire, UH-1,AJS-37 Viggen, MIG-15, MIG-19, MIG-21, AV-8B Harrier, P-47D

Persian Gulf, Caucuses, NTTR, Normandy, The Channel, Syria

Combined Arms, WWII Assets,Super Carrier

TM Warthog, Virpil VFX,BuddyFox UFC, Saitek Pro Flight quadrant & Switch Panel, Odyssey+ VR, Jet Pad w/ SSA, Voice Attack w/Viacom Pro

GeForce RTX2080TI OC, Intel Core i7-7700K 4.5Ghz, 64GB DDR4, Dedicated 1TB SSD

Link to comment
Share on other sites

On 7/10/2021 at 1:45 AM, LASooner said:

Separate from DCS, Leap almost needs a second stationary sensor to track the hands when they're out of your viewing cone.

 

It would be great if it were possible to buy more sensors and have them be synced. It seems like the technology is so good, yet so underutilized and unpolished, even after a pretty long time.

Link to comment
Share on other sites

I finally had a chance to try Leapmotion with my pimax 8x

I just tried the F16 and I am easily able to push with my finger all the ICP pushbuttons, all the MFDs pushbuttons and basically every other switch on the main instrument panel, so this is very good.

For what regards the left and right consoles I am unable to operate most of the switches.

I almost have the feeling that the hands are higher in the 3D world than where they should be, so I am loosing the hand tracking before I can actually reach the switches on the consoles and the shadowing created by the stick and throttle doesn't help either.

I know this is work in progress and I am not discouraged at all,I am quite positive this will be amazing with a bit more development.

Also many times my virtual hands take control of the aircraft and there should be a way to deactivate this feature.

But all in all great potential.....

Link to comment
Share on other sites

Agreed. Most important, a Hotas keybind to activate and deactivate. Also some basic functionality to calibrate the distance to the cockpit. In the DCS F16 the ICP is easy reachable. But the MFDs unfortunately  are 10 cm in the desk in front of me. No problems for pit builders, but for desktop users.

 

Same for me like you said with the side panels. I think its the chair or even my shoulder which blocks the view for the camera. I would consider that as a design flaw of the pimax because the camera is mounted under it close to my mouth.

Use multiple cameras to track the hands would be awesome, but I think for that the leap motion programmers will be in charge not ED.


Edited by darkman222
Link to comment
Share on other sites

9 hours ago, darkman222 said:

Agreed. Most important, a Hotas keybind to activate and deactivate. Also some basic functionality to calibrate the distance to the cockpit. In the DCS F16 the ICP is easy reachable. But the MFDs unfortunately  are 10 cm in the desk in front of me. No problems for pit builders, but for desktop users.

 

Same for me like you said with the side panels. I think its the chair or even my shoulder which blocks the view for the camera. I would consider that as a design flaw of the pimax because the camera is mounted under it close to my mouth.

Use multiple cameras to track the hands would be awesome, but I think for that the leap motion programmers will be in charge not ED.

 

For what regards the hand tracking of the hands yes I think it's also the chair that blocks the camera but in my opinion this is because the hands are rendered too high compared to where they should be so you need to lower your hand a lot to be able to reach the side consoles and this is why the hand tracking is lost because you continue lowering your hand and at a certain point like you said the chair or the hotas will mask the tracking.

And as you said a calibrating functionality would be great, this would solve all these problems or simply a reset vr hands view function similar to our current Vr view reset.

With the VR view reset we have in dcs now we can not only recenter the view but moving forward or backward and recentering we can have our cockpit closer to us or further away.

It would be great to have a feature that works in a similar way for the hands only.

Would be great to know other people thoughts , ideas on this..

 

Link to comment
Share on other sites

1) The hand scaling is wrong. Everyone should put hands fingers open front of them and then touch same fingers together from tips. Like index finger, thumb etc. The gloves at now are like 2 cm longer with fingers. The glove 3D model should scale to the fingers positions and palm center so they would always match when finger tips touches each others. 

 

2) Tracking requires tweaking. It is now very jumpy and gets very easily confused with fingers. As said, we need calibration option as I have now in short test period had the laser beams pointing 90 decrees wrong direction.

 

3) Deadzones. As LASooner suggested that when controls are moved then have virtual hands turned off or snapped to them. The potential is that one can just operate cockpit without any extra work. Just reach to the button and be done. 

 

4) Just like with a touch controllers, remove the laser pointers. Just forget them. Make the finger tips (thumb and index finger at least) the active parts. So you need to just touch things. The icing on the cake is that switches would move to the opposite direction from where they are touched, or each time switch the status. So if switch is left/right then pushing from right would move it to left. But easiest is just to make it each click (and pause between activations, like 200 ms) to flip switch or button. This so that switch doesn't start repeating itself at high speed.

If we don't have laser beams, then we don't need to worry about accidental touches either.  So make the laser beam optional. 

 

5) Design the system to activate and disable those without mouse and keyboard. I don't have mouse or keyboard anywhere near the flight chair so it was annoying jumping back and worth to disable touch controllers and get hands working, and then get even menu opening. What leads to gestures.

 

6) We need some clear intentional gestures for some basic things like menu. Easiest way really is to have like other VR games that you have watch or something on the wrist and there you have buttons for menu. So menu is #1 to be accessible. Want to make very fancy thing? Make a wristwatch that will show the current (real) time. Touch that watch with another hand for DCS Menu. Even a activation/deactivation could be done with it, like spread fingers to toggle that hand On/Off. Or require to keep index finger straight or thumb up to have that hand temporarily active/visible, otherwise disabled. 

 

Where I was disappointed was that I couldn't get the finger tips to work. As only things that really matters for me to be able operate cockpits is steady hands and capability use finger to push things.  That requires accuracy so fingers are properly tracked like in the Orion demos.  

 

Very good potential to offer a "controller free" cockpit. Requires work but with effort and good designing it will succeed. 

  • Like 1

i7-8700k, 32GB 2666Mhz DDR4, 2x 2080S SLI 8GB, Oculus Rift S.

i7-8700k, 16GB 2666Mhz DDR4, 1080Ti 11GB, 27" 4K, 65" HDR 4K.

Link to comment
Share on other sites

En 17/7/2021 a las 19:08, Fri13 dijo:

1) The hand scaling is wrong. Everyone should put hands fingers open front of them and then touch same fingers together from tips. Like index finger, thumb etc. The gloves at now are like 2 cm longer with fingers. The glove 3D model should scale to the fingers positions and palm center so they would always match when finger tips touches each others. 

 

2) Tracking requires tweaking. It is now very jumpy and gets very easily confused with fingers. As said, we need calibration option as I have now in short test period had the laser beams pointing 90 decrees wrong direction.

 

3) Deadzones. As LASooner suggested that when controls are moved then have virtual hands turned off or snapped to them. The potential is that one can just operate cockpit without any extra work. Just reach to the button and be done. 

 

4) Just like with a touch controllers, remove the laser pointers. Just forget them. Make the finger tips (thumb and index finger at least) the active parts. So you need to just touch things. The icing on the cake is that switches would move to the opposite direction from where they are touched, or each time switch the status. So if switch is left/right then pushing from right would move it to left. But easiest is just to make it each click (and pause between activations, like 200 ms) to flip switch or button. This so that switch doesn't start repeating itself at high speed.

If we don't have laser beams, then we don't need to worry about accidental touches either.  So make the laser beam optional. 

 

5) Design the system to activate and disable those without mouse and keyboard. I don't have mouse or keyboard anywhere near the flight chair so it was annoying jumping back and worth to disable touch controllers and get hands working, and then get even menu opening. What leads to gestures.

 

6) We need some clear intentional gestures for some basic things like menu. Easiest way really is to have like other VR games that you have watch or something on the wrist and there you have buttons for menu. So menu is #1 to be accessible. Want to make very fancy thing? Make a wristwatch that will show the current (real) time. Touch that watch with another hand for DCS Menu. Even a activation/deactivation could be done with it, like spread fingers to toggle that hand On/Off. Or require to keep index finger straight or thumb up to have that hand temporarily active/visible, otherwise disabled. 

 

Where I was disappointed was that I couldn't get the finger tips to work. As only things that really matters for me to be able operate cockpits is steady hands and capability use finger to push things.  That requires accuracy so fingers are properly tracked like in the Orion demos.  

 

Very good potential to offer a "controller free" cockpit. Requires work but with effort and good designing it will succeed. 

+1000


YouTube Channel


Update: MSI Z790 Tomahawk, i9 13900k, DDR5 64GB 640 MHz, MSI 4090 Gaming X Trio, 970 EVO Plus 1TB SSD NVMe M.2 and 4 more, HOTAS TM Warthog, Meta Quest Pro

Link to comment
Share on other sites

@Fisu_MADhow did you get it working that you can actually touch buttons with your finger instead of using the fist as laser pointer?

Even after yesterdays update I cant do it. It looks maybe a checkbox is missing in  the option for leap motion.

I have "enable" and "arm visible" activated.

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...