Jump to content

Recommended Posts

Posted (edited)

This $99 Jetson Nano neural net development seems like the perfect platform to develop a finger tracking system for VR. Based on the image tracking examples shown in the YouTube video below, it seems that it would be possible track ten finger tips from multiple cameras in real time and map them in 3D to the clickable areas of a virtual cockpit. It would unload the heavy duty image recognition and tracking from the game PC and then just pass the 3D coordinates to DCS through a custom mouse driver or perhaps even DCS bios. Since neural net learn, seems like you could train it for the different DCS cockpits and it would improve over time.

 

https://www.youtube.com/watch?v=RpO_a10QmLk

 

Hey Miles how about a "Touch Control II using Neural Nets"?

Edited by TWC_Alamo

[sIGPIC][/sIGPIC]

TWC_Alamo

Denver, CO

 

Military Flight Sim

I7-7700K, 4.9 GHz, Z270-Gaming MB, 16GB, 512GB EVO-960 NVMe M.2, 512GB WD Black NVMe M.2, 1 TB SSD Raid, EVGA RTX 2080ti, Samsung Odyssey Plus, TM HOTAS/MFDs, MFG Crosswinds, Gametrix 908 JetSeat

 

GA Flight Sim

I7-5820K, 4.2Ghz, Godlike Carbon MB, 16GB, 512 GB EVO 960 NVME M.2, 2 X SSD, EVGA 1080ti, HTC Vive, 3 X 4K 55" TVs, 4 X 27" Monitors, CH: Flight Yoke, Throttle Quadrant, Rudder Pedals

Posted
why not just use leap motion?

 

Doesnt work in dcs afaik.

New hotness: I7 9700k 4.8ghz, 32gb ddr4, 2080ti, :joystick: TM Warthog. TrackIR, HP Reverb (formermly CV1)

Old-N-busted: i7 4720HQ ~3.5GHZ, +32GB DDR3 + Nvidia GTX980m (4GB VRAM) :joystick: TM Warthog. TrackIR, Rift CV1 (yes really).

Posted
This $99 Jetson Nano neural net development seems like the perfect platform to develop a finger tracking system for VR. Based on the image tracking examples shown in the YouTube video below, it seems that it would be possible track ten finger tips from multiple cameras in real time and map them in 3D to the clickable areas of a virtual cockpit. It would unload the heavy duty image recognition and tracking from the game PC and then just pass the 3D coordinates to DCS through a custom mouse driver or perhaps even DCS bios. Since neural net learn, seems like you could train it for the different DCS cockpits and it would improve over time.

 

https://www.youtube.com/watch?v=RpO_a10QmLk

 

Hey Miles how about a "Touch Control II using Neural Nets"?

 

Interesting Stuff, thanks for the PM about it. I will probably look at that for future projects. As far as PointCTRL goes, its meant to be simple and have something to actually press.

The original version used 3 accelerometers and a 3d gyro for sensing the finger moving a switch. It just didn't feel right, as didn't the touch-pads or even dome switches. I am working with some different tech for future versions, but thats only if nothing better comes out before that.

Now shipping up to website Pre-Order Form date 2022/11/15

Pre-Order  Form Submission https://pointctrl.com/preorder-form/

PointCTRL Support Discord https://discord.gg/jH5FktJ

PintCTRL Website https://pointctrl.com/

PointCTRLsmall.jpg

Posted

Also the Leap uses the PC processing power as I understand it. Neural nets are great for image recognition. The Youtube at about 6:20 shows the nVidia Deep Stream Application running on the Jetson Nano tracking multiple objects at 30 FPS from 8 simultaneous H.264 video streams.

 

 

 

why not just use leap motion?

 

Doesnt work in dcs afaik.

[sIGPIC][/sIGPIC]

TWC_Alamo

Denver, CO

 

Military Flight Sim

I7-7700K, 4.9 GHz, Z270-Gaming MB, 16GB, 512GB EVO-960 NVMe M.2, 512GB WD Black NVMe M.2, 1 TB SSD Raid, EVGA RTX 2080ti, Samsung Odyssey Plus, TM HOTAS/MFDs, MFG Crosswinds, Gametrix 908 JetSeat

 

GA Flight Sim

I7-5820K, 4.2Ghz, Godlike Carbon MB, 16GB, 512 GB EVO 960 NVME M.2, 2 X SSD, EVGA 1080ti, HTC Vive, 3 X 4K 55" TVs, 4 X 27" Monitors, CH: Flight Yoke, Throttle Quadrant, Rudder Pedals

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...