Jump to content

Wyverex

Members
  • Posts

    58
  • Joined

  • Last visited

Everything posted by Wyverex

  1. Hi, thanks for the feedback! There is no LeapMotion support in IvyVR as of now. That might be added later. Although it will always suffer from missing haptic feedback which is actually a huge deal. But I can see the desire for using LeapMotion. So, the reason why `L DEVICE` and `R DEVICE` are showing is because you presently need to use VR controllers with IvyVR. There are default bindings for a broad range of controllers included. Also, the presence of `MODULE` indicates that IvyVR isn't talking to DCS properly. This could be related to you using the Steam version of DCS which I've never used before. Where is the user data located in the Steam version? In Standalone, it's in `Users\<Username>\Saved Games\DCS.openbeta`. I probably need to update the installer to put in the correct path for Steam. For now you should be able to fix that issue by copying the `Scripts` folder from that directory into the directory that Steam uses for its user data. It contains a LUA script that IvyVR uses to communicate with DCS.
  2. I've updated IvyVR to 2.1.0: - Fixed IvyVR not starting up on various systems. Unfortunately I didn't know about this until very recently. Please let me know if you still can't start it! - Pimax Crystal controllers should work correctly. There's a bug in the PimaxXR runtime which I had to work around until it's fixed properly in the runtime
  3. Previously it only worked using OpenVR (which implied using SteamVR). Now it only works using OpenXR (and as an Index user that means you're still using SteamVR, just using its OpenXR runtime)
  4. Good news, IvyVR 2.0.0 now supports OpenXR natively! SteamVR is no longer required unless your headset requires it. Since my Pimax Crystal is still currently shipping, I could only test this on my Index yet (which does require SteamVR). However, it should work on all OpenXR runtimes and on a large variety of controllers. Give it a try and please let me know if anything goes wrong! The website has been updated, you can download the new version here: https://ivyvr.net/download/ Some other nice benefits in this new version: Recentering now works using the same key as DCS uses (Ctrl+F12). Currently this is hardcoded but I'll add keybinding configuration later if there's any demand for it Better performance and reduced input lag. IvyVR now runs in lockstep with the DCS frame loop which feels more responsive Hint textures are only as big as they need to be
  5. I actually started out with a similar concept. I bought one of these 3D touch pads and wanted to use that to control the mouse with finger gestures: https://www.microchip.com/en-us/development-tool/DM160225 But then I found out that you could render overlays into any VR game using OpenVR and that changed everything for me!
  6. Quick update since I've been silent for a while. I've changed my mind and instead of first supporting the remaining aircraft, I've started the OpenXR port. Progress is good and it looks like everything will work just fine (with probably a few snags we'll have to live with). But it's still a lot of work, so it might take a few weeks until I can release it. Then IvyVR should work with just about any VR headset out there and will no longer require SteamVR to run (except if your headset requires it).
  7. Possibly! However, I want this to work on as wide a range of headsets and controllers as possible. The point is that you don't have to invest in special hardware just to get a feature that you should already have access to using your motion controllers. I agree that picking up your controllers whenever you need them is still a bit cumbersome. I've gotten quite used to rest them in my lap and picking them up when I need to but my endgame would currently be something like this: https://www.instructables.com/Etextile-VR-Gloves-for-Vive-Tracker/ It would be very easy to support in IvyVR. But before I start a project like that, I want to get OpenXR working first.
  8. For the most part, your controllers and your HOTAS don't get in each others' way, especially if your stick is located where it should be in the cockpit. I have my stick mounted to the right side of my chair, so for most aircraft it could obstruct access to the right console. In practice I haven't found that to be an issue though. I also have a lateral offset configured (due to my armrests), which helps further with clearance. Sure, sometimes, you need to adjust your position a bit to reach a certain switch but I didn't find that to be a problem so far!
  9. I've always wanted to feel more immersed when playing DCS, especially after I switched to VR (~2.5 years ago). Back then, we still had laser pointers to interact with switches and that just didn't feel good enough. I wanted to reach out and do it instead of just pointing at a switch from a distance. That's what motion controllers were always meant for, so I wanted to make use of that. While hand controller support has gotten much better now, it still leaves a lot to be desired. It still doesn't feel good enough. That's where IvyVR comes in. It's an external program that projects two crosshairs into your cockpit and allows you to perform natural gestures on all available switches. Best watch the video to see how it works: I had a first working prototype beginning of 2021 but then life got in the way, I stopped playing DCS etc. Now, IvyVR is finally ready for release. I hope you enjoy it as much as I do! I couldn't imagine ever going back to using the mouse in VR. Note that IvyVR is not meant to replace your HOTAS, I still use mine and wouldn't give it away. But you can't possibly bind every cockpit function to HOTAS buttons and, frankly, I wouldn't even want to. There is an added level of immersion in physically reaching out to do something, even though VR currently can't give you the full haptic feedback of flicking a switch. Download here: https://ivyvr.net Now fully on OpenXR, you only need SteamVR if your headset requires it! I'm committed to make IvyVR work on as wide of a range of headsets and controllers as possible, so if you have a combination that doesn't work, please let me know in this thread.
  10. I love this campaign! I played the first two Maple flag training campaigns and they were already great but this is an entirely different league. Thanks so much for creating this! I tend to agree with the sentiment about the DIVERT page. Given that the missions already go into so much detail about even fairly obscure things like the IFFCC Test menu, I'm actually surprised that the DIVERT page isn't mentioned at all when going for the landing in Silverbow. That orbit time would be the perfect opportunity. I actually totally forgot about that page while flying the mission and I was wondering how to know once I'm 5 miles out of the airfield. Feels like an inconsistency. I was considering manually going through all the WPs to find the correct one in time. In the end I just winged it and it was ok.
  11. I had the same happening to me today. Biff flew back all the way to Moapa, turned and then followed the FP to land. Things that I might have done to cause this: - I landed on 21R and took a while to clear the runway - I stopped between 21L and 21R because I wasn't sure how far to go. After a while I continued further beyond 21L until I saw the UI message to wait there During my landing approach, Biff also stayed beside me for a long time and wildly oscillated up and down. I don't think he was orbiting around the truck stop. Then I didn't observe him any further until I was on the ground. I guess he started back on his FP then.
  12. Thanks guys! There's some very helpful stuff in here. I'm already banking towards the TGP and I've noticed that the sidewinders you sometimes get in the mission exacerbate the problem. I think my main problem is that I'm too low and too close. There's always a sense of urgency after a dive and coming back up again that probably causes me to not separate far and high enough to set myself up for success on the next run. I'd love to do that more. Unfortunately, with limited resolution in VR, looking out the window doesn't give you much in terms of SA when it comes to ground targets. Even when diving in, I only see targets very late. So I'm actually very dependent on having a good TGP fix and then diving towards the SPI marker to get on target. I hope that changes when upgrading from my Index to a Crystal later this year. The HMCS and and TMS right long have actually been a life saver for me, I couldn't live without that anymore. But I haven't really used Mark points a lot yet. Should probably start to as they sound really useful if employed like you describe. I think I'll have to load up a practice mission and just try to keep a target in sight for as long as possible to get a better feel for what I'm doing.
  13. I've recently started playing the amazing The Enemy Within 3.0 campaign and I keep running into the same issue, over and over. I anchor over a target area using ALT Hold to search for targets and half of the time I spend looking at a masked TGP. So I disengage hold mode and start maneuvering to keep the targets in sight. Sooner or later that makes me oscillate wildly while being heads down dealing with the TGP. The worst part is when trying to track moving targets, like three trucks that I have to destroy before they reach a target. They are hard to spot anyway (especially in VR) and I constantly lose sight of them while my TGP is masked. This is extremely stressful. I waste so much time searching for them again after the TGP is unmasked. This gets even harder in the mountaineous areas in the campaign where I have to watch out not to fly into the mountains and on top of that I regularly mask targets simply by having a mountain or trees in between. At this point I'm sure I'm doing something fundamentally wrong. Keeping your targets in sight is surely a basic skill that everything else builds upon. What are your techniques to ensure you minimize TGP masking time (I doubt you can prevent it entirely, especially during bombing runs since you have to reverse at some point). For example, is there a sweet spot for both the bank angle and distance to target area while anchoring?
  14. No matter how I run DCS these days, ST, ST + force_steam_VR, MT, I can never get the Steam VR dashboard to show while in a mission. It works fine in the menu but not in game. The image just freezes. Does anyone else have this?
  15. When you download the ALVR v20 nightly for the streamer, there is also a generic alvr_client_android.apk included. Download that on your Pico and use that as the client. I did that yesterday and I had good performance
  16. ALVR v20 + ST + --force_OpenXR works indeed. However, I can get the same result by just not using --force_OpenXR in ST and going through Streaming Assistant (or VD). I'd love to use MT though. Using ALVR v20 + MT gets rid of the extreme jankiness but I also get a massive FPS drop if any overlay is active, so unfortunately this isn't a solution either.
  17. It seems that using OpenXR is the culprit here. Everything is fine if I start DCS as single threaded, without any arguments. If I --force_OpenXR, I see jankiness with overlays. The same with the multi-threaded version (which always uses OpenXR as far as I know). With DCS obviously pushing towards both MT & OpenXR, that makes me wonder if overlays will work again in the long term. There's no official OpenXR overlay support (and it's probably not going to come either) and even OpenKneeboard with its custom API layer has the same jankiness, at least on the Pico. I'll check out ALVR, thanks!
  18. So far I've played using a Valve Index and I'm currently trying out a Pico 4. I'm using OpenKneeboard which works fine with my Index. If I use the Pico 4 without OpenKneeboard (either with StreamingAssistant or VirtualDesktop), everything is smooth. As soon as I start OpenKneeboard, DCS gets extremely janky. While the frame rate seems to be more or less the same, there's an extreme lag and it looks like both eyes have a different refresh rate. It's basically unplayable. The problem goes away as soon as I close OpenKneeboard. The same happens with any other application that renders overlays into the scene. Does anyone else have this issue?
  19. For posterity, in case other people run into this as well: I've been struggling with this as well when I converted a mod to use the new Hooks approach instead of the old Export script approach. GetDevice() is available in the Export namespace, so Export.GetDevice() should do what you want
  20. Hi, I'm currently exporting data from both `LoGetCameraPosition` and `LoGetSelfData` to calculate the camera position within the cockpit. local selfData = LoGetSelfData() if selfData ~= nil then local cameraPos = LoGetCameraPosition() if cameraPos ~= nil and cameraPos.p ~= nil then local toSend = { pos = selfData.Position, h = selfData.Heading, p = selfData.Pitch, b = selfData.Bank, c = cameraPos, t = socket.gettime() } Ivy.sendCommand("cam", toSend) end end On the receiving end, I calculate a quaternion from the rotation data and transform the camera position relative to the aircraft position. This generally works totally fine and I get correct results as long as the aircraft is standing still. However, in flight, my results vary by up to 0.6m when flying at 160 km/h for example. These two exports show such a difference along the X axis (which points forward in DCS). [2023-03-29 19:53:39.313] [IvyVR] [info] cam_ac (3.015137387800653, 0.3236163383971928, -0.0058042729454039765) [2023-03-29 19:54:12.014] [IvyVR] [info] cam_ac (2.416017720146929, 0.23703545435031226, -0.02618762237694683) (Note that my relative camera position didn't change except for minute movements of my head which is reflected in the slight differences along the other axes) A likely explanation would be that the results of selfData and cameraPos are from two different points in time between which the aircraft has already moved a bit further, which is why there's a gap in between them. I export both within the same tick though, so my assumption was that the data should be internally consistent. Does an export script run on a different thread which means that there's a sufficient time gap between LoGetSelfData and LoGetCameraPosition to give results from different game ticks? Is there a way I can get consistent results?
  21. Is it possible to create a new command which appears in the "Adjust controls" dialog as part of a mod? Ideally you can define a callback method in lua that's called when the player presses the assigned key binding. The only thing I've found so far (in DCS-SRS) is window:addHotKeyCallback(srsOverlay.config.hotkey, srsOverlay.onHotkey) That works, but it's tied to a window and the key is hardcoded into the plugin. I'd like to expose it to the player (ideally without needing a window)
  22. While going through clickabledata.lua files, I'm having a hard time figuring out how the LEV class type defines how many steps are available in game. There always seems to be an example that contradicts all the rules I come up with. These are the basic rules as I understand: class defines what type of element it is. If you have more than one, they seem to be related to left mouse button, right mouse button, mouse wheel in that order arg_lim defines the values the element can switch between for that class, basically a min and max value arg_value defines the step size that one activation advances. So for an arg_lim of {0, 1} and an arg_value of 0.1, you'd have 11 switch positions/detents This seems to work well for TUMB and BTN classes. For the LEV class there is also a gain value. My assumption so far is that it is multiplied into the arg_value. But that doesn't really hold up when looking at different examples. Let's take the A-10C: PNT-LV-HSI-HDG, the heading select knob on the HSI has a very fine precision. If you use the mouse wheel you can turn it by a degree or so with each detent. When expanding the definition you get these values: arg_lim = {0, 1} arg_value = 1 gain = 0.1 From that example alone I can't infer in any way how many steps the element has. Multiplying gain with arg_value yields 10 which is obviously wrong. PTR-TACAN-CHANNEL-SELECTOR-2 Three different classes, the mouse wheel responds to LEV and controls the last digit of the TACAN channel. This one has a fixed number of 10 steps. arg_lim = {0, 1} arg_value = 0.1 gain = 0.1 Assumption: The arg_value not being 0 or 1 defines an actual step count (and the gain is somehow ignored here) But we immediately find a counterexample: PTR-HARS-CP-PUSH-TO-SYNC The LEV values: arg_lim = {0, 1} arg_value = 0.5 gain = 0.1 Although it has an arg_value of 0.5, you have very fine control, there are no discernible detents, it basically behaves like the heading knob. PTR-SASP-YAW-TRIM The Yaw trim rotary on the left console arg_lim = {-1, 1} arg_value = 0 gain = 0.1 An arg_value of 0. Doesn't make much sense to me. For this one I actually counted the steps in-game. I came up with 67. It doesn't have full 360 degrees range so I assume that the full range could have around 100 steps. So at this point I'm pretty much at a loss how from looking at these definitions you can know how many detents a LEV class control has. I'm pretty sure this is not defined by the animation keyframes in the cockpit model either since those just define the physical/visual range of motion and not how they relate to input. Does anyone know how this works? There are some properties I don't quite understand yet but I don't think they are related to the precision but rather have a visual effect: use_OBB, side, attach_left, attach_right
  23. Really enjoying these campaigns! I flew the practice mission #04, did three bombing runs (one for each target) and followed the egress waypoints correctly. I got all the instructions for fencing out, hard deck, landing and parking as usual. But when I park in the spot I usually park in and shut down the bird, I don't get any voice over from the instructor. I waited for a few minutes but it seems the mission is stuck. When I leave the mission then, I only get a score of 50, so I didn't pass. Is this a bug or did I miss any requirements? How many bombing runs do you have to do to pass? Is one run sufficient? I did three to make sure but the briefing doesn't really make it clear. If this is a bug, how can I manually progress the practice campaign to work around it?
  24. I have the same issue. Everything is pitch black. Is this intended? I attached a zoomed out F2 view after having landed. In weather like this I'd expect the runway to be lit up like a Christmas tree to guide airplanes home. When flying by procedure you wouldn't even be allowed to land since you can't see the runway. I barely managed to land on RWY 25 in the practice mission. Not sure I want to try in the real mission :shocking: NVG doesn't help either since clouds go down to ground level and you see less with goggles than without especially through the HUD (and I've almost turned it off). ​ Is this a bug with Kobuleti lighting or is this intended behaviour?
  25. Thanks guys, that was very helpful. Managed to fill up for the first time yesterday! (Only took 30 minutes :music_whistling:) Turns out it's mostly a speed issue for me. Apparently you have to almost come to a standstill for a short while so that the boom can make the connection. I was always slightly to fast and overshot. Thanks for the video Foka!
×
×
  • Create New...