

St4rgun
Members-
Posts
368 -
Joined
-
Last visited
Content Type
Profiles
Forums
Events
Everything posted by St4rgun
-
Maybe two different methods could be used for the different tasks: VR cursor stabilization: even a bigger deadzone can be highly usable for the head tracking not to interfere with precise mouse movement HMS symbology stabilization: the deadzone is contraproductive, special noise filtering is required instead like some tuned low-pass filter to avoid lag (i.e:https://hal.inria.fr/hal-00670496/document)
-
The blue VR cursor persistence is good now, only a minor bug remained: if I switch to external view (F2) then back to the cockpit then the blue cursor shows up, like I was moving the mouse. It will disappear quickly, but it's still a bug. The cursor should show up only upon mouse movement.
-
Several users observed a HMS symbology jitter, which is most conspicuous with the IHADDS of the Apache. It turned out, that this phenomenon is not only HMS related, it's the same with the VR cursor. The head tracking of the VR headsets is so precise, that even my pulse can move the crosshairs altough I hold my head very stable. If someone only want to ocassionaly use the crosshairs then it's fine, but as long as you are starting to be aware of this problem and trying to observe these fine details then it will be obvious that something is basically wrong with the HMS crosshairs movement (including all helmet sights of the A-10, F-16, F/A-18, Ka-50 and the Apache as well). The same goes with the mouse VR crosshairs: because the VR cursor is linked to your head so it also moves this way. The slightest movement (pulse) can jitter it around while trying to fight it with the mouse. It's more emphasized in the Apache when typing a lot of data on the keyboard. The main cause can be the method, that the movement of the HMD sensed at position of the head, while the movement of the VR cursor and HMS symbology placed in the space further in a distance which means a given angular displacement of the head makes fairly large displacement in space at the distance of the displayed VR cursor or HMS symbology. IRL this would not happen, because the HMS placed close to the eye, just collimated to infinity. IMHO this issue is serious enough to be corrected to eliminate fatigue caused by the need of unnatural head fixing meanwhile improving the aim experience with the crosshairs. I think some deadzone solution for the head tracking is generally not appropriate, because sometimes we want to move the crosshairs precisely, so a deadzone would introduce some artificial lag. Probably filtering the jitter of the HMDs head tracker sensor raw data is the solution to go. But these all should be applied only to HMD symbology and VR cursor, in every other aspect the head tracking is perfect now. @BIGNEWY Any toughts on this?
-
All right, I'll make a separate topic then.
-
I'm using Reverb G2. The head tracking is so precise, that even my pulse can move the crosshairs and I can assure you, I can hold my head very stable. If someone only want to "use" the crosshairs then it's fine, but as long as you are starting to be aware of this problem and trying to observe these fine details then it will be obvious that something is basically wrong with the HMD crosshairs movement (including A-10, F-16, F/A-18 and the Apache). IMHO it needs to be corrected. The same goes with the mouse VR crosshairs: because the VR cursor is linked to your head so it also moves this way. The slightest movement (pulse) can jitter it around while trying to fight it with the mouse. It's more emphasized in the Apache when typing a lot of data on the keyboard. I think the deadzone solution is not good, because sometimes you want to move the crosshairs precisely, so a deadzone would introduce some artificial lag. Filtering the jitter of the HMDs head tracker sensor raw data is the solution to go. But these all should be applied only to HMD symbology and VR cursor, in every other aspect the head tracking is perfect now.
-
This noise filtering applied for head tracking (only for HMD symbology) would be a key for all HMDs in DCS, if other techniques can't be used. It's so important, that you should make a standalone thread in the VR bugs just for this. I asked @BIGNEWY in my previous answer but he didn't replied. IMHO this issue needs further development to eliminate completely.
-
The last part of the discussion was not about the aiming itself, but the movement of the reticle (or better say the jitteryness, or over-sensitiveness of its motion). By the way Casmo in one of his videos showed that a default of 800 m manual range is more appropriate for general use. I can also pinpoint and destroy targets perfectly with the gun, practiced a lot. But somehow we all feel that the movement of ALL VR helmet mounted sights are a bit odd in DCS, they are too wobbly. In the Apache it's more pronounced maybe because of it's high importance and continuous use as a pilot. Trying to find a solution to this issue.
-
I have a theory: for the IHADSS reticle to be correctly focussed to infinity (mainly for the both eyes mode) they placed the symbology on an imaginary plane FAR away. That's the effect like if your head and the far away plane would be connected by a several hundred meters long pole and you are trying to hold the OTHER end of the pole stationary by stabilizing your head. If this assumption is true then this is massively wrong idea from the devs, because the lights from the IHADSS reticle indeed should be focussed to infinity IRL (it means the light rays coming parallel into your eye), but that affects only the focus and depth perception. Meanwhile the reticle itself is very close to your eye, so the movement of it should be proportional with that close distance. Try to place some marker on an eyeglasses or sunglasses and "aim" with that marker at objects far away: you'll be able to perfectly stand still with your head and hold the mark stationary on the given objects like hundreds of meters away. @BIGNEWY Could you please check with the devs, that this is what's happening with the IHADSS? Because if it's true then this should be corrected from the ground up, because it makes senses of IHADSS totally false.
-
I have never tried with both eyes setting. It's also unrealistic because you CAN sense the symbology like a picture far away (focussed to infinity), but it will be still sharp when your eyes focus on the cockpit at much closer distance. So it will trick your brain other way. Beside of that it covers the MFDs from your left eye, so will disturb those in the view. Unfortunately the best solution right now with the current VR technology is the following: use the IHADSS only ONE eye, the dominant one (for 100% realism only the right eye is supported) have the highest framerate possible for accurate head tracking without lag (45+) have the best edge to edge clarity of your VR headset (if you have to wear glasses for perfect vision at monitor distances (50-60 cm) then try it with the headset also - it's MUCH better despite it shouldn't needed for the so called 2 m focus distance. If you don't have any glasses and older than 45 and using a lot of monitor a day then check your vision, because it's most likely it's already not 100%) train your brain a lot to accomodate to different views from your eyes try to reduce the IHADSS symbology and video brightness to the minimum which gives adequate contrast (mostly at night don't overbrighten your right eye while the other is in the dark) practice, practice, practice
-
Not really, it is focussed to infinity, so the light rays from the symbology image are parallel (https://www.wikiwand.com/en/Reflector_sight#/Design) That's the problem with current VR headset technology: fixed focus. IRL the cockpit, the projected sight of the IHADSS and the reflector glass and the projector lens are all at totally different focus distances, so our eyes can distinguish between them by focussing. It is impossible with one eye in VR headset. The panel in the G2 is indeed close to my eyes, but there's lens inbetween which corrects the visuals for the ~2 m focal distance (see my previous post with varifocal lenses).
-
I'm using WMR with Reverb G2 and a low power GPU (GTX 1070 only), so the framerate is not great (25+) with the Apache. If TADS video switched on as a pilot then much lower (15+). But even with these low framerates the motion smooting of WMR (OpenXR) is so good, that it does not induce nausea and I'm able to aim pretty well with the crosshairs of IHADSS. At first it's weird that the slightest motion of your head (even the pulses of your heart) is sensed and transaltes in minor movement of the crosshairs, but with enough practice you can learn how to aim steadily with it. I have 24 flight hours with the Apache right now and I think any smoothing of the IHADSS movement would feel "laggy" or "floating" for me. The real concern for me is the focused visibility of the IHADSS monocle frame and projector lens. Personally I'd like it much more with some blur on those to simulate the very close distance to the eyes. It's impossible for anyone to focus on such a short distances (2-3 cms), so its totally unreal right now. But if the framerate is high enough then the movement is fine as is.
-
No. It's not the brain, it's your eye. Unfortunately current VR HMD technology have a FIXED focus, so your eye does NOT focus to infinity correctly when using the IHADSS (as it should IRL). Right now proper depth perception can be done only by using image for both eyes. The only solution will be the making of varifocal lens HMDs, which will be the future. For more details:
-
I still don't get it. There're a lot of modules in DCS and there're module specific details which are interested a lot of people. But the HEART of DCS is the graphic engine. This should interest ALL players, so any information on core progress is really useful for every user. ED was able to make a "Development report" on Vulkan / Multicore on 15th October 2021. They were able to write down something, which is not "meaningless jumble of jargon", but an understandable status report. Since then no such "detailed" report has been made. If ED decides not to share information on this progress that's understandable. But please why, WHY are there some people in the community who have the urge to "tell the truth" about the meaningless of such reports? If someone is not interested to core progress, then don't read about it. But actively writing a comment about why it is wrong and impossible to ask for ANY status report is ridicuolus. Please, stop it. I don't want to disturb any development. If ED would like to say us sometihing about this topic, they will find a way, I'm sure.
-
When LMC is OFF then you change the position of the TADS crosshair. When LMC is ON then you change the speed of the crosshair. Upon pressing the LMC to set it ON sets the crosshair speed to zero which makes the crosshair stationary ("ground stabilized"). When the chopper is moving then this is an easy way to "fix" a position after trying to chase it with LMC off. Just don't forget that with LMC ON it needs really miniscule movement to fine tune the speed of the crosshair. During LMC mode the motion of the chopper is taken into account to stabilize the crosshair motion.
-
It's not exactly the brain, it's the eyes. If you look at IHADSS symbology which are IRL focused to infinity your eyes are looking far away, focused far distance. Meanwhile the monocle and the projector lens are 4-5 centimeters away from your eyes, way out of focus. It's total waste of polygons to make a 3D model of the IHADSS monocle in first person view, or if it's modelled it should be somehow blurred at the edges a lot. The biggest problem with blurring the monocle and projector lens would be that this engine has hard time handling semi transparent textures in VR. In other way why they choose to make the 2D and VR version differently? Maybe this in not feasible at all in VR.
-
As I remember maybe Casmo said that IRL you can't see the monocle itself (it's too close to your eyes to be in focus), but the projector lens certainly block some part of your vision. So IMHO a middle solution would be the best: heavily blurred, semi-transparent edges of the monocle and the projector lens, but NOT semi transparent projector body. The way it is now in VR is highly unrealistic with current VR display technology.
-
I just tried to make instrucions how to interpret my original reply, so I WAS sarcastic, sorry for misunderstanding: Fortunately the cursor persistence in VR changed at once. It's a bit longer than last time... So not eliminating such a bug which would need so little development resources is unbelievable for me. As it annoys so many users I expected it to be THE number one priority on the hotfix, but it didn't happen. Now it could be in the stable version also.
-
I noticed this in VR also in a certain head position (like looking down to the CP/G head with my HMD then look up to the rotor blades with my eyes). Higher or lower looking angles seems perfect, but in a certain range this weird effect appeares.
-
resolved DCS 2.7.11 - VR cursor persists too long
St4rgun replied to Raven (Elysian Angel)'s topic in VR Bugs
Now this issue inherits to the stable version as well. Nice. Unbelievable that is has not been eliminated in the last hotfix (2.7.11.22211). -
...sarcasm.
-
Fortunately the cursor persistence in VR changed at once. It's a bit longer than last time...
-
resolved DCS 2.7.11 - VR cursor persists too long
St4rgun replied to Raven (Elysian Angel)'s topic in VR Bugs
Phew... Nice idea. The only problem is I think it's longer for me to read all these instructions than having the devs change a variable back to the previous value to solve the issue. I don't think that this bug can't be solved in max 2 minutes. Very annoying indeed. -
Yes, we know it well, but during the Vulkan implementation the team had the opportunity to change the whole graphic pipeline / technology. In the past in some Q/A session they not even excluded the possibility to change from deferred rendering back to forward rendering (Vulkan equally gives the opportunity for both) because of much better performance of forward rendering in VR and the possibility to apply hw assisted MSAA again (which is unusable in deferred rendering). As we know the VR performance gain (hopefully) got high priority during the rewrite work, and part of that is the AA implementation. It's really technical details I know, maybe it's much less relevant for people who only want to fly the airplanes, but the overall visual quality in VR highly depends ot these decisions. That's why I asked if there're such plans as part of the Vulkan implementation.
-
Dear @BIGNEWY, sorry to bother again with the engine rewrite, just a quick question. Is the team working on the applicable AA technology for VR to use promising new technologies instead of the existing MSAA? Previously there was some discussion about the AA / deferred shading difficulties in VR, not to mention the transparent textures' problem. While we're waiting for Vulkan it would be very relaxing to know, that some fancy AA solution is underway for VR. Of course we don't need any detailed "classified" technology leak, just a YES or a NO.