Jump to content

Recommended Posts

Posted (edited)

I just learned, that DCS still does not not properly support HMDs with non planar screens, like Pimax and the upcoming Valve Index.

 

I don't own and have not prepurchased one of them, still i'm completely puzzled, as to why there is not much talk about this.

Having the image corrected in post (parallel projection compatibility) has a massive impact on performance and should not be considered as a proper solution.

A proper solution would mean to cater to the perspective within the rendering pipeline, which should have no impact on performance.

 

As long this is not adressed by ED, i don't see the Index to be a viable option for DCS.

Edited by twistking

My improved* wishlist after a decade with DCS ⭐⭐⭐⭐🌟

*now with 17% more wishes compared to the original

Posted

Uh-oh.

 

on the one hand, that doesnt bode well for any software.

 

On the other hand, if it affects all software, I imagine Valve will be on top of it.

 

Would it definitely be up to ED to support or could it come from Valve? As part of SteamVR?

Posted (edited)

i've only started reading into this, so i'm by far not an expert on the subject.

 

 

what i understand:

- It would definitely be up to ED to properly (!) support canted displays, as it requires a different projection, which comes early in the rendering pipeline.

- All "made for VR" games already support angled projection

- Most multi-title game engines already support it

- The VR interface software (Pitools, potentially SteamVR in the future) can correct for the projection with post process distortion correction, which works fine, but requires a much higher render resolution

- Even if there was a better was to infuse proper projection through interface software, i doub that Valve or Pimax would bother, considering that nearly all modern games already have support for the proper projection

- Implementation of proper projection for non planar HMDs is supposed to be straightforward and relatively easy. Might depend on the engine in question though...

Edited by twistking

My improved* wishlist after a decade with DCS ⭐⭐⭐⭐🌟

*now with 17% more wishes compared to the original

Posted

Not sure I understand this completely, are we expecting to see tangible performance degradation due to canted screens? I will definitely be testing DCS performance on the Index vs my Vive Pro once I get it, since the objective screen resolution is the same. To say the Index isn't a viable option for DCS is a bit premature, imo.

PC: 5800X3D/4090, 11700K/3090, 9900K/2080Ti.

Joystick bases: TMW, VPC WarBRD, MT50CM2, VKB GFII, FSSB R3L

Joystick grips: TM (Warthog, F/A-18C), Realsimulator (F-16SGRH, F-18CGRH), VKB (Kosmosima LH, MCG, MCG Pro), VPC MongoosT50-CM2

Throttles: TMW, Winwing Super Taurus, Logitech Throttle Quadrant, Realsimulator Throttle (soon)

VR: HTC Vive/Pro, Oculus Rift/Quest 2, Valve Index, Varjo Aero, https://forum.dcs.world/topic/300065-varjo-aero-general-guide-for-new-owners/

Posted (edited)
Not sure I understand this completely, are we expecting to see tangible performance degradation due to canted screens? I will definitely be testing DCS performance on the Index vs my Vive Pro once I get it, since the objective screen resolution is the same. To say the Index isn't a viable option for DCS is a bit premature, imo.

 

Well, i expect it to see tangible performance degredation, yes! How bad it will be, i don't know; Maybe the Index is built in a way that the missalignment is minimal and not a game breaker, if not corrected, but this won't be optimal since the perspective will be a little bit off.

The correction is, what will take the toll on performance, because of the need for higher render resolution. Maybe Valve can work some magic on this with super clever foveated rendering or something, BUT i doubt that they will bother, since most modern engines already support canted HMDs and therefore won't need this correction.

 

Another possibility would be, that the Index corrects for the angle optically through its lens design, but this is highly unlikely for the reasons above and the fact, that this would put more strain on the lens design and would lead to very complex lenses with the only benefit of compatibility for some games.

 

The obvious solution would be for DCS to support canted display HMDs.

It would mean that the renderer project the world onto a viewport/target that is not planar, but canted in the same way as the lenses in the headset. Simple stuff!

This would then result in a perspective correct image without the need for post process correction

(Note that there will still be distortion correction to compensate for lens distortion - it would be great if this could also be done in the renderer, but this would require the renderer to project on a warped plane... sounds complicated: Something for John Carmack to figure out perhaps...).

Edited by twistking

My improved* wishlist after a decade with DCS ⭐⭐⭐⭐🌟

*now with 17% more wishes compared to the original

Posted (edited)

Do remember that Valve owns the Steam ecosystem, whose revenue in 2017 alone was $4.3 billion. It was less than a quarter of Facebook's revenue but they're not some no-name company, and certainly do have the means to make things happen especially when their primary focus is in the gaming market. Why would they not fully support their own headset technology when there are tons and tons of VR games/apps already on Steam that people will want to play using their new Valve headset.

Edited by Supmua

PC: 5800X3D/4090, 11700K/3090, 9900K/2080Ti.

Joystick bases: TMW, VPC WarBRD, MT50CM2, VKB GFII, FSSB R3L

Joystick grips: TM (Warthog, F/A-18C), Realsimulator (F-16SGRH, F-18CGRH), VKB (Kosmosima LH, MCG, MCG Pro), VPC MongoosT50-CM2

Throttles: TMW, Winwing Super Taurus, Logitech Throttle Quadrant, Realsimulator Throttle (soon)

VR: HTC Vive/Pro, Oculus Rift/Quest 2, Valve Index, Varjo Aero, https://forum.dcs.world/topic/300065-varjo-aero-general-guide-for-new-owners/

Posted
Do remember that Valve owns the Steam ecosystem, whose revenue in 2017 alone was $4.3 billion. It was less than a quarter of Facebook's revenue but they're not some no-name company, and certainly do have the means to make things happen especially when their primary focus is in the gaming market. Why would they not fully support their own headset technology when there are tons and tons of VR games/apps already on Steam that people will want to play using their new Valve headset.

They would not fully support it, because they can't: The games have to support VR and in this case the games have to support canted screen VR, which many already do and all games going forward will do.

 

The only thing that Valve could technically do, is to enable a perspective correction in post processing, but that would come with some form of performance toll.

 

The Pimax has a canted design, the Index has and in the future there will be more HMDs with similar designs: DCS just needs to support this.

My improved* wishlist after a decade with DCS ⭐⭐⭐⭐🌟

*now with 17% more wishes compared to the original

Posted
i’d be surprised if DCS doesn’t work with the new headsets.

 

can you explain again why it won’t?

 

It surely will work, but you may not have an optimal experience.

You can see this problems already with the Pimax HMDs: Without compensation the perspective is slightly wrong. Like you would look at your monitor from an angle... but the angle would be different for both eyes.

I can't say how intrusive that is in practice, but it is definitely not how it should be.

This is also why VR Zoom does make you "cross-eyed" on Pimax: The Zoom amplifies the wrong perspective to where it becomes obvious.

 

To compensate for that, you could correct the perspective in post processing, but this has detrimental effects on performance.

The proper way would be to have the canted screens accounted for when doing the projection in the renderer.

That is why a proper implementation has to be done by the developer of the game or engine.

My improved* wishlist after a decade with DCS ⭐⭐⭐⭐🌟

*now with 17% more wishes compared to the original

Posted (edited)

Wouldn´t say, that properly making work the new and/or canted HMDs belongs to the work of DCS developers, but to work oft he developers of particular HMD types and techniques.

 

DCS developer shouldn´t care about specific HMDs techniques, but improving the DCS engine for VR in general and that´s exactly, what they do and it´s devinitely the best support for VR we could get from DCS developer.

 

To engineer the canted HMDs working well with DCS should be done by programming the compositor in a right way. At this point I am very much confident, that the Valve engineers have done it.

The Pimax software engineers seems to have problems to do this. Meanwhile, I would say, that the Pimax could run much better in every aspect with more properly engineered interface/compositor to interact with the applications.

Pimax vision is to have their product working with all(!) VR applications, which makes the software developement for the Pimax compositor more difficult, than to make the HMD work with selected applications.

It´s a very good effort, Pimax is following, so the Pimax supports Oculus exclusive games as well as SteamVR games right away, whereas with a HTC Vive or WMR HMDs the Oculus exclusive games, could only be support with a hack.

 

Now, with Pimax going open source, there might be the chance, that the 8k/5k+ could be connected directly and better through SteamVR, without PiTool,... maybe, if the Valve engineers implement fully support into SteamVR for the Pimax headsets. This would give some advantages in better and more advanced engineered algorithm, than the Pimax engineers could do, which was always a lack of Pimax... but let´s see about this, it´s currently more wishing, than expecting it to be that way.

 

I´m pretty sure that the Valve engineers programmed and tested the Index to work perfectly with DCS. Another point for having the Pimax fully implemented into SteamVR could be, that the Index shares the canted design and the adjustable panel frequencies with the Pimax.

 

In opposite to that, I still wonder, that someone in the forum mentioned, that the VR zoom worked well for the Pimax before and now doesn´t work, but not sure if it is broken through a DCS update or an update of the Pitool version.

 

There are also ways to adjust the Pimax compositor settings by hexediting the pi_server.exe, but this is real try and error and as much fiddling you could imagine. For me, I´ll wait now for the DCS spring update, which can´t be so far away from now, which should give more improvements for VR.

 

Also for the moment I more enjoying, than fiddling around, to fly one of my first love in DCS, which is the F5E-Tiger, which gives also much better performance in VR than the F-14B.

 

Let´s see how things go... give it some time. Eagle Dynamics developer truly make DCS the best it could be to run in VR. No need to blame in that direction. Making a specific VR headset to work properly with DCS belongs to the engineers of the headset from my point of view.

Edited by - Voight -

AH-64D  Apache  /   F-16C Viper  /   F1 Mirage   /   Mi-24 Hind  /   F-14b Tomcat

Posted (edited)
[...]

Unfortunately i still haven't read into how a VR pipeline typically works. So you probably know better.

However i can't really believe that the VR interface software (is that what you call compositor?) could do anything else than just bruteforcing the perspective correction.

 

Other game engines already do it properly. They have a little toe-out on their stereo render targets.

It's similar to what Nvidia SMP (simul. multi projection) or Nvidia VrWorks MVR (multi view rendering) can do (but these techniques need implementation into the engine - they don't just plug in).

 

Also once you add support for this, the game devs don't have to support every HMD single-handedly. The HMDs software would just hand the HMDs geometry to the game engine, where this information is used to do perspective correct projection for the HMD.

I assume, that as of now, DCS engine already receives some information, like FOV, IPD and Resolution. You woud "just" add toe-out-angle to it and allow the renderer to account for it.

 

I want to stress that other engines already do that!

Edited by twistking

My improved* wishlist after a decade with DCS ⭐⭐⭐⭐🌟

*now with 17% more wishes compared to the original

Posted

In advanced, I have to disappoint twice: first, I don´t know better about how the VR pipeline works, than anyone else and secondly, of the three and a half languages I speak, english is only my second best, so please apologize, that I can´t express myself, like a native or colloquial english speaker ... anyway, I try best, to have a nice chat.

 

Mentioning the "interface" or compositor or SDK or API or driver for VR is key for the situation with VR we are facing at the moment. The headsets themselves are pretty much the same: one or two cheap smartphone displays, two cheap fresnel lenses and tracking system all hold together by some plastic ... but the effect of experiencing VR is astonishing!

 

The stage of VR development or let´s the the stage in the evolution of the HMD technique is pretty much in the beginning. Regarding the different APIs it is comparable to the situation in some old days, in which VHS, BetaMax and Video2000 were competitors in the market for videotape devices. Each one standard was brought up by different companies to compete in the market for the first place and biggest piece of cake of the market. Finally VHS made it as the one standard.

When bluray disc started as successor for the DVD, there was also HDDVD as a competitor to bluray, but vanished quickly, so bluray became the standard format.

With VR it´s virtualy the same and the reason, why we have such a mess with the VR API. We have Oculus, SteamVr and WMR competing in the market driven by individual companies to get the biggest piece of cake in the market. But as with every standard in consumer electronics in the past, there will be one standard at a time and the others will vanish.

 

The current situation is not only bad for the consumer, but as well for the software developer. Just imagine, and following comparison is very close to VR headsets, we would have three different standard for Flatscreen monitors to work with different software applications ... it would be a mess, wouldn´t it?

 

The Nvidia VRworks is great, but as far as I know not to easy to implement, but with lot of support by Nvidia. Implementing VRWorks clearly woudl have advantages, but makes the developer also depending on Nvidia. AMD cards user would loose any support at the same time, if not as well the AMD equivalent to VRWorks would be implemented into the application. For Nvidia VrWorks is more or less a software to promote and sell more Nvidia cards. I think AMDs equivalent is named "crystal-something" . Both SDK would support VR-SLI, which promises a massive boost in performance for VR, beside the SMP and MVR, which are greatly improving performance and quality in VR ( take a look at the Nvidias "clown house" demo or "EVE Valkyrie" which had implemented some of the VRWorks features and run much better than any other VR application.

VR hardware on lacks of two issues at the moment: SDE and limited FOV. But with regard to both, there is good progress at hand to be optimistic for the future developement of VR hardware.

AH-64D  Apache  /   F-16C Viper  /   F1 Mirage   /   Mi-24 Hind  /   F-14b Tomcat

Posted

what is a canted screen? sorry for the stupid question but I want to find out what it is and how it is different..

Intel Ultra 265K 5.5GHZ   /  Gigabyte Z890 Aorus Elite  /  MSI 4070Ti Ventus 12GB   /  SoundBlaster Z SoundCard  /  Corsair Vengance 64GB Ram  /  HP Reverb G2  /  Samsung 980 Pro 2TB Games   /  Crucial 512GB M.2 Win 11 Pro 21H2 /  ButtKicker Gamer  /  CoolerMaster TD500 Mesh V2 PC Case

Posted
what is a canted screen? sorry for the stupid question but I want to find out what it is and how it is different..

 

Instead of the screens being perfectly perpendicular to your eye (i.e. flat straight in front of you), they are titled outwards slightly, so the inside edge of each screen (each eye gets it's own screen) is further away from your eye than the outside edge.

Posted
In advanced, I have to disappoint twice: first, I don´t know better about how the VR pipeline works, than anyone else and secondly, of the three and a half languages I speak, english is only my second best, so please apologize, that I can´t express myself, like a native or colloquial english speaker ... anyway, I try best, to have a nice chat.

Well, it's more or less the same with me :)

Mentioning the "interface" or compositor or SDK or API or driver for VR is key for the situation with VR we are facing at the moment. The headsets themselves are pretty much the same: one or two cheap smartphone displays, two cheap fresnel lenses and tracking system all hold together by some plastic ... but the effect of experiencing VR is astonishing! [...]

I agree with everything you said, however i think you underestimate the significance of the actual game engine to deliver a good VR experience.

For example you can't make a non VR game, run in true stereo-correct VR, just by throwing some API between it and the HMD.

All the perspective correct rendering is done by the renderer in the game engine and at this point all the VR API has to do is feed the needed information to the engine.

I assume this would be:

Resolution

Head Position (tracking)

IPD

Screen Geometry

In the future there would maybe the possibility to also feed lens design, to have the renderer account for lens distortion etc.

 

Sure, it will help developers, if there was only one API to account for, but even then they would still need to make sure, that the renderer can work with the information the API feeds it.

I assume that in the case of DCS, the renderer gets the information from the API, but it just ignores the screen geometry, because the renderer is only able to render stereo viewports, that are planar.

 

The API magic then kicks in, when the image is completely rendererd, applying lens distortion correction, motion smoothing, color correction and in the case of DCS and other non compatible games: Perspective correction.

And it will be this perspective correction that will always require a higher rendering resolution to not look blurry.

It's the same with distortion correction, if you think about it, however it would be much more complex to account for lens distortion in the renderer, compared to just adding the toe-out for the viewports.

My improved* wishlist after a decade with DCS ⭐⭐⭐⭐🌟

*now with 17% more wishes compared to the original

Posted (edited)

Moin Twistking,

 

you have to excuse some of dramatized phrasing before - just wanted to emphasize on the API for VR more than on the hardware.

It doesn´t make sense to me constantly complaining particular VR headset or manufacturer for the different performance indiviuals get on their system. It´s like you would buy a 4k Flatscreen, put the resolution up to 4k and blaming the Monitor, cause you don´t get the same performance like on your old FullHD Flatscreen before.

 

The Khronos Group is now more than two years old and to me it looks like simply put some company´s logos on to a sheet and spread into the internet. But as far as I know, the USB-C type/virtual link connection on the newer RTX cards, was the only effect so far from some communciation between industrie leaders, while having in mind a standardized tethered connection from VR devices to PC as well to Notebooks and Smartphones. So far no HMD manufacturer uses this connection and the moblie varinat of VR headset are in anyway developed untethered. The Khronos Group feels a bit like the OPEC is meeting to discuss how to save the environment...lol... but let´s keep cynicism aside ...

 

When you use SteamVR, there is an option within steamVR, in the developer section, which allows you to enter the webconsole. The webconsole gives a detailed overview on how the steamvr compositor in working and which files are requested at which point in the process on running a specific headset with specific application. Quite interesting, as it seems a lot of different files correpond and gets settings from for the individual headsets.

 

The related file in the Pimax directory is pretty poor in its information and adjustable settings, maybe there are more possibilities, when Pimax is open source to adjust the settings for DCS.

Within the Pimax "default.*" file is set the maximum resoltion to 8192 pixel, which is crazy, but resetting the value to 4096 ( same as maximum render resolution in the steamvr.setting file, the slider in SteamVR get´s redefined and could eb adjusted with values more close to the native resolution of the 5k+´s displays. By this I could reached by adjusting the settings in Pitool to 1.0 and SteamVR 36% to render resolution to 2560 x 1580 , what is still not right in the vertical resolution, but should create a more clear image ( surely it could be seen different and depends on individual perception.

 

Another file in which could be done some corrections is the "Stereo" file, within DCS directory /monitorviews.

There the Viewports could be corrected to the aspect ratio of the large FOV displays. The effect is like correcting the picture on TV from a 4:3 aspect ration into 16:9 aspect ratio...quite advisable to edit this file, if using large field of view panels in DCS.

 

So far, I couldn´t find the setting for canted displays. Even not in the OpenVr directory "setting" files ( located in /user/apps/ local/ ... ) which also correspond back settings for VR following the webconsole in SteamVR. I think the large field of view headsets are still too new and adjustments for the degree of the canted design is not yet a factor in the setting files. But need to be in future, as a large field of view only could be realized by canted or curved design of the panels. I think for Pimax it should be about 20° angle.

 

You´re right that this degree need to be adjusted somewhere. In VRzoom with DCS it could be observed, that each eye zooms in from a viewport which is not in the 12 o´clock position of each eye, but left eye approx in the 10 o´clock position and right eye in the 2 o´clock position and the more both viewports are zoomed in, the more the 3D images are overlapping.

I guess that pimax compositor does not have implemented a function/correction of the viewports, when zooming in VR... finally agree with you, that Eagle Dynamics developer might find an algorithm to compensate the overlapping images due to VRzoom, but I wouldn´t force or complaint to do so...actually be more thankful, if it could be arranged to have the VRzoom work like for planar VR Headsets with non-planar headsets.

 

The distortion is not really magic and is only responsible for the 3D effect for the VR headset.

I had these Nvidia 3D Vision glasses before at the time the DK2 form Oculus became popular to get to experience a 3D effect on 144Hz Flatscreen. With the related Nvidia software the distortion could be adjusted by a slider to improve the accuracy of the 3D effect in the way to avoid doubled lines ( not ghosting ) through the process. I think Pimax uses kind of shutter technique for the 3D effect in their headset, but it could be also switched to parallel projection, which is used by other VR headsets for the 3D effect, but on cost of more performance needed in the renderer/GPU to create the double amount of rendered images.

 

There is also an option within a DCS graphical settings file to allow to render in stereo ( parallel projection?), but I don´t know how exactly this option affects the process, there is an effect activating this option, also don´t know, if setting the stereo render to true in the DCS praphical settings corresponds to the stereo file settings in the monitorview/stereo file in another DCS directory.

 

It would be great to have more information noted in the files on how VR is implemented into DCS from the developer to try adjusting the settings to native resolution and specific design for individual VR headset, as we are now getting more diversification from the VR headset manufacturer.

 

By the way, do you know VorpX? It´s a software to make non-VR games working with VR headsets. VorpX does work, but needs much performance and it comes with issues everywhere like a mod you put on top of the original programming.

 

I think the blurry looking images, relates more to discrepancies in the render resolution and the native resolution of the panels in the VR HMD, but also because the lenses in the HMD also work like magnifying glasses between the image on the displays and the eyes. Poor lenses result in a sweetspot effect, blurry image and visible SDE. Still curious if Valve could have solved or better solved these issues with their new Index HMD and lens technique as well as XTAL might have improved the effects with much more better lens design and technique. If so it could be a huge step into better image quality for VR HMDs - once these lens techniques are available in public, it could be backengineered by all manufacturer for more improved VR HMDs in future.

 

At the moment, I find it very interesting, that Sweviver runs his Pimax without SteamVr, will try to follow this and see next, if the 5k+ could be improved for DCS by having more options due to Pimax changing their render pipeline to open source.

Edited by - Voight -

AH-64D  Apache  /   F-16C Viper  /   F1 Mirage   /   Mi-24 Hind  /   F-14b Tomcat

Posted (edited)
Moin Twistking,

you have to excuse some of dramatized phrasing before - just wanted to emphasize on the API for VR more than on the hardware.

It doesn´t make sense to me constantly complaining particular VR headset or manufacturer for the different performance indiviuals get on their system. It´s like you would buy a 4k Flatscreen, put the resolution up to 4k and blaming the Monitor, cause you don´t get the same performance like on your old FullHD Flatscreen before.[...]

Hey Voight und ähm... willkommen beim inoffiziellen deutschen DCS VR Stammtisch...

 

I agree, however is still think that the canted screen HMDs are a bit of an exception, because supporting them "properly" is not very difficult and with more wide FOV HMDs expetced to be developed, it's a design that might become the norm going forward.

Although i must admit that after reading into all this, i was surprised how much else could be improved on an engine/renderer level, so that the canted screens don't stand out so much anymore (complex projections to compensate for specific lens design f.e.).

However i still think, that it is reasonable to expect ED to work on that. I will again try to explain why waiting for better APIs and Valve secret sauce will not be enough(!):

 

 

[...]So far, I couldn´t find the setting for canted displays. Even not in the OpenVr directory "setting" files ( located in /user/apps/ local/ ... ) which also correspond back settings for VR following the webconsole in SteamVR. I think the large field of view headsets are still too new and adjustments for the degree of the canted design is not yet a factor in the setting files. But need to be in future, as a large field of view only could be realized by canted or curved design of the panels. I think for Pimax it should be about 20° angle.

It has to be somewhere, as some game engines already support canted dispalys natively and therefore don't need any correction in post processing. (therefore running faster). These engines simply apply the same angle the screens have to their viewports. So there must already be a way for the Pimax/SteamVR API to feed the screen geometry to the engine.

 

 

[...] You´re right that this degree need to be adjusted somewhere. In VRzoom with DCS it could be observed, that each eye zooms in from a viewport which is not in the 12 o´clock position of each eye, but left eye approx in the 10 o´clock position and right eye in the 2 o´clock position and the more both viewports are zoomed in, the more the 3D images are overlapping.

 

I guess that pimax compositor does not have implemented a function/correction of the viewports, when zooming in VR... finally agree with you, that Eagle Dynamics developer might find an algorithm to compensate the overlapping images due to VRzoom, but I wouldn´t force or complaint to do so...actually be more thankful, if it could be arranged to have the VRzoom work like for planar VR Headsets with non-planar headsets.

I think here lies your main missconception of the issue: The problem is not with the zoom feature. Technically the scene renders incorrectly all the time, it's just that it does not show in typical DCS gameplay (a sensitive person might still notice it though - i assume it will be more obvious on very close objects). The Zoom just amplifies the problem.

Think about a person that is slightly cross-eyed: The person will have some reduced depth perception, but all in all the person will be fine.

Now think about a pair of binoculars, that are cross eyed. While the binoculars would align perfectly with the cross-eyed person's eyeballs, the effect would be that through the magnification the brain can not put the two images together since they are too different.

 

Also the Pimax compositor does indeed have the possibility to correct for that. From what i gathered, it is called "Compatible with Parallel Reprojections" and it corrects for canted screens.

This is great, but it eats 30%-50% of perforance, because it needs a higher resolution to work with, as it crops and skews the image to correct the perspective.

This is because it is done in post processing. The better way would be to do this in the actual projection, but this is not soemthing the API or any magic sause can do. The API just tells the game engines renderer the geometry and the renderer then puts the render targets in the geometrical correct position, so that the projection then works on the corrected targets.

 

 

[...]

The distortion is not really magic and is only responsible for the 3D effect for the VR headset.

I had these Nvidia 3D Vision glasses before at the time the DK2 form Oculus became popular to get to experience a 3D effect on 144Hz Flatscreen. With the related Nvidia software the distortion could be adjusted by a slider to improve the accuracy of the 3D effect in the way to avoid doubled lines ( not ghosting ) through the process. I think Pimax uses kind of shutter technique for the 3D effect in their headset, but it could be also switched to parallel projection, which is used by other VR headsets for the 3D effect, but on cost of more performance needed in the renderer/GPU to create the double amount of rendered images.[...]

I think you are mixing sth. up here. What i meant with distortion was either lens distortion or perspective distortion. Perspective distortion can f.e. come from canted screens when not accounted for in projection. Lens distortion is very common, especially on cheap lenses, or simply on simple lens design which are smaller and lighter than complex lens arrays.

If you would stream your rendered image directly to a common HMD it would look distortet, since the lenses in the HMD are not able to project (this time projection with actual light rays, not virtual ones) the image to your eyes without distorting it.

It's the same in photography: Even expensive lenses will give you a distortet image, which can be easily corrected in post processing.

In a time before photoshop it was more common to use uber complex and expensive lenses that would distort less severly and even allow for perspective correction through the optical design.

This is now less common, because cameras have so high pixel counts, that it is acceptable to loose some resolution to correct distortion (and even perspective) in post processing.

 

I think you was thinking of some kind of parallax correction with the "old" 3d screen technology, but that is something else. Compared to VR -i would say - it is more related to IPD adjustment than anything else.

Also doubt that shutter techniques are used in VR at all. The sole use of thsoe shutter techniques is to have two images (mostly stereo imagery) on a single physical screen like on a 3d TV or in the cinema.

In VR each eye has its own screen, or at least its own part of a screen, if it is a single screen HMD.

There is technically a second way to generate stereo iamgery and i think you are referring to that. Instead of rendering (2) stereo correct images, you just render one single image and then use the z-buffer to know the depth of each pixel.

You then mash up a stereo pair by somehow faking it from the original image and the z-buffer. It is unholy and should not be done. You will end up with depth perception, but it will not be same fidelity as you would get from true perspective correct stereography.

 

 

[...]

By the way, do you know VorpX? It´s a software to make non-VR games working with VR headsets. VorpX does work, but needs much performance and it comes with issues everywhere like a mod you put on top of the original programming.

I've heard of it and i am sure, that it does "fake" 3d with z-buffer. You can get the z-buffer from the graphics driver, that is the reason why the game itself does not need to support vr for it. The game renders just one viewport.

This might also be in some unholy relation to the old nvidia 3d-drivers technology. I'm not sure, but it is surely not the "proper" way of rendering VR. Still a cool project though...

 

 

[...]

Poor lenses result in a sweetspot effect, blurry image and visible SDE. Still curious if Valve could have solved or better solved these issues with their new Index HMD and lens technique as well as XTAL might have improved the effects with much more better lens design and technique. If so it could be a huge step into better image quality for VR HMDs - once these lens techniques are available in public, it could be backengineered by all manufacturer for more improved VR HMDs in future.[...]

Yes, however those lenses will always be expensive and heavier, but it might still be worth it considering the prices of graphic cards. If graphic cards were way cheaper, it might be better to keep the lens design simple and correct distortion in software.

 

 

[...]

At the moment, I find it very interesting, that Sweviver runs his Pimax without SteamVr, will try to follow this and see next, if the 5k+ could be improved for DCS by having more options due to Pimax changing their render pipeline to open source.

Sure, however i want to nitpick,that it is probably not correct to call it the Pimax render pipeline, as the image does not get rendered by the Pimax API. The API just provides the info for the actual renderer (in our case: DCS engine) and after the stereo images are rendered, the API does post processing to compensate for everything the renderer could not account for (lens dustortion, color correction and amogst others the canted displays - if you enable "Compatible with Parallel Reprojections" in Pitools).

Edited by twistking

My improved* wishlist after a decade with DCS ⭐⭐⭐⭐🌟

*now with 17% more wishes compared to the original

Posted

Let me put it this way, if ED can't figure out how to get the Valve Index to work properly, they'll be in a world of hurt and criticism. The Valve Index will probably sell better than any other VR system.

[sIGPIC][/sIGPIC]

Posted (edited)

Very interesting talking, twistking!

 

I think, I was wrong hoping the Pimax to be integrated completely into SteamVR, as now, without SteamVR, Pimax with Pitool only seems to run much better.

 

You´re right, that the distortion not only is a factor for the 3D effect, but also the correct distortion for the viewport with regard to different levels of optical refraction of the lenses.

I remember this was an issue due to developement of the Pimax, as on the outer sides of the large FOV lenses, there was incorrect distortion, but Pimax managed to fix it by software distortion correction.

 

I think so far, if not specialized, all the VR API dealing with the same distortion level of lenses, screen aspect ratio and viewports, which until today, were always the same for the first generation of HMDs, like Rift CV1, HTC Vive. The WMR headsets, VivePro and Odyssey, were not such different with regard to FOV, lenses and aspect ratio and with only slightly differences in resolution, why the settings for VR with openVR implemented into the application still work fine and the same for the variety of first generation HMDs. Nowadays things change and HMDs going to differentiate more in the named aspects, what makes it need to have more option to recognize different aspects and adjust the software settingt to different kinds of HMD techniques.

 

I am still thinking, this is done with the compositor/API, not with the game engine, for each HMD.

 

In DCS so far could be identified only a few files files, corresponding to the VR pipeline:

 

In the main directory c:\Programs\Eagle Dynamics\DCS World\bin\openVR_API.dll

 

... which might be the main file to translate the VR settings/resolution request from the compositor of specific VR hardware to the game engine and back.

 

There is also in

C:\Programs\Eagle Dynamics\DCS Wolrd\Config\Monitorsetup\stereo.lua

 

... in which some adjustments could be done to the VR settings and maybe overwrite the default settings for VR in the openVR_API.dll

 

And there is also in:

C:\user"username"\AppData\Local\openvr\...

 

... of which I think, the openvr_api.dll in the DCS directory communicates back and the openvr files form openvr directory, then communciates to the specific HMD compositor ( steamVR, Oculus, WMR, Pitool ).

But please don´t take my findings for sure, as described is only some conclusion, but not knowledge.

 

If you like, try to edit following values for the Viewport of the Pimax in Large FOV mode (!) within the stereo.lua in DCS and have a look, if the vision with Pimax in large FOV mode could be more adequate:

 

Viewports =

{

Left =

{

x = 0;

y = 0;

width = screen.width * 0.5;

height = screen.height;

viewDx = 0;

viewDy = 0;

aspect = screen.aspect / 2;

eye_shift = -0.032;

--tans of side angles

projection_bounds = {

left = -2.769231,

right = 1.346154,

top = 1.269841,

bottom = -1.269841,

}

},

Right =

{x = screen.width * 0.5;

y = 0;

width = screen.width * 0.5;

height = screen.height;

viewDx = 0;

viewDy = 0;

aspect = screen.aspect / 2;

eye_shift = 0.032;

--tans of side angles

projection_bounds = {

left = -1.346154,

right = 2.769231,

top = 1.269841,

bottom = -1.269841,

 

 

I think the original values for left,right,top, bottom in the stereo.lua are fine for the usual first generation HMDs, but need to be adjusted for a more correct viewport for the large FOV.

 

The canted display is not so much a problem, except for the VRzoom, if the tangent space for the large FOV displays and angle to the eye is more corrected.

 

I know, what you mean that the 3D distorted image with the canted displays ( currently we only have the Pimax as reference ) does not feel right and there is always the impression of overlapping images, even without zoom. It could be visualized, when focusing contrails from another plane in the distance. When the contrails are viewed from a side angel they are visible, when you direct focus the contrails, they become invisible, because, of what I think, they are in area, while direct focusing, in which the images are overlapping, but no longer visible on the displays. This is for sure an incorrect projection of the viewports on to the panels, but I am not so sure if it depends on the canted panel design only.

 

When looking straight into the canted display, there is no difference, beside some feeling of incorrectness in the center of the combined left and right viewport, to a planar design display HMD - the eye is looking to a selfdefined focus point and where as the image is flat, the 3D distortion is tranlated correctly to trick the brain for a 3 dimensional image with depth.

It appears flat, because of the viewport correction done in the compositor. Due to development of the Pimax, the outer edges of the viewport appeared stretched, but could have been adjusted/corrected with further developement.

 

Of what I think happens with the VRzoom is, that the brain suddenly get confused, because when zooming in, the formerly selfdefined focus point of the eye into the image now reveals a wrong angle from the eye to the screens.

Difficult to explaing and only what I think happens....

Without zooming and looking straight, the brain thinks, the image is straight in front, as it should be and the 3D effect works. At this stage, we could set the focus for the eye wherever we want into the picture - the compositor comoensates by viewport correction to not get the image as streched to the end, because of the canted design.

 

But when VRzooming, the VRzoom works with a fixed focus point at the center of the displays, from a approx 20° difference to the real focus point of the eye straight ahead. Now, when zooming, the eyes, while focusing a point a 0° recognizes the pictures zooming in from another focus point ( the center of the displays ).

As we aren´t chameleons, somehow the brain overlaps both images. this could be followed, when closing one eye and only zooming in with one eye, which works fine for the brain. Also it could be followed while zooming in with canted display, the suddenly we have two differnet focus points, which are moving each to one side, from the shared focus point we had before without zooming.

 

So, what I think, needs to be done, is, a shifting of the focus point of left and right viewport of the screen aligned with the grade of zooming to compensate the approx 20° angle of the canted design from looking to a focus point straight ahead.

 

@remi

Don´t worry, everything will be fine with the Index in DCS, only maybe the Index will have same issue with VRzoom like the Pimax, if Valve didn´t have a solving algorithm to compensate for a VRzoom.

With last SteamVR update, SteamVR already prepared for the coming Index and SteamVr now recognizes different panel frequencies...maybe an adjustment for the canted design will follow within SteamVR.

 

EDIT: Just rethought it. I think you´re completely right: the focus point of the screens for the canted displays need to be corrected in general through the viewport, to eliminate the issues with the canted display HMD.

Edited by - Voight -

AH-64D  Apache  /   F-16C Viper  /   F1 Mirage   /   Mi-24 Hind  /   F-14b Tomcat

Posted

Interesting finding with the dlls, however i don't own a Pimax, so i can't test it. i don't have another VR HMD neither (but i plan to make the jumo to VR later this year, when i get a new computer).

 

With the files you found, i wonder though, if those values are not more of a fallback, since i think that we both agree, that the game would receive all the necessary data from the VR API directly;

So when you change your HMD to maybe another model of Pimax, i would expect the Pimax API/driver to detect the new HMD and send the values to the supported programm - in our case DCS.

My improved* wishlist after a decade with DCS ⭐⭐⭐⭐🌟

*now with 17% more wishes compared to the original

Posted (edited)

EDIT: Just rethought it. I think you´re completely right: the focus point of the screens for the canted displays need to be corrected in general through the viewport, to eliminate the issues with the canted display HMD.

 

Yes, if it's supposed to be done properly, that would be the only way. However i was wondering why Pitools projection compatibility setting is able to compensate for it (at least i think, that it can compensate for it - as mentioned above, i don't own a Pimax) just by destortion correction.

 

I now realized that my comparison to a cross-eyed binocular is a little bit off and i think you are also a little bit off, when you think about it about it as a divergence problem (you wrote "focus point", but i assume you mean divergence).

 

I think it is better to see it as a perspective problem, since the HMDs displays, while being tilted, are still in front of your eyes (more or less). See this example of perspective correction:

zperspector.jpg

 

This is image is "corrected" (or artistically modified) vertically, but in VR we have to worry about horizontal correction. In this example you can also see, why a correction in software requires more resolution and a higher FOV to begin with.

It is worth noting, that in this example the "corrected image" is technically wrong (its an unpossible perspective), however in photography it is often a desireble artistic decision.

In VR you want the physically correct perspective and not a synthetic perspective that might look well when viewed on a flat screen.

So for VR we want to go from a skewed image to a natural image, while in flat photography/cinematography (where the sample image is from) you sometimes want to go from a natural perspective to a synthetic perspective.

 

The nice part is, if in photography/cinematography you would want to modify the perspective without loosing Resolution from your sensor, you would have a special lens or camera, that would allow you to change the geometry of the optical system. And this is what would be done in the renderer, when you account for the canted screens by modifying the virtual projection targets (or viewports as you would call it in computer terminology...(?).)

 

It's a bit harder to see why the problem is so much pronounced with VR zoom, but i think that this is just because VR zoom in itself is less natural and while non zoomed VR is within the tolerance of the brain (both pictures above look "proper" on first sight), i think the combination of zoom and skewed perspective overwhelms your brain.

Also if you would zoom in into a photograph that has strong "artistic" perspective manipulation applied (in other words: a synthetic perspective), it looks more off, while zoomed out is feels more natural (at least to my eyes).

 

So i think it is still valid to say, that "proper" support for canted lens HMDs like Pimax or Index can only be done, if the HMD lens geometry can be accounted for in the game engine renderer.

Post process solution will work, but they will require more performance, due to the overhead in resoulution and FOV needed for the correction.

Edited by twistking

My improved* wishlist after a decade with DCS ⭐⭐⭐⭐🌟

*now with 17% more wishes compared to the original

Posted (edited)

hmmm ... not to mix things up too much and keep it simple, I made some drawings.

You have to excuse the bad painting, I´m a writer not a drawer.

 

Picture #1 should show a planar HMD design.

The perspective and focus of the natural view and the synthetical focus of the viewport are aligned. The brain surely takes the natural view to focus the eyes.

 

Picture #2 should show a non-planar HMD or canted HMD design.

The perspective and focus of the natural view (eye/green ) is always straight ahead. The brain always follows it nature, it could be tricked by synthetical distortation for a synthetical 3-D vision, but not, when its up to focus.

Picture #2 is the normal view within the viewport of a canted displays. The brain simply don´t care for the wrong focus of the viewport ( pink ), which is still, like on the planar displays on a 90° angle.

This is what in my opinion needs to be corrected in the viewport to maybe +20° for the left eye and -20° for the right eye to compensate the angle of approx 20° of the displays. This not so much of a problem, as we still could create one picture in 3D in the brain. But when zooming, things get off, because the VRzoom follows the focus-line of the viewport, which is not aligned with the natural view. The brain now tries to adjust the wrong angle of the viewport focus to the natural eye focus, which result in an overlapping 3-D picture.

 

Picture #3 should show the process of VRzoom and what we could observe, while zooming with the canted displays, in which the synthetical viewport focus line is not corrected to natural focus line of the eyes. The brain forces the focus the pictures of the left and right eye, to be in line with the natural focus line of the eyes and creates a wrong picture.

 

Not to get me wrong, I ´m not a programmer. DCS is my beloved hobby, but for my real life, I´m too very interested in Virtual Reality and hopefully soon, I could afford a Insta360 Pro VR camera, which films in VR in 8k resolution.

 

Man, you should really get a VR headset soon, it´s such a great thing in DCS. I won´t never miss this.

picture_1.JPG.153b7156ac94d64a233a618315b3f84a.JPG

picture_2.JPG.1e0a31c04b8072d95393f2d5cc604c65.JPG

picture_3.JPG.a3afa61804be7700b91a8532b1a6eeaf.JPG

Edited by - Voight -

AH-64D  Apache  /   F-16C Viper  /   F1 Mirage   /   Mi-24 Hind  /   F-14b Tomcat

Posted
hmmm ... not to mix things up too much and keep it simple, I made some drawings.

You have to excuse the bad painting, I´m a writer not a drawer.

 

Looks good :thumbup:

 

However, if i understand it correctly, that is more in the line of what i was describing with the "cross-eyed binocular" issue, while i now think that the problem is less complex actually and more in line with what i was describing above with the picture of the church.

 

I changed your image to be more in line with what i think is the main problem.

Notice that in my version both cameras are still pointed forward and the projection on the canted target "only" results in a skewed image.

I think that would be enough to get messy when zooming in, but maybe you are also right and it is also an axis convergence problem in addition.

 

Do you have a Pimax? How does the VRZoom in DCS work, when "parallel projection compatibility" is activated? Is it completely corrected?

Also can you carefully test, if without "parallel projection compatibility" the view is correct when not zooming? I suspect that even without zoom, the image should be skewed ever so slightly, as shown in the image. Probably most users just don't notice.

picture_4.jpg.9e7ebfeb764a1102c447096b88152fd1.jpg

My improved* wishlist after a decade with DCS ⭐⭐⭐⭐🌟

*now with 17% more wishes compared to the original

Posted (edited)

N`Abend,

 

we were not so much off, of how it works, I think.

 

But there´s a difference. The viewport or let it describe as the virtual cameras in the virtual reality are determined by the screens and are looking to the screens. ( see picture 4 ).

 

Picture 5 only shows the viewport 3 dimensional.

 

The perspective corrections of the pictures of the buildings is hard to compare with what we have in VR. But with the compositor there is also a correction done to the frames the engine creates, to make the frames match on to the lenses in the HMD in the way to correct the distortion resulted of the shape of the lenses ( which is comparabel to the building in the picture before and after correction ).

 

The effect could be seen in the pictures of the A-10 is exactly, what could be corrected for the Pimax large FOV with editing the stereo.lua, what described before.

The values I copied in, are more adequate distances from the eye to the top, bottom, left and right of the viewports for each eye, as seen in picture 5.

 

The new SteamVr update now detects the frequency of the HMD, which is good, but so far no adjustment for the angle of the canted displays to correct the viewport and the focus line. But I´m pretty sure, that Valve provides a solution for the Index, but we have to see, if what works for the Index could also work for the Pimax with regard to the canted displays.

 

"Parallel Projection" is very interesting, as we already talked about VorpX before.

The Pimax5k+ is now my 4th headset and in the early times with my first, the HTC Vive, i played a bit around with VorpX. It was depending on the game, if VorpX works more like "fake" VR or "real" VR. Actually the lack of a information of the objects in the depth of the 3D space ( which the z-buffer provides ) forced VorpX into the "fake" 3D mode, if the application was programmed with z-buffer, VorpX worked like "real" VR.

 

I think the "Parallel Projection" in Pitool does exactly the same like VorpX and makes the Pimax to run with games or application, which are not programmed with a z-buffer.

DCS has got z-buffer programming, so no need for parallel projection.

 

But interesting is, why the VRzooms seems to work in DCS with Parallel Projection with the Pimax.

My conlcusion is simple: parallel projection ignores the position of objects in the depth of the 3D space and it looks like it also ignores the focus line of the viewport, which could also been seen as scale for the depth of the picture. There is actually no zooming into the depth of the picture on the focus line.

What could be observed, when VRzooming with parallel projection is, that the things don´t appear closer, but more tiny than without zoom. This effect could be also seen in 3D cinema, when you watch a 3D movie with the shutter glasses. In the cinema the pictures of the movie also lack of an information in depth of the picture and in scenes, let´s say a landscape with a view into the far of the landscape, objects in this scene appear tiny as long as the eye has got a reference point from these objects to objects in "depth" of the scene. Compared to VR, 3D cinema is like "fake" 3D.

Parallel projection should never be activated in DCS, it destroys the quality of depth information of objects in DCS and cost much more performance. Parallel projection also doesn´t work right with VRzoom.

picture_4.JPG.54bd6e33345f3b69adbda89564213b29.JPG

picture_5.JPG.a174666f6768425b7ebc9b6753289c10.JPG

Edited by - Voight -

AH-64D  Apache  /   F-16C Viper  /   F1 Mirage   /   Mi-24 Hind  /   F-14b Tomcat

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...