Jump to content

Recommended Posts

Why does DCS have a IPD/world scale setting at all? Why would a player want to change that unless they want to make a first person Godzilla movie? 😆

It seems like an odd setting to have and it’s just going to cause confusion. The headset already has an IPD setting which should be enough.

The trouble with IPD in VR is you’d need to have this set exactly down to like a fraction of a millimeter for it to be truly accurate. I don’t imagine many headsets have this fine tuning, just a dial or slider. In most games getting this off a bit probably isn’t a big deal but for a game where you sit in a cockpit any small deviation is going to be very noticeable. 

i9-14900KS | ASUS ROG MAXIMUS Z790 HERO | 64GB DDR5 5600MHz | iCUE H150i Liquid CPU Cooler | 24GB GeForce RTX 4090 | Windows 11 Home | 2TB Samsung 980 PRO NVMe | Corsair RM1000x | LG 48GQ900-B 4K OLED Monitor | CH Fighterstick | Ch Pro Throttle | CH Pro Pedals | TrackIR 5

Link to comment
Share on other sites

I just leave the IPD option unticked. That works best for me 🤷‍♀️

Spoiler

Ryzen 9 5900X | 64GB G.Skill TridentZ 3600 | Gigabyte RX6900XT | ASUS ROG Strix X570-E GAMING | Samsung 990Pro 2TB + 960Pro 1TB NMVe | HP Reverb G2
Pro Flight Trainer Puma | VIRPIL MT-50CM2+3 base / CM2 x2 grip with 200 mm S-curve extension + CM3 throttle + CP2/3 + FSSB R3L + VPC Rotor TCS Plus base with SharKa-50 grip mounted on Monstertech MFC-1 | TPR rudder pedals

OpenXR | PD 1.0 | 100% render resolution | DCS "HIGH" preset

 

Link to comment
Share on other sites

1 hour ago, SharpeXB said:

...any small deviation is going to be very noticeable.

Not really that much.

  • Like 1

🖥️ Win10  i7-10700KF  32GB  RTX3060   🥽 Rift S   🕹️ T16000M  TWCS  TFRP   ✈️ FC3  F-14A/B  F-15E   ⚙️ CA   🚢 SC   🌐 NTTR  PG  Syria

Link to comment
Share on other sites

You feel the need for speed? Go into VR and fly thorugh the Arc d Triomphe. The feeling of speed seems pretty accurate for me. Sorry for the guys with a flat screen constantly changing Fov to keep situational awareness 😉

 

Link to comment
Share on other sites

Just now, darkman222 said:

You feel the need for speed? Go into VR and fly thorugh the Arc d Triomphe. The feeling of speed seems pretty accurate for me.

You know because you’ve done this IRL? 😆

i9-14900KS | ASUS ROG MAXIMUS Z790 HERO | 64GB DDR5 5600MHz | iCUE H150i Liquid CPU Cooler | 24GB GeForce RTX 4090 | Windows 11 Home | 2TB Samsung 980 PRO NVMe | Corsair RM1000x | LG 48GQ900-B 4K OLED Monitor | CH Fighterstick | Ch Pro Throttle | CH Pro Pedals | TrackIR 5

Link to comment
Share on other sites

Nope. I am high speed train driver, so I know how it has to look. Lol. How do people know who just ride the bike in real life how it has to look then?! That argument is really a lame starting point for a discussion.

As long as DCS is built from real world values and your VR headset is not off, it should be as close as you can get.


Edited by darkman222
  • Like 3
Link to comment
Share on other sites

8 minutes ago, darkman222 said:

Nope. I am high speed train driver, so I know how it has to look. Lol.

Fair enough 😃👍

But again there’s nothing wrong with DCS in this regard. Such an effect is easy to achieve. 

i9-14900KS | ASUS ROG MAXIMUS Z790 HERO | 64GB DDR5 5600MHz | iCUE H150i Liquid CPU Cooler | 24GB GeForce RTX 4090 | Windows 11 Home | 2TB Samsung 980 PRO NVMe | Corsair RM1000x | LG 48GQ900-B 4K OLED Monitor | CH Fighterstick | Ch Pro Throttle | CH Pro Pedals | TrackIR 5

Link to comment
Share on other sites

I'd say it strongly depends on hardware including the VR headset youre using. I have the Pimax Crystal and the Varjo Aero. On the Crystal the very close to the pilots head located F16 ICP is hurting my eyes in the Pimax, I have to concentrate to focus on. On the Aero its no problem. I might actually look into the DCS IPD setting, I thought when I was reading through this thread.

But what I am saying is, that if already two different headset produce different impressions, the only way would be to create DCS for one certified VR headset as reference hardware to be used with, which is not feasable for a consumer product....


Edited by darkman222
Link to comment
Share on other sites

14 hours ago, darkman222 said:

I'd say it strongly depends on hardware including the VR headset youre using. I have the Pimax Crystal and the Varjo Aero. On the Crystal the very close to the pilots head located F16 ICP is hurting my eyes in the Pimax, I have to concentrate to focus on. On the Aero its no problem. I might actually look into the DCS IPD setting, I thought when I was reading through this thread.

But what I am saying is, that if already two different headset produce different impressions, the only way would be to create DCS for one certified VR headset as reference hardware to be used with, which is not feasable for a consumer product....

 

What you describe really looks like the IDP physical setting of your Crystal that need to be set for you - in both of mine (Reverb G2, Quest 3) the only real impact of changing them is the comfort when looking at the cockpit instruments. Nothing wrong on the DCS side on that one I would say 😉

Link to comment
Share on other sites

1 hour ago, zetikka said:

What you describe really looks like the IDP physical setting of your Crystal that need to be set for you

Yeah., But why does the same IPD setting in DCS produce different results / impressions with different headsets. I mean, if VR was  a universal thing. as long as my eyes in my face stay in the same place when switching headsets, the result should be the same between them.

Link to comment
Share on other sites

I can only guess here...  Each headset is a different combination of micro-screens and optical lenses specific to the maker, so depending on the product you can have is one case a small (pixel-count wise) screen paired with a strong set of lenses to present you with the same FoV as a higher pixel-count screen paired with a less powerfull set of lenses would do. 

As DCS only cares about the pixels the result would look different to your eye. 

And then you have the vendor with small screens who wants to sell you a huge FoV and uses huge lenses, and you get a 55' VGA display... 😉

(edited for typos)


Edited by zetikka
  • Like 1
Link to comment
Share on other sites

World scale and IPD are different things. Also, when we say "IPD adjustment", we usually mean the distance between the user's eyes, which is adjusted using either motorized lenses or a  physical slider on the headset (or, for some early headsets, not at all). World scale, meanwhile, sets "IPD" of the character within the sim, and calling it IPD, while you could justify it in athe most literal sense, only causes confusion, implying it's some sort of software hack for a hardware problem. For instance, if you have a headset with no IPD adjustment, messing with the IPD setting in DCS is not going to make your experience any better. The commonly accepted term for what this setting does is "world scale", and thus calling it IPD adjustment is, for all intents and purposes, incorrect.

  • Like 1
Link to comment
Share on other sites

On 3/1/2024 at 4:02 PM, SharpeXB said:

It seems like an odd setting to have and it’s just going to cause confusion. The headset already has an IPD setting which should be enough.

My IPD is below the minimum range of my headset, so this setting is very welcome.

Options are always welcome, they can only add flexibility.

Besides, we can't never be sure if in the whole VR stack from the game to the light hitting our eyes every stage is perfect and the end result is 100% accurate. On top of that, in VR we don't have eye lens accommodation, which is also an important visual cue IRL for perceived distances and scales and can confuse us when we try to tell if scale is or isn't correct. So, the option for manual tweaking can have it purpose. Maybe eye lens accommodation is a part of the problem being discussed here.

 

5 minutes ago, Dragon1-1 said:

World scale and IPD are different things. Also, when we say "IPD adjustment", we usually mean the distance between the user's eyes, which is adjusted using either motorized lenses or a  physical slider on the headset (or, for some early headsets, not at all). World scale, meanwhile, sets "IPD" of the character within the sim, and calling it IPD, while you could justify it in athe most literal sense, only causes confusion, implying it's some sort of software hack for a hardware problem. For instance, if you have a headset with no IPD adjustment, messing with the IPD setting in DCS is not going to make your experience any better. The commonly accepted term for what this setting does is "world scale", and thus calling it IPD adjustment is, for all intents and purposes, incorrect.

It changes the distance between the render cameras, nothing is made to the world. The change in scale happens inside our brains.


Edited by average_pilot
  • Like 1
Link to comment
Share on other sites

2 minutes ago, average_pilot said:

It changes the distance between the render cameras, nothing is made to the world. The change is scale happens inside our brains.

Yes, exactly, and this is what VR industry calls "world scale". As you own example shows, calling it "IPD" caused you to assume you can use it to work around your headset's inadequate IPD adjustment range. You're wrong on that account. What's happening is your brain compensating for headset's IPD being off.

  • Like 1
Link to comment
Share on other sites

7 minutes ago, Dragon1-1 said:

Yes, exactly, and this is what VR industry calls "world scale". As you own example shows, calling it "IPD" caused you to assume you can use it to work around your headset's inadequate IPD adjustment range. You're wrong on that account. What's happening is your brain compensating for headset's IPD being off.

Ok, now I get what you mean. You are right, it can be a source of confusion.

  • Like 1
Link to comment
Share on other sites

22 minutes ago, average_pilot said:

My IPD is below the minimum range of my headset, so this setting is very welcome.

The ingame setting, regardless of what it is called can't help you with that though....

Physical IDP (Your Eyes and the Lens-System) and in-game IDP (or whatever term you see fit) are and do completey different things, as Dragon1-1 rightfully explained.

Edit: Ups, he was faster......

I have one little gripe though. The statement, that the virtual IDP alone causes worldscaling seems not right to me. First of all - from own experience with very early 3D games, like Quake for example (Using early shutter-glasses from Nvidia e.g.) - setting a bigger value for the camera separation only increased or decreased the depth perception, but did nothing to the worldscale. Also  - how is vertical axis affected by that?

Maybe It works totally different with VR headsets than with 3D-Glasses, but I can't see why right now? 

(Should be easy to test though, with a 2D-mirror that only shows one eye's picture....)


Edited by Hiob
  • Like 1

"Muß ich denn jedes Mal, wenn ich sauge oder saugblase den Schlauchstecker in die Schlauchnut schieben?"

Link to comment
Share on other sites

16 minutes ago, Hiob said:

The ingame setting, regardless of what it is called can't help you with that though....

Physical IDP (Your Eyes and the Lens-System) and in-game IDP (or whatever term you see fit) are and do completey different things, as Dragon1-1 rightfully explained.

Edit: Ups, he was faster......

 

Yes, it helps because my headset is sending a IPD value that is wrong for me and DCS would use that value to setup the separation between the camera and has an effect on my perception of scale. Although like SharpeXB said, not really that much for such a small difference.

So IPD is the distance between my actual eyes, but is also the separation between the optical centers of the lenses of my headset and also the distance between the cameras rendering the game. I should add that ideally these three values should match perfectly. In practice things can be complicated.


Edited by average_pilot
  • Like 1
Link to comment
Share on other sites

37 minutes ago, average_pilot said:

Yes, it helps because my headset is sending a IPD value that is wrong for me and DCS would use that value to setup the separation between the camera and has an effect on my perception of scale.

Ok, I wasn't aware, that there is a syncronization between hardware and software IDP. Interesting.

However - I'd still would be grateful, if somebody could explain to me how exactly IDP changes the world scale - outside of depth perception that is.

  • Like 1

"Muß ich denn jedes Mal, wenn ich sauge oder saugblase den Schlauchstecker in die Schlauchnut schieben?"

Link to comment
Share on other sites

10 minutes ago, Hiob said:

Ok, I wasn't aware, that there is a syncronization between hardware and software IDP. Interesting.

However - I'd still would be grateful, if somebody could explain to me how exactly IDP changes the world scale - outside of depth perception that is.

If the separation between the left and right cameras inside the game is greater than the separation between your eyes you perceive the world that you see through the VR headset as being smaller than real life. If it is the other way around you perceive it as being bigger. The greater the mismatch the more pronounced the effect.

Google Earth VR exploits this effect to change the perceived scale of the world as you change your altitude.

Link to comment
Share on other sites

15 minutes ago, average_pilot said:

If the separation between the left and right cameras inside the game is greater than the separation between your eyes you perceive the world that you see through the VR headset as being smaller than real life. If it is the other way around you perceive it as being bigger. The greater the mismatch the more pronounced the effect.

Google Earth VR exploits this effect to change the perceived scale of the world as you change your altitude.

I know how it's supposed to work and how you think it works. That wasn't my question.

I'm interested in an explanation on the exact mechanism that make it work.

As  I experienced in my example with rather simple 3D-shutter-glasses. Increasing the camera IPD in-game let to a greater depth perception, but not to a change of scale iirc. And I fail to see, how a change in the horizontal distance between the cameras should affect the vertical size of objects. "You're brain does figure it out" is not a sufficiant explanation to me, since such adjustments (of which the brain is absolutely capable) are not instant.


Edited by Hiob

"Muß ich denn jedes Mal, wenn ich sauge oder saugblase den Schlauchstecker in die Schlauchnut schieben?"

Link to comment
Share on other sites

45 minutes ago, Hiob said:

I know how it's supposed to work and how you think it works. That wasn't my question.

I'm interested in an explanation on the exact mechanism that make it work.

As  I experienced in my example with rather simple 3D-shutter-glasses. Increasing the camera IPD in-game let to a greater depth perception, but not to a change of scale iirc. And I fail to see, how a change in the horizontal distance between the cameras should affect the vertical size of objects. "You're brain does figure it out" is not a sufficiant explanation to me, since such adjustments (of which the brain is absolutely capable) are not instant.

 

With shutter-glasses and similar setups the stereo separation changes the physical position of the left and right images on the screen and with it how much you have to converge to look at the objects rendered above or beyond the screen. The result is how strong the stereo effect is. Coupled with it you have the convergence setting that tells at wich distance inside the game the objects converge into a monoscopic image on the display. This second one can have an effect on the scale of things if it doesn't match the IRL distance of the viewer to the screen.

I'm not sure if that what you are referring to because both in VR and with shutter-glasses changing the 'camera IPD in-game' has the same effect.

Changing the distance between the cameras changes how much you have to converge or diverge your eyes to merge the left and right images of an object into a single image. That's were the "magic" of stereopsis lies, or why looking at the world from two different vantage points feels 3Dish.

Imagine you are far enough from a building so that both your eyes are parallel in order to look at it. This is perceived as a big object relatively far from you. Now you move your eyeballs apart from each other far enough so that now to merge the left and right images of the building as they form on the back of your eyes you need some convergence angle. This building will look now as a smaller object closer to you. That is what is happens when you change the cameras in the game: moving them closer or farther will make you change how much you converge your eyes to focus on different objects and with it the perception of sizes and distances.


Edited by average_pilot
Link to comment
Share on other sites

15 minutes ago, average_pilot said:

With shutter-glasses and similar setups the stereo separation changes the physical position of the left and right images on the screen and with it how much you have to converge to look at the objects rendered above or beyond the screen. The result is how strong the stereo effect is. Coupled with it you have the convergence setting that tells at wich distance inside the game the objects converge into a monoscopic image on the display. This second one can have an effect on the scale of things if it doesn't match the IRL distance of the viewer to the screen.

I'm not sure if that what you are referring to because both in VR and with shutter-glasses changing the 'camera IPD in-game' has the same effect.

Changing the distance between the cameras changes how much you have to converge or diverge your eyes to merge the left and right images of an object into a single image. That's were the "magic" of stereopsis lies, or why looking at the world from two different vantage points feels 3Dish.

Imagine you are far enough from a building so that both your eyes are parallel in order to look at it. This is perceived as a big object relatively far from you. Now you move your eyeballs apart from each other far enough so that now to merge the left and right images of the building as they form on the back of your eyes you need some convergence angle. This building will look now as a smaller object closer to you. That is what is happens when you change the cameras in the game: moving them closer or farther will make you change how much you converge your eyes to focus on different objects and with it the perception of sizes and distances.

 

How does that work in the vertical?

"Muß ich denn jedes Mal, wenn ich sauge oder saugblase den Schlauchstecker in die Schlauchnut schieben?"

Link to comment
Share on other sites

36 minutes ago, Hiob said:

How does that work in the vertical?

With shutter-glasses it doesn't work at all. It forces you to keep your eyes level. If you tilt your head your eyes will try to compensate to keep the stereopsis happening but it's unnatural and very uncomfortable and the eyes give up soon and the illusion fades away. In fact with shutter-glasses you are suppose to keep your head as still as possible. In CAVE-like systems headtracking is used to account for the position and orientation of the user and the software changes the rendering accordingly.

VR is equivalent to real life because the screens are attached to your head and move with you plus the game knows where the displays are because the headset provides parameters about the geometry of the headset as part of how the API works, and also there is headtracking, so what the screens display always correspond with what you'd see if the game were real, of if you were inside the game for real. That's the goal after all.

If you mean how this eye convergence thing I explained before work for the size of the objects in the vertical dimension then I only have speculation. (Damn! I shouldn't have used a building as an example! :D)

Intuitively I'd say that the scan of the environment with the eyes and with the head complete the picture. On top of that there is also eye lens accommodation and knowledge about the objects you see and expected dimensions.


Edited by average_pilot
Link to comment
Share on other sites

It’s interesting to realize your vision and phenomenon like depth perception are just signals your brain interprets. There was this case of a blind person having their vision restored who just couldn’t adapt to it. Depth perception is as much a learned behavior as it is a mechanical or physical product of stereo vision. You could be fitted with prismatic glasses that made you see upside down and given enough time your brain would adapt to it. 

  • Like 2

i9-14900KS | ASUS ROG MAXIMUS Z790 HERO | 64GB DDR5 5600MHz | iCUE H150i Liquid CPU Cooler | 24GB GeForce RTX 4090 | Windows 11 Home | 2TB Samsung 980 PRO NVMe | Corsair RM1000x | LG 48GQ900-B 4K OLED Monitor | CH Fighterstick | Ch Pro Throttle | CH Pro Pedals | TrackIR 5

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...