Jump to content

World Scale is off (Everything is tiny!)


NakedSquirrel

Recommended Posts

If it was fov, you will see some distortion in the edge of the screen and when something is next to you it will appear like it is in front, you will get motion sick and 1:1 head tracking will feel strange. It is simply a general scale. Yes objects compared to each other are correct but this doesn't mean the scale is 1:1 to the real world.

And I will repeat again everyone that cant imagine the scale in the 3d world go and download demos of xplane/flyinside and play with the scale and then come back here to say what it is.

 

Sent from my Redmi 4 using Tapatalk

 

Another +1 here.

Link to comment
Share on other sites

All that 3 dimensional vision provides is a relationship between objects in 3 dimensions. What you perceive your size to be is determined by external factors - i.e. your own perception of reality versus virtual reality.

 

When I see comments like this it makes me think that you have never actually tried VR but are equating it to something like watching a 3D movie.

Link to comment
Share on other sites

I have sat in most aircraft depicted in DCS at airshows, museum etc. The size of things in DCS VR doesnt seem off at all to me. If anything the MiG21 cockpit in game seems a bit bigger than the real one, but that might be because it is so small that your body fills a good proportion of it once you get in, and I usually fly with the pilot model off in game.

[sIGPIC][/sIGPIC]

 

Callsign: BUNZ

 

https://www.5vwing.com/

Link to comment
Share on other sites

I have done some tests as well with this and I can say for certain the pilot models are kinda funky in what seems all of them where they exist. I am not a big guy, 5'10" 165lbs, avg. hands and arms, not fat nor supper skinny, and the pilot model is quite a bit smaller and with odd proportions, long forearms, tiny hands, thin thighs, long shins, narrow ass shoulders. Basically looks like Capt. America before he got the special treatment.etc. This seems to be a factor in all this too perhaps. I use center stick and can position my stick in the same exact spot as shown in VR and pop off headset and back on back and forth and compare real world with virtual from same spatial locations and my hands are huge compared to the VR pilot, the stick in the A10 does seem a tad bit smaller too. Again I am flipping between real world and VR quickly or even peeking and you can compare. It seems a proper sized and proper proportioned pilot model would help this issue. I am not sure how it works with the various devs but it would seem best for ED to create a pilot base model that is properly modeled and correct proportions and provide that as the template for all devs pilot model to be built from. This way there is uniformity across the board. If this is how it works currently then ED needs to make new pilot model because the current one is funky.

 

I mean the Warthog stick is a correct size copy of the real one as represented in the A-10 right? Isn't that one of the features of the Warthog, it is the same model design and size as the real one? Or is this incorrect as well?


Edited by Torso
Link to comment
Share on other sites

 

We all know binoculars aren't magical grow and shrink rays (As much as we wish they were). They simply give us a different perspective (zoom).

 

 

Perspective is a point in space, a zoom is a change of field of view.

 

To understand this it works like this:

 

Take a piece of cardboard and cut a square hole on it.

Move the cardboard in front of your face by bringing it closer to your eyes and then further.

 

Your head position is the perspective point, the cardboard distance from your eyes is the field of view and the motion you do to move it closer or further is zooming (in photography it would be the focal length).

So if you want to change your perspective, you can't do that by zooming (changing field of view) but only way to do it is to move your head position in the space.

 

 

A VR glasses are nothing else than a display that is split to two. And the machine is required to render a two different perspectives in the 3D space instead one that is required for 2D. This is same thing as in real world 3D videography where you have two cameras recording simultaneously with same focal length but slight separation in X axis. Usually this is done with a split mirror, where one camera is positioned to shoot straight down and another camera is recording straight forward. And then the 45 degree mirror in front of both cameras will split the light to two, 50% goes through the mirror, 50% gets reflected to up. And this allows camera pointing down to be moved in sideways to get the perspective separation. The perspective separation is required to be calculated depending the 3D effect that is wanted and the distance that is wanted to be in focus. So if you have object close-by, you need smaller separation. If the object is far away, you need more separation.

 

And what software allows, is that you change the virtual camera distance to each other as it isn't anything else than a virtual camera without physical limitations (why two cameras in 3D videography are positioned pointing down and straight as they can't be close enough by their sizes) and that commands the 3D effect you get for both eyes. In realistic manner, this should be same distance as your pupils are, as then if the 3D model is correct and the virtual camera perspective (distance to everything in 3D model) is correct, then you would experience a realistic 3D to that cockpit. But as we have 3D virtuality, we can simply just change that 3D effect even on the fly, so suddenly get a stronger 3D effect and suddenly weaker, just by adjusting the virtual cameras separation.

 

But it would look very odd if done dynamically as real eyes doesn't work that way.

 

It as well depends a lot with the VR glasses optics and panel size, magnification etc what can be done. A typical human vision field of view is from 190 degree to 170 degree, it is just personal thing. Like I have around 182 degree where I can see motion, but that is both eyes simultaneously. As we have nose ridge between eyes that denys other eye to see much to other eye side. So it really becomes more like 110-120 degree field of view per eye. Together overlapping to get the widest possible.

 

I don't know how VR software in DCS operates, likely point to same direction because the display is still just one piece that is set straight middle of the eye, instead in a angle and slight separation as it would generate physical difficulties for human eyes.

And this creates a problem where virtual camera angle can't be set different from each other, and seriously limits the total field of view.

 

So the fix for that likely is just adjust the virtual focal length (field of view) so it is shorter (wider) to get the wider field of view, and this requires then a different 3D projection to get the surroundings of circle fitten to 2D square ratio.

Ie, with 2D we use typical rectilinear projection and it means that edges of the screen are enlarged more than in the center. But with realistic projection it would be a cylinder or spherical where edges of screen are smaller than center. And this would allow to use a wider field of view per eye, but let the human brains to correct them naturaly back.

 

But that would definitely create a illusion that a cockpit is smaller than in 2D, and even some cases smaller than in reality as the virtual camera projection is not as use to.

 

Many knows a FishEye lenses, a very wide angle (180 degree) "distorted" camera lenses.

Well, that is actually how a human eye see as well, but as it is 2D projection we can look to corners and edges and see the distortion. But if we take such photo (or video) to very close to our eyes so it fills our vision and we just look straight center, the photo/video looks totally natural, meaning all the bending straight curves turns back to straight lines. Because our brains fixes all naturally.

 

And it is actually "dangerous" to realize this, as once you notice this effect, you can stare straight forward and you can see that hor every straight line bends like in fisheye as you fight against the brains correction for the visual information.

 

It is just very difficult to get a 3D world (reality or 3D modeling) captured and projected to human eye through a pair of 2D displays and optics, when a target audience is such that every person has unique way to see around.

i7-8700k, 32GB 2666Mhz DDR4, 2x 2080S SLI 8GB, Oculus Rift S.

i7-8700k, 16GB 2666Mhz DDR4, 1080Ti 11GB, 27" 4K, 65" HDR 4K.

Link to comment
Share on other sites

Perspective is a point in space, a zoom is a change of field of view.

 

To understand this it works like this:

 

Take a piece of cardboard and cut a square hole on it.

Move the cardboard in front of your face by bringing it closer to your eyes and then further.

 

Your head position is the perspective point, the cardboard distance from your eyes is the field of view and the motion you do to move it closer or further is zooming (in photography it would be the focal length).

So if you want to change your perspective, you can't do that by zooming (changing field of view) but only way to do it is to move your head position in the space.

 

 

A VR glasses are nothing else than a display that is split to two. And the machine is required to render a two different perspectives in the 3D space instead one that is required for 2D. This is same thing as in real world 3D videography where you have two cameras recording simultaneously with same focal length but slight separation in X axis. Usually this is done with a split mirror, where one camera is positioned to shoot straight down and another camera is recording straight forward. And then the 45 degree mirror in front of both cameras will split the light to two, 50% goes through the mirror, 50% gets reflected to up. And this allows camera pointing down to be moved in sideways to get the perspective separation. The perspective separation is required to be calculated depending the 3D effect that is wanted and the distance that is wanted to be in focus. So if you have object close-by, you need smaller separation. If the object is far away, you need more separation.

 

And what software allows, is that you change the virtual camera distance to each other as it isn't anything else than a virtual camera without physical limitations (why two cameras in 3D videography are positioned pointing down and straight as they can't be close enough by their sizes) and that commands the 3D effect you get for both eyes. In realistic manner, this should be same distance as your pupils are, as then if the 3D model is correct and the virtual camera perspective (distance to everything in 3D model) is correct, then you would experience a realistic 3D to that cockpit. But as we have 3D virtuality, we can simply just change that 3D effect even on the fly, so suddenly get a stronger 3D effect and suddenly weaker, just by adjusting the virtual cameras separation.

 

But it would look very odd if done dynamically as real eyes doesn't work that way.

 

It as well depends a lot with the VR glasses optics and panel size, magnification etc what can be done. A typical human vision field of view is from 190 degree to 170 degree, it is just personal thing. Like I have around 182 degree where I can see motion, but that is both eyes simultaneously. As we have nose ridge between eyes that denys other eye to see much to other eye side. So it really becomes more like 110-120 degree field of view per eye. Together overlapping to get the widest possible.

 

I don't know how VR software in DCS operates, likely point to same direction because the display is still just one piece that is set straight middle of the eye, instead in a angle and slight separation as it would generate physical difficulties for human eyes.

And this creates a problem where virtual camera angle can't be set different from each other, and seriously limits the total field of view.

 

So the fix for that likely is just adjust the virtual focal length (field of view) so it is shorter (wider) to get the wider field of view, and this requires then a different 3D projection to get the surroundings of circle fitten to 2D square ratio.

Ie, with 2D we use typical rectilinear projection and it means that edges of the screen are enlarged more than in the center. But with realistic projection it would be a cylinder or spherical where edges of screen are smaller than center. And this would allow to use a wider field of view per eye, but let the human brains to correct them naturaly back.

 

But that would definitely create a illusion that a cockpit is smaller than in 2D, and even some cases smaller than in reality as the virtual camera projection is not as use to.

 

Many knows a FishEye lenses, a very wide angle (180 degree) "distorted" camera lenses.

Well, that is actually how a human eye see as well, but as it is 2D projection we can look to corners and edges and see the distortion. But if we take such photo (or video) to very close to our eyes so it fills our vision and we just look straight center, the photo/video looks totally natural, meaning all the bending straight curves turns back to straight lines. Because our brains fixes all naturally.

 

And it is actually "dangerous" to realize this, as once you notice this effect, you can stare straight forward and you can see that hor every straight line bends like in fisheye as you fight against the brains correction for the visual information.

 

It is just very difficult to get a 3D world (reality or 3D modeling) captured and projected to human eye through a pair of 2D displays and optics, when a target audience is such that every person has unique way to see around.

Nice explained, thank you.

But anyway it looks both project cars and flyinside managed to make the scale adjustable which more or less solves the problem with the scale.

 

Sent from my Redmi 4 using Tapatalk

[sIGPIC][/sIGPIC]

Link to comment
Share on other sites

  • 2 years later...

+1

 

Any news/Fixes?

 

Just popped in VR (Yay!) and this was the first thing that striked me.

 

Cockpit seems to be through a 50mm lens and the outside world through a 35mm one.

 

Also the distortion from Lock On era when LOMAC was the only game to distort the cockpit like a wide angle lens when looking up (especially) is back. Lomac and DCS are the only games (flightsims) I played and that have this ugly distortion. Olegs Il-2 didn't had it and neither the rest of the games that followed with same name.

 

What gives?

[sIGPIC][/sIGPIC]

I5 4670k, 32GB, GTX 1070, Thrustmaster TFRP, G940 Throttle extremely modded with Bodnar 0836X and Bu0836A,

Warthog Joystick with F-18 grip, Oculus Rift S - Almost all is made from gifts from friends, the most expensive parts at least

Link to comment
Share on other sites

+1

 

Any news/Fixes?

 

Just popped in VR (Yay!) and this was the first thing that striked me.

 

Cockpit seems to be through a 50mm lens and the outside world through a 35mm one.

 

Also the distortion from Lock On era when LOMAC was the only game to distort the cockpit like a wide angle lens when looking up (especially) is back. Lomac and DCS are the only games (flightsims) I played and that have this ugly distortion. Olegs Il-2 didn't had it and neither the rest of the games that followed with same name.

 

What gives?

 

Make sure your IPD is set correctly on your VR hardware. Google it if you don’t understand

Link to comment
Share on other sites

I have prescription glasses since I was 8 and now I am 44. I know what it is and now is 62. It's written on my prescription. I set that in Oculus software and enforced it also in DCS since first second. It has nothing to do with the scale. If I get glasses with wrong ipd I don't see things smaller, I see things double.

 

Ipd distance just makes the two images overlap so they recreate correctly the 3d perception of objects.

 

The problem in game is due to having different focal lenghts for projecting the two parts of the dcs image.

As I said , the cockpit and plane are seen through a bigger focal lenght than the rest of the world. It is like a dolly zoom effect sort of...

 

Another thing is that this effect is not seen in other vr flight sims

[sIGPIC][/sIGPIC]

I5 4670k, 32GB, GTX 1070, Thrustmaster TFRP, G940 Throttle extremely modded with Bodnar 0836X and Bu0836A,

Warthog Joystick with F-18 grip, Oculus Rift S - Almost all is made from gifts from friends, the most expensive parts at least

Link to comment
Share on other sites

I have sat in most aircraft depicted in DCS at airshows, museum etc. The size of things in DCS VR doesnt seem off at all to me.

I didn't sat in any of them, but the scale of the cockpits seems right to me, bigger ones would mean wasted space.

 

I guess some people are having a bug related to IPD, so they see it differently.


Edited by cercata
Link to comment
Share on other sites

Can you explain what IPD does in your opinion?

 

The cockpit sizes are OK. Is not the size!!! Is how they are projected and how they are projected differently that the outside world (not plane wings etc)

 

So if you went to a museum and sat in a cockpit and it seems the "size" of it is OK in DCS is IRRELEVANT! The cockpits are fine!

[sIGPIC][/sIGPIC]

I5 4670k, 32GB, GTX 1070, Thrustmaster TFRP, G940 Throttle extremely modded with Bodnar 0836X and Bu0836A,

Warthog Joystick with F-18 grip, Oculus Rift S - Almost all is made from gifts from friends, the most expensive parts at least

Link to comment
Share on other sites

I have Oculus Rift CV1 and in the headset i set IPD 64 (my true value) but for the world and cockpits to look like the correct scale I set IPD in game to 72.

Don't forget to tick the box in front of in game IPD setting (I did and wondered why nothing changed at first :D)

High in game numbers reduce size, low increase. At least this is my take on it.

[sIGPIC][/sIGPIC]

_____________Semper paratus, In hoc signo vinces________________

 

PC: Intel i7-8700K (4.9 GHz), Aorus Ultra Gaming Z370 MB, Gigabyte RTX 3080, 32 GB DDR3 (3,2 GHz), Samsung EVO 860 M.2 500 GB SSD + Samsung 960 M.2 250 GB SSD Gaming: Virpil T-50 CM2, TM WH Throttle, Crosswind pedals, HP Reverb

Link to comment
Share on other sites

This is so strange. I will try to increase it over 62. Thanks.

[sIGPIC][/sIGPIC]

I5 4670k, 32GB, GTX 1070, Thrustmaster TFRP, G940 Throttle extremely modded with Bodnar 0836X and Bu0836A,

Warthog Joystick with F-18 grip, Oculus Rift S - Almost all is made from gifts from friends, the most expensive parts at least

Link to comment
Share on other sites

I have Oculus Rift CV1 and in the headset i set IPD 64 (my true value) but for the world and cockpits to look like the correct scale I set IPD in game to 72.

Don't forget to tick the box in front of in game IPD setting (I did and wondered why nothing changed at first :D)

High in game numbers reduce size, low increase. At least this is my take on it.

 

:thumbup:

Gigabyte Z390 Gaming X | i7 9700K@5.0GHz | Gainward Phantom GS RTX 3080 | 32GB DDR4@3200MHz | HP Reverb | TrackIR 5 | TM Warthog HOTAS | MFG Croswinds | DCS PD 1.0 / Steam VR SS 170%

Link to comment
Share on other sites

Ok, it seems I must apologize.

 

Changing the IPD and observing the image on the monitor does what I expected (moves the 2 images appart). But in Oculus this has the effect you are saying. Makes the cockpit be perceived bigger/smaller and world the other way around.

 

I stand corrected... but I have no complete explanation for this. True that making the eyes go further appart or closer to each other would make things closer seem wider/narrower in a sense (if you look at a box in front of you and then pull your eyes appart and distance them one from the other while still focusing on the box.... yeah... the box might seem to enlarge itself compared with the world) but I didn't expected it to be that dramatic.

 

Another bad point was that I had set my IPD to 62...but... I can't explain why... I found it afterwards to be actually 6.2 (yes with a dot between the numbers...) :D . So the effect was really dramatic.

 

Now I settled to 62 as the difference from 62 to 72 is not really big and I just hope the rest of the viewing is OK and I don't see "doubled".

 

So thank you everyone to point me to the correct direction (insisting on it :D ). I stand corrected.

[sIGPIC][/sIGPIC]

I5 4670k, 32GB, GTX 1070, Thrustmaster TFRP, G940 Throttle extremely modded with Bodnar 0836X and Bu0836A,

Warthog Joystick with F-18 grip, Oculus Rift S - Almost all is made from gifts from friends, the most expensive parts at least

Link to comment
Share on other sites

  • 1 month later...

I forgot to come back to this thread since I posted.

 

DCS now has the option to "Force IPD" in the VR settings (It got the option a long time ago). You can use this to effectively adjust the game scale.

 

I was originally using the Vive, which uses both hardware and software IPD, but once you load into the game, the IPD doesn't update, so I would adjust the lenses to get good focus on the screen, but the software IPD might be vastly off. The "Force IPD" setting fixed this.

 

I found somewhere between 57-62 works well for me. I used "Show pilot body" and "enable motion controllers" to find a decent scale. Enabling motion controllers gives you virtual hands in game, so I could guess from pictures, videos, or what I remember from museums or air shows to tweak the scale.

 

The WW2 aircraft seem to have pretty decent proportions, as well as the F-14, F18, and F16. The only aircraft I've been able to actually sit in is the F5E, and that one seems to have fairly good scale as well, at least as far as I can remember.

 

With that, some cockpits have scale issues. The MiG-21 is the most obvious if you look at the side switches. It would be difficult to fit a finger between the metal guards to get at any of the switches. The MiG-19 has similar guards but they are spaced far enough apart to give reasonable access to the switches, and the switches appear sturdy/large enough not to easily break, whereas the 21's are smaller than a toothpick (at least the side panel switches are).

 

So if you are going to use Force IPD to tweak the scale, I would suggest using one of the WW2 aircraft or P51 trainer and "Show pilot body."

 

-----Note-----

 

M3LLC is working on the MiG-21 Cockpit, specifically the VR issues. Hopefully that includes scale:

 

Yes, Phase I of the cockpit update is meant primarily to adress VR issues as it was created at the times when VR was not yet a thing.

Edited by NakedSquirrel

Modules: A10C, AV8, M2000C, AJS-37, MiG-21, MiG-19, MiG-15, F86F, F5E, F14A/B, F16C, F18C, P51, P47, Spitfire IX, Bf109K, Fw190-D, UH-1, Ka-50, SA342 Gazelle, Mi8, Christian Eagle II, CA, FC3

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...