Jump to content

Rosebud47

Members
  • Posts

    647
  • Joined

  • Last visited

Everything posted by Rosebud47

  1. @Highwayman-Ed Actually I wouldn´t like to interrupt you posting more compliments, but thank you very much for this. :) -2.32 is accurate with render target setting 1.0 and 1.25 in Pitool. To meet the exact aspect ration with a higher render target 1.5, 1.75 or 2.0, there would be need a 3rd or maybe 4th position after the comma. Some thoughts on judder and ghosting ( double image effects ) in DCS: Special VR games and experiences are build from scratch with minimal load for the render process respectively the GPU to get a smooth experience in VR. Also most VR experiences or games use some kind of teleporting system to move your position in the 3D World and/or snap turning to move around axis. As long as you don´t move or only slowly move in VR you´ll get a smooth experience, if the graphical fidely reflected in detailed graphics like high polygone, high amount of 3D structures and high res. textures does not exceed the capability of the GPU to change the rendered picture in time due to movement. DCS is quite the opposite as it does give you quite accurate the feeling of speed in movement by flying a jet, means the rendered picture needs to change very often when flying low level. -There are more structures to render over a city, than over the sea. -The closer you get to the ground, the more often the rendered picture need to change to maintain the feeling of your speed. -when you fly by another plane both objects move with high speed over a griund which also has to be rendered while moving - most demanding is rolling in your jet while looking to the left or right of the cockpit and not straight, as the rendered picture need to change superfast in every degree of the roll to give you a smooth experience. Some jets can do a 360° roll within approx 1 second. In this one second of rolling the render process needs to create let´s say for every degree a new image to give you the perseption of a smooth roll. The figures are not accurate and are just an example to visualize the problem. So the payload for the GPU and render process receives a massive impact with superfast movements in DCS VR while having to render high polygone environments. First step for a smooth experience in VR is to keep the maximum framerate of 90fps or 45fps with reprojection/ASW/smart smoothing. With stable 45fps the supporting algorythms generates a copy of rendered frame, which is faster, than the frame would have been actaully rendered by the GPU, and fills with the copied frame the 45 real rendered frames to reach a perceived 90fps. This works fine as long as you don´t move too fast through the 3D environment or so saying the 3D environment does not need to change to fast to give you the impression of a fast movement. If you´re moving faster than the render process delivers new real and copied frames, you perceive the copied images as a doubled image or ghost image every let´s say 20 milliseconds. So, fist to mention is: Try to keep a stable and average framerate under every circumstance, which should be half of the Hz your VR headset is set up, to maintain a constant and fluid continuity for the interchangeable real rendered frames and copied frames, which in sum should create a smooth perception of the 3D environment in VR while moving. If you´re dropping under the fixed 45fps things get much more worse to your perception. It´s more delicate when coming to frametime, which is a more real measurement for the performance in VR, as the FPS should be fixed at 45fps in anyway. BTW: I would expect measuring the FPS with the build in fps counter in DCS, that you only measure with it the frames on your monitor not the frames in the VR headset. You would need a special VR framecounter to get measuring the fps in VR. The framtime is the time needed for the render process to deliver one frame to the headset. 11 milliseconds is the focused value in framtime when processing 90Hz headsets to reach the 90fps target. If the render process could reach 11ms to create one frame continuously at a frequency of 90Hz/FPS then you should´n perceive any judder or ghosting/ double images. Remember the payload to create one frame is much more in a high polygone to redner scene, than over the sea or at 30 000 ft heights. So, while the render process could deliver one frame at a frametime of approx 11ms over the sea, it would need maybe 20ms to create a frame of a city with many polygones, details and eye candy. So while you get the impression of a smooth image over the sea, over a city you will get judder and ghosting as long as you don´t keep the framtime as much close to 11ms as you can get. When the frequency is set to 90Hz ( to reach 90FPS ), but your frametime is over 11ms, one frame stays longer visible as the continuity in frequency has to wait for the next frame longer than 11ms. The higher your fametime is, the longer each frame stays visible to your perception, resulting in ghost images in the worst case. If you can´t stay even at stable 45fps you might get jiddering or stutter effects generally and in addition the perception of double images at fast movements gets much more worse. ...oh, have to go back to work - please add or correct if I´m wrongly thinking.
  2. Yeah, the problem with the 120Hz is, that you do need to get 60 real FPS for optimum smoothness as reprojection need the 60FPS to work properly in 120Hz mode. Ach, we´ll see ...
  3. Once I´ve tested the 60Hz mode of the Odyssey just for fun, but it wasn´t. It felt uncomfortable to the eyes. It´s very interesting, that Valve is coming up with higher display frequences, than the 90Hz standard. When 1st gen VR headsets were introduced, it has been said, that 90Hz is minimum you would need to not feel nausea in VR. But there already is a connection from the display frequency to the FPS the graphics card delivers, so saying that 90Hz is the minimum display frequency could also mean or more specific mean, that 90 FPS is needed to not feel nausea or uncomfortable in VR. So, you can´t get 90FPS ( with or without ASW ) being displayed on displays with less than 90Hz. But, with Valve´s Index in backmind, you can get 90 FPS displayed on a 120 or 144hz display, I would think. I don´t think, refering to your post in the other thread, that the increased Hz of the Index is only marketing, instead I would expect the 120Hz to create a more calm image, a less nervous looking image, as for now and since the 1st gen was introduced could be observed with every VR headset. In DCS the "nervous" image could be also observed in a distance of estimated 5 NM, things get blurry and jaggy, like the pixels in the distance are constantly in kind of nervous movement, maybe or hopefully a higher refresh rate of the displays could countervail this perception. I got a benQ Monitor with 144Hz and when I switch the desktop from 60hz to 144Hz the perception of the desktop image instantly gets completely calm even I didn´t feel the 60Hz before were uncomfortable, with 144Hz looking on to the desktop feels more like looking into a book or magazine... I think that was meant, when Norman from the Tested guys mentioned, he felt much more in VR with the higher Hz mode of the Index. In opposite to that, I think the wider FOV the Index is promoting is more or less marketing, because you only perceive it as wider as closer you move the displays to the eyes, but the rendered image of the displays ( and of the viewports ) remain with the same FOV, means the wider FOV you could reach with the Index is not translated into a wider FOV, more to see on the left and right, in the image. You know, I think the more we´re reaching kind of betterment, the more we tend to complain. With all the individual impressions we get with different VR headsets, there should be also taken into account, that the eye is individually inert and different, so maybe, what one feels an absolute inconvinience in VR, another did not bother. Me am very excited to know how the Reverb and the Index is performing in DCS finally.
  4. @Nagilem in one of the video reviews ( I don´t remember which one ), someone mentioned, that he´s running the Reverb fine at 60Hz. The Reverb runs at 90Hz natively, but there is the option in windows under the settings for windows mixed reality to set the WMR headset at 60Hz, so I thought, this setting was mentioned. I´m also not sure, if you could directly compare the performance needed to process a supersampled image with res. 1600 x 1440 to a not supersampled image with res. 2160 x 2160. Guess we have to wait and see results from practice, doesn´t help anything until someone does it for real. Oh, another thing just to mention is, that there´s a significant difference in performance between the P51 Mustang ( like it was shown in one of the review videos ) and the F14 Tomcat or more bigger difference performancewise with the MiG21. At least I´m hoping for the spring update for the MiG to get a better performance with it.
  5. True. I could get the MiG down in one piece and all wheels intact, no question, but Steinsch´ landing is really something to accomplish. Yeah, a little bit too slow and the fishbed falls like a stone from the sky, but exactly for this sensitvity and other special behaviours, I love to fly this bird. :)
  6. Really a wonderful landing. :) Landing the MiG most time feel to me, like someone drops a basketball ...need to practice more landing the MiG.
  7. The lighthouse boxes are a good investment. I´m still using the lighthouse boxes, which I bought together with the Vive some years ago and I believe, they could be also of use for the next VR headset.
  8. The Index is the only one I would get from the new headsets, but recently I could get aware again how much sharpness and clarity you could get out of the Pimax with a combination of render target and pixel density in addition to the large FOV... I think the Pimax could in 3 years still be state of the art, if performance could get improved. For the Reverb, I think people might regret purchasing it. The 60Hz WMR mode is not good and meant for movies, not 3D games. Running the resolution of the Reverb in DCS is as same challenging as with the Pimax resolution and the Reverb has nothing more to offer than its resolution. When coming from one of Oculus´ headsets, the Index will be the most perfect upgrade for sure for DCS.
  9. no, not necessary. You could mount the boxes as well on tripods or any photohragher compatible equipment, like clips or simply put the box on furniture.
  10. So far I don´t see much of a difference with ghosting than before. With ghosting in DCS it is like before: the lower the framtetime, the less ghosting. But found another interesting entry in the profile.json file ( located in C:/user/´username´/AppData/local/Pimax/runtime/... ) As follows: "fov_outer_adjust_degree" : -2.32, "lens_separation" : 0.08988272398710251 The profile.json reflects the settings you could make in Pitool, like display timing selection ( Hz ), contrast, brightness, etc. The above two lines are not in the Pitool user interface, maybe there will be sliders or options in a future release of Pitool. Anyhow, "field of view outer adjust degree" is set by default to "0" - I´ve edited to "-2,32", which defines the aspect ration of the large field of view, so now the render target could be set to the native resolution of the displays ( like the XTAL does ) by following setting: Pitool render target: 1.0 Large FOV mode Profile.json: fov_outer_adjust_degree : -2.32 SteamVR SS: 30% ( edit: forgot to mention that I´ve set the maximum render resolution to 4096 instead of 8192, of which I thionk is more meant for the Pimax8k not for the 5k+ ) = 2560 x 1440 render resolution In consequence, there is no longer an overhead rendered through the process, which results in better performance. Measured a benefit of approx 10% - 15% or concrete approx 3ms plus in frametime ( aldo depending on the plane you´re flying! ) . The point is, that when there is already an overhead with any render target, which will be multiplied as same as the render target or supersampling gets multiplied. But now the aspect ratio of the large field of view could be properly defined, so there is no overhead anymore for the benefit of performance. Does this make sense? Anyway, if you use the Pimax in large field of view mode, I would recommend to set fov_outer_adjust_degree to -2.32 I guess the default value of "0" is more in accordance to the aspect ratio of normal fov mode. The render resolution resulting of readjusting could be observed through steamVR supersample settings. EDIT: just made some further checks: Pitool 1.0 = SteamVR 30% render target: 2560 x 1440 / coefficient 1.77 Pitool 1.25 = SteamVR 30% render target: 3200 x 1800 / coefficient 1.77 Pitool 1.5 = SteamVR 30% render target: 3840 x 2164 / coefficient 1.77 ( <--- interesting: native resolution of the Pimax 8k panels: 3840 x 2160 ) Pitool 1.75 = SteamVR 30% render target: 4484 x 2524 / coefficient 1.77
  11. New official Pitool Version 1.0.0.132 available for download. It notes: Issue Fixed: Fix the occasional ghosting problem caused by the high GPU occupancy https://pimaxvr.com/pages/pitool ...sounds good, looking forward to check with DCS in the evening.
  12. That´s true, Pitool is currently really a bad beta. Didn´t try the Sweviver thing with the default button surprise, ... also I believe a lower frametime while having a stable FPS with reprojection or smart smoothing is more appreciable for VR, than a high frametime with relative high fluctuating FPS. So far I could have watched the Video, the frametime was as same high as before he pushed the default button. Interesting is also, that SteamVR is not completely excluded, when running Pitool and DCS without starting SteamVR. The render resolution set in SteamVR still incorporates with the render target set in Pitool, even when the SteamVR application is not running. Yeah, ... Pitool is really like a construction area :):):)
  13. For better clarity/less blurriness set in Nvidia control panel "no scaling". Nvidia control panel-->Adjust desktop size and position-->scaling mode--->no scaling
  14. So, what does it need to support canted displays?
  15. It gets a bit like catch 22... Could I forward the question to twistking? :)
  16. I wouldn´t asked this way for support for canted VR headsets, as it imply, that canted VR headsets might not work with DCS, if not explicitly being supported. Canted screen VR headsets or non-planar Vr headsets do work well with DCS! But there are minor issues with Pimax, like the VRzoom. So far there is only Pimax available as a non-planar headset and issues might also depend on the software design of Pimax´ engineers. I´m pretty sure that ED developers are as same enthusiastic about VR, as we are and as same as for us the new design of VR headsets are as same new to ED developers. Maybe Wags could tell a bit more, which VR headsets are tested due to development of DCS. I think, he will be blown away by seeing DCS through the XTAL HMD :) What I like about Oculus and the Rift S is, that it gets a lot of people into VR, who never tried before or hesitated. The Rift S could convince, as it is easy to handle and more affordable than the other VR HMDs. In what Microsoft failed with the regular WindowsMixedReality headsets of its first generation, Oculus reached more people by its popularity. So VR gets more important for DCS as more people use it as a feature. I think the distortion reports you mentioned are the ones from a early development stage of Pimax, which are now solved. Main problem with Pimax and DCS still is, that it needs a highly performant PC to get it run acceptable... but these issues surely will be balanced due to further development of graphics hardware, software APIs and DCS engine ... with regard to last named, maybe tomorrow we get a nice surprise :) :) :)
  17. It´s true, that there´s no common sense explanation on how a particular HMD works, but the concept is quite clear for every HMD. It´s not only guessing, we´re talking about. It´s observation, techniques, facts and conclusions. The conclusion themselves are subject of the discussion. Beside the let´s say marketing competition orientated discussion around VR headsets, this discussion here should be open and not to be moved at PM, so everybody could have access to follow, add or contradict without being afraid to be ridiculed or anything. The intention should be to optimize the VR experience for all or particular HMDs in DCS. So there is also no need for me to keep you posted, you could simply add your conclusions and findings to the discussion. My conclusions on the issue with the VRzoom in DCs with Pimax are right. The problem could be also visualized in the drawings. Your drawing, in which you added the viewports, is, how it should be to make VRzoom work. My drawing, picture_4, shows how it is and why the VRzoom doesn´t work. But I think I´ve maybe found a possible solution, which is following code, which need to be implemented into the VR render pipeline: "" IVRSystem::GetEyeToHeadTransform Jeremy McCulloch edited this page on 1 Nov 2018 · 2 revisions HmdMatrix34_t GetEyeToHeadTransform( Hmd_Eye eEye ) Returns the transform between the view space and eye space. Eye space is the per-eye flavor of view space that provides stereo disparity. Instead of Model * View * Projection the model is Model * View * Eye * Projection. Normally View and Eye will be multiplied together and treated as View in your application. This matrix incorporates the user's interpupillary distance (IPD). eEye - Eye_Left or Eye_Right. Determines which eye the function should return the eye matrix for. "" If I find more time, will try to look and/or add into the renderpipeline, maybe it could be added to the stereo.lua ( don´t think, it would be such easy ) or maybe by hexediting the OpenVR_API in DCS. I think best possibility would be, when Pitool gets open source and easier to edit to implement the eyetoheadtransformation. Beside this single codeline, it will be needed to reference to IPD settings to make it work. The IPD setting within DCS system option is a different thing, than the physical IPD settings of the HMDs. But the reference need to be done to the IPD setting from left and right viewport. Parallel Projection in Pitool surely needs more investigation to absolutely know how it works and what it does. You´re right that my conclusion, that it ignores the depth information in the image are more guessing. Fact is, that it doesn´t work right for VRzoom. Another fact is, that the Pimax works differently from the HMDs which work in anyway with parallel projection. The Pimax works in regular more like 3D shutter glasses and the image is projection in one eye while the other don´t get a image at the same clock of the panels frequency and vice versa. That explains, why parallel projection is more performance consuming, than without parallel projection. Actually I think, that the engineers of the XTAL achieved a perfect match for their compositor with the shape of lenses and headset design and that´s the reason, why the image quality and performance is much more better than with the Pimax, but that, honestly, is just a guessing beside. To know more about, how the VRpipeline could be adjusted for DCS, will be also of advantage for other new and upcoming HMDs. The HP Reverb´s panels got an aspect ratio of 1:1, they are quadratic. If the rendered VR image is based on a usual rectangular aspect ratio, there will be always rendered more, than could be projected in the visible area of view respectively the displays. Presumtion is, that these differences will be also accompanied by a loss of image quality, beside the waste of performance. Time will tell how things work out and I think we´re now a bit in a changing time through more different HMDs available for DCS, which also brings the need to adjust the settings in the VR pipeline in accordance to the differnet specification of the HMDs to get the best out of the headsets and DCS.
  18. @ram danke für den Tipp mit dem VR Mod für die F-14. Das ist genau, was die F-14 in VR gebraucht hat. Alles sehr viel besser lesbar mit dem Mod.
  19. Das ist nett, Grizzly - da kann ich nicht nein sagen. :) Demnächst müsste auch die neue VPC Mongoos T50CM base eintreffen und nach Jokers Worten sollte die einen gewaltigen Unterschied zur Warthog HOTAS base ausmachen... wenn nicht, muss ich sowieso dann ein ernstes Wörtchen mit Joker sprechen - das Teil hat immerhin 350,- Euro gekostet. :) :) :)
  20. Same with me, never needed the IPD adjustment to get a sharp image. It´s a human factor. Most people need IPD adjustment along the way few see sharply everything all the time.
  21. Hi twistking, it´s really interesting and things to learn from your perspective as professional photographer/cinematographer. But I also think, there are lots of differences to virtual reality/VR HMDs. Most interesting is, that with both there could be found similar effects, but from different conclusions. My goal is to learn exactly, how it works to be able to find the right switches to improve and it´s quite thrilling with VR, as the technology we have is pretty new to everybody. Anyhow, I´d like to suggest to have a break in this discussion, as I don´t like to always test, conclude and make some findings which might point in the right direction or not. Just preordered the F-16 and still need to learn some system with the F-14 and improve flight skills by practice or let´s say in short: enjoy DCS. To the Viewport and canted displays there is still a lot to say and draw some pictures from my side, but let me keep it in mind to come back to that later, if you agree and still interested. Maybe ED could have fix the VRzoom meanwhile and we won´t have anymore to discuss this problem... hahaha. It would have been also of advantage if you would have a VR headset or best a canted one, like the Pimax to approve the findings I observed and for me to confirm the conclusions you could give by using the headset... just not to be theoretically all the way.
  22. Totally agree theoretically, let´s see in practice first ...
  23. H = height W = width before buying the HP Reverb ´blind´, I would really wait for some testing with DCS, as it is to expect, running the Reverb in DCS in its full resolution, will be a massive impact on performance and maybe not satisfying on midrange systems ( like GTX 1080Ti ). Whereas the Reverb counteracts the disadvantages of a less clear and detailed image due to the design of the HMDs, especially the so far regular fresnel lenses, with a massive increase of panel resolution on cost of GPU, CPU and DCS engine performance, the Index sounds more promising to me for a good quality images in VR. Need to know exactly, when the Index is available, but I would expect, that Valve engineered a lens technology, which counteracts the disptortion of the lenses on a physical base, like one konvex lens blows up the image from the panels and another concave lens counteracts the distortion before the image meets the eye. In result the Index might then have a clear image all over with no sweetspot and the need for super high resolution to achieve the detail level in comparison to the Reverb. But that´s more or less speculation, but sure is, that the Reverb will have performance issues with DCS in its full resolution with state-of-the-art GPU generation.
  24. Also ich fand die Zeit toll bei den Ugly Angels. Sehr nette und unverkrampfte Truppe. Vielleicht kommt ja irgendwann mal der Mi-24 oder zumindest das Update für den KA-50, so dass die Helis wieder interessant für mich werden ... edit: ... ist jetzt auch kein kindisches Psycho Gelaber, sondern genau so gemeint, wie es da steht
  25. When setup as seated, not room scale, then there are no bounderies. Always had the Odyssey in room scale. Didn´t like 3DOF....6DOF feels more natural to me. Oh, I think the slider to set up the visibility of the bounderies, was for the Vive in Steam. But there is also an option somewhere for the Odyssey for setting the bounderies to invisible.
×
×
  • Create New...