

Fri13
Members-
Posts
8051 -
Joined
-
Last visited
-
Days Won
4
Content Type
Profiles
Forums
Events
Everything posted by Fri13
-
I don't use zoom. Yes, and where I talk there about using zoom? So you claim that sitting in a Co-Pilot seat to operate its functions is unrealistic, but operating his functions from a commander seat using a laser beam is realistic? Okay.... Don't build straw man arguments that is about zooming. As zooming is unrealistic unless you are using binoculars (that pilots do happen to have). As in reality you can't spot things so easily at all as you do in DCS. In DCS you have superior vision all the way without requirement to use binoculars. And zooming does it all even more unrealistic. Is it a game or not? Yes it is a game, but it tries to be as well offering as realistic experience as possible the player wants. Player has option to go for as unrealistic means as possible. It is not double standards to say that zooming is unrealistic as you don't have in reality means to "zoom" without optical devices, related to be able fly as commander or as co-pilot. In reality you do not start swapping seats middle of flight so that commander can select the weapon pylons, the co-pilot does that. If you have a trouble for that, then you are having double standards by using it as argument that laser beams are realistic that commander just extends left hand from otherside of the cockpit and says "Sorry Ivan, I just rotate that knob right there....". One could very well go and do a simple voice command software macro that will rotate that very specific element X times clockwise and then set it behind voice command like "Ivan, give me GUV, please!" and then magically that knob would rotate to GUV position as AI is not doing it. One could see more effort to make it so that commander says "Switching to 7.62" and macro is run from co-pilot seat properly. Saying "Switching to pylon 2 and 5" could set it properly again. But how would you really do it with a another human being, is how you do it in co-op. Other sits there and does it when you communicate. Or you can simply when flying alone tilt a hat and you are the co-pilot and you do it, and tilt a hat and you are back as commander. You just played two separate roles as your friend was not flying that time with you and there is no AI that could understand what you are about to do, or there is no flight engineer that prepares weapons systems for launching. This is reason why Heatblur developed their Jester and Iceman so you don't need to be on both seats when you fly alone. This is why ED is developing their AI for Mi-24 as even there you need to perform some duties when playing alone, but they did select Mi-24P as well because you can fly it from both seats and do the all required things from front as from rear. Want the ultimate reality? Play with friends only. Not alone, not with AI.... You can keep zooming as much you want, but I don't do it as it is unrealistic. And I don't need to as I can read almost everything in cockpits. My purpose is to minimize as much unrealistic things as possible, to the point where it is sensible and possible. You know this as I have had this discussion already earlier. Everyone has their own limitations that are understandable. Like someone is using a old VR HMD that is not so detailed and they can't see without zooming, fine it is their limitation. Someone doesn't like the idea that their neck doesn't turn 210 degree with just slight nod, so they stay on TrackIR. Someone has amputated left hand so they need to use only right hand for everything, it is their limitation. Someone is blind from one eye but still wants to use VR. Isn't the gaming fun that we get all the options and possibilities to enjoy from something? Are there tens of thousands players putting name that features exists there? So why don't you? I have. I have given them more options what they would need and want, helped them more than you have. Helped the developers to see what features there would be required to be added to get things better than ever. And you are against it all. As you just want the old ways back. And you stated how it is more realistic to use one set of VR controllers than using a HOTAS for fitting aircraft controls.... Yet flying with left hand on laptop to press space bar to launch missiles and using comma, semicolon etc to steer TGP as you don't have a throttle where those would be... They have already redeveloped the controls, that is the thing! They have already taken more steps to progress further, not to stay behind. Now you can use right controller to operate the cameras and move across the battlefield (or at least I have found it just now) as they are adding the core RTS game elements to game now. As I have stated to You earlier and know every well, I did use looooong time ago. But they are not realistic like a proper HOTAS is. They don't offer the capabilities that proper HOTAS does. If you want to have experience how to fly aircraft with one hand on throttle and one hand on stick, you need to have throttle and stick where to lay your hands. Those are not same thing at all as these are: And those don't offer all the same things as: null But what comes to having the proper features on each proper device and otherwise use this: As ED has offered all that. To small details, clickable and usable almost everything. All you need to do is to loosen grip from stick and throttle reach your hand on a the system and use it, and then lay your hands back on throttle and stick that has all the proper functions in them and nothing more. The OP has falled short that those systems has been removed. They have not. Do you understand what I wrote in the first message? Do you understand what Original Poster wrote in his bug report? I understand very well that you have only two real bugs here that OP didn't even talk about other. 1) no laser beam (opinion should it be there or not) 2) when you have VR controller grabbing a virtual controller, the trigger and buttons do not output anything (not topic of this thread, so start a new one from that). But when it comes to that the common bindings were removed from the Touch Controllers, that is false claim as they are implemented. They are there to be used. There is nothing required to be done to get them back as they didn't go anywhere. Or if you would read my first post, you would understand that was a 100% false. So get to the topic: Because as you have been provided with evidence, only laser pointer is lost and bindings are still there and usable. And it is good for some of us that laser pointer is gone, but we understand your kinds problem that you want it to have but we do not accept that it would be brought forcedly back so that we would need to suffer from it again. Instead we recommend that it becomes as an option in the VR settings that player can enable the laser beam if they want it, so everyone can have a choice to have it Off or On. But as you do not accept that, it is same as forcing others to use their gaming devices only with the way you want it to be.
-
Did you do clean install by removing all the previous version configs and saves? As everything that OP says doesn't exist anymore for mini-stick, are still there. Those features has not been remove or replaced. Instead there has been removed the laser beam (topic already discussed) and added bunch of new effects like VR camera controls with the right hand. So tell what does not work in that, compared to what OP says needs to be brought back? "Before I didn't have to setup the controller in the usual input bindings, the trigger did left click but the thumb stick would rotate or click switches up or down. Now the beam is gone so I can't easily see what I'm pointing at and only the trigger works, so I can't turn dials or flick switches easily anymore. Any idea when these things would be back as at the moment the game is pretty much unplayable and 2D is not an option for me." So did those things got removed? No Are those the usual input bindings? Yes. Are they required to be binded? No, impossible as you can't bind anything on VR controllers. Everything that OP says is missing (except the laser beam), is still there. Not missing, not gone....
-
That is not the OP bug report topic. He does not have mini-stick doing left/right mouse clicks and rotating knobs and all that. Nothing about "Hey, when I grab a virtual stick with VR controller the trigger doesn't do anything even when a index finger in new gloves moves as it would be pulling a trigger". Yes, as I have acknowledged it from the begin already... That has been missing thing from the day 1 years back that we can not bind all the buttons and everything as required in the VR controllers when we grab things but we are relaying to some specific ones. And as I have explained, going through the binding process using Vjoy, one can see that those are still binded to the VR system part. If you try to bind them doing so, there comes the conflict and system (as with any binding twice same control button/hat/axis) says that "This is already in use for this, do you want to cease to exist there?". We need a official mean to bind trigger do what we need it to do. We need to have means to bind TDC axis to mini-stick. To set even a NWS/Undesignate or Lock function to buttons. So that no one is required to use keyboard or anything else to do the basic things at minimum level (naturally one can't bind all that Warthog throttle has for a left Touch Controller, but very basic things can be done like TDC, Lock, China Hat Forward" Zoom is one of the features that not everyone need. Or maybe some one would like it to be on the left controller instead on right one. "Fix the cause, not the symptoms... " Even if zoom and trigger would start again working, it doesn't fix the problem. It doesn't make those gaming better who are forced to fly with a one hand on laptop to operate a TDC or launch a missile or release bombs as they can't use the left with right VR controller to do those basic things.
-
I don't use zoom. It is 100% unrealistic. But as I said, it is already acknowledged that bindings are gone when grabbing something, it needs fixing by implementing totally new binding system. Not by reverting to something old that was left behind for purpose. Mouse cursor doesn't move on me. It stays same position in DCS as on the Oculus Menu or virtual desktop. Problem is that as I explained, Oculus own hand is as well off-set wrong on me same way. It was proper some long time ago when Rift S got out but in CV1. But some patches after that it got off. Even when the 3D modeled touch controllers are exactly properly aligned. Trying to fix something that a new code added is not easy by trying to just patch old code in. It is easier just to go and just do what is needed to be done. It is not like a changing car tires that when you notice one new tire is flat, that you can just place a old tire in its place until you get new tire fixed. Yet there are tens of thousands that do not complain..... Sorry but that is not an argument. It is just an evidence that there is people who want something else. And as I have said, all is fine as long it is optional to be enabled by those who want to have laser beams and want to have a own custom bindings and want to have own pink flying horse gloves etc. More options and merrier. Common issue was that we had a laser beam coming from our finger tip regardless how far it was. Any time you wanted to use your hands it was a laser show inside a cockpit. Completely unrealistic and immersion braking. It is superior to have that laser beam gone for short ranges (reach of a hand) as your task in a cockpit is to actually reach up to the control and do something. Yes, make it a opt-in setting in VR as I have been saying from the start so those who need such, can enable laser beams to be happy. But at some point people need to understand that if they keep hitting their joystick on the table, that the problem is in the location of that joystick and table and not in the game that requires you to move the joystick such amount that joystick allows to be moved. If someone builds himself a 80 x 60 cm box where they want to play full room VR games then it is their problem that they didn't make it 2 x 2 meters at least. As I have said from the begin, there are issues that needs implementing and fixing. Part of the bug reporting is that you tell what is wrong in your opinion and you can suggest the solution for it. If a game crash because you press button A, then there is no other solution than "make it so that it doesn't crash". But when the bug report is about bindings, usability and features, then every opinion counts. Hence, more options. What options we need. Who wants what kind options. etc. As if you want something improved, you need to tell what you want. If someone just go saying "IT IS BROKEN AS WE DON'T HAVE LASER BEAMS!" then it is 100% wrong if they just would add laser beams back, because it is great that we don't have anymore laser beams! And it is not just my opinion alone. So what can you do when you have two group of people with different opinions for same topic of laser beams either ON or OFF? You make an suggestion that it would be a option in VR settings that you can enable the laser beam if you want it or leave it disabled if not wanted. When you make a usability improvement by making the virtual hands angled so that you can more relaxed way point things with virtual hand, it is counter-productive to go demand that it can't be in such position and should only be in X position. No, it needs to be a setting where player itself can go and adjust the virtual hand position and angle as pleased to fit for their purposes if the default one doesn't please them. People need to understand that bug reports are exactly about discussions that what should happen or why. In a question of simulating reality there is no much opinions as only factor is reality, so if a light switch works in reality only in Up/Down direction, it doesn't help to say that they want it to work Left/Right direction. When it comes to controller bindings and such, there are considerations that are per player. Someone doesn't want to have a zoom button, someone does. Someone doesn't want toggle but hold, and someone wants other way around. Someone doesn't want to use VR controllers and someone does. If one group go to say that their preferred setting is the only way then next time after update there will be bug report that request opposite. Only choice is to gather the opinions and make there options so people can choose who they want their controllers to behave. It is not a bug report and correct place, and you know it. If they feel something is broken, they can share their opinions and findings. I already did and pointed out those problems that you are now saying that I am against to, but I came up with the suggested solutions while you just want everything to go back to old. We finally got improvements, and you want to go back - until things gets fixed... Like what would be the fix then? Really, lets see what the OP says: "Just installed 2.7 and my VR controllers have lost the laser pointer (on grip) and key bindings." Yes, he lost the laser pointer, that we have already all agreed. But people who demands it back should not ignore the opinions of others who want it to be only optional, not forced change. It is as much wrong to demand it to be there for everyone as it is for it being removed from everyone. Solution is, make it optional so everyone can choose do they enable or disable it. Then comes to other point, "lost key bindings". And it is below: "Before I didn't have to setup the controller in the usual input bindings, the trigger did left click but the thumb stick would rotate or click switches up or down." And that is false. It all is there. All that works exactly as before, unlike how OP says that it doesn't. Trigger is left click, thumb stick would rotate or click switches etc. It is ALL THERE! So first half part of the bug report is totally false as it has not been lost. Let's see the second part: "Now the beam is gone so I can't easily see what I'm pointing at and only the trigger works, so I can't turn dials or flick switches easily anymore." So lets split it to parts. "Can't easily see what I am pointing at" like can't anyone see where their finger is pointing at? Again, in reality I don't need a laser beam from my finger so I can press these keyboard buttons or I can click a mouse button or I can flip a light switch on the wall. It all works correctly as the touch part is in the proper position on finger to be used. But of course people want to be clicking things in pilot cockpit from rear RIO cockpit. From the commander seat to co-pilot seat in Mi-8 or UH-1. The real aircraft is designed so that things are either easy to reach or more difficult to reach. Important parts are made more easily accessible while unimportant or unneeded are put aside. This is why example some functions are behind in difficult place and some are front of you in easy place. Why HOTAS has important functions so nothing else needs to be pressed while some requires hands to be moved. It is pretty moot point that someone say that they can't move hands without colliding to their table and all, once they fly only in VR because they shift their places to play so often that they can't carry a HOTAS with them in traveling etc. If all they need is a simple chair, then move that simple chair just a little further from the table so they have little more space. Again, if they have problems that they can't have all the functions in their VR controller and need to lay down their hand on the laptop, maybe it is better that they support idea to fix that problem that VR controllers would be bindable for various functions. Like they could use a mini-stick as TDC hat and mini-stick press as a TDC Action/lock. That they could use a trigger as trigger and one of the buttons as bomb release button. Now they don't need to move hands on the laptop to do tasks that they were required do previously and keep laptop near them. Hence properly bindable buttons, hats and all in VR controller fix a lot of problems and makes better gameplay same time. The other part of that sentence is just repeating that OP can't turn dials or flick switches while it is still possible since the begin they got added to VR hands (at the begin we had just floating hands without function to operate anything). And then last part from OP post: "Any idea when these things would be back as at the moment the game is pretty much unplayable and 2D is not an option for me." They are back... All those functions are as they never left! Trigger is LMB, hat does click LMB/RMB and rotate knobs and all. The finger tip is just like before, everything is like before except that there is no laser beam visible for close range use. It is right there already.... The OP doesn't talk about "I need to loosen grip from a virtual stick to zoom in" at any given part in his post. There is no "I can't shoot a cannon with the trigger when I am holding the stick" and how it is different "I play by having left hand on laptop to use space, alt+space, ,.- and l to move TDC around" etc? Because those functions isn't there. Yes the trigger enables the virtual trigger finger movement in stick but that is it. And when someone suggest the proper fix for that is that there should be means to bind VR controller functions in two modes + when grabbing stick/throttle/lever, that it is not a fix but old mechanic should be brought back. Like since when did the old 2.5 version allow to use TDC with the mini-stick in left hand VR controller when grabbing a throttle? Or have a bomb release button, trimmer and trigger in right hand when grabbing the stick? If the system has all those things still in place, they have not gone away. And if something is not gone, it can't be brought back. There is possibility that people just updated to 2.7 and didn't clear their settings and have conflict there. Those things happens. If some people have all those functions as should, but some doesn't, they have problem but it doesn't mean that the system needs reversing new features as it is not a fix! Should developers go change the code in a game if player has broken configuration files from previous version?
-
It is more fun and immersive to be required move hands around the cockpit than just point things with mouse or with laser pointer. The actual process to reach a button above you or flip the switch behind a UFC is experience. But you are not doing that so much. We never can have a realistic things like how a physical UFC allows to support left hand on left side of it, while thumb is used to press the buttons or we can't take a support to look back: https://youtu.be/WS4mZiIqzxU?t=458 But mainly you are holding hands on HOTAS. That is why VR controllers are not good as you can't operate aircraft realistic manner or properly like you can with HOTAS. It is easy to fly a helicopter for couple hours when arm rest on your joystick and leg, compared to hold a touch controller on lap and avoid moving it or move it in air. It is not so easy in DCS to flip switches that starts switching position rapidly and it makes it annoying as you can't be confirm that what you intend to do is done.
-
To get that all unrealistic back eventually (as not all is gone). Like the laser beam. Crosshair is right there when you point something far away, it just doesn't appear when you point something at close. They made it very clever manner that the crosshair and the laser beam is not distracting and blocking your view. As it is real immersion killer when every time you do something you have a laser beam pointing from your finger like 3rd class jedi. The crosshair doesn't go through the canopy (unless pointing through glass) if the module cockpit is made support it like in AV-8B Harrier it goes through but in Mi-8 it doesn't, but it is there and it is fixable and it size change dynamically. It works very pleasant way in that manner. As it is increasing immersion as you don't have constant reminder that you are in game. At long distance it is large and correctly there. Closer you bring it to you, smaller it becomes. And that is compensating for the illusion "it is outside of cockpit". It is visible only for the left eye, so it can look odd at first. The default bindings are still there. Mini stick: Up = Right click Down = Left click Right = Rotate clockwise Left = Rotate counter-clockwise Trigger = Left click Grip = Grab / Activate pointer X = Menu Y = Re-center B = Zoom A = ??? Those are the default ones, still there like previously. What DCS still doesn't have, is support to detect these controllers and allow to bind anything in them in three modes: 1) When GRIP is hold down 2) When GRIP is not hold down 3) When Virtual Hand is Grabbing something. That is what we need. Options for settings among others already mentioned (hide/show laser beam etc). The current main problems are: 1) the virtual glove does not match the real controller position and so on real hand, even when they have good idea with it now. 2) The "trigger" does not work as a trigger finger when grabbing a stick that has the trigger. Again something fixable with Vjoy but should be by ED. How are you going to fix a bug without a patch? It is actually comfortable now as is, it is just difficult to quickly point things as you need to work how you see, not how you feel. So your problem is exactly what? You said that the finger tip is incorrectly angled. It was as you wanted. You said that the grip was Press to Grab -> Press to Release. It was as you wanted (Press to Grab and Release to Release). The new hand is better than before. 1) Fix the hand angle and position as I suggested to have option for it. 2) Add binding options (as people has wanted from 2016 or so) with couple sets. 3) Add some of the options back for those who want to enable lasers and crosshairs etc. You can't say that from the loading screen and all. As tracking goes wonky when the VR app doesn't anymore receive the inputs for loading phase. Why you need to often reset the view once loading mission or main menu. And the angle is already in the VR controller angle case. Nothing new there. Yes, that is the problem mentioned from the start. The virtual hand is not tied to physical controller properly. Yes. You can see those in my MiG video pointing the red controls and how much they are off-angle from real finger. We don't need "a patch", we need a new configuration system as people have different controllers and different size hands and they hold different manners same controllers etc. The Oculus SDK includes everything for a properly tracking each controller. All you really need to do is to build your own virtual hand model around that virtual model axis and same way place all the guns (if we would have) and objects to the hand. The problems starts when we have different shaped objects than the controller is. There is reason why developers has set that hand angle so, as it is more relaxing way to point things in a cockpit as you don't need to twist your wrist so much downward and it is out of the way as the hand doesn't block the finger position. It is purposely made so, and it works on close ranges but at longer ranges it becomes little distracting as own hand doesn't match it. That is why we need configuration option that we can adjust as wanted, each of their own. ED added support for a $1000 dollar VR gloves... (VRFree) but they can not add the binding option for already existing hardware? Like I would love to see those three modes (multipliers) of bindings options. So that when I am not holding GRIP button down, I can press A/B for centering view or opening game menu. When I am holding that GRIP button down, then I am pointing something and I might want A to be left click and B a right click. And when I am grabbing something like a virtual stick, then I want the A to be a weapon release button and B to be something else. The DCS recognize when grabbing is done, so it can enable additional bindings for that mode. Others are just about multipliers as normal buttons and such.
-
We are not talking about the MFCD alone. We are talking about the POD limitations itself. The TGP LItening has a 640x512 resolution camera. You can not get any better than that by any means. It has two optical magnifications, Wide and Narrow. That is the best it can get. Using either one will give you a ~500 x 500 pixels video as it is cropped to square and utilize some stabilization etc. Now that is only at the its best possible scenario. Each digital zoom level shrinks it. There are digital processing to be done, but it is nothing amazing. Blur is blur no matter how you try to deconvolute it when you don't know exactly what was done for the blur or if you need to do it in real time. A 9x zoom level in a Litening gives you at best case scenario a 52 x 52 resolution video. That is blur. That is blur on blur and just blobs everywhere. That doesn't even count the original video being soft because all the atmospheric problems it need to see through. The military targeting pod is not better than even a binoculars. You can go look using binoculars around places and you see that at various times things just gets soft. Even when you can look from high altitude down, problems are there. Nothing you can do for it. And when you enable FLIR, that is soft on soft. In DCS that is as sharp and detailed as any other video you ever see. Thermal imagery is not something that is high detailed at that level of cameras. And you take that already soft video and you capture it with a digital zoom processing and you get soft video to output. Now that video is to be outputted on a low resolution display. Again own pixelated quality instead sharp 1024 x 1024 as now. So you get pixels and you get soft. The AV-8B Harrier just received from Razbam in 2.7 update the Litening 4th gen (2008) pod. It is best there is now. It has a 1024 x 1024 FLIR and CCD. Multi-Target capability etc. Nope, G4 is from 2008, Viper is from 2007 and Hornet is from 2005. Because ED doesn't want to provide a more realistic modules where compatible systems becomes available when mission year is newer, it means that you are always stuck to specific year no matter how realistic something would be to be available. Like example these targeting pods are independent systems, where they do all the work and feeds just the video to cockpit and receive the basic control inputs. All the fancy stuff the pods does is in the pod itself from tracking to datalinks to everything. It is plug and play basically. It is simulated too well. That is a fact. You are looking a perfect world rendition without any atmospheric limitations -> through a perfect optics -> that has no technical limitations for its resolution -> without any required digital processing to "enhance" some of the characteristic in sake of others -> outputted on a high resolution and clean virtual display. There is a major difference to look in a perfect conditions captured 1024 x 1024 video with perfect stabilization on a perfectly rendered 1024 x 1024 display and looking something that starts as 52 x 52 pixel video that is played on 640 x 640 display (or what ever). Finding stuff with those old targeting pods is soft on soft on soft. It is trying to make out what you are looking and trying to assume things that "They say it is next to the building, so that must be it" and not getting clear picture that what it really is. https://www.youtube.com/watch?v=XFj6f9L827A Even when those are recorded on a 8 tape, you can assume the quality based to what is the pure digital quality via the symbology. The youtube compression does nothing to quality, but it is so easily seen how 3-5 nmi ranges are blur without even digital zooming. Go further and the FLIR becomes even worse. This is reason why Sniper pod is so major upgrade to these old pods as it is so easy to get better from so low quality to begin with. And in DCS that is not the case. We have perfect quality where you can count almost every finger soldier has from a 10 nmi distance using maximum digital zoom. We have just received the 2.7 update for new clouds. It was just a first step of many in the whole system, but one major part is coming that is the humidity simulation. As well we are getting a new FLIR simulation. We are going to get far better modeling of the optical challenges. Like right now we don't even have proper laser beam simulation. We don't have the reflection rules, angles or anything. Some had artificial 8 nmi (or was it 18...) length for the beam, so if you were out of that range then beam ended in air and weapons hit there in air. But you can point laser from opposite side of the vehicle and weapon can impact to it from another direction because it doesn't care what is the line of sight as long there is no building (AFAIK) or terrain between. Like a common JTAC laser designators has like in excellent conditions a max 3-5 km designation range and 5-10 km laser ranging range. Battery lifetime is counted in minutes when in use, not in hours. Add some heat waves, dust, moisture etc and those ranges drops quickly to much shorter ranges. We just have totally unrealistic targeting pods quality. It is worked for. The F-14 own pod is far better simulation already but even it requires little nerfing.
-
Some improvements has been done, like now the VR view is shaped by the optical viewing capability. So instead large rectangular screens being rendered, it is now oval shaped areas. Clipping these corners and all, has good possibility to cut even 15-30% of the wasted rendering process as you can't see them anyways. You can see the effect on the reprojection but that doesn't matter as the VR view does. That is my primary visual glitch that whole cloud layer jumps up/down. The shimmering in horizon is not so bad but constant feeling that world jumps around is. So expected soon to be fixed. Have you disabled the VR option for bloom? Get a video of the light sources so it could get fixed.
-
You are repeating my findings that I have been discussing all the time.... Check the point #3 Do you guess who was first one in this thread to bring up this problem? So why you argue against when you are agreeing with me?
-
reported Overly dark when flying in shade under a cloud
Fri13 replied to jrowland96's topic in Weather System Bugs & Problems
Problem is that you would want that when you fly under the shadow, that the game would automatically adjust the lighting effects so that the shadow would become brighter so your real eye that is looking the display would need to adjust itself. It is surprising dim to be under some clouds shadows. It can be even darker than being under a sunshade (like umbrella) because the cloud covers so large area that reflective lighting doesn't happen. While being under a sunshade the area is so small that surrounding area reflects light and illuminate the shaded area. One of the major differences that people don't get is that when light is diffused, it can get very dim illumination but it looks bright. This because high contrast is lost to make a comparison that what are highlights and what are shadows. I think the lighting is proper at the moment, there is always possibility to tweak things but first weather I tested was the heaviest rain weather. And that was amazing experience as you really got that dim feeling where everything feels that someone just turned lights off. And that what DCS offers is not even a dark as it can get when lighting and all starts to feel super bright. We need little bit more contrast, like contrast to clouds shapes edges (but we might not have yet proper cloud formations for that yet, so it can be coming in future) as well sharp shadow edges and darkening. -
You should clear your configs and remake them. Here is how it works on me: As you can see, nothing like that is happening on my side as you report about the finger tip having wrong angle. And you can as well see the problem I mentioned, every switch and button going crazy when finger stays in their activation zone as there is no limit how many times they can be manipulated per second. So you get difficult times with finger operation as switch does 50 actuations per second if you do not manage to just "swipe" the switch/button quickly over, and even then they can do movement couple times and end up to same position from where you wanted to flip them off. Under is video about the problem that I mentioned at the begin, the VR hand being seriously off-set from the real hand. In most VR games this is not such a problem as they are properly angled. And many offers a configuration option to adjust the virtual hand angle (usually a gun grip angle) in Y and X positions. Some games offers even a virtual automatic alignment here you either point the object with real hand and it will automatically adjust the virtual hand to point the object. Or you point the object with virtual hand that locks it in that direction, and then you move your real hand to match what you see and feel as "correct" and it is automatically this way adjusting virtual hand to match real hand angles. Here I am pointing the two red objects with my real finger as I would see it and point "Press this" kind a way. Then I show the off-set angle in both X and Y axis that how much it is relative to my real finger and virtual finger. In the Oculus Home and all that the virtual hands are as well wrong. They point as well wrong direction but if I enable the VR controllers to be shown instead the hands, then the controllers are exactly correctly positioned, so the 3D models aligns perfectly with the real controller. It is the Oculus and now ED own "optimized hand angle/position" that is problem.
-
reported Overly dark when flying in shade under a cloud
Fri13 replied to jrowland96's topic in Weather System Bugs & Problems
Actually there are black and white films that can capture far more than human eye can see. We are talking about 10 stops that average human eye can see at once. But human eye works so fast between highlights and shadows when looking at them that it is like adjusting camera aperture where amount of light is quickly controlled. It is extremely rapid process and brain does amazing things to build the picture in mind even when we can see everything just about 2 degree FOV as sharp area. It would be good that people would go out with a spot meter and check out how much does the luminance change between shadows and direct sunlight etc at various scenes. -
reported Overly dark when flying in shade under a cloud
Fri13 replied to jrowland96's topic in Weather System Bugs & Problems
You are talking about dynamic range, and it is fooling people as human eye does not have wide dynamic range but very quick adaptation to exposure changes. One can take the above like photos and compare them in the scene and see that they are exactly as real as seen with naked eye. Nothing odd there. But we have our problem that DCS World is not simulating cameras, we are very limited to displays dynamic ranges. We are already talking about 5-6 stops on average monitor. Even on the so called "HDR" displays are very limited unless we are talking about a real HDR monitors that cost tens of thousands. No matter what would be done, the real scene can not be captured even in 3D environment and presented via monitor as realistic. But it doesn't matter, as 5-6 stops is plenty, and 7-8 is amazing. You can capture everything in the nature with just 10 stops. Where 1st and 10th stops are pure white and pure black. That leaves only 8 stops for everything else. The problem is that human eye just adapts so well and human brain creates the illusion that it sees everything at once. Looking outside from northern window in mid day shows outside bright and nice, and then looking to inside it is bright and nice, but actually the human eye adapts fast and brain creates the illusion that indoors are seen as well bright. We need to wait ED to complete the weather engine, as not all clouds are same. Some diffuse light just lightly, cutting maybe just 1-2 stops of light. While some clouds has different composition of water and cuts 5-6 stops of light just next first one. One day there are lots of clouds that just diffuse sun softly everywhere, and some days they are hard and tightly packed clouds that blocks sunlight like it would start to rain even when they look totally white and puffy. People just usually love high contrast and colorful imagery. That is again one of those things that has driven the recent shader mod to increase saturations, even when it looks unrealistic and very unnatural. Trying to present a 15-16 stops environment through a 7 stops capable display is not easy task. -
Empty + under 200 L of fuel... If that is overweight....
-
Yep. Was expecting that this would have been fixed. But can't read the displays so back to configuring on Display_StrokeDefs.lua again to get it fixed.
-
People should call it by what it really is. Page 230 TAC-000. "The thermal cuer is not a hot spot tracker, but a delta T-cuer. Thus, when flying over a coast line or river, one could expect cuers along the beach or riverside where there is a large temperature difference. The thermal cuer does not incorporate any target recognition logic either; thus, any object which meets the preprogrammed target size, falls within the cuer capture gate, and has the required delta T for the TEMP sensitivity setting is a potentially ″cueable″ object. Cattle, trees, and rocks are often cued when not desired. Figure 1-103 provides some general guidance for use in determining TEMP settings. Mission requirements, EOTDA PAR’s, and pilot experience must all be considered in order to establish optimal TEMP settings." What is a "hot spot tracker"? It is a system that will only, and only point out where are the hottest spots in the thermal image. It can not be configured, it can not be adjusted, it will do one thing and only one thing. What is the Thermal Cuer? It is a system that is programmed to point a wanted heat changes, wanted heat points sizes and wanted temperature value across all the temperature ranges there is visible and finally as well the area where the thermal is searched for. These configurable settings needs to be modeled once ED completes the new FLIR: - Gain control (MAN or AUTO) - Nudges (Auto Gate and Off-Set) - Reject (to clear the HUD from symbols as VV, waterline etc) - FLIR menu (FLMR) to program cuer function. - Temperature Sensitivity - Capture Gate / Area of Regard - Target Size - Limit (already, 0/4/8) - Black Hot/White Hot We need those functions and capabilities to properly use and utilize it, to be responsible get authentic simulation of it. A hot spot is a problem for example a Maverick IR seeker that does not have these capabilities. It will try to lock on either hottest or coldest spot that it is pointed at by setting the polarity (white or black crosshair). It doesn't have the logic to ignore a unwanted thermal differences, sizes or positions between just going for coldest or hottest target it is pointed at.
-
Whiskey barrel. Advance one.... Ready to be dropped to men in trouble.
-
Well the pilots say that 450-500 km/h should be good stable handling with it, but comparison to F that fly like a dart the Bis flies like a beer bottle. The flight performance is otherwise identical by speed and turn rates etc. It is crazy at the moment that you need to use afterburner in landing, as maintaining a 320-350 km/h landing slope drops it easily like a rock. And on moment you put afterburner on, the aircraft becomes stiff and easy to control regardless that your AoA doesn't change and speed is same until it gets accelerating (slowly). It is almost constant dancing around mil and AB to keep a 3-5 m/s sink rate as required to pull nose too much slows quickly down so easiest is just go almost full mil and perform the proper speed (280-290 on touch down).
-
I don't know when it has been done, but just noticed in the 2.7 update that MiG-21Bis mirror rotates when you open/close the canopy. Last time I recall as that the mirror kept relative angle to rear as the "virtual camera" never moved with the canopy. But that is opposite direction, so when you open the canopy to right, and the mirror points slightly downward, you see ground up and sky down. And when you look carefully the canopy animation, you see that the mirror image rotates in wrong direction and ends it to be upside down. The other is that even still the mirror center is off to the left. So when sitting in cockpit and looking the mirror, the vertical stabilizer is further right. And I need to move head to left off-axis to get it centered with the body. So the camera for mirror is slightly rotated to left. In VR this is more obvious with left eye but even with right eye the mirror points too much to left. The mirror camera should be rotated so it looks centered but alternating eyes would get it look little opposite side depending which one you use.
-
The AI in the P needs to only use the ATGM. Sure it does spotting without the scope, but it is not required at the same time period to use 3/10x sight to engage targets as it would be with the YakB sight at the close ranges when you perform overflies and passes in tight small areas. As the AI needs to perform different roles how and when it does spotting. - Naked eye it has widest field or view and capability to scan differently the close ranges areas and focus to different target types. - A YakB sight for aiming targets at 120 degree FOV area and point it quickly around from target to target and handle it by possibly correcting aiming when starting firing. - The same optical 3/10x sight in P as in V for scanning areas at long ranges, indicating target positions from spotting them with naked eye first so pilot can find them easily. Now you remove the YakB element completely in the P variant. There is only the naked eye spotting and the optical sight utilization. 1/3 of the required work was eliminated with that simple decision. As right now hen searching and spotting targets with naked eye, the WSO has two decisions. Either do nothing and just report it to the pilot. Or the WSO reports and starts using a optical sight if range is matching. The third option to make a decision that to use a ATGM or YakB is gone. There is no need to think about what to shoot when only pilot can actually shoot with the cannon or rockets so it is just pointing out where target is or marking it with sight. Example 1: if you fly and pop-up over hill to enemy position ahead at 1-2 km distance, there are few trucks and maybe APC. The WSO can't do anything about that as you are not going to waste a ATGM on such targets that you could have taken out with YakB effectively while pilot can try to use rockets to destroy them as well. Now it is just pilot aiming with whole helicopter to take them out with 30 mm cannon or rockets. Sure the pilot wouldn't have 30 mm cannon but YakB would have been enough, and WSO would had easier time to aim and shoot the targets from the spotting moment all the way overfly position when the gun gimbal can't anymore reach them. Example 2: If you pop-up opposite side of the large open crop fields and at 4-5 km is a similar setup, it is more about rockets and 30 mm cannon than it would have been with YakB until reaching that 1-2 km range to engage them effectively with YakB. Now you can do it from 2-3 km range with the 30 mm cannon or rockets. Example 3: The same scenario but target area is a platoon with APC's and ATGM or AAA vehicle. Now you can start considering ATGM and rockets being primary, while 30 mm cannon as secondary. You don't want to go for a shooting competition with a ZSU-23-4 with 30 mm cannon as it is superior to you even with 23 mm cannons. Example 4: You have enemy position in a edge of tight open field between forests, own troops are 150-200 meters from them in opposite forest and you need to support them, but near by there are MANPADS and AAA for air cover, so you need to stay very low and fly from only safe direction. Now you have very limited opportunities to engage targets as you can't get altitude, so you need to come fast and low and have couple seconds to shoot at them. Flying toward them is risky as you will take fire, but performing a turning approach or near pass makes it safer for you but you can still use YakB (while still rockets are most preferred option to saturate large areas), but that is again co-op with pilot using rockets and WSO using YakB, where 30 mm cannon and ATGM are more useless. When AI doesn't need to utilize a one different weapon, it makes it easier to be programmed. As it is simple as that when WSO can't use ATGM, it just does naked eye spotting and nothing else.
-
Comment box did it again, replaced all the text and quotes from a post that was done hour earlier and slapped a attachment in to it.... It sure does. The DCSW got like completely new element to Air-to-Air, Air-to-Ground and Surface-to-Air navigation, combat and everything. My settings are below. Almost all maxed out. I get on the Rift S a constant 40/80 FPS depending map or area. Only time when FPS drops below 40 is when example firing with gunpods on Mi-8 a 5 second burst and the bullets hit the ground and all the dust effects appear. That is when FPS drops to slideshow effect. But that has been so almost always, sometimes it is and sometimes not. But this 2.7 update has been like major performance boost for overall flying. I do experience slow downs at single digits in two cases: 1) Adding a aircraft module on map in mission editor. On moment it is clicked on map it takes 10-15 seconds run at 1-2 FPS until it appears there and all is fine again. Previously this happened only in the AMMO tab on right when 3D model was loaded and it took like 5 seconds to load the model. So this has to do with new 3D model preview and loadout panel. 2) When selecting aircraft to fly in mission start, 10-15 seconds goes when I see only a HUD and display symbols floating in air. Then suddenly the cockpit appears and all is fine. In example MiG-21Bis I had half of the aircraft, seeing through to nosewheel and radar cone. All is functional and operational by rendering wise but cant click anything and only fly smoothly, until cockpit appears.
-
Well, he is right in some points. The obvious patchnote mentioned known problems are there (shimmering in horizon etc) but on me the all the clouds does jump up/down by like 50-100 meters in various situations. It is pretty annoying in the mountain flying when the overcast coverage keeps pumping above you and it feels like you go up/don each time. The current templates for clouds does have very soft edges even at the highest detail. Hopefully we get possibility have sharper edges (higher definition) among the nice soft ones. And I noticed that previously I wanted to use 1.8 gamma, now I am at 2.2-2.3 as clouds look too dim/grey otherwise in direct sunlight (flying above clouds) but this hides all the top clouds forms as you can't any more see the shapes there and terrain becomes again too bright. And this causes as well situations where clouds shadows are way too soft. I am yet to go through all templates, but none of them has so far been giving me the realistic sharp shadow edges that are like 15-20 meters on ground. This is because clouds are too soft edged. All kind clouds can have very sharp edges. From just a meter to ten meters. This is easily visible on the summer times when a cloud shadows moves as someone pulling a large cloth over the terrain. Or that rain cuts like a razor on ground, like someone would have drawn line with a stick that what is side for full rain and what is side for no rain at all. It is like stepping out from the shower or under the umbrella for sun shade. ED is of course developing further these things and fixing problems. But start is excellent. I might have been hasty but I today deleted the 2.5.6 installations (both open and stable) as 2.7 is way too good looking to go back. ps. Surprisingly the antialiasing and DP increasement made the ugly horizon shimmering go almost completely away.
-
I can zoom in and out. By pressing the right controller button. Nothing has changed there. Left controller has Reset VR View and End Game menu, and right controller has Zoom and something I don't recall (maybe nothing). When I was rebinding the Touch Controllers with Vjoy, the DCS World gave listed conflicting bindings for installed modules (I have only three installed right now as I made full new install, maybe that is why I don't have problems you explain) like Y was a trim button and trigger was the ICS and grip was machineguns. How to get it working around the problems I faced? I tried to seek such app back in the day when the Touch Controllers came supported as virtual hands only (no buttons or anything) and had lost interest to find solution to it until today came cross with it. It would be nice to get the VR overriding the binded functions. But you are correct that we should not be required use any third party software for input devices. All should be read directly from DCS World (common devices) and bindable as we want. They are all equal. As ED needs to do it all instead just one part or a another. Add the suggested features and configuration options. There are bug reports about that. Even when a controls are invisible you grab them. They are years old now. All problems needs to be solved, not just one or another. And solution is to add options so everyone can get happy. It actually makes more sense to press a UFC buttons and such as now you can do it more with a finger than tip of the finger. Just like if you need to press a button, you don't use a finger tip but the fingerprint part. Again, all switches and buttons are terrible as long the input is repeated as long you are touching the switch/button. You can do a simple easy LTD/R flip as it keeps flipping back and worth as long you have finger touching it. That is what is forcing to use a idiotic "swipe" techniques and trying to avoid touching anything else, or just use a laser pointer and mini-stick Up/Down to do the work. Simple fix is that delay setting that once button has just been operated wait X milliseconds before it can be operated again. You are not suppose to use your right hand to operate left panel. You use left hand to operate left panel and right hand to operate right panel.
-
Because in the phonebook there is one wrong number for the contact, it doesn't render all contacts informations invalid....
-
In a MiG-21Bis that happens. You can see the IR missile seeker pipper jump to flares. So if you launch at that moment, then the missile will definitely go to flare (and never to aircraft itself, but you can launch with lock at the aircraft and have missile go to flare... lol). I don't even anymore know what is the realistic behavior in the MiG-21Bis gunsight. At least with the rockets and gun in A-G mode you should get the CCIP calculation, but for bombs you shouldn't get any CCIP/CCRP functionality but need to bomb by using numbers. AFAIK the IR missiles should signal the detection of the heat only by the audio, but not by placing pipper on the target when seeker is uncaged to it. And when the radar is locked on target, the pipper should not lock on the target but stay centered, only seeing the radar pointing angle via the radar scope. It would be awesome to get the IR effects for missiles and targeting systems, but we are getting there. Razbam just added in 2.7 the "Hot Spot Detector" (in fact it is not a hot spot detector but a Delta-T detector, pilot selects the temperature size, temperature range from scale and then some other values and the FLIR will symbolize where something is seen by those parameters, so hottest targets (literally hot spots) are not being detected as you can even detect the coldest or middle ones or where is variation in temperature etc) and ED is working for the new improved FLIR that should count the new clouds and all other reflections to lure IR missiles, FLIR cameras to have the realistic look (no more units popping out as bright hot objects automatically) and as well get flares emit heat. Hopefully we get someday the chaff modeling and proper radars so targets would appear and disappear in radar at various ranges and scenarios, and we would see radar get confused by chaff and seekers should really work for the CM instead just rolling a die "Do I go to chaff/flare or not, even if it is 20 degree out of my field of view?".