Jump to content

Fri13

Members
  • Posts

    8051
  • Joined

  • Last visited

  • Days Won

    4

Everything posted by Fri13

  1. I have told it from the start, and now you are claiming I am condescending you because you keep ignoring the fact that I have shown you that it is not broken for everyone. So stop insulting if you can't provide evidence for your support that it is broken for everyone and no one can have the features that OP said has been removed from the game.
  2. Agree. If, and that is BIG IF, we get a proper FLIR and proper TV optics and all, it means everything will change radically to shorter ranges. (You know all this... So not too say to You). It would make huge difference between aircraft capabilities to operate. Example: NiteHawk -> LANTIRN -> Shkval -> DMT -> Litening II -> ATFLIR -> Litening G4 -> SNIPER. We would get different experiences and capabilities to fly and operate in digital battlefield. We already have way too accurate GPS coordinates, no drifting, no jittery etc. If we could get the proper jamming equipment as well, we could jam GPS for aircraft and weapons and deny their utilization and as well force players to fly with INS only. So many options for combat performance is lacking, starting from targeting systems quality. Like AV-8B N/A one major quality is much better accuracy to deliver bombs than radar hornet or harrier or viper. You need a laser guided bomb to get better consisting accuracy. Because targeting systems being modeled to simple manner, we are in situation where everyone are way too effective and capable. No challenge or no tactics required like should.
  3. It is more annoying to be required move hand too much further and then back closer than it is to just hold it there and wait that you can operate switch or button again (when not one that requires multiple anyways). So the size of such bubble is important. That would fix the situation that finger pulled out doesn't trigger anything again if once gone past it. That would help many things as you can be more relaxed to push things and not try to avoid trigger something because your hand just went through. I would take that only for a panels but not too any switches. As last thing I would like to see is to have finger stuck to somewhere. Yes as I from the start have said (of anyone reads and remembers anything) that is why it needs to exist as optional setting. So those who need it, can enable it to be able operate things from any distance. Better make it opt-in to have beams, as it is assisting feature to not require player move hand so much. If they can't, then they can enable assisting feature like beam, a dynamic pointer position in closest button/switch and even haptic feedback (vibration). Why I have always said that we need zoom and all other assisting features (labels and such) as there are players who requires them. The problem is that people do not understand how difficult it is in reality to spot a military vehicle next top forest edge even when it doesn't try to conceal themselves. Rule of thumb is that similar color vehicle becomes visible at 1500 meters or less for an pilot when it doesn't move. Add some camouflage patterns and it gets very difficult. Add some nets and branches and like and you can walk on it and hit your head to it as you can't spot them unless they reveal themselves. In DCS I could see a APC on ground from 15 nmi distance with Rift. Have a vehicle in the edge of city next to building or open on street and it was easy as having label there. This is why all kind terrain texture mods are important that you get clutter to start hiding those units as should. It is not fun to look a outside and spot dozens of units in 15 nmi radius if they ain't inside forest or otherwise just happen to mix with fake cars and such. Having a AI as Su-27 and F-15 dog fighting at 10 km distance and you could see them against ground and you could see which one is which at 5 km range so you know against which one you go. Zooming was totally unnecessary as it was possible identify MBT from a APC from 8-10 KM when flying in KA-50 and this in a rainy weather. Actually not so much. As average vision is not so super wide as human fovea can't be moved to whole FOV. The only 2° sharp area is as well lower, requiring very low Mpix count. They are for those who need them.... As I have always said, players physical and technical limitations shouldn't be there to deny them from enjoying DCS. If someone can't afford for pedals, it shouldn't be reason to not fly helicopters as they can use assisting features. If someone can't see well or they don't have high res display, they can use zoom. Etc. But I never have said that my standards or my ways are what everyone else should have or do. I just don't use many unrealistic things as I don't need them. Just like I don't buy or use button boxes as I don't need any extra buttons as I use only VR to operate everything else in cockpit than real kind HOTAS layout. A perfect system would have a real simpit so when in VR you place your hand in something that you see, you feel it exactly correctly as in real world. That is just impossible for scenario where wanted to fly multiple aircraft in one room.
  4. It was in progress to be removed, but only after years later it was really out. Here is example about Marines: "The Marine Corps is at a crucial crossroads in its constant effort to remain technologically relevant. Currently, Marine F/A-18 Hornets are not authorized to employ laser-guided bombs (LGBs) when illuminating a target with its NITEHAWK targeting pod, due to the pod's low fidelity and increased chances of target misidentification. As a remedy, the Navy and Marine Corps, as well as F/A-18 air forces around the world, are in the process of selecting and integrating a new targeting pod. The contenders are the LITENING AT, in service with Marine AV-8B squadrons, and the Advanced Tactical Forward Looking Infrared (ATFLIR) pod, in service with Navy F/A-18 Super Hornet squadrons. Current plans have the Marine expeditionary (land-based D model) Hornets slated to receive the LITENING AT, while the Marine carrier-based (A+ and C) Hornets will receive the ATFLIR. However, the Marine Corps should equip its carrier-based F/A-18s with the LITENING AT instead of the ATFLIR, because the LITENING AT is equally capable, less costly, and more quickly available." https://www.researchgate.net/publication/235120937_The_Next_Marine_Corps_FA-18_Targeting_Pod_ATFLIR_or_LITENING That in 2006. What we possibly would have, is about 80-90% using a NiteHawk and 20-10% split between ATFLIR and LITENING AT. Now it is no NiteHawk at all, and everyone flying with ATFLIR in 2005. Players would scream against the idea that they would be limited mostly to NiteHawk and have a horrible visual quality for 10-40 nmi as they are so custom to spotting and launching at all ranges.
  5. Everything above that is true. And need to say that loss of toe brakes is not a problem for other aircraft as they can be enabled so pedals works as toe brakes for taxiing. And I don't think 200 € for excellent ones is not much to ask, considering that now even CH pedals can cost 270 € then the VKB are cheap. Well, they are best for helos if not going for far more expensive complete setups. As helicopters benefit a lot more about precise and small movements, the lack of linear movement from hips makes VKB more anjoyable options. But just to buy these to buy Mi-24P and fly it, I don't see it as first choice as twist stick should be good starting point even when very limited.
  6. Sorry to quote you like that. All was interestingly said. I have known that with MiG-21 the approach should be done faster and time the specific speeds at the outer and inner markers, with oddly rapid deceleration. What you wrote explains things more. That you come fast in and you do the landing quickly by utilizing it's speed to decelerate fast in controlled manner. Yeah, insult and make a guesses (AoA indicated below 10) . That is a way to improve someone's airmanship.... Yes, you don't know and as you don't even consider someone needing help, it doesn't help them.
  7. I can get the foot well being too well lit as they are deep and shaded well. But the other way around the hard contrast between sunlight and shadow is gone and you can actually start having that huge canopy act as a diffuser to soften the light and scatter it all other places than just direct light. And in those clouds and so on there are lot of other reflections from them to as well add new directions. It isaybe too soft at the moment but better than previously IMHO.
  8. So could we see a G4 or something in A-10C II? I wish we could have the older targeting pod for hornet, the Nite Hawk or what ever was. Something more proper for the year than what was just now added in 2.7... Like just offer it as optional one. Three pods for Hornet would be nice thing. Nite Hawk Litening II ATFLIR Each with their own characteristics and times.
  9. Please don't hurt me by telling me that they have not even done the new pod properly... Please say that they succeeded in it, as next time I take Harrier up in the air I want to check the new marvelous targeting pod... I throwed towel out already by first thing testing the TDC Action mode from special settings and then going to use DMT/TV and they didn't even model the TDC Action properly. Same speed all the time regardless anything. Like why to add a such feature back when it doesn't work? I then checked was the clock zulu time fixed.... and exited as couldn't really take it anymore. So unique airframe. So special and unique targeting system. So unique flight capabilities.... All at the level of Su-25T... Well maybe everything changes once new FLIR system is added by ED....
  10. I don't remember where I read it, but someone commented that he one contacted ED about the targeting pod limitations in gimbal speed, stabilization etc. He said he was a civilian engineer working those targeting pods. And that he offered to ED a public information (not classified as secret) for the systems. Saying that the current gimbal speeds are way too fast, accurate and responsive. That gimbal wouldn't stay in target at such maneuvering rates as now or maintain lock etc. So as you say "devil is in the details" as that part of details is what makes simulators interesting. The challenge, the designs, the technology, the future improvements and benefits. When a 1996-1999 targeting pod works as well as 2019 one, there is lot of things wrong right there. This is what drives people from Flaming Cliffs to DCS World, the attention to details as in bad and good. That is what makes cold war era so interesting as technology was made back then. All of the space races and fighter development etc. Like just few days in 2.7 and it is amazing experience to fly CAS in scenes where you have terrible visibility on ground. When you need to get close inside a thunderstorm as your sensors are useless. Visibility is bad and you need to do the job. Then you climb up from the clouds to sunshine, and everything feels like you don't want to go back.... Now if we would have targeting pods limitations, challenges etc, it would make one appreciate more the more modern systems as it would show how it was in the past. Edit: "Don't even get me started on how "well" the DMT/ARBS is "modeled" on the harrier right now." Damn you.... How again it hurts to see so amazing system be modeled so terribly wrong... So badly that Harrier should be part of FC3 package instead DCS branded. It would be so amazing if one would really need to use DMT to get a lock on target properly with all challenges, or use the INS mode correctly by requiring proper visual corrections. And then get to fly so ARBS does it calculations. So many things would need to get right to be capable deliver accurate bombs that was Harrier huge benefit over hornets and vipers and all.
  11. Better done in DCS or as in reality? (Well, for the era). Why I would like to see more of those as optional loadouts. Like why did Razbam need to remove the 2nd gen variant from 1999 to replace it with 4th gen? Like why not maintain both? It would have kept nice things for various missions. Especially when one get to compare the video quality and controls.... Not to forget software tracking capabilities etc.
  12. I don't use zoom. Yes, and where I talk there about using zoom? So you claim that sitting in a Co-Pilot seat to operate its functions is unrealistic, but operating his functions from a commander seat using a laser beam is realistic? Okay.... Don't build straw man arguments that is about zooming. As zooming is unrealistic unless you are using binoculars (that pilots do happen to have). As in reality you can't spot things so easily at all as you do in DCS. In DCS you have superior vision all the way without requirement to use binoculars. And zooming does it all even more unrealistic. Is it a game or not? Yes it is a game, but it tries to be as well offering as realistic experience as possible the player wants. Player has option to go for as unrealistic means as possible. It is not double standards to say that zooming is unrealistic as you don't have in reality means to "zoom" without optical devices, related to be able fly as commander or as co-pilot. In reality you do not start swapping seats middle of flight so that commander can select the weapon pylons, the co-pilot does that. If you have a trouble for that, then you are having double standards by using it as argument that laser beams are realistic that commander just extends left hand from otherside of the cockpit and says "Sorry Ivan, I just rotate that knob right there....". One could very well go and do a simple voice command software macro that will rotate that very specific element X times clockwise and then set it behind voice command like "Ivan, give me GUV, please!" and then magically that knob would rotate to GUV position as AI is not doing it. One could see more effort to make it so that commander says "Switching to 7.62" and macro is run from co-pilot seat properly. Saying "Switching to pylon 2 and 5" could set it properly again. But how would you really do it with a another human being, is how you do it in co-op. Other sits there and does it when you communicate. Or you can simply when flying alone tilt a hat and you are the co-pilot and you do it, and tilt a hat and you are back as commander. You just played two separate roles as your friend was not flying that time with you and there is no AI that could understand what you are about to do, or there is no flight engineer that prepares weapons systems for launching. This is reason why Heatblur developed their Jester and Iceman so you don't need to be on both seats when you fly alone. This is why ED is developing their AI for Mi-24 as even there you need to perform some duties when playing alone, but they did select Mi-24P as well because you can fly it from both seats and do the all required things from front as from rear. Want the ultimate reality? Play with friends only. Not alone, not with AI.... You can keep zooming as much you want, but I don't do it as it is unrealistic. And I don't need to as I can read almost everything in cockpits. My purpose is to minimize as much unrealistic things as possible, to the point where it is sensible and possible. You know this as I have had this discussion already earlier. Everyone has their own limitations that are understandable. Like someone is using a old VR HMD that is not so detailed and they can't see without zooming, fine it is their limitation. Someone doesn't like the idea that their neck doesn't turn 210 degree with just slight nod, so they stay on TrackIR. Someone has amputated left hand so they need to use only right hand for everything, it is their limitation. Someone is blind from one eye but still wants to use VR. Isn't the gaming fun that we get all the options and possibilities to enjoy from something? Are there tens of thousands players putting name that features exists there? So why don't you? I have. I have given them more options what they would need and want, helped them more than you have. Helped the developers to see what features there would be required to be added to get things better than ever. And you are against it all. As you just want the old ways back. And you stated how it is more realistic to use one set of VR controllers than using a HOTAS for fitting aircraft controls.... Yet flying with left hand on laptop to press space bar to launch missiles and using comma, semicolon etc to steer TGP as you don't have a throttle where those would be... They have already redeveloped the controls, that is the thing! They have already taken more steps to progress further, not to stay behind. Now you can use right controller to operate the cameras and move across the battlefield (or at least I have found it just now) as they are adding the core RTS game elements to game now. As I have stated to You earlier and know every well, I did use looooong time ago. But they are not realistic like a proper HOTAS is. They don't offer the capabilities that proper HOTAS does. If you want to have experience how to fly aircraft with one hand on throttle and one hand on stick, you need to have throttle and stick where to lay your hands. Those are not same thing at all as these are: And those don't offer all the same things as: null But what comes to having the proper features on each proper device and otherwise use this: As ED has offered all that. To small details, clickable and usable almost everything. All you need to do is to loosen grip from stick and throttle reach your hand on a the system and use it, and then lay your hands back on throttle and stick that has all the proper functions in them and nothing more. The OP has falled short that those systems has been removed. They have not. Do you understand what I wrote in the first message? Do you understand what Original Poster wrote in his bug report? I understand very well that you have only two real bugs here that OP didn't even talk about other. 1) no laser beam (opinion should it be there or not) 2) when you have VR controller grabbing a virtual controller, the trigger and buttons do not output anything (not topic of this thread, so start a new one from that). But when it comes to that the common bindings were removed from the Touch Controllers, that is false claim as they are implemented. They are there to be used. There is nothing required to be done to get them back as they didn't go anywhere. Or if you would read my first post, you would understand that was a 100% false. So get to the topic: Because as you have been provided with evidence, only laser pointer is lost and bindings are still there and usable. And it is good for some of us that laser pointer is gone, but we understand your kinds problem that you want it to have but we do not accept that it would be brought forcedly back so that we would need to suffer from it again. Instead we recommend that it becomes as an option in the VR settings that player can enable the laser beam if they want it, so everyone can have a choice to have it Off or On. But as you do not accept that, it is same as forcing others to use their gaming devices only with the way you want it to be.
  13. Did you do clean install by removing all the previous version configs and saves? As everything that OP says doesn't exist anymore for mini-stick, are still there. Those features has not been remove or replaced. Instead there has been removed the laser beam (topic already discussed) and added bunch of new effects like VR camera controls with the right hand. So tell what does not work in that, compared to what OP says needs to be brought back? "Before I didn't have to setup the controller in the usual input bindings, the trigger did left click but the thumb stick would rotate or click switches up or down. Now the beam is gone so I can't easily see what I'm pointing at and only the trigger works, so I can't turn dials or flick switches easily anymore. Any idea when these things would be back as at the moment the game is pretty much unplayable and 2D is not an option for me." So did those things got removed? No Are those the usual input bindings? Yes. Are they required to be binded? No, impossible as you can't bind anything on VR controllers. Everything that OP says is missing (except the laser beam), is still there. Not missing, not gone....
  14. That is not the OP bug report topic. He does not have mini-stick doing left/right mouse clicks and rotating knobs and all that. Nothing about "Hey, when I grab a virtual stick with VR controller the trigger doesn't do anything even when a index finger in new gloves moves as it would be pulling a trigger". Yes, as I have acknowledged it from the begin already... That has been missing thing from the day 1 years back that we can not bind all the buttons and everything as required in the VR controllers when we grab things but we are relaying to some specific ones. And as I have explained, going through the binding process using Vjoy, one can see that those are still binded to the VR system part. If you try to bind them doing so, there comes the conflict and system (as with any binding twice same control button/hat/axis) says that "This is already in use for this, do you want to cease to exist there?". We need a official mean to bind trigger do what we need it to do. We need to have means to bind TDC axis to mini-stick. To set even a NWS/Undesignate or Lock function to buttons. So that no one is required to use keyboard or anything else to do the basic things at minimum level (naturally one can't bind all that Warthog throttle has for a left Touch Controller, but very basic things can be done like TDC, Lock, China Hat Forward" Zoom is one of the features that not everyone need. Or maybe some one would like it to be on the left controller instead on right one. "Fix the cause, not the symptoms... " Even if zoom and trigger would start again working, it doesn't fix the problem. It doesn't make those gaming better who are forced to fly with a one hand on laptop to operate a TDC or launch a missile or release bombs as they can't use the left with right VR controller to do those basic things.
  15. I don't use zoom. It is 100% unrealistic. But as I said, it is already acknowledged that bindings are gone when grabbing something, it needs fixing by implementing totally new binding system. Not by reverting to something old that was left behind for purpose. Mouse cursor doesn't move on me. It stays same position in DCS as on the Oculus Menu or virtual desktop. Problem is that as I explained, Oculus own hand is as well off-set wrong on me same way. It was proper some long time ago when Rift S got out but in CV1. But some patches after that it got off. Even when the 3D modeled touch controllers are exactly properly aligned. Trying to fix something that a new code added is not easy by trying to just patch old code in. It is easier just to go and just do what is needed to be done. It is not like a changing car tires that when you notice one new tire is flat, that you can just place a old tire in its place until you get new tire fixed. Yet there are tens of thousands that do not complain..... Sorry but that is not an argument. It is just an evidence that there is people who want something else. And as I have said, all is fine as long it is optional to be enabled by those who want to have laser beams and want to have a own custom bindings and want to have own pink flying horse gloves etc. More options and merrier. Common issue was that we had a laser beam coming from our finger tip regardless how far it was. Any time you wanted to use your hands it was a laser show inside a cockpit. Completely unrealistic and immersion braking. It is superior to have that laser beam gone for short ranges (reach of a hand) as your task in a cockpit is to actually reach up to the control and do something. Yes, make it a opt-in setting in VR as I have been saying from the start so those who need such, can enable laser beams to be happy. But at some point people need to understand that if they keep hitting their joystick on the table, that the problem is in the location of that joystick and table and not in the game that requires you to move the joystick such amount that joystick allows to be moved. If someone builds himself a 80 x 60 cm box where they want to play full room VR games then it is their problem that they didn't make it 2 x 2 meters at least. As I have said from the begin, there are issues that needs implementing and fixing. Part of the bug reporting is that you tell what is wrong in your opinion and you can suggest the solution for it. If a game crash because you press button A, then there is no other solution than "make it so that it doesn't crash". But when the bug report is about bindings, usability and features, then every opinion counts. Hence, more options. What options we need. Who wants what kind options. etc. As if you want something improved, you need to tell what you want. If someone just go saying "IT IS BROKEN AS WE DON'T HAVE LASER BEAMS!" then it is 100% wrong if they just would add laser beams back, because it is great that we don't have anymore laser beams! And it is not just my opinion alone. So what can you do when you have two group of people with different opinions for same topic of laser beams either ON or OFF? You make an suggestion that it would be a option in VR settings that you can enable the laser beam if you want it or leave it disabled if not wanted. When you make a usability improvement by making the virtual hands angled so that you can more relaxed way point things with virtual hand, it is counter-productive to go demand that it can't be in such position and should only be in X position. No, it needs to be a setting where player itself can go and adjust the virtual hand position and angle as pleased to fit for their purposes if the default one doesn't please them. People need to understand that bug reports are exactly about discussions that what should happen or why. In a question of simulating reality there is no much opinions as only factor is reality, so if a light switch works in reality only in Up/Down direction, it doesn't help to say that they want it to work Left/Right direction. When it comes to controller bindings and such, there are considerations that are per player. Someone doesn't want to have a zoom button, someone does. Someone doesn't want toggle but hold, and someone wants other way around. Someone doesn't want to use VR controllers and someone does. If one group go to say that their preferred setting is the only way then next time after update there will be bug report that request opposite. Only choice is to gather the opinions and make there options so people can choose who they want their controllers to behave. It is not a bug report and correct place, and you know it. If they feel something is broken, they can share their opinions and findings. I already did and pointed out those problems that you are now saying that I am against to, but I came up with the suggested solutions while you just want everything to go back to old. We finally got improvements, and you want to go back - until things gets fixed... Like what would be the fix then? Really, lets see what the OP says: "Just installed 2.7 and my VR controllers have lost the laser pointer (on grip) and key bindings." Yes, he lost the laser pointer, that we have already all agreed. But people who demands it back should not ignore the opinions of others who want it to be only optional, not forced change. It is as much wrong to demand it to be there for everyone as it is for it being removed from everyone. Solution is, make it optional so everyone can choose do they enable or disable it. Then comes to other point, "lost key bindings". And it is below: "Before I didn't have to setup the controller in the usual input bindings, the trigger did left click but the thumb stick would rotate or click switches up or down." And that is false. It all is there. All that works exactly as before, unlike how OP says that it doesn't. Trigger is left click, thumb stick would rotate or click switches etc. It is ALL THERE! So first half part of the bug report is totally false as it has not been lost. Let's see the second part: "Now the beam is gone so I can't easily see what I'm pointing at and only the trigger works, so I can't turn dials or flick switches easily anymore." So lets split it to parts. "Can't easily see what I am pointing at" like can't anyone see where their finger is pointing at? Again, in reality I don't need a laser beam from my finger so I can press these keyboard buttons or I can click a mouse button or I can flip a light switch on the wall. It all works correctly as the touch part is in the proper position on finger to be used. But of course people want to be clicking things in pilot cockpit from rear RIO cockpit. From the commander seat to co-pilot seat in Mi-8 or UH-1. The real aircraft is designed so that things are either easy to reach or more difficult to reach. Important parts are made more easily accessible while unimportant or unneeded are put aside. This is why example some functions are behind in difficult place and some are front of you in easy place. Why HOTAS has important functions so nothing else needs to be pressed while some requires hands to be moved. It is pretty moot point that someone say that they can't move hands without colliding to their table and all, once they fly only in VR because they shift their places to play so often that they can't carry a HOTAS with them in traveling etc. If all they need is a simple chair, then move that simple chair just a little further from the table so they have little more space. Again, if they have problems that they can't have all the functions in their VR controller and need to lay down their hand on the laptop, maybe it is better that they support idea to fix that problem that VR controllers would be bindable for various functions. Like they could use a mini-stick as TDC hat and mini-stick press as a TDC Action/lock. That they could use a trigger as trigger and one of the buttons as bomb release button. Now they don't need to move hands on the laptop to do tasks that they were required do previously and keep laptop near them. Hence properly bindable buttons, hats and all in VR controller fix a lot of problems and makes better gameplay same time. The other part of that sentence is just repeating that OP can't turn dials or flick switches while it is still possible since the begin they got added to VR hands (at the begin we had just floating hands without function to operate anything). And then last part from OP post: "Any idea when these things would be back as at the moment the game is pretty much unplayable and 2D is not an option for me." They are back... All those functions are as they never left! Trigger is LMB, hat does click LMB/RMB and rotate knobs and all. The finger tip is just like before, everything is like before except that there is no laser beam visible for close range use. It is right there already.... The OP doesn't talk about "I need to loosen grip from a virtual stick to zoom in" at any given part in his post. There is no "I can't shoot a cannon with the trigger when I am holding the stick" and how it is different "I play by having left hand on laptop to use space, alt+space, ,.- and l to move TDC around" etc? Because those functions isn't there. Yes the trigger enables the virtual trigger finger movement in stick but that is it. And when someone suggest the proper fix for that is that there should be means to bind VR controller functions in two modes + when grabbing stick/throttle/lever, that it is not a fix but old mechanic should be brought back. Like since when did the old 2.5 version allow to use TDC with the mini-stick in left hand VR controller when grabbing a throttle? Or have a bomb release button, trimmer and trigger in right hand when grabbing the stick? If the system has all those things still in place, they have not gone away. And if something is not gone, it can't be brought back. There is possibility that people just updated to 2.7 and didn't clear their settings and have conflict there. Those things happens. If some people have all those functions as should, but some doesn't, they have problem but it doesn't mean that the system needs reversing new features as it is not a fix! Should developers go change the code in a game if player has broken configuration files from previous version?
  16. It is more fun and immersive to be required move hands around the cockpit than just point things with mouse or with laser pointer. The actual process to reach a button above you or flip the switch behind a UFC is experience. But you are not doing that so much. We never can have a realistic things like how a physical UFC allows to support left hand on left side of it, while thumb is used to press the buttons or we can't take a support to look back: https://youtu.be/WS4mZiIqzxU?t=458 But mainly you are holding hands on HOTAS. That is why VR controllers are not good as you can't operate aircraft realistic manner or properly like you can with HOTAS. It is easy to fly a helicopter for couple hours when arm rest on your joystick and leg, compared to hold a touch controller on lap and avoid moving it or move it in air. It is not so easy in DCS to flip switches that starts switching position rapidly and it makes it annoying as you can't be confirm that what you intend to do is done.
  17. To get that all unrealistic back eventually (as not all is gone). Like the laser beam. Crosshair is right there when you point something far away, it just doesn't appear when you point something at close. They made it very clever manner that the crosshair and the laser beam is not distracting and blocking your view. As it is real immersion killer when every time you do something you have a laser beam pointing from your finger like 3rd class jedi. The crosshair doesn't go through the canopy (unless pointing through glass) if the module cockpit is made support it like in AV-8B Harrier it goes through but in Mi-8 it doesn't, but it is there and it is fixable and it size change dynamically. It works very pleasant way in that manner. As it is increasing immersion as you don't have constant reminder that you are in game. At long distance it is large and correctly there. Closer you bring it to you, smaller it becomes. And that is compensating for the illusion "it is outside of cockpit". It is visible only for the left eye, so it can look odd at first. The default bindings are still there. Mini stick: Up = Right click Down = Left click Right = Rotate clockwise Left = Rotate counter-clockwise Trigger = Left click Grip = Grab / Activate pointer X = Menu Y = Re-center B = Zoom A = ??? Those are the default ones, still there like previously. What DCS still doesn't have, is support to detect these controllers and allow to bind anything in them in three modes: 1) When GRIP is hold down 2) When GRIP is not hold down 3) When Virtual Hand is Grabbing something. That is what we need. Options for settings among others already mentioned (hide/show laser beam etc). The current main problems are: 1) the virtual glove does not match the real controller position and so on real hand, even when they have good idea with it now. 2) The "trigger" does not work as a trigger finger when grabbing a stick that has the trigger. Again something fixable with Vjoy but should be by ED. How are you going to fix a bug without a patch? It is actually comfortable now as is, it is just difficult to quickly point things as you need to work how you see, not how you feel. So your problem is exactly what? You said that the finger tip is incorrectly angled. It was as you wanted. You said that the grip was Press to Grab -> Press to Release. It was as you wanted (Press to Grab and Release to Release). The new hand is better than before. 1) Fix the hand angle and position as I suggested to have option for it. 2) Add binding options (as people has wanted from 2016 or so) with couple sets. 3) Add some of the options back for those who want to enable lasers and crosshairs etc. You can't say that from the loading screen and all. As tracking goes wonky when the VR app doesn't anymore receive the inputs for loading phase. Why you need to often reset the view once loading mission or main menu. And the angle is already in the VR controller angle case. Nothing new there. Yes, that is the problem mentioned from the start. The virtual hand is not tied to physical controller properly. Yes. You can see those in my MiG video pointing the red controls and how much they are off-angle from real finger. We don't need "a patch", we need a new configuration system as people have different controllers and different size hands and they hold different manners same controllers etc. The Oculus SDK includes everything for a properly tracking each controller. All you really need to do is to build your own virtual hand model around that virtual model axis and same way place all the guns (if we would have) and objects to the hand. The problems starts when we have different shaped objects than the controller is. There is reason why developers has set that hand angle so, as it is more relaxing way to point things in a cockpit as you don't need to twist your wrist so much downward and it is out of the way as the hand doesn't block the finger position. It is purposely made so, and it works on close ranges but at longer ranges it becomes little distracting as own hand doesn't match it. That is why we need configuration option that we can adjust as wanted, each of their own. ED added support for a $1000 dollar VR gloves... (VRFree) but they can not add the binding option for already existing hardware? Like I would love to see those three modes (multipliers) of bindings options. So that when I am not holding GRIP button down, I can press A/B for centering view or opening game menu. When I am holding that GRIP button down, then I am pointing something and I might want A to be left click and B a right click. And when I am grabbing something like a virtual stick, then I want the A to be a weapon release button and B to be something else. The DCS recognize when grabbing is done, so it can enable additional bindings for that mode. Others are just about multipliers as normal buttons and such.
  18. We are not talking about the MFCD alone. We are talking about the POD limitations itself. The TGP LItening has a 640x512 resolution camera. You can not get any better than that by any means. It has two optical magnifications, Wide and Narrow. That is the best it can get. Using either one will give you a ~500 x 500 pixels video as it is cropped to square and utilize some stabilization etc. Now that is only at the its best possible scenario. Each digital zoom level shrinks it. There are digital processing to be done, but it is nothing amazing. Blur is blur no matter how you try to deconvolute it when you don't know exactly what was done for the blur or if you need to do it in real time. A 9x zoom level in a Litening gives you at best case scenario a 52 x 52 resolution video. That is blur. That is blur on blur and just blobs everywhere. That doesn't even count the original video being soft because all the atmospheric problems it need to see through. The military targeting pod is not better than even a binoculars. You can go look using binoculars around places and you see that at various times things just gets soft. Even when you can look from high altitude down, problems are there. Nothing you can do for it. And when you enable FLIR, that is soft on soft. In DCS that is as sharp and detailed as any other video you ever see. Thermal imagery is not something that is high detailed at that level of cameras. And you take that already soft video and you capture it with a digital zoom processing and you get soft video to output. Now that video is to be outputted on a low resolution display. Again own pixelated quality instead sharp 1024 x 1024 as now. So you get pixels and you get soft. The AV-8B Harrier just received from Razbam in 2.7 update the Litening 4th gen (2008) pod. It is best there is now. It has a 1024 x 1024 FLIR and CCD. Multi-Target capability etc. Nope, G4 is from 2008, Viper is from 2007 and Hornet is from 2005. Because ED doesn't want to provide a more realistic modules where compatible systems becomes available when mission year is newer, it means that you are always stuck to specific year no matter how realistic something would be to be available. Like example these targeting pods are independent systems, where they do all the work and feeds just the video to cockpit and receive the basic control inputs. All the fancy stuff the pods does is in the pod itself from tracking to datalinks to everything. It is plug and play basically. It is simulated too well. That is a fact. You are looking a perfect world rendition without any atmospheric limitations -> through a perfect optics -> that has no technical limitations for its resolution -> without any required digital processing to "enhance" some of the characteristic in sake of others -> outputted on a high resolution and clean virtual display. There is a major difference to look in a perfect conditions captured 1024 x 1024 video with perfect stabilization on a perfectly rendered 1024 x 1024 display and looking something that starts as 52 x 52 pixel video that is played on 640 x 640 display (or what ever). Finding stuff with those old targeting pods is soft on soft on soft. It is trying to make out what you are looking and trying to assume things that "They say it is next to the building, so that must be it" and not getting clear picture that what it really is. https://www.youtube.com/watch?v=XFj6f9L827A Even when those are recorded on a 8 tape, you can assume the quality based to what is the pure digital quality via the symbology. The youtube compression does nothing to quality, but it is so easily seen how 3-5 nmi ranges are blur without even digital zooming. Go further and the FLIR becomes even worse. This is reason why Sniper pod is so major upgrade to these old pods as it is so easy to get better from so low quality to begin with. And in DCS that is not the case. We have perfect quality where you can count almost every finger soldier has from a 10 nmi distance using maximum digital zoom. We have just received the 2.7 update for new clouds. It was just a first step of many in the whole system, but one major part is coming that is the humidity simulation. As well we are getting a new FLIR simulation. We are going to get far better modeling of the optical challenges. Like right now we don't even have proper laser beam simulation. We don't have the reflection rules, angles or anything. Some had artificial 8 nmi (or was it 18...) length for the beam, so if you were out of that range then beam ended in air and weapons hit there in air. But you can point laser from opposite side of the vehicle and weapon can impact to it from another direction because it doesn't care what is the line of sight as long there is no building (AFAIK) or terrain between. Like a common JTAC laser designators has like in excellent conditions a max 3-5 km designation range and 5-10 km laser ranging range. Battery lifetime is counted in minutes when in use, not in hours. Add some heat waves, dust, moisture etc and those ranges drops quickly to much shorter ranges. We just have totally unrealistic targeting pods quality. It is worked for. The F-14 own pod is far better simulation already but even it requires little nerfing.
  19. Some improvements has been done, like now the VR view is shaped by the optical viewing capability. So instead large rectangular screens being rendered, it is now oval shaped areas. Clipping these corners and all, has good possibility to cut even 15-30% of the wasted rendering process as you can't see them anyways. You can see the effect on the reprojection but that doesn't matter as the VR view does. That is my primary visual glitch that whole cloud layer jumps up/down. The shimmering in horizon is not so bad but constant feeling that world jumps around is. So expected soon to be fixed. Have you disabled the VR option for bloom? Get a video of the light sources so it could get fixed.
  20. You are repeating my findings that I have been discussing all the time.... Check the point #3 Do you guess who was first one in this thread to bring up this problem? So why you argue against when you are agreeing with me?
  21. Problem is that you would want that when you fly under the shadow, that the game would automatically adjust the lighting effects so that the shadow would become brighter so your real eye that is looking the display would need to adjust itself. It is surprising dim to be under some clouds shadows. It can be even darker than being under a sunshade (like umbrella) because the cloud covers so large area that reflective lighting doesn't happen. While being under a sunshade the area is so small that surrounding area reflects light and illuminate the shaded area. One of the major differences that people don't get is that when light is diffused, it can get very dim illumination but it looks bright. This because high contrast is lost to make a comparison that what are highlights and what are shadows. I think the lighting is proper at the moment, there is always possibility to tweak things but first weather I tested was the heaviest rain weather. And that was amazing experience as you really got that dim feeling where everything feels that someone just turned lights off. And that what DCS offers is not even a dark as it can get when lighting and all starts to feel super bright. We need little bit more contrast, like contrast to clouds shapes edges (but we might not have yet proper cloud formations for that yet, so it can be coming in future) as well sharp shadow edges and darkening.
  22. You should clear your configs and remake them. Here is how it works on me: As you can see, nothing like that is happening on my side as you report about the finger tip having wrong angle. And you can as well see the problem I mentioned, every switch and button going crazy when finger stays in their activation zone as there is no limit how many times they can be manipulated per second. So you get difficult times with finger operation as switch does 50 actuations per second if you do not manage to just "swipe" the switch/button quickly over, and even then they can do movement couple times and end up to same position from where you wanted to flip them off. Under is video about the problem that I mentioned at the begin, the VR hand being seriously off-set from the real hand. In most VR games this is not such a problem as they are properly angled. And many offers a configuration option to adjust the virtual hand angle (usually a gun grip angle) in Y and X positions. Some games offers even a virtual automatic alignment here you either point the object with real hand and it will automatically adjust the virtual hand to point the object. Or you point the object with virtual hand that locks it in that direction, and then you move your real hand to match what you see and feel as "correct" and it is automatically this way adjusting virtual hand to match real hand angles. Here I am pointing the two red objects with my real finger as I would see it and point "Press this" kind a way. Then I show the off-set angle in both X and Y axis that how much it is relative to my real finger and virtual finger. In the Oculus Home and all that the virtual hands are as well wrong. They point as well wrong direction but if I enable the VR controllers to be shown instead the hands, then the controllers are exactly correctly positioned, so the 3D models aligns perfectly with the real controller. It is the Oculus and now ED own "optimized hand angle/position" that is problem.
  23. Actually there are black and white films that can capture far more than human eye can see. We are talking about 10 stops that average human eye can see at once. But human eye works so fast between highlights and shadows when looking at them that it is like adjusting camera aperture where amount of light is quickly controlled. It is extremely rapid process and brain does amazing things to build the picture in mind even when we can see everything just about 2 degree FOV as sharp area. It would be good that people would go out with a spot meter and check out how much does the luminance change between shadows and direct sunlight etc at various scenes.
  24. You are talking about dynamic range, and it is fooling people as human eye does not have wide dynamic range but very quick adaptation to exposure changes. One can take the above like photos and compare them in the scene and see that they are exactly as real as seen with naked eye. Nothing odd there. But we have our problem that DCS World is not simulating cameras, we are very limited to displays dynamic ranges. We are already talking about 5-6 stops on average monitor. Even on the so called "HDR" displays are very limited unless we are talking about a real HDR monitors that cost tens of thousands. No matter what would be done, the real scene can not be captured even in 3D environment and presented via monitor as realistic. But it doesn't matter, as 5-6 stops is plenty, and 7-8 is amazing. You can capture everything in the nature with just 10 stops. Where 1st and 10th stops are pure white and pure black. That leaves only 8 stops for everything else. The problem is that human eye just adapts so well and human brain creates the illusion that it sees everything at once. Looking outside from northern window in mid day shows outside bright and nice, and then looking to inside it is bright and nice, but actually the human eye adapts fast and brain creates the illusion that indoors are seen as well bright. We need to wait ED to complete the weather engine, as not all clouds are same. Some diffuse light just lightly, cutting maybe just 1-2 stops of light. While some clouds has different composition of water and cuts 5-6 stops of light just next first one. One day there are lots of clouds that just diffuse sun softly everywhere, and some days they are hard and tightly packed clouds that blocks sunlight like it would start to rain even when they look totally white and puffy. People just usually love high contrast and colorful imagery. That is again one of those things that has driven the recent shader mod to increase saturations, even when it looks unrealistic and very unnatural. Trying to present a 15-16 stops environment through a 7 stops capable display is not easy task.
  25. Empty + under 200 L of fuel... If that is overweight....
×
×
  • Create New...