Jump to content

Fri13

Members
  • Posts

    8051
  • Joined

  • Last visited

  • Days Won

    4

Everything posted by Fri13

  1. Yep. Was expecting that this would have been fixed. But can't read the displays so back to configuring on Display_StrokeDefs.lua again to get it fixed.
  2. People should call it by what it really is. Page 230 TAC-000. "The thermal cuer is not a hot spot tracker, but a delta T-cuer. Thus, when flying over a coast line or river, one could expect cuers along the beach or riverside where there is a large temperature difference. The thermal cuer does not incorporate any target recognition logic either; thus, any object which meets the preprogrammed target size, falls within the cuer capture gate, and has the required delta T for the TEMP sensitivity setting is a potentially ″cueable″ object. Cattle, trees, and rocks are often cued when not desired. Figure 1-103 provides some general guidance for use in determining TEMP settings. Mission requirements, EOTDA PAR’s, and pilot experience must all be considered in order to establish optimal TEMP settings." What is a "hot spot tracker"? It is a system that will only, and only point out where are the hottest spots in the thermal image. It can not be configured, it can not be adjusted, it will do one thing and only one thing. What is the Thermal Cuer? It is a system that is programmed to point a wanted heat changes, wanted heat points sizes and wanted temperature value across all the temperature ranges there is visible and finally as well the area where the thermal is searched for. These configurable settings needs to be modeled once ED completes the new FLIR: - Gain control (MAN or AUTO) - Nudges (Auto Gate and Off-Set) - Reject (to clear the HUD from symbols as VV, waterline etc) - FLIR menu (FLMR) to program cuer function. - Temperature Sensitivity - Capture Gate / Area of Regard - Target Size - Limit (already, 0/4/8) - Black Hot/White Hot We need those functions and capabilities to properly use and utilize it, to be responsible get authentic simulation of it. A hot spot is a problem for example a Maverick IR seeker that does not have these capabilities. It will try to lock on either hottest or coldest spot that it is pointed at by setting the polarity (white or black crosshair). It doesn't have the logic to ignore a unwanted thermal differences, sizes or positions between just going for coldest or hottest target it is pointed at.
  3. Whiskey barrel. Advance one.... Ready to be dropped to men in trouble.
  4. Well the pilots say that 450-500 km/h should be good stable handling with it, but comparison to F that fly like a dart the Bis flies like a beer bottle. The flight performance is otherwise identical by speed and turn rates etc. It is crazy at the moment that you need to use afterburner in landing, as maintaining a 320-350 km/h landing slope drops it easily like a rock. And on moment you put afterburner on, the aircraft becomes stiff and easy to control regardless that your AoA doesn't change and speed is same until it gets accelerating (slowly). It is almost constant dancing around mil and AB to keep a 3-5 m/s sink rate as required to pull nose too much slows quickly down so easiest is just go almost full mil and perform the proper speed (280-290 on touch down).
  5. I don't know when it has been done, but just noticed in the 2.7 update that MiG-21Bis mirror rotates when you open/close the canopy. Last time I recall as that the mirror kept relative angle to rear as the "virtual camera" never moved with the canopy. But that is opposite direction, so when you open the canopy to right, and the mirror points slightly downward, you see ground up and sky down. And when you look carefully the canopy animation, you see that the mirror image rotates in wrong direction and ends it to be upside down. The other is that even still the mirror center is off to the left. So when sitting in cockpit and looking the mirror, the vertical stabilizer is further right. And I need to move head to left off-axis to get it centered with the body. So the camera for mirror is slightly rotated to left. In VR this is more obvious with left eye but even with right eye the mirror points too much to left. The mirror camera should be rotated so it looks centered but alternating eyes would get it look little opposite side depending which one you use.
  6. The AI in the P needs to only use the ATGM. Sure it does spotting without the scope, but it is not required at the same time period to use 3/10x sight to engage targets as it would be with the YakB sight at the close ranges when you perform overflies and passes in tight small areas. As the AI needs to perform different roles how and when it does spotting. - Naked eye it has widest field or view and capability to scan differently the close ranges areas and focus to different target types. - A YakB sight for aiming targets at 120 degree FOV area and point it quickly around from target to target and handle it by possibly correcting aiming when starting firing. - The same optical 3/10x sight in P as in V for scanning areas at long ranges, indicating target positions from spotting them with naked eye first so pilot can find them easily. Now you remove the YakB element completely in the P variant. There is only the naked eye spotting and the optical sight utilization. 1/3 of the required work was eliminated with that simple decision. As right now hen searching and spotting targets with naked eye, the WSO has two decisions. Either do nothing and just report it to the pilot. Or the WSO reports and starts using a optical sight if range is matching. The third option to make a decision that to use a ATGM or YakB is gone. There is no need to think about what to shoot when only pilot can actually shoot with the cannon or rockets so it is just pointing out where target is or marking it with sight. Example 1: if you fly and pop-up over hill to enemy position ahead at 1-2 km distance, there are few trucks and maybe APC. The WSO can't do anything about that as you are not going to waste a ATGM on such targets that you could have taken out with YakB effectively while pilot can try to use rockets to destroy them as well. Now it is just pilot aiming with whole helicopter to take them out with 30 mm cannon or rockets. Sure the pilot wouldn't have 30 mm cannon but YakB would have been enough, and WSO would had easier time to aim and shoot the targets from the spotting moment all the way overfly position when the gun gimbal can't anymore reach them. Example 2: If you pop-up opposite side of the large open crop fields and at 4-5 km is a similar setup, it is more about rockets and 30 mm cannon than it would have been with YakB until reaching that 1-2 km range to engage them effectively with YakB. Now you can do it from 2-3 km range with the 30 mm cannon or rockets. Example 3: The same scenario but target area is a platoon with APC's and ATGM or AAA vehicle. Now you can start considering ATGM and rockets being primary, while 30 mm cannon as secondary. You don't want to go for a shooting competition with a ZSU-23-4 with 30 mm cannon as it is superior to you even with 23 mm cannons. Example 4: You have enemy position in a edge of tight open field between forests, own troops are 150-200 meters from them in opposite forest and you need to support them, but near by there are MANPADS and AAA for air cover, so you need to stay very low and fly from only safe direction. Now you have very limited opportunities to engage targets as you can't get altitude, so you need to come fast and low and have couple seconds to shoot at them. Flying toward them is risky as you will take fire, but performing a turning approach or near pass makes it safer for you but you can still use YakB (while still rockets are most preferred option to saturate large areas), but that is again co-op with pilot using rockets and WSO using YakB, where 30 mm cannon and ATGM are more useless. When AI doesn't need to utilize a one different weapon, it makes it easier to be programmed. As it is simple as that when WSO can't use ATGM, it just does naked eye spotting and nothing else.
  7. Comment box did it again, replaced all the text and quotes from a post that was done hour earlier and slapped a attachment in to it.... It sure does. The DCSW got like completely new element to Air-to-Air, Air-to-Ground and Surface-to-Air navigation, combat and everything. My settings are below. Almost all maxed out. I get on the Rift S a constant 40/80 FPS depending map or area. Only time when FPS drops below 40 is when example firing with gunpods on Mi-8 a 5 second burst and the bullets hit the ground and all the dust effects appear. That is when FPS drops to slideshow effect. But that has been so almost always, sometimes it is and sometimes not. But this 2.7 update has been like major performance boost for overall flying. I do experience slow downs at single digits in two cases: 1) Adding a aircraft module on map in mission editor. On moment it is clicked on map it takes 10-15 seconds run at 1-2 FPS until it appears there and all is fine again. Previously this happened only in the AMMO tab on right when 3D model was loaded and it took like 5 seconds to load the model. So this has to do with new 3D model preview and loadout panel. 2) When selecting aircraft to fly in mission start, 10-15 seconds goes when I see only a HUD and display symbols floating in air. Then suddenly the cockpit appears and all is fine. In example MiG-21Bis I had half of the aircraft, seeing through to nosewheel and radar cone. All is functional and operational by rendering wise but cant click anything and only fly smoothly, until cockpit appears.
  8. Well, he is right in some points. The obvious patchnote mentioned known problems are there (shimmering in horizon etc) but on me the all the clouds does jump up/down by like 50-100 meters in various situations. It is pretty annoying in the mountain flying when the overcast coverage keeps pumping above you and it feels like you go up/don each time. The current templates for clouds does have very soft edges even at the highest detail. Hopefully we get possibility have sharper edges (higher definition) among the nice soft ones. And I noticed that previously I wanted to use 1.8 gamma, now I am at 2.2-2.3 as clouds look too dim/grey otherwise in direct sunlight (flying above clouds) but this hides all the top clouds forms as you can't any more see the shapes there and terrain becomes again too bright. And this causes as well situations where clouds shadows are way too soft. I am yet to go through all templates, but none of them has so far been giving me the realistic sharp shadow edges that are like 15-20 meters on ground. This is because clouds are too soft edged. All kind clouds can have very sharp edges. From just a meter to ten meters. This is easily visible on the summer times when a cloud shadows moves as someone pulling a large cloth over the terrain. Or that rain cuts like a razor on ground, like someone would have drawn line with a stick that what is side for full rain and what is side for no rain at all. It is like stepping out from the shower or under the umbrella for sun shade. ED is of course developing further these things and fixing problems. But start is excellent. I might have been hasty but I today deleted the 2.5.6 installations (both open and stable) as 2.7 is way too good looking to go back. ps. Surprisingly the antialiasing and DP increasement made the ugly horizon shimmering go almost completely away.
  9. I can zoom in and out. By pressing the right controller button. Nothing has changed there. Left controller has Reset VR View and End Game menu, and right controller has Zoom and something I don't recall (maybe nothing). When I was rebinding the Touch Controllers with Vjoy, the DCS World gave listed conflicting bindings for installed modules (I have only three installed right now as I made full new install, maybe that is why I don't have problems you explain) like Y was a trim button and trigger was the ICS and grip was machineguns. How to get it working around the problems I faced? I tried to seek such app back in the day when the Touch Controllers came supported as virtual hands only (no buttons or anything) and had lost interest to find solution to it until today came cross with it. It would be nice to get the VR overriding the binded functions. But you are correct that we should not be required use any third party software for input devices. All should be read directly from DCS World (common devices) and bindable as we want. They are all equal. As ED needs to do it all instead just one part or a another. Add the suggested features and configuration options. There are bug reports about that. Even when a controls are invisible you grab them. They are years old now. All problems needs to be solved, not just one or another. And solution is to add options so everyone can get happy. It actually makes more sense to press a UFC buttons and such as now you can do it more with a finger than tip of the finger. Just like if you need to press a button, you don't use a finger tip but the fingerprint part. Again, all switches and buttons are terrible as long the input is repeated as long you are touching the switch/button. You can do a simple easy LTD/R flip as it keeps flipping back and worth as long you have finger touching it. That is what is forcing to use a idiotic "swipe" techniques and trying to avoid touching anything else, or just use a laser pointer and mini-stick Up/Down to do the work. Simple fix is that delay setting that once button has just been operated wait X milliseconds before it can be operated again. You are not suppose to use your right hand to operate left panel. You use left hand to operate left panel and right hand to operate right panel.
  10. Because in the phonebook there is one wrong number for the contact, it doesn't render all contacts informations invalid....
  11. In a MiG-21Bis that happens. You can see the IR missile seeker pipper jump to flares. So if you launch at that moment, then the missile will definitely go to flare (and never to aircraft itself, but you can launch with lock at the aircraft and have missile go to flare... lol). I don't even anymore know what is the realistic behavior in the MiG-21Bis gunsight. At least with the rockets and gun in A-G mode you should get the CCIP calculation, but for bombs you shouldn't get any CCIP/CCRP functionality but need to bomb by using numbers. AFAIK the IR missiles should signal the detection of the heat only by the audio, but not by placing pipper on the target when seeker is uncaged to it. And when the radar is locked on target, the pipper should not lock on the target but stay centered, only seeing the radar pointing angle via the radar scope. It would be awesome to get the IR effects for missiles and targeting systems, but we are getting there. Razbam just added in 2.7 the "Hot Spot Detector" (in fact it is not a hot spot detector but a Delta-T detector, pilot selects the temperature size, temperature range from scale and then some other values and the FLIR will symbolize where something is seen by those parameters, so hottest targets (literally hot spots) are not being detected as you can even detect the coldest or middle ones or where is variation in temperature etc) and ED is working for the new improved FLIR that should count the new clouds and all other reflections to lure IR missiles, FLIR cameras to have the realistic look (no more units popping out as bright hot objects automatically) and as well get flares emit heat. Hopefully we get someday the chaff modeling and proper radars so targets would appear and disappear in radar at various ranges and scenarios, and we would see radar get confused by chaff and seekers should really work for the CM instead just rolling a die "Do I go to chaff/flare or not, even if it is 20 degree out of my field of view?".
  12. And what manual is that?
  13. Quick testing, it was possible to get all buttons and triggers work via VJoy in DCS. But not the right controller mini-stick axis or grip axis (left controller grip is a axis). But the problem is that DCS World will register these inputs properly in bindings, but once you are in the game it will override most with the default actions. But I example managed to get a AV-8B N/A axis registered so that left touch controller mini-stick was a TDC and pressing it down was TDC Action (lock etc). So I could control TV by slewing it with the left touch controller mini-stick and designate targets. Problem was as well that AV-8B doesn't have a support for VR controllers ("shocking", as they have stated in patch notes last december to have been implemented support for it). So for me the Auto_Oculus_Touch -> Autohotkey -> Vjoy (Joystick Gremlin) -> DCSW was a only partially success. But I believe that any of these will capture the touch controllers input, but simply listen it and reacts to it. So DCSW will still read the real inputs and react to them as well. And that can be the main problem.
  14. Well, the animation for the virtual hand to grab the switch is fancy, but it as well feels little off. And in VTOL VR (one of my favorite VR games, so would have hoped to see it in standalone for Quest 2.... Think about playing VTOL VR where ever just with Quest 2.... Oh boy) the cockpit is designed by the VR controllers. Large buttons, well separated big switches and levers. In a real aircraft switches, buttons and levers are placed so that you manipulate them with thumb or you just flip with finger. Like example the MiG-21Bis sidepanel switches are guarded each individually so that you can lay your hand on the panel to feel where your hand is and flip switch up / down with thumb, and then glance that did you flip the proper one and back to looking outside. The landing gear is like you don't look at it, you just get your hand on it and feel it and you operate it in all aircraft. The UFC is such that you place hand to side of it and use thumb to operate it while hand is supported on side. Having a VR hand to perform a "pinch" like in VTOL VR on each widget and knob in the DS could look challenging. It could be a good solution but we would need the virtual hand to go through everything else than these now. I would like to try it as it could be good response but right now it is easy to point virtual hand index finger on thing and get it operated. Problem is that once you do so, the hand continues sending input and widget will keep reacting to input as fast as it can. So you can see switches jumping back and forth at rapid speed etc. That was the old setting. I don't want the beam at all. Just like we have it now. No beam as it is distracting. Just the glove as now is amazing. This means as well that now the invisible beam as it still exist would be visible, you can get it when you point to other side of cockpit in Mi-8MTv2 I noticed... So it seems to be designed so that if you can reach at something the beam is invisible, but if you point something you can't reach, then it is visible!. But I don't want whole beam visible or its function. I want option to have it gone. So options for VR settings for virtual hands: 1) Dynamic laser beam (current) / constant laser beam (old) / no laser beam. 2) On/Off for a capability operate widgets from further distance than touching finger (the end of the laser beam) 3) laser beam visible all the time / only when grip hold down. I think the canopy opening and ejection seat handles etc should require a button click. Not touching. A UFC panel should work with just touching, but you accidentally place hand on ejection seat that's top of you are sitting and BOOM, you are out. Nope, it is requirement to do three times clicking (and should be in all aircraft, as only few does it. Best way to get ejected...). Totally agree. On/Off setting for that too. There was the bug report as well about grabbing those controls even when they were hidden, needs checking that is it still a thing or is it gone. As it was bad when you accidentally moved hand through empty space and suddenly pushed throttle to idle or stick fully to one direction. If invisible = no touch reaction. Actually started to check out for some of the solutions for binding the Touch Controllers and at least this was interesting in one of them: AutoOculusTouch can give you: - index and hand triggers of a Touch, as floats from 0.0 to 1.0. - thumbstick axes as floats from -1.0 to 1.0. - all Touch, Remote and XBox buttons (except the Oculus Home button and remote volume buttons). - all Touch capacitive sensors. - all Touch capacitive gestures (index pointing and thumbs up for either hand). - Pitch, Roll and Yaw of both touch controllers and headset in degrees. - Position of both touch controllers and headset in metres. - Set continuous or limited time vibration effects of different frequencies and amplitudes on either touch controller. vJoy Support Normally AutoHotKey can only generate keyboard and mouse events. vJoy is a driver that emulates one or more virtual joysticks with configurable features. AutoOculusTouch can now send analog axis and digital button values to vJoy. This lets you use Touch (or the remote) as a gamepad in games that support DirectInput. Note: while most of the controls match an XBox controller, it technically isn't one. Any game that uses XInput directly can't see vJoy. Only DirectInput games will work here.
  15. Most of the bindings are there. Trigger = Left Click Trigger (CCW) = Right Click / Middle click to drag knobs around by moving hand. Up = Right click Down = Left click Right = Rotate right Left = Rotate left X/Y = Center VR view A/B = Zoom view (don't really recall which one was which as I don't even remember which side had which without looking a photo, and don't remember which button did what) Grip HOLD = Grab virtual controls as long as hold down. So if something that has changed is possible the trigger acting as LMB when grabbing the stick. I don't recall how it worked back in the day when the virtual controllers gained capability to grab these as I don't use this unrealistic feature (nice to have if traveling with a laptop and VR to fly a simple aircraft like WW2 birds that don't have more than one button in throttle or stick). But that I know that forever you have not had change to bind these buttons and triggers in touch controllers properly in game, what has been wishlist to be changed. One can try to use a DirectInput-to-XInput Wrappers to get things working by the button part by binding them to keyboard etc. Dolphin VR https://dolphinvr.wordpress.com/ Auto_Oculus_Touch https://github.com/rajetic/auto_oculus_touch/releases XOutput https://github.com/csutorasa/XOutput XBOX 360 Controller Emulator https://www.x360ce.com/ The Oculus Touch Controllers should be a DirectInput devices, so you should be able read those inputs and convert them to XInput for standard DirectX inputs. I might try those later today as I am very interested to get more out of them, but I want the hands move in space as now. Personally for me the ultimate is still that no lasers, no crosshairs. Get the virtual hand positioned properly for a real hand and then get each of button/switch etc be triggered only once every X milliseconds (like touch input is registered for same button/switch only once every 200-2000 milliseconds instead like now with as fast and long as finger is on the button/switch).
  16. But they just added things, a lot of things. They have modified many things that are not in the changelog. Like right now the Mi-8 weapon safety light has actual text in it, instead just lamp. We have "VRFree" controller in the VR settings. We have this new hand (and angle and all). Clearly they take these things seriously but need to just do what is required to complete them. Sure that is like realistic thing, but the problem is again in the real controller grip button location being in unrealistic position, why you can't hold cyclic with just two/three fingers as you need to hold the controller in your hand to keep pushing that grip button. Hence it is better to go for press for grab, press for release as then you can fly with holding controller with just two or three fingers like real thing. The current system is as you wanted, in HOLD mode. As long you keep grip button down, as long it is grabbing the virtual controllers. Once you release the grip button, then you release the virtual controls. It is like it was. It is not a Toggle but Hold how it works.
  17. You are not swapping the seats, you are swapping the character. That is the difference. You are in the RPG where you select what AI seat you are playing. You are not flying alone the helicopter but with the AI. So you just "allocate the soul" of the AI. So just like you pretend to be a pilot virtually, you are pretending to be a commander, co-pilot, flight engineer or door gunner.... Again, it is not that you are there to swap seats with some other person. You are allocating just the person in that position. And you are stretching your arm as far you can, but not further. Like in Mi-8 it is for commander easy to flip the weapons safety above his head, in VR you can do that without even looking once you get the experience that in what position it is. It is like in reality that you just reach to it and flip it. But, it is easier for a flight engineer to prepare the weapons by turning weapons safety off from the top left ceiling and then from center top panel and prime the weapons. And commander task is to operate his weapons panel above his head. Actually it is not. I just tested. As long you keep pressing Touch Controllers Grip button, so long you are grabbing the cyclic or the collective. But once you release the grip button, you are not anymore grabbing the virtual controls. So it works exactly as you would want to (I would like to see that option to enable press once to grab, press second time to release). The normal problem is that in Mi-8 you can not bind the weapon release button to mouse - hence you can't just use right controller to fly and use right controller trigger (default left mouse button) to fire. In a UH-1H this is allowed for be binded for mouse, so you can select the LMB to Weapon Release/Machine gun - but you can't still fire with right controller trigger as on that moment the trigger is not recognized as left mouse button.
  18. That is not a trouble for me, that is the realistic experience. You can't in reality reach things = you need multiple people in the cockpit for each person to do different tasks. And that is well done in Mi-8 that you can jump to second seat. At now we can't have a AI keep flying smoothly if you jump to co-pilot seat or flight engineer but need to keep flying even if flight engineer doesn't have controls, but it is better than nothing. Like in reality you can swap controls with the co-pilot when they have better visibility to right side, so you can do that just with a flip of a hat to that side and do it from there.
  19. Instead reversing the work to return new code and hand to old one, it is better just to fix the problem by making the suggested changes. 1) Add option to enable laser beam and crosshair for those who need it (+ adjustment slider to set the laser beam length from like 2 cm to 200 cm) 2) Add option to enable old grip button function to behave that you need to hold it down to have it grabbing the stick/throttle, so when flying you need to keep pressing grip button down. 3) Add option to bind A/B and X/Y buttons as separate functions when holding grip or when not holding grip (and then have opposite as default VR reset, VR zoom, end mission etc) This allows to example set them as trim or shoot/release buttons or just keep them as example left/right mouse button for easy use (and allow trigger to be binded as well). 4) Allow adjusting hand position and angle, as now it is off-axis from the real controller. Trigger by default acts as LMB and when you rotate hand past 12'clock position it acts as a RMB / middle mouse button depending is button/switch target or a knob/dial. That has been problem from the begin that we can't bind those buttons and hats as wanted. Instead we are forced to use what ED decides for them. For flying helicopter I take more gladly that I can just press once the grip to grab the control and then hold gently the controller to fly. I don't need to be keeping grip button down to maintain the grip. Then to let go I press again the grip. Helicopter is Hands On flying, you can't release grip unless you have autopilot like in KA-50 to take care of it, and there you can just release grip as AP doesn't care so much. Flying conventional helicopter without holding the cyclic is unrealistic. And you want to fly helicopter for long periods just by holding controller with two or three fingers instead full grip holding the grip down.
  20. That is similar problem as in many VR games that people don't have space to even flex their hands when they stand, even less to take a side step. So why should those who have space to be limited for standing position and not to be able even move their arms further from their bodies? Sure, return the laser and crosshair as option in VR settings. But allow them to be turned Off separately so that VR hand is required to be used to activate things. I have, like many others too, empty space around the flight chair. There is nothing else than a throttle on left side and stick between legs or at right side (depending settings). I have years moved my hands to operate all the buttons and switches in the cockpits to touch them. This example means in Mi-8MTv2 that I need to swap the seat to co-pilot if I want to operate the right side of the cockpit. If I want to operate the upper panel then it is better be done from the flight engineer seat. It is great we finally have the laser and crosshair gone, but it should be option to turn them back On by those who requires the "extra reach".
  21. All those should be optional in the VR settings. You can disagree but not everyone want laser beams and crosshairs and constantly hold the grip. The old triggers are still there. You just don't get to do it properly as the "invisible laser beam" points at the different direction than your finger does. Mini-stick Up moves switches Up and Down moves Down. Left turns counter-clockwise and right turns clockwise all the knobs. Trigger is still as a left click, and rotating palm around makes a right click on buttons/switches and on knobs it is "grab" and moving hand adjust them. You just need to know where to point the hand to use those things like before.
  22. The new American flight glove is interesting (liked the black leather with nice stitching in the upper part too) and it is now better sized IMHO. I was as well happy to see first the beam lasers gone, but there are few problems now. The previous mini-stick Up/Down/Left/Right works as previously, but you just don't see the cross or the laser beam to know that what you are pointing, and it is pointing wrong direction so it is difficult to do without labels. 1) Regardless you have controllers in use, the mouse cursor moves with the HMD in VR. So you get all over the places the buttons labels popping up and away. Very annoying. You can't disable the "Use Mouse" in VR settings as then it turns to HMD centered mouse dot. Like, I want the hands to be only active when I press the grip button (Oculus Touch). 2) I want to see a option to enable/disable the laser beam as here are people wanted to see it. I am more than happy for having it gone, as I want to point things with my finger (have been doing so for long time) instead use a beam/cross. Have a option to hold the mini-stick Up/Down/Left/Right and have its function to be input when you touch the buttons, switches and knobs etc. Now your poke will just trigger mouse left button and you need to point object and utilize the proper function separately. 3) Hand angle and finger direction is wrong. If having hand in controller properly and pointing with index finger straight, the game hand and finger points to right and below from the real hand position. It feels just wrong and bad. So a hand axis adjustment is required to be in the VR settings. There could be automatic one where you point with finger a dot on the screen and press button and the system automatically adjust hand axis to match the real hand pointing direction. Or just have two axis sliders to adjust the hand X/Y off-set. 4) For the touch / pointing to work properly, we need new setting for all cockpit switches. A delay between activation. Right now if you have a switch that moves Up and Down, then touching it with index finger makes it flip Up and Down at rapid speed. You need to try to just barely touch the button from corret position so it wouldn't be touched when it flips to another position. Like have a adjustable 0.5-2 second delay for re-reaction of the touch. So if you hold finger on the switch then it will swap its position only after set delay time has moved, so like after 2 seconds on each flip it will flip again. This would allow to go pressing buttons in the MFCD easily as right now you press button and it will swap between pages on that one OSB as you don't get to move hand away in time. So your only way is to "swipe" finger quickly over the button carefully so you don't touch buttons next to it. Having re-reaction happen example 0.5 second delay would allow to just point the buttons, and it wouldn't even be a problem if required to input multiple times same button like no:8 in UFC for inputting laser code 1688 as 0.5 second would be enough delay to allow second time of pressing 8 to be acceptable (2 seconds would be annoying in there, but not with switches) I might be the only one who is happy for loss of the laser beam and mouse cursor, but IMHO the virtual hand should be usable without either one. Just get the hand angle and repeat delay fixed and at least I am happy!
  23. All the years (7 years here) for the improved clouds has reserved its purpose and meaning. Just got fresh installations done aside of the previous 2.5.6 version (in case if required) and after couple hours I am ready to delete the previous version. Flying got depth, the feeling how you now can speed upward when you have clouds below and above you in multiple layers is amazing. The tactics got better multifold. The requirement to fly to get the opening from altitude between the clouds is as expected - a amazing experience. The lighting changes for the terrain by clouds shadows is just marvelous. And this is just a begin for a completely new DCS World! We just received something that is not just an eye candy as many said clouds to be, but actually about navigation and tactical engagements multiplier. I have just got to try just three profiles for clouds, and can't wait to get to test them all! And then even get the editor for them! Flying at 12'clock at mid day has never been so visually beautiful and interesting.
  24. Don't take a stress about it, as I trust you that you have good starting point to go for the problem source. I have my own opinions what is missing in the Mi-8MTv2, like some kind a "weight" or "grip" that is difficulty to really point out. It can be about the feeling as well how the VRS feels to behave as well, as for some reason it is as well enjoyable as that there could be something off. Doesn't Mi-8 pilots train for a VRS in a controlled and safe fashion at higher altitudes how it happens and how to recover from it? At least I think some would have some experience for it, while no one really want to experience it at all at any altitude. That might be it. I have just the feeling that the mass/inertia is somewhat missing and it can be about that light Mi-8 feels little too easily "going in" without loading it to maximum, and even then it has some odd thing. I believe you might have something there. As if I compare Mi-8 to something else like KA-50 that is heavier than a Mi-8, it has this interesting inertia. It has the feeling that air grabs you. That is missing from the Mi-8. Based to testimonies about KA-50 vs Mi-8 and Mi-24, the KA-50 should be superior at the high altitudes and high cross winds like on mountains. But I don't get that feeling. That is why I am interested to see how Mi-24 behaves as it should be more stable and more "direct". But nothing of that can be trusted than as a opinion and requires more evidence. Edit: Need to point out that I have been flying SA342 for the last couple weeks as primary helicopter, and its Flight Modeling is destructive. It totally is terrible for your capabilities fly any other helicopter. It takes time to get use to when transitioning from others, but you get to use for its game style flying by how wrong it is. And when you come back to Mi-8, as I have now done to prepare for Mi-24, it makes all senses to say "THIS IS WRONG!" as you just got to completely different experience. I don't recall such major problems when going between KA-50 and Mi-8 alone. Or Mi-8 and UH-1 as all those behaves more as expected.
×
×
  • Create New...