Jump to content

sk000tch

Members
  • Posts

    410
  • Joined

  • Last visited

Everything posted by sk000tch

  1. Others have commented on the penalty and high Vc scenarios, so i'd just echo that. As Eteokls points out its more that its occurring in geometries and Vc that it should not have a penalty rather than the strength of the penalty The ranges are a little concerning. Its currently detecting large fighters at ranges that are well below NLT for most of the intercept timeline. That's vague enough to not get in trouble yet clear enough to illustrate the issue I hope? Can't PRI and Commit by 30nm on something you can't see until 20nm, so to speak. Right now the Hornet feels like an Eagle compare to the Viper, which isn't right. The Hornet should feel similar, as there's a 40mm difference in aperture (compared the the 200-250mm in Eagle/Cat)
  2. Neither it's just aircraft g aka load factor, so n = L/W, or as is elsewhere stated, cos (ɸ) where ɸ = bank angle. Lateral component isn't factored in, except in simplifying the lift equation with the equivalency of vertical lift component and weight in level flight.
  3. ok what they hell... I wrote a response addressing many comments to me but i have one sentence left? I have no idea how this quote system works, so no point by point rebuttal. Probably good actually. My point was, actually my points were twofold. These forums are overflowing with awful information, much of it made up or pure conjecture, answers to questions that begin with "i think it should" or similar are rampant. I'm glad ya'll read a war college thesis or something, and maybe its a good piece of work -- many are. Nevertheless, extrapolating real world combat effectiveness to DCS isn't going to work well, but hopefully it provided a lot of detail about current or recent past capabilities. Keep in mind if its discussing late model pods it will assuredly not contain a discussion of all capabilities. Anyway, as to resolution - my point was that you can't examine a single characteristic without considering the whole. At least not meaningfully. Real world pods' image quality is a product of physics, lenses, wave mechanics, weather and other atmospheric phenomenon, image processers, software, etc.,. and as I previously explained, they change. LRUs are upgraded continuously, and in some cases the upgrades can significantly impact utility. What is the point in focusing on the exact zoom level if the pod is not correctly drawing contrast? If I say I am able to ID an AK at 40k slant range, and perhaps cite the lighting conditions/time of day, etc., if the game pod does not resolve sufficient detail to make that ID at that specific range, but zooms in further on the blurry image, is the solution to decreases zoom? I don't want to drone on regarding every single reply but my point about undisplayed detail seems to have been misunderstood. It ain't exactly rocket science, but if i'm flying a USMC legacy hornet with a lightning AT, I have a 1024x1024 EOTS that must be downsampled by the MC for the display. Again I know the pilot shit but not the computer shit, but the azimuth/angular resolution of the base image contains detail the display cannot show, thus when zoomed that detail is displayed, without any digital zoom interpolation. Hopefully that makes sense, the radar does something similar btw, but that is for another post. Again though, resolution matters but its one factor among many. The move from bandgap to quantum well imagers was huge, far more important that higher resolution of poor detail FLIR. There is a whole suite of detection related capabilities the pods had by mid-late 2000s that are not simulated. How should that factor in? Anyway, I said my peace. I don't come around here much so won't do the back and forth nonsense. And toilet2000 I have done my research. I won't boast about RL quals in how many DCS modules or hours, or even anecdotal personal opinions about specific pods. I'm not here to attack anyone or prove anything, quite the opposite actually. I came here for settings info for VR and swung by to answer a few questions, but certainly not get drawn into bickering match. Its either that or leave like so many others did, though I suppose my infrequence use is a less committed version of exactly that.
  4. Precisely, DCS doesn't really model different "kinds" of zoom, or limitations of optics/digital zoom and so forth, nor sensor resolution limits. Please don't state conjecture as fact. These forums are riddled with answers to questions that begin with "It should" or "airplane Y does this so..." Just because you have experience with something kind of like another, does not equate to knowledge of the other. I recognize you know about pixels, and, you have evidently both zoomed in on your camera phone or in photoshop beyond where actual pixel data exists. However, these systems have nothing in common with the pods. The MFDs in the legacy hornet are low res, depending on what pod and what mode you are in there are usually many more levels of zoom data there before you would haver the problem you're talking about. Moreover, the pods we have in the sim cannot be compared in any meaningful way to RL pods. There's a few reasons for this, including it being very difficult to compare a pod zooming into detail of a rendered relatively low res environment vs. a pod zooming into a real world very high rez environment, the factors that contribute to loss of clarity or often optical or environmental in nature and aren't sim'd in DCS. Moreover, image processing and stabilization plays a HUGE rule in how an image appears yet it doesn't show up in pure stats. As a result, its impossible to compare apples to apples. What pods specifically are we talking about? Litening II with 320 or ER with 600 rez flir? What about atflir? Is it after LRU upgrade and software suite? Our litening pods don't have basic gen 3 functionality like multiple target cuing, so it must be a II. What about Sniper XR? The first HTS dual pod compatible targeting pod of any kind was Sniper, and only then it was, iirc (<-- see what I did ithere when i wasn't sure?) release 7 and S-3. So if they go with that, we've got multi-target cuing, auto target recognition and threat categorization with vastly increased accuracy coordinates, the full suite of NTISR .... actually, guarantee this won't be in game but we will be able to ID an ak-47 from 60k ft slant range right? *crickets* Actually in fairness if that's the sniper version then litening should be G4, or at least G4 kit upgrade right? 1024x1024 Flir, amazing CCD and the NIR/IR laser imaging? That's what the rafael got with its AVP so aren't we.... Yes, we should definitely get G4 right? I'm not trying to rub any noses in piss its just this place has a way of creating and then spreading misinformation. I don't know how much it affects the game design, but it doesn't take much for some nonsense to get repeated as fact. So please, i beg, unless you really know what you're talking about please hold off on comments like the Litening pod is simulated "too well." Its just not true, its not even close. Even if it were true optically, its missing so much stuff that it doesn't make up for. And even if you had lots of experience using the pods, as many do, you would likely agree that comparing a simulated pod to a real pod is harder than it sounds. Even if at a specfiic time we can document that the apparent zoom is higher than it should be, what matters is the whole package that sums to the pods' effectiveness. And asking those questions, it really matters exactly which pod and when you're talking about.
  5. Man sometimes I sympathize with ED... The irony here is what you ask for is counterproductive to the larger goal you say you want. Its an absolute truth in software development that strict adherence to frequently updated incremental releases is horribly inefficient. If what you want is the best game as fast as possible, shut up and let them work. There will always be delays in development. Sometimes you plan well and release as planned, sometimes you don't. If you have to waste a week getting an incremental patch ready for public release before it logically makes sense from a development perspective, that week is wasted. It is spent only on that incremental release and has negligible bearing on the final product. And btw, the whole "everybody knows that" or "nobody believes you ED" thing is Trumpy and unpersuasive. I'm honestly not trying to pick on you, just noting that relying on inaccurate broad characterizations like you have, particularly when they are obviously untrue (I disagree with almost every sentence you wrote and I suspect many others feel the same). It distracts from the point you're trying to argue rather than augment it. Just chill a bit. Even if the community agree with your viewepoint the incentive you create for ED is to reduce the amount of information they share regarding their development plan. There will always be delays and the product features will also change, forcing strict adherence to the information they release will not change that, it will only change the amount of information they release.
  6. I am curious about the opinion of those with the stick as well. I know it had some issues early on, but its improved? Any of the stick snobs pick one up? I've got a fssb r3L w/ f16GRH on the side w/ Gunfighter and MCG pro in center. I've been contemplating going virpil so as to be able to use realsims f-18 grip. I don't suppose anybody has had hands on both? btw intend to pickup winwing throttle and probably panels, hence the question. Big realsim fan fwiw if it maters
  7. The obvious potential explanation is priority, if not aware it can only display 7 tracks. I haven't had the opportunity to play with it in MP with flight members, but the little time I did have donor tracks did seem to get a little laggy. Do you think it was a network issue?
  8. I suspect you might be taking "do not loft" as an absolute. Elevation angle rate of change and flight condition allow the LOBL to utilize a variant of PN guidance (RWR slaved modes as well to a lesser extent). The flight path ends up being an arc with, in some cases, pitch stepping like you would see from a JDAM. If you compared it to a range known flight path the range known would climb much more and stay at altitude longer, though once it stiches to homing it will fly similar How much lofting are you seeing? Like what was altitude and range at launch, and how much did it climb?
  9. Sk00tch also said that straight and level release provides superior stand-off in all flight conditions. There is no benefit to lofting in range, and it gives a substantial precision hit. Same with dives for that matter, no increased penetration or accuracy/precision. The non-ballistic JDAM flight path is most efficient when given altitude, speed and time. A release at 30k ft and .9 Mach should give you 12-15nm stand-off, with maximum gps guidance time There are always tactical situations you can come up with. As a result, there is a procedure for loft release but it says the same thing I’m saying Again though, this is a game... everybody should fly however they want! Just because it’s less efficient doesn’t mean it’s not fun
  10. I believe HTS is scheduled for later in phase II. I wouldn't expect anything until you see PB mode in the Hornet, since hey are both range known LOAL modes. Probably should expect incremental capability as well. I.e. HTS first allows HARM range known mode, then JHMCS integration, then Precise geolocation for PGMs, then maybe left side mounting with dual pod/enhanced PGM capability, then HTS threat categorizing/sharing via link16, then maybe the 3 aircraft rapid acquisition and targeting stuff or maybe someday we'll get a rivet joint and that dlink - just examples, I have no idea how they will actually release Question though...I know many here have played other f-16 consumer level sims, how did they do the HTS simulation? I'm sympathetic to the classification issues ED deals with, and the need for documentation. The AN/ASQ-213 is only classified as confidential, but documentation of technical nature or that discusses performance parameters, vulnerabilities, or performance against specific threats, war time threat tables are mostly classified secret for HTS R7 - the 2007 revision. Ive never seen any decent HTS documentation floating around, much less anything with the detail to simulate it? Also, I haven’t been following the viper too closely, but has ED said whether they are going to dual pod HTS w/ litening(v)? Far as I know that's not possible, but whatever... if there's some reason they can't do sniper I'd rather they bend the realism rules a bit than lose the dual pod config. They ever explain why sniper ATP got axed? I mean, it's real similar to litening(v), little less max range but I doubt that's xtra 10-15% would be all that usable in DCS anyway, I'm more worried about the dual pod thing
  11. Not an a10 pilot and don’t have any 1st hand knowledge, but there were issues early on, at least in development, where the rocket motor exhaust causing damage over time to the laser sensor of other rockets in the launcher (even though they’re ~center of rocket). I don’t think that’s what that is though. Again, basically an educated guess but I remember something about a variant or 2nd gen, at least for some platforms, that is longer and would be likely cause for longer launcher. I hate it when people spread bad info here with posts that start with “I think” and I am breaking my own rule here, but this forum is pretty slow still. I’m sure someone that knows what they’re talking about will come around eventually and correct me.
  12. Yoda- I was just commenting on the question of losing JDAMs as a general matter and his 2nd question of how do you typically attack a high threat target. I don’t know that I’ve seen the same lone wolf stuff in forums, I think most DCS players understand fighters fight as a section. While I do see players and/or mission designers tend toward older tactics, specifically low level, I suspect this is due to DCS lack of EW (and stealth). Because that critical element of modern tactics is missing, players defeat surface to air threats by flying nap of the earth. I am not criticizing DCS, it would be impossible le to accurately simulate. And, frankly, overbanking ridge lines is much more fun than ensuring your positioned between a growler and threat. While I wouldn’t say every strike includes sweep, sead, strike and EW, yours is a good description. There are many who argue that the massive reduction in low level training hours is a mistake. But, while pilots still terrain mask and penetrate low level when given the option at LFEs, low level strikes against defended targets died 30 years ago when f-117 flew circles around Baghdad and intruders/tornadoes came back with 30mm holes. For better or worse it’s not a part of modern doctrine. For anyone interested, yt has a nice almost tacview style overview of the first night that includes all plus some of what gods described:
  13. Couple things - let's just agree not cite Reaper videos as evidence of real world employment Loft delivery profiles do not improve JDAM performance due to the shaped trajectory commanded by the autopilot. Lofting reduces the range capability afforded by a straight and level release under the same flight conditions, and significantly increase the standard deviation from mean point of impact due to autopilot inconsistencies and LAR uncertainties in the dynamic IZLAR. Release profile against high threat targets is ideally high, fast and on-axis to maximize standoff distance and TOF to maximize GPS guidance. Against isolated SAMs, a tangential off-axis release along the threat ring to exploit JDAMs high off boresight capability can be used to good effect. Against IADS or well defended targets JDAM is not the most desirable weapon due to its minimal standoff relative to JSOW and propelled weapons. I don't know what you mean by "first world opponent" as its not a well defined term. But, assuming a modern force with a significantly degraded IADS, the high potential of GPS jamming also makes loft deliveries a poor choice. When GPS is denied the weapon can be released INS only, but due to INS drift the accuracy degrades significantly as TOF increases and/or off-axis or shaped loft deliveries are used. Additionally, the GPS antenna is rear mounted in part to utilize weapon masking of jamming signals. A loft trajectory's upward glide will maximize exposure of the antenna to the jamming signal regardless of orientation and its long TOF will maximize INS drift, the two most important parameters in accuracy/CEP. Bottom line - while you can loft a JDAM in manual release, if the 12-15nm standoff of a JDAM isn't enough use a JSOW.
  14. It seems from comments on their forum that it’s moot at this point, but I don’t think realsim would produce a consumer product and price it out of consumer price range. Their current lineup is a good guide, their stuff is expensive, but not prohibitive for the segment they target. I would add that extrapolating costs from Otto switches doesn’t make much sense. On average I fly irl probably about as much as I play DCS, and while it does make me extremely picky about things like stick feel, the deflection force (or switch force) that feels right when your sitting on a chute at 9G would feel outrageous for a sim control. For example, I like about 8lb pull on my realsim fssb f16grh combo, but my RL ride is a 40+lb pull at 9G. The strength/robustness (and reliability) is also a factor, I’m pretty sure if I stomped my crosswinds the way I do rl pedals they would disintegrate...
  15. Neither, or it depends I guess. Instead of "see" lets use "communicate," because seeing where other units are is just one small aspect of what links do. Real time ISR, targeting, avoiding blue on blue, better informed and faster command, its an endless list. But tio answer your question: Viper can communicate with both Hogs (SADL) or Hornets/Eagles (Link-16) Hog can communicate with Hogs, Viper, AC-130, many helos, some Army small drones like puma/shadow, larger drones like predator/gray eagle, global hawk, tankers, any ground unit with a EPLRS radio. Hornets/Eagles can communicate with other eagles/hornets, vipers, f-35s, f-22 (receive only), B-1, B2s, awacs, jstars, rivet joints, tankers, ships, most other ISR assets from U-2s to sentinels to global hawks, NATO fighters (typhoon/rafael) There are many others. F-22 uses IFDL (but can receive link 16), F35 uses MADL, but also link 16. Global hawks have everything from ku-band satcom to UHF tactical links, and can operate as dedicated gateway (EQ-4B variant). In reality it's not this segmented. Gateway is a misleading name, it's not a firewall/router type device but rather a translator/forwarder that transfers data fromo one datalnk to another, or more accurately between several. Some are air based, in DCS i guess awacs is performing the job though E-11 and EQ-4B are real world airborn communication nodes. Air based has the advantage of not having line of sight issues, but there are ground gateways as well. BUG-E's are one example of link 16 - SADL ground gateway, or link 16/SADL - SIPRNet (WAN based). This is really just scratching the surface, and just public info. OEF was the ultimate proof of concept and its only accelerated since.
  16. The A-10C cannot transmit or receive Link 16 messages only SADL, the f-16 can do both though. As to the main point though, I don't think ED simulated gateways, they simulated the function, but not specific assets. The SADL-Link 16 gateway still works if there is no awacs. Perhaps they built an automatic reversion to ship-based or other terminals/gateways, but the more likely answer, in light of the choices they made regarding simplification of datalinks as a whole in their implementation, is that ED chose to implement a gateway function without requiring a specific asset. It makes sense for a number of reasons (other than strict realism). Like all systems they model they make choices, for a variety of reasons, about what behavior of the real system they choose to simulate. So, an individual fighter gets a line of sight check. That's good low hanging fruit realism, so to speak. A fair amount of what they did include can be inferred from how the system works, what functions we have in jet, etc. We don't, for example, have different modes, or peacetime constraints like mode 1 frequency hopping. That makes sense because, well, not peacetime. But we also don't have frequency separation concerns, IPF failures, and while DCS is not "big" enough where time slot duty factor would be an issue, its also not simulated. The F-16 fine/coarse sync is a good example. Its in there, you can change it, but their's no system time, network time reference or event time slots... so its pointless. That detail would probably be difficult to simulate for several reasons and would add very little to the sim, so i'm fine with it. AWACS is just one gateway and generally is used for a limited purpose. Voice bridging and tactical gateway functions are usually provided by redundant layers with specific assets serving participial forces. A drone bridging SADL L16 for force protection of a large ground maneuver, navy s tadil satellites for fleet, modified Gulfstream V bridging frequencies w/ NATO allies, etc. If someone were really curious it would be easy to test. We don't have milstar satellites or modified gulfstreams, just AWACS and surface ships. Gateway altitude is critical to line of sight, so if DCS is simulating gateways, we should see SADL-link 16 gateway degrade if you build a mission with no relays and terrain blocking LOS to the fleet, then test on other side of terrain.-
  17. The A-10C cannot transmit or receive Link 16 messages only SADL, the f-16 can do both though. As to the main point though, I don't think ED simulated gateways, they simulated the function, but not specific assets. The SADL-Link 16 gateway still works if there is no awacs. Perhaps they built an automatic reversion to ship-based or other terminals/gateways, but the more likely answer, in light of the choices they made regarding simplification of datalinks as a whole in their implementation, is that ED chose to implement a gateway function without requiring a specific asset. It makes sense for a number of reasons (other than strict realism). Like all systems they model they make choices, for a variety of reasons, about what behavior of the real system they choose to simulate. So, an individual fighter gets a line of sight check. That's good low hanging fruit realism, so to speak. A fair amount of what they did include can be inferred from how the system works, what functions we have in jet, etc. We don't, for example, have different modes, or peacetime constraints like mode 1 frequency hopping. That makes sense because, well, not peacetime. But we also don't have frequency separation concerns, IPF failures, and while DCS is not "big" enough where time slot duty factor would be an issue, its also not simulated. The F-16 fine/coarse sync is a good example. Its in there, you can change it, but their's no system time, network time reference or event time slots... so its pointless. That detail would probably be difficult to simulate for several reasons and would add very little to the sim, so i'm fine with it. AWACS is just one gateway and generally is used for a limited purpose. Voice bridging and tactical gateway functions are usually provided by redundant layers with specific assets serving participial forces. A drone bridging SADL L16 for force protection of a large ground maneuver, navy s tadil satellites for fleet, modified Gulfstream V bridging frequencies w/ NATO allies, etc. If someone were really curious it would be easy to test. We don't have milstar satellites or modified gulfstreams, just AWACS and surface ships. Gateway altitude is critical to line of sight, so if DCS is simulating gateways, we should see link degrade significantly for a fighter with terrain blocking LOS to the fleet or other friendly assets vs. a high altitude awacs.
  18. Hope nobody got in trouble, i was surprised to see comments re how employed/effectiveness. We are better off citing public domain docs guys, it provides something ED can actually use. Curly - that is a cool way to represent/show how cell size effects IQ, but if I understand correctly its not accounting for supersamping like effect used. Google pixel spacing SAR resolution for a better explanation than I can give. I am nowhere near as informed as ya'll about specific block we have or exactly when specfiic upgrades, etc., but if we have BRU-55's that is at least 05/06-ish iirc? That was a significant 8-years relative to 97-98. With SAR the tradeoff is latency vs. resolution, and latency is most prominently effected image processing speed, so... No theorycrafting just pointing out the significant unknowns. If the legacy hornet's SAR was good enough for ATA terminal or mid-course updates for SLAM-ERs, which had the same or a modified version of block III TLAMs seeker, even if not ideal, there's plenty of public docs on optimizing cruise missile delivery and difficult of timely ISR that you can infer a minimum image quality from. Anyway, I just came here to ask whether ED has said anything about this? I am not seeing anything that even acknowledges EXP 3 isn't just an additional zoom level of DBS patch?
  19. The amount of arguments that begin with "I think" or "It would make sense" is one of the reason many don't post here anymore, at least regarding technical topics. Tactical or procedural conversations are fair game i figure, some people want to learn to fly correctly, others are bored with or uninterested in that and would prefer to fly as they wish. Nothing wrong with either of those even if a bit funny at times, but for topics like this it doesn't benefit anyone to guess. It just confuses the topic with bad information. fyi F-35 has the same size antenna as a legacy hornet, both significantly smaller than than 70s era F-14 and F-15 radar. I am inclined to defer to ED and assume WIP status on most things unless given reason to think otherwise. It is public domain manual stuff that EXP1 and EXP2 and DBS sector and patch, respectively. To the pilot they are functionally the same as MAP with improved azimuth resolution at the expense of forward looking capability. EXP 3 SAR is different, DBS is technically SAR but most here seem to understand in the Hornet it refers to processing returns to generate an EO "like" image. Just so we are all on the same page... ED depends on documentation and I don't think there is much regarding specific SAR cell resolution, latency, and other details required to properly model various geometric distortions, multipath errors, foliage or cloth penetration information, etc. In fairness it does get hyper complicated quickly trying to simulate SAR radar imaging, trying to mimic geometric distortions, how shadows, reflectors, vibrating or spinning objects (like compressor blades) appear. So some simplification is required but that is true of every system in DCS. But again, is there agreement that there is even a problem? I won't argue effectiveness other than to note that an argument based on how it was/is employed in real life makes no sense unless the jet is employed in the same manner, with realistic planning, briefings, support, ISR, etc. AG radar is just another sensor, it has some unique capabilities and plays nice with some of the stand off stuff we have yet to get. Depending on implementation DCS pilots might find more utility that RL given the small force style of missions in the sim, but that's not really the point. If it's in the jet it should be in the sim right?
  20. The role of ships is more complex than that, tactical networks must be managed like any other network. Most of that is not modelled in DCS, but certainly AEGIS systems should contribute like the way SURV/AWACS is modelled (though I am not sure to what extent it currently does). The only thing I would add to fmedges comment is line of sight requirement. I don't know to what level of detail DCS models this, whether LOS to each donor is checked, if there are relays, etc. But you will notice missed messages in canyons and such, so it's in there to some extent.
  21. real simulator's grips are a big step up in quality, the F-16 CE won't be plug and play with VKB but the buttons can be passed via bluetooth while X/Y axis from the gunfighter. I have their f-16 on a FSSB but with their f/a-18 grip coming out soon am contemplating putting it on my gunfighter
  22. fwiw it will change with weight and config, but 250 is a fine approximation. Frankly, it is completely irrelevant outside very limited circumstances. It isn't a Cessna with a 20:1 or whatever insane number it is glide ratio. The hornet has, if I had to guess, probably close to ten times the wing loading of a 172 (~120 lb/ft2 vs. maybe 15 or 20?). For a fighter that's not that high really, viper is higher for example, as are many larger aircraft like a B-1 or 747/A380. That’s not the whole story though. Trapezoidal wings favored by US have excellent flight performance at super/trans sonic speeds, but require aeros like leading edge flaps or strakes (f-16 uses strakes but they typically don’t play well with twin tails) to prevent poor high AoA performance. Actually, complex topic that you didn't ask about... so before this becomes a wall of text just take my word on it – hornet is a terrible glider. Ratio is around 1:1, similar to a brick. More importantly, no where in the emergency procedures is there anything resembling “trim for best glide and scan for suitable emergency landing location.” The actual number is in the 200-300 knot range, but almost never does that factor in to desired airspeed. In a single engine failure she can get home. If both engines are lost priority is restart. By the way you also lost both generators, so MFDs and HUD are gone, and FCS likely MECH ON. Some can be brought back on by cycling batteries after minimizing electrical load. Again, regardless of bests glide, procedure is nose down until airspeed is ~400 knots and check RPM (easy does it if MECH ON). If RPM under 12%, increase airspeed until 12% (450-500) and attempt windmill restart. If stuck at 0% its seized frp, thermal expansion of rapid cooling. Level off to 200-300kts, again, easy does it when pulling, keep wings level, avoid slip, etc. If time attend to other troubleshooting (hydraulics, electrical, etc.) while descending to APU restart max alt (10k). APU accumulator can be charged via HYD ISO switch to ORIDE if you have hydraulic pressure. Good time to get pointed away from population or bad guys, radio (if powered, else survival radio). At 10k can attempt APU restart - should be wings level and 200kts or so at this point, into wind if possible, straps/helmet tight and gear checked as won’t have much time before ditch. Tl;dr its 200-300 kts, but not many Qs its the rigth answer for
  23. jesus i have barely any recollection of this thread... flashback to when I was recovering from an injury/surgery. I don't generally do the walls of text thing, fortunately it's not too bad. If it helped you then great, I am just glad I didn't embarrass myself. btw - solution to your problem is time in seat and bfm drills. Athletes don't practice how to play a game, they do small side drills, then large side drills, and finally scrimmage. No different here To be relevant to this thread - rather than practicing BFM, and work on in close defense. Focus on specifics like lift vector, airspeed/AOA/energy management at first, then maintaining tally, recognizing WEZ, assessing AWE, opportunities to separate, etc. Fly perch sets - lots of them, and do them precisely. Know your game plan until you don't forget 90%. At first just breaking at 9 is difficult, lift vector will be all over the place... eventually plane control will become much better and you can focus on maintaining tally, recognizing the visual cues that tell you what bandit is doing. Review your flights on tacview and note all the mistakes you didn't see/failed to exploit. Next time pick something else, maybe scissors, or snapshots, etc. Bonus if you train with the same guy as you'll have a well trained wingman for SEM. Somehow many words were written arguing whether going vertical with bandit in close deep on 6 was a good idea...
  24. Most guys agree that g-loc in DCS is unrealistic, but for a number of reasons beyond max tolerance. The consistency and low peak G of onset is just one aspect. I remember that main thread well and I don't remember anyone singling out the viper as specifically unrealistic. The Viper just makes reaching DCS' limits easier. Pilots on these forums have lost friends in the 16 though, and virtually everybody knows someone. The viper had a tendency to bite, often unexpectedly, often guys with low time in type, and for more reasons than just the 9g peak. Its better now, they can identify students physiologically predisposed to g-loc during f-16 b course, and with improved g-suit, and of course auto-gcas, the occurrence pmfh is now comparable to other fighters. That said, beyond some simple G warm up logic, DCS doesn't seem to have much nuance or complexity to the current model. There are some big question marks ED needs to decide when redesigning the effect. How to account for variances between aircraft, or the training requirements (i.e. centrifuge profiles) the pilots must pass. Whether to include fatigue or randomness, both of which are big factors. Which, btw, is why auto-gcas is as cool as it is. This vid is a good example of both points, note peak G barely above 8 with relatively slow onset. Conversely, G-LOC onset was very fast, unusually fast actually: Very cool about the incentive flights, sounds like you enjoyed them. It is much different as a passenger, where your sole task is to grunt and bear it. Managing pilot workload, particularly if looking over your shoulder, and the responsibility of aircraft and passengers, changes the equation. Even then, the only time I have completely blacked out was not dissimilar from the video, very low G, sudden onset. I have been over it a thousand times, the negative G preceding positive maneuver, the drinks the night before and lack of breakfast, all things I'd done a thousand times none of which resulted in me waking up knife edge 60 degrees nose down. Yet it still happened. Anyway, ED has info with lots of centrifuge and flight analysis containing mathematical predictions of probability of G-LOC onset at various loads and times, and a whole bunch of anecdotal pilots opinions with suggestions on how to make the system more realistic. Some of which is game friendly, others perhaps not so much. Good example is unexpected sudden onset described above. It is a big RL hazard, but like mechanical failure, probably not something most would enjoy in game. But what about normalizing different aircraft? Or accounting for fatigue, jerk, not allowing continued use of controls or repeated or continuous pushing grey out limit? All would make the overall effect more realistic, but perhaps not all belong in a game? It will be interesting to see what they do. Most everyone agrees the current effect comes on a bit soon, and with the viper now in the hangar, that's much more of a problem. We have been spoiled by the gentile Hornet, as the Typhoon will be even worse. Wags did say they were working on it, though when or what that means i have no idea. Will be interesting to see what they come up with.
  25. To answer your question, the roadmap lists JPF as a 2021 project. I don’t know where damage modeling that would include any JPF capabilities is in the larger DCS roadmap however. As it would be just a couple ddi screens and graphic effect until that happens, I suspect this is why it is listed as a late stage project. Time delay or void sensing is pointless if the effect on a structure is the same as inst. Same is true for terminal trajectories, jdam airbursting, etc, until weaponeering effects on structures and some type of risk estimate distances for different units are modeled there’s not much reason to build it.
×
×
  • Create New...