Jump to content

Sn8ke_iis

Members
  • Posts

    537
  • Joined

  • Last visited

Everything posted by Sn8ke_iis

  1. This might be a bit out of date, but hopefully it will help. From your desktop right click on the desktop to pull up the context menu to pull up Nvidia Control Panel. Or click on the Nvidia icon in your system tray. There's a utility called Nvidia Profile Inspector that I like to use but become familiar with NCP before you jump to that. This one might be more current for an I9.
  2. Fair enough. I was actually wondering about any exemptions. When I lived in Wiesbaden I had one but that was part of the NATO SOFA. It would be a shame for AI researchers and artists to have to pay that premium if they are actually using the cards to be productive and not just goofing off like us. SLI is fun if you are an "enthusiast" and really into benchmarking on the 3Dmark leaderboards. But the cards don't really scale well outside of synthetic benchmarks. Steve from Gamer's Nexus uses Sniper Elite 4 as a benchmark for SLI. It actually scales at 100% but that is the exception. Most games only scale about 30-40%. Besides the cost of the extra GPU card you also need a MB with the appropriate slots and adequate power supply. Good mini ITX motherboards cost about $150 tops. Whereas ATX motherboards can easily run $500-$900 at the top end. Most people don't use the extra IDE slots and you could buy some good pedals or something for the price difference. I was hesitant to go VR as before I got the Reverb I played on a 4K big screen. But when I first tried it my exact words were "Oh wow!". I spent a couple days just playing demos and checking out all the different cockpits in modules that I have. Being able to naturally pan your head around the cockpit and look behind you makes a huge difference in immersion. I rarely look straight ahead while playing flight sims anymore. I'm always scanning to the side, up, or behind. I still get eye fatigue, I try not to wear it for more than an hour or two between breaks, but no nausea. The first weekend I had it, I made the mistake of not taking breaks and my eyes hurt quite a bit for a few days. Elite Dangerous is very impressive too.
  3. Thanks Der Hirte. I've been building PC's since the late 80's with my first 80486. I was lucky enough to have a parent who worked at IBM so I grew up on the first IBM PC and PC AT. I'm actually about to sell my second 2080 Ti and go back to an ITX motherboard and smaller case. My rig is heavy and I like to move it between my living room, bedroom, and simchair often and it's not very convenient for that. I made the bargain with myself that if I liked the Reverb enough to keep it I would sell the 2nd GPU as VR can't utilize SLI as of now or in the foreseeable future. I mostly play DCS but I have a big backlog of older games on Steam. I recently played about a 100 hours in a heavily modded version of Skyrim Special Edition and the second card was at 0% utilization. Not sure if that was inherent in the engine or because of all the 3rd party mods. It's a blast to play older games with everything maxed out as the developer intended. You can even supersample above your native resolution in the Nvidia driver. Now if you are into recent AAA games like Shadow of the Tomb Raider in 4K, it's jaw dropping amazing. Looks absolutely gorgeous. But the law of diminishing returns applies. You are paying quite the premium for a few graphic settings and frame rate. I love helping people get into gaming and tune/build their PCs. Even the 4K videos and screenshots you see don't really compare to the native content. You have to see it with your own eyes. For DCS when playing with SLI enabled on flat screen with TrackIR I was able to maintain 60 fps at 4096 x 2160 with almost everything maxed and with shadows kicked up to Ultra manually in the config file. I kept depth of field off, heatblur on medium and civilian traffic on medium. On a few modules like the P-51 in single player I was even able to keep MSAA at 4x. Gorgeous...If I have time before I sell it I'll try and do some SLI benchmarks for DCS on how well it scales. Best guess at this time is about 30-40%. Most games don't scale very well with SLI. But assuming you have the power supply to handle it, 2 used 1080 Ti's would probably do well compared to a single 2080 Ti at current prices.
  4. Define optimize in this context? Have you set the power settings in Windows and the Nvidia driver to "performance"? What motherboard do you own? Nowadays most gaming boards come with pretuned profiles for overclocking in their BIOS. Enable a conservative one and if it boots you are good. There are benchmarks out there to test stability but just playing DCS for an hour or two is the best stability test. My more aggressive overclocks will crash Cinebench but are fine in DCS. Have you enabled the XMP profile for your RAM? Like the CPU overclock profiles these are pretuned for you motherboard, CPU, and RAM. You can squeeze out a frame or two that way and can prevent microstutters. Have you installed ProcessLasso? Highly recommend it. More convenient and in depth than Windows. I'm away from my rig right now, but I can post a snip later of my settings. I recommend the on screen display that comes with Afterburner for flat screen. FPSVR is a utility available on Steam. Don't own the Index. But with the Reverb it was a whole new Rabbit Hole of troubleshooting and tuning. Good Luck! In general, DCS and VR like fast CPU's and RAM. Learning to overclock is worth it and very user friendly these days.
  5. I've had mine since November '18, no issues at all. I upgraded from the Titan Xp. Definitely worth it, the faster VRAM and wider bandwidth help a lot in VR. It's only about 6 months or so till Ampere comes out, hopefully. I wouldn't get your hopes up that they are going to be cheaper than the RTX gen. There will be used RTX cards on Ebay for hopefully a decent price. I still can't reconcile that people spend so many hours on a hobby then blame the manufactures when they don't have money to buy a new toy. Makes no sense. Looks like I stirred up a hornet's nest of salt with that post. LOL. If someone is physically disabled and on a fixed income, that doesn't apply of course. $1200 for a new card isn't that much money in the context of our hobby. You just have to sell your old card. There's a big market for used parts in the States, at least. High end cards keep their resale value and are in high demand. You can get $400-$500 for a 1080Ti on Ebay right now. That resale value will drop significantly when the next gen comes out. The timing of the upgrade cycle is very important. This sounds like I'm shilling myself but the reason I am partial to EVGA cards is they have a transferable 3 year warranty. The upgrade cycle is 2 years. They hold their value really well. When I bought my cards I had the full intention of returning them if it wasn't worth it. It's telling that all the people saying the 2080 Ti isn't worth it have never actually used it themselves and base their decision on forum posts. Hmmm...
  6. No, what he stated was completely inaccurate. If you think the Ampere Ti model is going to cost less than the 2080 Ti I wouldn't get your hopes up. VAT isn't going away for Europeans. If you are in the States this stuff is a lot cheaper all around.
  7. Oh Mr. Biggs, as you figured out people on this board are easy to make fun of. You shouldn't read too much into it. It's just a game. I just lose patience with people who put out bad information and try to use the boards to promote a specific company. It doesn't help people who are trying to build a new rig.
  8. DING, DING, DING, DING, DING, DING...We have a winner!!! I'm definitely not an engineer but I did have math classes with them at my University. I'm pretty sure they didn't pass those classes with their subjective feelings.
  9. The detail in the radial engine and folding wing structure is very impressive. Can't wait to see the final product. Pappy would approve.
  10. Somebody help me out here. How are we defining "accurate" and "realistic" in this context? I see how we can get a reasonable approximation of D and Cd from a photograph because we can see A. https://www.grc.nasa.gov/WWW/k-12/airplane/dragco.html https://www.grc.nasa.gov/WWW/k-12/airplane/drageq.html Where are we getting all the other variables and parameters? e.g. thrust, burn time, weight? I keep seeing phrases like "OP", "I feel", "based on the information available", "behaving like I would expect". I don't get it.
  11. You speak so authoritatively, yet you provide no evidence of this. People who rely on word of mouth in forums without any data or testing to back up their statements are just giving opinions. You know what they say about opinions. Steve from Gamer's Nexus did actual tests and was not able to pinpoint the problem other than "test escapes" and inadequate QC in the rush to get the cards to market to meet demand. https://www.gamersnexus.net/guides/3394-rtx-2080-ti-artifacting-failure-analysis-crashing-black-screens There was never actually any evidence of a higher failure rate relative to other GPU's. People with good GPUs were too busy playing games while someone with a failed card would obviously be more likely to go on forums and tell people about it. If you are suggesting that you can ignore cooling and temperatures, go ahead and take off the heatsinks and cooling fans on your GPU, play DCS for a couple of hours, and see how that goes for you. A couple years back I was playing the Witcher 3 in 4K on my brand new 1080 SLI system that I built. The cinematic scenes will bring any GPU to it's knees. I made the mistake of relying on the card's stock firmware and fan profile to keep the cards cool and operating within specifications. That led to a trip to Microcenter to return one of the cards that had fried. After that I never relied on stock firmware and fan profiles to keep the cards within specifications. I now consistently check temps on an OSD that I can turn off and on with a hotkey. My current motherboard has an LED display so I can monitor clockspeed, voltage, and temperature in realtime but it's not convenient while gaming on my big screen. Although with a liquid cooling loop it has never been an issue. Overclocked PCs are very similar to a performance car with a supercharged engine. If you push the rpm too high without a proper radiator, it will blow. You know what the best part of liquid cooling is? Not the overclock headroom, not the higher framerates or graphic settings. It's so quiet. Air cooled CPU/GPUs can sound like a turbofan engine when at max utilization, which is typical when playing DCS at high settings. If you are coming here just to market and be a shill for AMD you should probably refrain from commenting in the future as that just undermines your credibility and people will just ignore you. It's really obvious when someone is a Fanboi. In the computing field we use benchmarks to test GPU performance, not opinions. Per the Passmark GPU benchmark, the $499.00 Nvidia 2070 Super surpasses the AMD Radeon VII for $529.00. The Radeon VII which was released in February '19 is still surpassed by the 1080 Ti which was released in May '16. That's almost 3 years. That's a long time according to Moore's Law. As of this writing you can purchase used hybrid cooled 1080 Ti's on Ebay for $500.00. Newegg has the Radeon VII available @ $529.00 but they are being scalped on Amazon for $770.98. https://www.videocardbenchmark.net/high_end_gpus.html Speaking of Intel, they wisely poached a very talented engineer from AMD. Raja Koduri is now head of Intel's Visual Computing Group. Apparently he was not happy with AMD's priorities with GPU development. Intel now also has Jim Keller formerly of Tesla and before that AMD where he was a lead architect of several successful CPU lines. What was the point of your post exactly and how does it help the OP? But hey, thanks for sharing your opinion and letting everybody here know that Nvidia is beneath you and you are way too intelligent to upgrade your GPU. I've been having a blast with my Reverb and 2080 Ti. I'll sell you my 2080 Ti used when I upgrade next summer. I wouldn't get your hopes up that the new Nvidia Ampere cards based on a 7nm process will be more economical by any stretch. There's always demand from people with adequate disposable income to enjoy their hobbies. You may be asking yourself how or why I would know all this. The bulk of my income comes from investing. To be successful at that you have to exercise due diligence and critical thinking skills and ignore Fanbois in forums. I am completely agnostic to whether I or anyone else builds a system with Intel, Nvidia, or AMD. I have traded stocks and options in all three. I make money regardless of what individual gamers choose. I just prefer to build my PCs with the fastest components available, because I can. :thumbup:
  12. Yes, please. I love having a Cougar throttle with realistic buttons but I think I've spent more time troubleshooting that thing then playing the F-16 module. I adjusted the detents from stock but it's still a little janky. Needs the pinky lever for throttle cut off too.
  13. Hey Formski, I wish I had kept some data from some real benchmarks, but last year I upgraded from a 7700K/Titan Xp to a 9900K/2080Ti. Before I sold it, I threw the Titan in my new build and got about a 10% performance bump in framerates. So I can say pretty confidently that your GPU is bottlenecked. In between the faster CPU, newer CPU architecture, new chipset, and faster DDR4 Ram with the XMP profile enabled, you should see a pretty nice bump in performance especially for VR. How much and if it's worth the money for you I can't say. 9900K will have better singlethread performance than the 9700K but unless you are using some kind of productivity software or editing video with Adobe Premiere or Blender the extra 8 logical cores won't be used as the DCS engine doesn't use hyperthreading. If you are familiar with overclocking even better as the custom loop will give you nice headroom for OC. DCS will take all the CPU you can throw at it. Settings like Visibility Range, Trees Visibility, and Shadows are heavily CPU bound. Resolution and AA setting are more GPU limited. I use the Reverb and with the WMR diagnostic tool I either get CPU bound, CPU/GPU bound, but rarely GPU bound by itself. If you want to try AMD I believe 3700/3700X has the fastest singlethread for a reasonable price last time I checked but be wary of 3rd party vendors scalping on Amazon and Newegg. Looks like the 3800 is actually in stock. I would defer to Bitmaster for an AMD build, I've never overlclocked an AMD chip before. I just saw that you are down under. I feel bad for Aussies and Kiwis. You guys have it worse than Canucks, Brits, and Euros. There's a pretty steep premium for this stuff. I would probably be breaking some kind of Australian import regulation but I wish I could just mail the stuff to you guys as a "gift". I live about an hour's drive from a Microcenter. It's a beautiful place. My state has to charge sales tax for online sales now so the price difference between Amazon or Newegg is negligible.
  14. Yahtzee!!!! There's a lot of people on this thread who simply aren't as knowledgeable as they think they are or lack critical thinking skills. Even that awesome white paper in the Google drive link (reading through it right now) had to at some point make an educated guess about thrust values. And for the record I was going to buy this module probably next year on sale to support the developer. I have more modules than time. But I think I might buy it to check out these new missiles. Is there a preorder discount like the F-16, perhaps through their website like HB? 2019 has been a great year for DCS. F-14, F-16, two new 3rd party developers with full fidelity modules. Good times!
  15. First off, that sucks! I've had both of my 2080 Ti's for over a year now. No issues. Mine are cooled with water not air. Are you watching your temps with Afterburner OSD? Are you using the cables that came with your new power supply or are you using old cables? The wiring diagrams are not standardized on the power supply side only the motherboard side. If the polarities were reversed they would have fried on first boot, but maybe the ground? Do you have a multimeter to test your cables and connectors?
  16. "The missile's geometry and aerodynamic characteristics were modelled as accurately as possible without having access to classified sources." All the math is open source as they are physics equations that have been around for 100 years. I appreciate the work they put in for this analysis for our hobby simulator but we still keep coming up against the same issue. Accurate data.
  17. I was curious other than the references to Wikipedia, what open source information are you guys basing this whole debate on? Now the size of a missile and the width of it's fins can be easily determined from a photograph, but what about the weight? The thrust value of the rocket motor? The mass listed on the wiki page for the AMRAAM has 352 lb/152 kg, but has no citation. This is Raytheon's page on the AIM-120. No data on range, speed, etc. https://www.raytheon.com/capabilities/products/amraam If you follow the link for the citation of the range of the AMRAAM (>57, which is obviously vague). The source clearly states: "Note: Data given by several sources show slight variations. Figures given below may therefore be inaccurate! Especially the range figures are rough estimates only." It's just some random guy named Andreas Parsch who's into missiles that made a webpage. This is a quote from one of the sources linked in the SD-10 wiki page. "Absolute determination of AAM capabilities is greatly hampered by the efforts of governments and manufacturers to deny information, such as that regarding missile range and countermeasures, which would allow potential adversaries to gain an advantage." Now I'm all for a good aviation nerd debate, but you guys do realize this is all conjecture and educated guessing? Right? One guy in the thread said there should be better chaff resistance for the AMRAAM. How do you know that? How do you know how well notching works? Especially if there is a data link from AWACS or multiple aircraft looking at the target from different angles. DCS is a hobby flight simulation product that people fly for fun. As far as I can tell, none of the figures and capabilities you guys are quoting as gospel are based on any kind of reputable source or verifiable intelligence product. We aren't going to get an accurate BVR/EW simulation for these weapons in the next 10-20 years, probably longer because export versions of these missiles will still be in service in various countries.
  18. How do you know that the force required is equal? The F-16's stick actually has varying force required depending on which way you push or pull. Your arm is naturally stronger in flexion than extension, and adduction than abduction, i.e. nose down and roll right require less force because you are naturally stronger pulling back and toward you, even more so in the F-16's side stick configuration. Works great with the FSSB, no issues at all. It would seem to make sense that greater force is required in pitch than roll as that is what will black you out from gloc. I've never used a FFB joystick, I'd like to try the Brunner FFB for the other modules, but I spent my Christmas money on the FSSB. Is there a way to calibrate force feedback outside DCS? The FSSB has it's own calibration software that can tune the force required.
  19. I was curious, what exactly are you expecting from a FFB joystick for the F-16? I'm not sure how they would model that except to copy and paste the F-18's profile. The F-16's stick doesn't move, how can they accurately model FFB? Other than some vibration corresponding to engine power state or firing the cannon?
  20. Who is buying a $750-$1000 CPU to play DCS? DCS players and gamers should be looking at Intel's 9700K/9900K or AMD's 3600X/3700X for this build season. I'm all for competition, Intel already had to lower their prices but, let's not get carried away here guys. 9900K has been out for over a year now. That's a long time in computer time. The KS version - which is overpriced - is just a top percentile bin of the 9900K. I bought one of those last year through Silicon Lottery that's stable on all 8 cores at 5.0 GHz for a $50 premium. Intel's stockholders and engineers aren't just going to give up and say "AMD wins" because AMD finally caught up over a year later. We'll have the next gen from Intel by this time next year. Linus and Jay know what they are doing when they create videos like this. More views=More $ for them. They are just creating drama for ratings to justify higher costs for their commercials and sell merchandise. That Linus video has over 2,000,000 views already. How does Intel lifting the NDA before AMD's prevent him or Jay from comparing processors? It's just an inconvenience for them to create content. Jay was ranting about the box it came in. Who cares? Since the 2080 Ti came out there hasn't been much to make videos about until now. We should all be very happy that this is the best time ever, ever to buy a CPU for gaming. But, lets keep our eye on the ball here guys. The benchmarks that Linus did still had the 9900KS winning in single thread and gaming performance benchmarks. All those productivity benchmarks for HEDTs aren't really relevant to most DCS users except in rare use cases. Here are all the objective benchmarks and ratings that I could find with a quick search. https://www.tomshardware.com/reviews/best-cpus,3986.html https://www.tomshardware.com/reviews/cpu-hierarchy,4312.html https://cpu.userbenchmark.com/ https://www.cpubenchmark.net/singleThread.html Gamer's Nexus 2019 CPU awards: Best Overall: AMD 3600 (low cost too!) Best Gaming: Intel 9900K I wouldn't worry about multi-core performance for Vulkun till there's actually a DCS build that uses it. By then there will be another generation of processors out. Team Blue, Team Red whatever...A build with a 9900 or 3700 would probably be within the margin of error for DCS framerates. Edit: Happy Thanksgiving everybody!
  21. Make sure to check out rtings.com for your model. They have calibration guides for most models. Turning on the PC gaming mode properly will turn off all the movie filter gimmick features that you don't need to minimize input lag. TV's are just big monitors nowadays. Samsung, TCL, and LG all have models with really high gaming/pc monitor review scores. I believe my Samsung supports Gsync over HDMI with the latest Nvidia drivers but I haven't tried it yet. Doesn't matter with TrackIR but it's nice for other games.
  22. The power supply is liquid cooled or the whole system? Concur with 32GB Ram instead of 64. Are you planning on storing a lot of movies/games? Buying an SSD is an easy upgrade for sometime in the future. You could get away with 1TB Nvme SSD or even a 512 GB if it was only for DCS. That will hold all the maps just fine. If you are buying storage through Alienware it will be overpriced. You could buy a faster SSD for the same $ from Newegg or Amazon. EK makes a nice aluminum block cooling kit that works very well for overclocking. Much cheaper than the copper block stuff and it works better than an AIO for CPU OC. DCS runs well with the stock turbo boosts but you can squeeze some frames out by only OCing the 2 cores that DCS uses. Then the rest of the cores don't put out as much heat.
  23. Not my content, and not DCS specific but this series is really good for those new to aviation and want to establish a good foundation of aeronautical knowledge based on the appropriate FAA handbooks from a reputable flight school. That being said you can completely ignore FAA/ATC rules, physics, take off from the taxiway, whatever. But this is how you are supposed to do it. This series is intended for those pursuing a civilian Private Pilot general aviation license in Cessna 172s in the US. USAF/USN/NATO regs and procedures can vary, but as you advance into the more sophisticated modules there will be an expectation of knowledge and fundamentals that will be assumed you have. This series helps a lot with that.
  24. Nice! If you are good with hardware you will probably have more trouble installing windows than putting the PC together. Hit me up if you need any advice tuning, OC, calibrating controllers, etc. It will be a lot at first but once you get past the initial barrier to entry, you are good. The underlying principles and tech haven't really changed since the 80's. The transistors are just smaller and more numerous and you don't have to be a PC tech to overclock anymore. All I know is a little R and can barely compile "hello world" in C++, don't worry about it you'll be fine. DCS is a complex piece of software but there's a tutorial, forum thread, or some kind of documentation for most everything. Just don't try to learn everything yourself by trial and error, it will take too long and you'll get frustrated. People here are usually very willing to share their experience or at least point you in the right direction.
  25. The FSSB is hands down the most precise joystick I have ever used. I feel like I'm an F-16 demo pilot at an airshow she handles so well. If you are willing to part with the funds it is worth it. Make sure you set the force differential curves in the FSSB software as you are naturally stronger with muscle flexion vs extension, i.e. tune so the force inputs are comfortable for your setup. I have mine set at a comfortable forward angle, but you can calibrate to compensate for the leverage advantage. I flew the F-16 the first day with my usual center mounted extension setup but the FSSB in sidestick configuration is so much better. That in combination with VR really fools your brain that you are flying a completely different plane with unique Fly-By-Wire characteristics. I haven't really used it much for other aircraft as I prefer the traditional center stick for those, but it's awesome for space sims. Edit: Can't remember the exact post but Wags said one of their SME pilots broke the Warthog stick mounted on an FSSB because he was used to putting so much more force on the stick in the real aircraft. I have it tuned for a lighter touch.
×
×
  • Create New...