Jump to content

Sn8ke_iis

Members
  • Posts

    537
  • Joined

  • Last visited

Everything posted by Sn8ke_iis

  1. Yes, DCS will use all the CPU speed you can give her. There's a couple power saving features in Windows and the graphics drivers that are on by default as they are really not needed for day to day computing like Web browsers, Excel, etc. Assuming you have an Nvidia GPU you need to adjust a similar feature in the Nvidia Control Panel as well. "Reducing System Latency: Enable Max Frame Rate and set your power management mode to 'Prefer maximum performance' to reduce latency. While in this mode, the GPU is kept at higher frequencies to process frames as quickly as possible. To maximize latency reduction in GPU bound scenarios where FPS is consistent, set Max Frame Rate to a framerate slightly below the average FPS and turn Low Latency Mode to Ultra." And cap your frame rate at 60 fps if you are using TrackIR. https://nvidia.custhelp.com/app/answers/detail/a_id/4958/~/max-frame-rate%3A-cap-frame-rates%2C-save-power%2C-and-more
  2. This guy's test was geared towards FPS games and mouse input lag but was very informative.
  3. As the previous poster mentioned I used V-sync because my 4K panel could only take 60 Hz max at that resolution through the HDMI cable. You want to cap your frame rate at 60 fps through the Nvidia Control Panel or through Riva Tuner. You may also be able to lower the refresh rate of your monitor in its own settings menu. I also maxed out the Smooth slider in TrackIR settings. You want a nice flat line on the frametime graph of MSI Afterburner. It will pick up any little stutters so you don't have to rely on your eyes alone. After a couple hours of tuning graphic settings your brain can play tricks on you or graphic settings can have a placebo effect without an objective way to measure it. Here's a screenshot I have from some benchmarks I ran last month. I wasn't able to maintain that high a framerate in complex missions, only in solo free flight. Frametime for 60 fps is 16.7 ms. Here's a quick guide to turn it on in the latest Nvidia driver: https://www.ghacks.net/2020/01/06/geforce-driver-441-87-introduces-framerate-limiter/ This guy does some excellent test videos for framerate limiters and input lag. This one is mostly geared toward FPS games but it's a good intro to the topic. We just substitute TrackIR for the mouse to change our POV. And here's an actual test comparing framerate limiters for input lag.
  4. It will boost automatically on a single core. Just make sure you are in High Performance Mode in the Power Options of Windows Control Panel for playing DCS and other games. You can also double check in your BIOS that Turbo Boost is on but it should be on by default.
  5. Any of those would be fine. Buy the one you think looks cool in your case and has a good price/warranty. Since you are installing a non K CPU, VRM for overclocking isn't really an issue. If you do upgrade your CPU later to a K version, it would be highly unlikely any of those OC better than the others to any significant degree.
  6. I was skeptical myself when I first read this on threads here. Thing was I had been playing at 4K60 with Vsync on when playing DCS. When I played first person shooter games on my 165Hz Gsync monitor, I didn't use Track IR. TrackIR was engineered before Freesync/Gsync became a thing. As Bit mentioned it's due to the polling rate of the IR camera and probably the software running in OpenGL as well. Set up a framerate/frametime graph in Afterburner and then pan around and look over your shoulders in the cockpit while flying. At 80-90ish fps it's most noticeable, the microstutters are really bad. Less so the closer you are to 60 or 120 fps. At 120+ it's not really noticeable but I was only able to sustain those framerates on single player freeflight in the TF-51. As soon as I loaded a complex mission with wingmen over Nellis in another module sustaining 120 wasn't happening. Then to test, pause TrackIR and use your mouse to pan your view and there will be no issues and look smooth as Gsync is intended. One of the nice things about VR is you can get 90 fps smooth versus the 60 fps limitation of TrackIR. But when framerate is capped at 60 fps for TrackIR gameplay is very smooth. I think a lot of players just get used to framerate drops and stutters and think that's normal because they have no other frame of reference. When tuned properly for a consistent 60 fps, DCS plays real nice to your eye.
  7. If you already have it capped through Riva Tuner I wouldn't bother changing it. I doubt you'd notice any difference. I watched a YT video where a guy tested both with a high speed camera and mouse inputs and the difference was a few milliseconds. He got the best results with frame rate limiters within game. On other threads I've read DCS's limiter is borked but never actually tried it myself.
  8. Hopefully it's higher, the 1080 Ti was a much bigger improvement over the 980 Ti than the 2080 Ti was over the 1080 Ti. Hopefully, we'll know by summertime. Good news is that percentages compound. So the 3080 Ti should be a nice improvement over the 1080 Ti. Something to keep in mind as the rumor mill winds up over the next few months. None of the leaks about the 2080 Ti were accurate until about a week before the NDA lifted. So be very skeptical about the clickbait articles from tech "journalists". They're fun to read though. A 50% performance bump at $1000 instead of 30% at $1200 would be most welcome compared to last time. I'll believe it when I see it though. 1080 Ti's are still selling for $400-$500 on Ebay, so I would guess you could still get at least $300 for one on Ebay after the new gen comes out.
  9. Hate to be a bummer but TrackIR doesn't work with G-Sync for VRR purposes. It will have improved latency and input lag but the frames rates won't match if you are controlling your view with TrackIR. Have you tried adaptive sync in Nvidia Control panel instead of Vsync in DCS settings? It should reduce stuttering from frame rate drops. If you are seeing tearing that is actually from the frame rate exceeding 60 fps. If you haven't already you should try the frame rate limiter in the NCP and lock it to 60 fps for DCS. Otherwise your GPU is doing more work than it needs to and the animation will stay smoother for TrackIR. It should help keep temps down too to prevent throttling. https://www.geforce.com/hardware/technology/adaptive-vsync/technology If you set up Afterburner with a frame time graph it should keep a smooth line at 16.6 ms for 60 fps. Your eyes can detect even slight hiccups in frametime and the graph is a good way to make sure you aren't just hallucinating. And thank you so much for posting these benchmarks. It corroborates the benchmarks that Aurelius did. I did some myself before upgrading but I forgot to save the file folder with the data when I reinstalled Windows like a dummy. It's interesting seeing the differences between modules and maps.
  10. Since you are located in the US, Cyberpower and IBuyPower have the best prices in my experience for build to order rigs for a very reasonable premium over building yourself. You can configure just about anything you want on their sites. If they don't have the case or specific component you want, they'll build anything you want just by sending them a parts list from PCPartpicker. Don't tell them your max budget though, bad negotiation tactic that will cost you money. I built the rig in my sig for less money than your budget as a point of reference. Although I have a lot of experience building custom PCs and know how to do custom liquid cooling. There's always an opportunity cost of your time to learn and build though. I do it because I enjoy it as much or more than gaming. Stay away from Alienware, Falcon NW, and Digital Storm unless you want to pay more money for the same performance. I'm not sure how Falcon NW is still in business to be honest, you are mostly paying for the name and a fancy custom case. I'm sure they have great customer service/warranty if that's what you want to pay for. All good PC components have their own warranty, some of which like power supplies go up to 10 years. EVGA has a 3 year warranty on their video cards. If you can wait till the summer, Nvidia's new cards should be out by then but they will be scarce and overpriced as is usual with new launches. Right now, 9900KS for CPU and EVGA 2080 Ti Kingpin are the best you can get. 3900X does very well in games too and costs less than 9900K. 3950X is an editing/workstation CPU not meant for gaming and costs more than the 9900KS. And the Reverb is amazing. I though it would be a couple more years at least before we had such a headset, especially for only $650.
  11. Nice! Glad to see you went big, $100 more for a good CPU isn't that much in the grand scheme of things especially since you have the motherboard that supports it. DCS likes fast CPUs, even more so for VR. 12C/24T will keep you a happy gamer for a good while.
  12. You can overclock 1 core and Intel's turboboost will take care of the rest automatically for DCS's rendering thread that sends draw calls to the GPU. The other cores don't really need to be overclocked so aggressively above stock for DCS. Be careful with core affinity. When I restricted to just 2 cores in Process Lasso, it turned into a choppy mess, but does well restricted to 3 cores, at least on the 9700K/9900K. Can't speak for AMD or older Intel processors.
  13. I believe he was referring to his GPU usage, not his CPU usage. Ideally you want your GPU at 90%+ to maximize performance and get what you pay for. In my experience, DCS is still mostly CPU bound on the Reverb per the native diagnostic tool that gives you a colored dot specifying whether you are CPU bound, GPU bound, or both. This is all situation dependent though, "it depends", is the key takeaway. When flying low level around lots of trees and buildings I'm still CPU bound. Draw distance graphic settings are mostly CPU dependent as well. As you get away from complex areas and gain altitude you get the green dot for true 90 fps without any frame interpolation and it's very smooth to your eye.
  14. Hey thanks for posting this! I've never watched this guy's videos before. The Youtube algorithm has been failing me. I surprised this guy hasn't popped up in my feed with almost 500,000 subscribers. I like they way he benchmarks with the 4 way split screen. Much more edifying than just looking at graphs. He makes good points about future proofing, costs, real world usage, etc. Very sound rationale for specific uses cases. Can't really go wrong with the 9700K especially if you get a good deal on it. Something to keep in mind, logical cores ≠ physical cores. Even in a heavily multi-threaded application that properly utilizes Hyperthreading you won't get a 100% bump in performance from a 9700K to a 9900K. It's more like 20-30% even under ideal circumstances. 8 fast cores that you can overclock will keep you happy for a while for gaming. DCS only needs 3. What the guy said in your video is true about games using more cores and multithreading over time but it's going to be a while before games are bottlenecked by 8 fast cores. You'll have a couple more generations of processors by then that will smoke a 9900K and cost a lot less. "Future proofing" has diminishing returns in cost when tech moves so fast.
  15. HeHe, I really need to watch that movie again. I forgot John Goodman was in it as the football coach. The guy who played Ogre passed away a little while ago IIRC. "Eat a pie for charity" ;)
  16. I wasn't trying to imply that it wasn't a working semiconductor just making a joke about how the diagrams are simplified. I find microprocessor design fascinating but have pretty much reached the limit of my understanding without going back to school to take EE classes.
  17. Are you running a separate computer with the dedicated server client or are you trying to do both on your gaming rig? If so show us some stats running in Single Player or hooked up to somebody else's server to establish a baseline for performance comparison. Your framerates and frametimes are all over the place. A 6700 only has 4 cores. I'm not sure you can do both on the same machine. You can try running DCS with core affinity set to 3 cores and use the 4th core for the client or try to set up a VM but even then I think you are asking a little too much. I'm not sure I could run a server simultaneously on my 9900K, though I've never tried. When you play on console or other multiplayer PC games and host a game there is infrastructure provided by the company. As far as I know, all DCS multiplayer servers are provided by individual players or squadrons who pitch in to buy the server or rent one from a service that provides a server off site. If you look in the multiplayer forum you'll see guys running servers on Xeon CPUs which are expensive HEDTs that can have 12+ cores.
  18. At first I thought that was a reference to Cyberdyne Systems from the Terminator movies. :) They make it look so simple from those diagrams. Pffft...I can draw one of those out on a Powerpoint slide, no problem. I should be an Electrical Engineer. :smartass: If you are into this sort of thing, Der8uer made a cool little series where he took a 9900K to U of Berlin and scanned it with different electron microscopes.
  19. What motherboard do you have? Assuming it supports the latest Ryzen chips I would look at 3600 or even better a 3700X. 3700X has really good performance for the money. Then I would use the leftover budget for RAM and a cheap SATA SSD. Then sell your old components and use that money to subsidize the cost of a new GPU when you have more money to spend. You would still be limited by your GPU to a certain extent in the meantime but you should see a decent performance bump in 1080p and have a good foundation for future upgrades. By November there should be some good deals on mid range and used graphics cards. A 3700X/1660 Ti is good pairing for gaming at 1080p. You won't be able to play with all the setttings cranked up but when you are in the moment in a good game graphic settings are just eye candy. Graphic settings can make a good game great but can't make a bad game good if that makes sense.
  20. A 380w power draw wouldn't be out of line for a 10 core CPU that's overclocked and/or under stress tests. The 9900K can easily draw 30w+ a core. I've seen other overclockers get 50w+ on single core benchmarks when heavily overclocked. I've read the same rumors but they are just rumors that have not been verified. https://videocardz.com/newz/intels-10-core-comet-lake-s-cpus-could-draw-up-to-300w An Intel employee would be breaking their NDA and would get fired and/or sued if they gave out that info without permission. AMD wouldn't hire them either and their career would be over. Now is a 300w+ CPU efficient? Probably not, but getting powerful processors to draw less power is a difficult design challenge for EEs. If you can design more efficient CPUs that match or beat current performance you can get a job at Intel or AMD and make a lot of money. The TDP is a design guideline for coolers. It's not really indicative of what the CPU can actually draw under max loads. The 9900K has a TDP of 95w and the KS 127w due to the stock clock. They both can draw a lot more than that. Noctua makes the best air coolers and there's several good options for AIOs and custom loops. But it's a lot more $ for just a little bit more performance so diminishing returns definitely applies. Overclockers are more into racing their computers on synthetic benchmarks for bragging rights. Not really practical for gaming without an expensive custom loop or an even more expensive chiller that also draws a lot of current.
  21. What's your budget? A Reverb will take all the money you can throw at it. It will run best on a 9900K or 3900X with a 2080 Ti. For VR, video memory bandwidth is very helpful so a 2070/2080 Super would be my minimum recommendation. That was one of the few benchmarks I was able to find that wasn't trying to sell you something and compared different GPUs. It won't be the same framerates with a Reverb but it gives you an idea of the performance hierarchy. I would also speculate that he was hitting CPU bottlenecks on his gaming benchmarks. For RAM you can get away with 16 GB, but DCS will use the 32 GB, especially in multiplayer. Unless you want to spend time watching your Windows OS boot up and your games load I would recommend an NVMe. The prices have come down a lot in the last couple years. For DCS loading there won't be much difference between a SATA but for Windows booting up and applying updates or just in general transferring large files an NVMe can be a lot faster. It's bottlenecked by the slowest drive though, for example if you are reading from or writing to an HDD, the HDD's speed will be the limiting factor. If you are installing a game from Steam or from the DCS client they are very fast. Since you are in the States you can sell your used stuff pretty easily on Ebay or craigslist. There's always people buying used stuff for a budget build for different use cases. Or just make a hand me down system for a family member or friend.
  22. Nope, I keep HT off for my overclocked DCS/gaming profiles. I only turn it on when I am messing around in Blender. I have more headroom for OC without it. For older CPUs with less cores it's usually better just to keep it on, but you have to test it yourself to see. Neither the 9600K or 9700K have HT and are made for gaming. You can also OC 1-2 cores higher than 5.0 GHz for gaming, it will use less voltage and put out less heat than OCing all 8 cores. With the DCS TrackIR use case you hit the 60 fps ceiling but if you are playing PUBG or other first person shooters at 1080p a fast CPU can get you more frames. Less so as you get into higher resolutions, then you are GPU limited. They make 240 Hz and soon 360 Hz monitors at 1080p for FPS games because they are so popular and competitive online. I was able to get 120 fps on my rig at 1080p when flying DCS single player at high altitude but as soon as you fly low and/or have wingmen next to you the FPS drops and you can't maintain 120 fps at high preset. I didn't even bother checking online. When you look at the absolute numbers on benchmarks you'll see that in some games there's not that much difference that would be perceivable to your eye so the limiting factor is more your budget. A good overclocking motherboard is the EVGA Z390 Dark but it costs $500. If you aren't planning on doing an SLI build you can save money by buying a Mini-ITX motherboard. Asus, Asrock, Gigabyte, etc. all make good ones for Intel and AMD and the money you save can be put toward a better GPU and faster RAM. Ultimately, for motherboards as long as you are buying from a reputable manufacturer that has a good VRM for OCing just buy whatever you think looks cool in your case. Here's some good benchmark sources for games besides DCS. They tend to focus on 1080p, that's where CPU makes the most difference with framerates. You'll see that a 9700K matches or beats the 9900K in a lot of games. The AMD 3900X did beat the 9900K in CS:GO and Rainbow 6 but it costs a $100 more than a 9700K. https://www.pcgamer.com/best-cpu-for-gaming/ https://www.gamersnexus.net/hwreviews/3489-amd-ryzen-5-3600-cpu-review-benchmarks-vs-intel I would agree if you don't want to build another computer for 5 years that you might consider a chip with HT, but I've never had a problem selling my stuff to upgrade. When you can sell your used stuff it's a lot easier to justify upgrading every couple years. The new Playstation and Xbox consoles are rumored to have multi-threaded chips but it will still be a while before games use hardware multi-threading. By the time games are common that use it we'll have another 2-3 generations of CPU performance to choose from. Both BigNewy and Nineline have said that the Vulkan API build isn't coming anytime soon.
  23. For the Reverb I have my settings tuned for 45 fps at low level/complex areas and the frame interpolation will double the frames up to 90 perceived by your eyes. Although I have read some posts where people tune for 30/60. I might try that soon and see how it looks.
  24. Nice! 9700 is a good chip for gaming. Can't really go wrong there. There's still a lot of demand for the 8700 so you should be able to get a decent price for it on Ebay or Craigslist. Lots of budget builders out there just waiting for people who just upgraded to sell their used stuff. Have fun!
  25. I think that's a good call. Using the performance tests chart in that guide you got a good chip. Pushing it any harder would just increase the risk of degrading your chip without buying a new cooler. Might as well save the money for a new GPU or CPU.
×
×
  • Create New...