Jump to content

Headwarp

Members
  • Posts

    991
  • Joined

  • Last visited

Everything posted by Headwarp

  1. What card did you upgrade from? Is there any chance it's using a different version of HDMI I.e. 2.0 or display port i.e 1.2 vs 1.4? If so that could be as simple as a new cable. HDMI port on my previous card didn't work, and a random DP to HDMI adapter limited my odyssey to 60hz refresh rate, ended up having to use an active DP 1.2 to hdmi 2.0 adapter to fix it.. found one for like $10 on amazon. 1080Ti's hdmi port is 2.0b and should work fine with the odyssey though so I can't be certain what's up there..but it does use DP 1.4 where maxwell was using DP 1.2. But grain of salt there.. DP cable i was using with my 980Ti works fine with my 2080Ti also using 1.4. Unsure if there's actually a dp 1.2 or dp 1.4 cable. Might be worth at least checking another cable or port on the gpu or monitor. About the only other possibility I can think of personally.
  2. A random passerby feels comforted, by the "in-stock" notice he got from EVGA, and the comfort in knowing he'd get a new card within a couple weeks should something go wrong. But he thinks he has a keeper, and he also made sure both of his 8 pin powerconnections were secured snuggly both in the GPU and modular PSU. >.< But yeah.. geforce forums have quite a few posts regarding issues with 2080Ti FE. I do have to wonder about the guys on their 3rd RMA tho.
  3. Have you tried using DDU in safe mode to uninstall everything nvidia and reinstall your nvidia drivers? Sometimes this is a must do when switching video cards. Anyways, good luck man hope it turns out not to be a faulty card. When you do get there my post above is how I found the best settings for my odyssey on my rig.
  4. Try 2x MSAA, no SSAA, use the Pixel Density setting in the VR menu to increase your supersampling until you can no longer maintain 45fps then turn it back down to where you can.. there will still be a slight shimmer no matter what but pixel density is the VR specific supersampling method in DCS. I would also make sure SteamVR's SS settings were set to 100%, so the only supersampling is coming from the pixel density setting. Also Normally I'd set negative LOD bias to Clamp, but setting high quality in the texture filtering turns off all the other texture filtering options..which anisitropic sample optimization says to turn off if you see shimmering, and turning off trilinear optimization may produce a better image quality as well. Sorry, you'll have to click the screen shot to see that it's settings for my DCS profile in nvidia control panel. There will be SOME shimmering no matter what you do in DCS with VR but this for me offered the best image quality and performance. Also, in the mixed reality portal settings - you can change the quality of how the cliff house will appear, and you can individually set the supersampling for apps that use steamvr, including the steam VR room if you navigate the steam VR settings.
  5. should have mentioned in my post that even using nvme as a windows drive at most knocks a few hard to notice seconds off of your boot time, speaking from experience here. With nvme where you'll see the difference is transferring or writing large files, and in that regard you'll still be limited by the device you're transferring to/fro. Transferring to HDD lucky to sustain 150MB/s, transfering to Sata III SSD from SSD or NVME - lucky to maintain 500MB/s, transfering from 970 evo to 970 evo? What is that like 1200-1500 MB/s writes? Think transfering 120-200GB worth of data in mostly 20-60GB file sizes. To HDD would take 20-45 mins. To sata III would be more like 8-15 mins, nvme to nvme would be like 3-10 mins, if you don't run into limitations of the chipset pci lanes. , (Guesstimations, don't quote me, cuz I didn't bother doing my math). Games and loading windows are more about access times for batches of small files. Don't feel too pressured on choice of SSD type for gaming purposes, as long as you have an SSD for booting/gaming. Where HDD's are mostly about inexpensive storage options should you need it. I.E i know my 4k video projects ate a ton of storage space. So I picked up a 4tb HDD for under $100. It doesn't see a lot of use outside of storing my video files. SSD prices overall are kind of awesome lately in my book. 250gb 970 evo under $100.. 500GB 970 evo as low as $150, 500GB sata III ssd's i've seen between $70-90, 1TB sata 3 ssd's like $160? Hard to go wrong.
  6. nvme drives won't offer much benefit over a sata III SSD in gaming/DCS if at all. Also looking at evga's z370 ftw manual, only the third 32mm m2 slot shares bandwidth with a pci-e x1 slot, where the two 80mm m2 drives share with sata ports, two of which could end up disabled with a 2nd m2 drive, and they share the 24 pci-e lanes from the PCH on the motherboard, which use the 4 lanes on the cpu that aren't devoted to your pci-e slots. I think the 3rd m2 slot is likely for intel Optane which would only really benefit an HDD. I'd expect the z390 to be the same in that regard, but I'd verify that once you can actually download a .pdf of the manual to be certain. Which is what I was looking for but couldn't find when I referenced the z370 FTW manual. PCI_E layout between the z370 and z390 chipset is the same though afaik.
  7. I'm not getting stutters, just a somewhat random drop to 30fps while flying the areas surrounding Dubai with all the tall buildings. Pretty much anywhere else in the map I'm maintaining 45FPS (VR). Not using msaa but same behavior whether I'm at 1.5 pixel density or 2.0. Would suspect that if this were based on hardware that I'd see something different with lower pixel density but it's 30fps either way, not higher not lower, exactly 30fps and it's only while I'm flying towards certain areas of the cities. Otherwise 45fps. I'll have to play around with other settings to see if maybe it has something to do with tree distance or something else. But, thank all of you for your insight, the information is helpful.
  8. I find that surprising. With my odyssey I can spot blurs flying around over 30NM. It was much harder to spot them when they were little black dots on my 3440x1440 monitor, but my ability to spot aircraft at a distance improved greatly when i switched to VR. I mean.. can't ID til they get closer and are no longer a blurry object in the sky, but spotting air targets got easier. Ground targets are another story altogether, but with things in the air for me it was mostly just a matter of time to acquire the skill. Couple years ago I might have well been a blind "Pilot". But I've had a lot of a2a encounters in sims since then. There's also something to be said for seeing afterburner flames at a distance.. especially at night. On the topic of contrails - I agree these shouldn't be something that appear/disappear based on zoom level or distance.
  9. I need two volunteers. One user with 16GB of DRAM and another user with 32 GB of DRAM. Both not CPU limited (I.e. no single core of the CPU is hitting 99% while flying). I'm using the Hornet at the moment, the instant action free flight over Persian Gulf. Flying over the largest cities up the cost from Al Maktoum Intl through Dubai and Ajman at low altitude, how do your framerates behave as you navigate the cities and look around? I'm not interested in actual framerate value, but curious about behavior. I can fly around Caucasus @45fps in VR all day (Asynchronous reprojection), and it makes sense that my GPU isn't running at 99% with it on. But flying low over the cities with the most and largest buildings in PG my frames randomly dip to 30 and my cpu/gpu still aren't maxed monitoring with a 100ms poll rate. I'm wondering if this is related to ram or not, although after I save a little money up I'm going to buy a 32GB kit anyway (Just because DCS.. cuz DCS). For the time being if there's anybody willing to share with me if they're geting sudden frame drops when flying low..i.e down low get you normally get 50-60 fps but approaching large cities around Dubai you get random 15-20 fps drops. With my settings I'm using anywhere between 13-14GB of ram and ending up with a 10-20GB swap file. So I'm wondering if someone with twice the amount of ram I have also experience framerate dips in those areas. Pixel Density changes pretty much showing the same behavior just with less gpu utilization using lower pixel density.
  10. Admittedly a CPU upgrade would've been my choice before a GPU upgrade coming from pascal Titans and x99 chipset. and if you've got the luxury of money to spend also have to agree with the choice of a 300 chipset+9900K over x299 very strongly. And the guidance I've offered is mostly in a desire of ruling out the possibility of running into the same issues on newer and expensive hardware due to something like too much resolution. Pinpoint a driver issue that has no real solution or pinpoint a bottleneck, and there's a bit more certainty there. That being said, I also get your reasoning if you were determined to get new gpu's anyway with the way stock has been. If you have to re-cooperate the bankroll, a decent AIO cooler now would at least be able to follow you to the new build when you get there, and most are able to fit a variety of intel sockets. Glad Bitmaster also mentioned msi_util. Downloaded and ran it myself and that's actually really useful for troubleshooting. IRQ conflicts are a definite possibility, and I'm kind of embarassed that it's not already a part of my troubleshooting routines.
  11. Other things to check - download a large file and check your CPU usage while downloading. I'm downloading right now at 1Gbps with about 21% total cpu usage spread pretty evenly across cores with several windows in chrome open. And we're just talking download not upload. Speedtest.net should do the trick as well, which got me to 32% CPU usage, but at 1Gpbs, on a slower connection i'd likely not see so much cpu usage. . If CPU usage is high, I'd suspect network drivers issues IF it's not this: Try transfering a large file between your drives. IF you see heavy CPU usage in this scenario i'd suspect sata controller drivers causing issues. Also might be worth checking chipset drivers. Worth going through device manager and make sure windows didn't install generic drivers for some reason on these components over your manufacturer's drivers. Sometimes it's worth digging up drivers via the manufacturer of the motherboard component rather than the manufacturer of the motherboard. If there's anything going on there there are probably topics to be found for it on google that might lead you to a better driver version. Had to do a lot of this with my 2500k as it got older and stopped getting driver updates, Windows 10 can be picky about drivers, and sometimes windows updates break them all together. I even remember having to manually extract drivers from an 8.1 install package to calm my system down before I finally upgraded. It was hangups in various games that eventually pointed me to faulty drivers hogging up my CPU. Good luck with the PSU monitoring. If there's anything wrong there EVGA should hook you up. I'm just tossing ideas into the bucket at this point.
  12. Sending you a PM to discuss temps/ocing and CPU hardware as it's really a do it at your own risk kind of thing and I can't keep it short and sweet. Watch dem temps. :) Default Tj max 5960x = like 87C, then the CPU throttles or downclocks trying to cool off. 70-80C preferable sweetspot, 85C is the max I'd allow under a full load personally, but I'd aim for lower. The fps bump you managed honestly might be an indicator of a gpu hungry for cpu cycles. That being said - on to the seeing if there's a needle in a haystack that might let you solve your issues without added cost, because we all have such different configurations, if you'd like to talk OC'ing or CPU hardware feel free to reply to my PM, or maybe start a thread on it in the PC Hardware section and people smarter than myself may chime in. My questions, with the bluetooth/wireless disabled has your Idle CPU usage changed? And how about single monitor flight to rule out tri-screen and DCS not being friendly together for some reason? Also - there's a setting in DCS for amount of monitors, is that set to 3 when running with 3 displays? (honestly things we should have checked before hardware changes or cpu speeds) The good news about the game not running well @ 5760x1080 is that it makes me think there's some solution to this somewhere. Just have to find out what.
  13. What I gather is that devs have to send nvidia a development version of their games, and they use their AI supercomputer to train the neural network that makes DLSS work. Then they implement the code and send it back. So..totally up in the air. Again just wishful thinking on my part. Nvidia still hasn't released a public download link for the NGX SDK, and there's no telling if ED would be into it. At this point it's all speculation. But it sounds pretty cool based on what I've been reading/watching. We should have a better idea what it's all about in the coming months. I'm pretty certain I've heard nvidia reps mention that if devs owned their own supercomputers they could make this work themselves.. but they cost like $130,000, so me personally I'd be hitting nvidia up to do it for free lol.
  14. My apologies, Goa, for misinterpreting. I won't disagree with your guys opinions on 20 series cards. The only card in the series that makes any sense to me at all without RTX specific features is the 2080Ti and that's only for tri-screen, 21:9,4k res and VR users .I'm definitely not disputing that the 1080Ti is a beast and a better bang for many peoples buck right now. As well as a 1080 or 1070 depending on one's budget and willingness to compromise. Trust me, I recently helped a friend part out his rig from ancient amd athlon system that ran at like @2.5ghz but had 1070 and he wasn't aware of how bad his cpu was bottlenecking the gpu. He was all like "I'm getting a 1080Ti ", playing on a 1080P monitor and I'm like "Man.. hang on to your 1070 for now. Wait till you see what your new PC can do because it's going to smoke your old rig." And it did. He got a 70FPS increase from that alone in another game I won't name cuz forum rules. Finally got him to download DCS too. :) So I'm not about blindly throwing money at things. I'm not made of money either. My house was an auction from a forclosure and for less than some people make in a year. I've only ever owned used vehicles, but I've never bought used PC parts >.< For my purposes the 2080Ti was the logical choice and I'll still be using this rig when it's no longer considered enthusiast level, and I spend a lot of time with my machine. Still I'm on about DLSS and anti-aliasing performance in DCS. Whether it be turing, or AMD offering an AI enhanced solution to AA with their next gpus, or 30 series nvidia gpus. My mind is in the future, not the present, hence a lot of "If"s in my previous posts. I don't think there's anybody I fly with who doesn't want higher framerates without shimmers and jaggies in DCS World. Albeit my wingmates have mostly adopted VR. But there are 4k users as well. Also know a people who use budget PC's and GPU's .. and I'd love to say "Hey this new feature that got implemented would give you slightly better than 1080Ti performance for $519 for a brand new 2070, you could finally hook up that 1440P display or get you a VR headset you keep asking about "Borrowing" if we aren't using them *cough*" Mostly though I have at times been at ends with MSAA performance in DCS, and although my system runs great without it, I just can't seem to ignore the shimmering textures around me. I hear this "DLSS" thing mentioned once or twice in between all the ray tracing talk coming out of Jensen Huang. And the more and more I look into DLSS the more this bell starts going of in my head like "Dingdingding we have a winner." And DCS is honestly the demanding of hunk of software I most want to see the effects of such a feature in. I don't even care about ray tracing and where it might take us. DLSS on the other hand kind of made me pick up my jaw. So I'm all for DLSS obviously, as in the long run, repeating myself I think it's just a win/win. I'm not trying to sell GPU's, I'm looking for a solution to anti-aliasing woes that right now just doesn't otherwise exist for DCS users.
  15. Heck my 980Ti pushes 80-100 FPS @3440x1440 without AA over Caucasus, with dips to 55ish with MSAA x2. Persian Gulf would maybe see 35 fps lows over the largest cities. 980Ti performance is comparible to 1070 performance, either shouldn't struggle with 1080P.
  16. Did you stop reading at the part where I said I'm waiting on my 2080Ti? The topic I was responding to was anti-aliasing and its performance in DCS. Using a 2070 as an example, I even said that without DLSS these pricetags are looking pretty sad for Turing. Adoption of such a feature might actually let the 2070 out perform the 1080Ti while providing equal or better image quality. I wasn't suggesting anybody go out and buy a 2080Ti, but in response to your comment - for me - not a waste of money at all. Maybe if I was running at 1440p I'd consider the fact that 5.0ghz 8700k's have shown CPU limitations with the 1080Ti in certain DX11 titles, but I run a 3440x1440 monitor and VR both of which can put the 2080Ti at 99% utilization in most dx11 titles with my cpu. I'm also coming from a 980Ti. I buy the best GPU i can afford at the time and as a result get to skip 1 or 2 generations of GPU's. If I hadn't bought my 21:9 or a VR headset, I wouldn't even be considering an upgrade from my 980Ti, and while $1200 is a much higher asking price than I'd like, I tend to make things last long enough to more than get my money's worth. Every review I've seen shows a 20-30 average FPS gain @4k over the 1080Ti. That isn't horrible, even if the pricetag may be. DLSS should be what makes the 2080 and 2080Ti leave the 1080Ti in the dust and the 2070 sprinting slightly ahead of pascal's best offering. So, on the topic of AA and performance as the only comment I made about the 2080Ti in my previous post is that I'm waiting for one to arrive :) Not providing or asking for shopping advice. I do also wonder if maybe there's room for higher texture resolutions and whether or not that would reduce the desire for AA on high resolution monitors. Jagged edges don't bother me, certain games I hardly notice them. It's the shimmering of treelines and buildings and missing wings from aircraft until they get really close that make me use AA and I'm wondering how other titles get around this. I don't know enough about the development side of things to say it's more than just a thought on my mind based on some reading I've been doing. I'd certainly be willing to test an optional "High resolution texture pack". But part of my keenness for DLSS is it sounds like less work for the dev team. One way or the other I'm certain I'm not alone in desiring a solution to AA and its performance in DCS. DLSS sounds very much to me like a good solution heading forward, especially for 4k and VR users. That at least gives us an option not currently available to us. Of course we can only wait and see. AI enhanced graphics to me sound like a possible future that would be silly to ignore if it truly offers the performance benefits that it boasts and nvidia's willing to do the work to implement it just to prove it. The money is in the tensor cores, and DLSS performance should make a $500-600 2070 a lot more interesting. 1080Ti's still running from $630-$1000 new excluding the $1499 EVGA Kingpin. We'll have to wait and see, as the NGX SDK download link still puts you at a "Notify me" submission form for public availability.
  17. Yes to minimum of 16GB of ram, yes to install DCS on an SSD. My DCS install takes up about 130GB, so, if limiting yourself to one drive, get at least a 500GB drive so there's enough room for your OS and DCS. SATA III should be sufficient, so don't feel like you need an NVME drive or anything. As far as budgeting the rest of your PC parts that's all up for grabs. If you're patient might be worth spreading your component purchases out over time so higher priced parts don't hurt as much, but if you want it ASAP definitely get the best parts that you can afford, keeping in mind single core CPU frequency and VRAM. 1060 6GB should be okay @ 1080P. DCS is a resource hog, and people have been bruteforcing it for years trying to increase performance with systems that would be overkill for other modern titles. And it kind of makes sense given the size of the maps, and all objects that are able to render at a distance in DCS.
  18. 10-15% CPU usage at idle sounds a little suspect. As I'm typing this my 8700K is at 3% utilization. 10-15% cpu utilization on your 5960x would be comparible to the 30% cpu utilizaton on my 2500k at idle if you look at the core count. That'd be enough for me to ask google why I have high cpu usage at idle, and sounds like a potential driver issue. Take a look at the performance tab in task manager and see if any one of your cores is running at high utilization when idle. Make sure you didn't overlook any drivers from your mobo manufacturer when you reinstalled windows. With googling you might find that any given one of those drivers there could exist a version more friendly with win10. I actually had to do this everytime I reinstalled windows 10 on my 2500k. I didn't see much difference in gaming performance between that and my new rig...but - having drivers that get continuing support has made life easier in that regard. NV Link still requires developer support and therefore likely won't work with DCS, but I hope you find use out of your new video cards. Good plan to keep one unopened so you have the chance to return it. I mean, nothing wrong with SLI/NV Link if you run games that support it. But I wouldn't get my hopes up on DCS utilizing that technology any time soon. *edit* - I linked a thread at the bottom of this post that seems to contradict what I'm saying here, at least in regards to SLI Also an x299 chipset is a bit on side of overkill on core count, but not in clockspeed frequency, as with DCS we're mostly concerned with single core performance. 8th and 9th gen intels (z370, or z390) or even a ryzen 2700x are among the most impressive chips for gaming atm, with the newer intel chips pushing to 5.0ghz on a single core out of the box, and easily handling overclocks on all cores to those speeds. As it stands - given at least a 7th gen quadcore, more cores does not = more gaming performance, although higher clockspeed will to the point of being limited at the GPU. Likely a tad less expensive all together while offering the clockspeed frequencies that should definitely rule out any CPU bottlenecks above a given resolution. A 7th gen quad core that can push 4.5ghz or more will yield better performance out of DCS than 12 cores @ 3.0ghz. (This could change when the devs implement the Vulkan API but we have no idea when that's coming or what kind of benefits to performance we'll actually see) At least we can say this much - When you shop for pc components you like to go big. ;) Personally - I'd have probably ordered a cpu/ram/mobo combo to see if that increases what you're getting out of your TitanXP's, as you could probably build an 8th or 9th gen intel system for around a grand using the same drives,case, gpu's and PSU that you're currently using. I also want to clearly state that I did not advise running out and buying $2400 worth of gpu, so if it doesn't change anything, please don't murder me. But, combined with a cpu/chipset upgrade (again, 8th gen, 9th gen, ryzen 2700x) you'll be rocking the latest and greatest gaming hardware known to man. ANd the 9th gen intel K series processors use a soldered TIM between the silicon and the heatspreader which should amount to less heat, and higher overclocks. i9 9900k would be overkill but at least you get high clockspeed frequencies out of it compared to x299, for a lot less money. A quick google search of x299 immediately pulled up a $600 motherboard and a $1000 processor that comes stock 2.9ghz, where you can get a an i9 9900K for $580, or an i7 9700k for $419 and get away with a z370 board which would be like $130-$250 depend on what you get, although the z390 boards will offer at least a bit more in the way of features. As it stands with DX11, 9900k and 9700k would perform about equally in DCS given they were set to the same clockspeed frequency.. vulkan could put 9900k well ahead, as well as ryzen, but that's not a guarantee. *edit* https://forums.eagle.ru/showthread.php?t=201454 An interesting thread on SLI, seems some people do get it to work by changing various settings in NV Profile Inspector. And you hear a lot of talk about CPU bottlenecks in the thread as well. Could be good news for NV Link as well, but we'll have to see. IF people actually are able to manage higher performance in SLI, and those same methods also apply NV Link - NV Link and the way it shares memory bandwidth should be superior. But grain of salt, ymmv. Me personally I've always just gone for the best single GPU solution I can afford as long as it isn't overkill for my display setup. Same reason I went for a 3440x1440 monitor vs 3x 1080P monitors. Less over all fiddling to make things work while offering performance I can get by with and some of the additional FOV I was looking for.
  19. AFAIK you're a 1440P user, with SSAA 1.5 and MSAA 2x aren't you pretty much running @4k with msaa on? People on triple monitor setups, or 21:9 or 4k are already high resolution. Supersampling even 1.5x is a slide show on my 980Ti on a native 3440x1440 monitor. For SSAA 1.5x just multiply. (1.5*horizontal resolution)x(1.5*vertical resolution) and the numbers will probably speak for themself. Upside to VR I guess is that you can change pixel density in increments of 10% instead of 50%. Downside it won't be as clear a picture as a monitor no matter what you do. More upsides - depth perception (i'm 6 foot from that wall, or I'm inches away from my hud after learning forward) and spotting blurry objects in the sky at a distance. Well, I guess if you SS through SteamVR rather than the pixel density setting in DCS you can do it in increments of 1%. As I wait for my 2080Ti to showup.. I can't help but mention that DLSS sounds amazing and like a solution to all of our anti-aliasing performance woes and I think we should all talk Eagle Dynamics into sending nVidia code to get it implemented. It's okay to lol at me for that. The value of the 20 series cards would go way way up for DCS World pilots. $500-$600 2070 i can't help but imagine would be a bit impresive if the tech were put to use, where without it it seems to rest almost exactly in the middle of a 1080 and a 1080Ti performancewise. Without DLSS they just look sad at those pricetags. This turned into a wordy summary of what's been going through my mind with my limited research into DLSS and what it can do. I would like some clarity from nVidia on the "We'll implement the code for free" statements from nvidia employees. Free for all or for partnered developers? The SDK is still unavailable for download, but the "Download" button is there for whenever they have it ready for the public. It links you to a "Notify me" submission form at the bottom of the page. It says specifically on the bottom of the page "Can be implemented into any game." I'm going to save the rest of my thoughts for threads about RTX features, but the only feature I'm interested in from 20 series atm is the future of DLSS. That's an AI enhanced anti-aliasing feature I believe will be relevant to discussions about AA's effects on performance. I could be completely wrong though as hands on experience isn't readily available to us yet. I'm hoping it lives up to the hype, in which case I'd happily be an advocate for it's implementation. I also wouldn't be against the same for AMD if they manage an appropriate response, which is still in the air as we don't know if Navi is going to provide anything that beats 1080Ti's performance with rastorization, much less if they're going to feel a need to respond to RTX specific technology. They've been doing good with CPU's, we'll see with GPUs. This move from nvidia is going to either be an impressive advancement in graphics technology whether it be DLSS or lighting effects or the fumble that gives AMD a chance to catch up similar to skylake/kabylake/coffee lake and Ryzen. And I'm wondering what AMD's response to the whole AI enhanced graphics thing will be. (Yeah this is the short version, too much free time I know)
  20. A fellow simmer of mine sold one of his two TitanXP's sometime this year due to the lack of SLI support in most games, DCS definitely included. Everybody I know that flies in DCS has a desire for better performance right now, although it's at least playable in VR and 4k and if you have a beefy GPU/CPU. I wouldn't completely rule out the spikes somehow disrupting the flow of data between CPU/GPU, but we'll have to see after you try 5760x1080, which afaik should be playable on a TitanXP, and again I can only speak to my own experiences in bottlenecks. Have you paid attention to the terrain around you when you notice the spikes? Near large cities maybe? If you still run into issues at lower res, more troubleshooting to do. Things to check if things are still spiking: CPU and Memory usage when system is idle - High CPU usage or high memory usage could indicate faulty drivers and you would be able to find out which by downloading the Windows Developer Kit so you can use poolmon (google is your friend) and hunt down driver version/s that work better. Windows 10 likes to sometimes use up to 1/4th of my ram if I'm not using it for something else normally. On my old i5 2500k rig that hadn't received driver support since 2010 and 2013 I ran into a network driver that was causing me to run at 30% total cpu usage (i5 quad core mind you) when idle and use up to half my available ram. The caveat of old hardware that otherwise still runs like a champ. At most you should see spikes of like 1%-5% total when idle, although the CPU may spike when you load a new window. In my case culprits ranged from sata controller drivers, to chipset drivers and network drivers as mentioned before. Windows 10 can be finicky about old drivers. My 2500k's sata controller required a specific version to get iastore.exe to quit throwing errors in Event Viewer and sometimes choking my system up. If all that checks out maybe dig deeper through these forums, as I remember some folks having issues with tri-monitor setups in dcs 2.1 alpha, though that was like a year ago, and I personally suspect not the issue. Swap your titanXP's out to rule out hardware faults (you may have already but I'm not scrolling through 3 pages right now), maybe even if you have a spare PSU laying around with enough wattage to rule out the PSU. Doubtful, my PSU's either last forever or die instantly from dust/hairball causing a short. On that note - if you're using any USB hubs I'd hope they use external power, and I've sometimes run into issues into them becoming momentarily disconnected (tapping a cable or something on accident, or knocked the power cable loose)) and got some weirdness out of my rig until I unplugged everything from them and plugged them back in. Interestingly enough that caused a bout of DCS freezing up when I'd plug a controller in that wasn't plugged in prior to running the game, where normally it'd recognize that I plugged something in and let me fly with it. I also use Event Viewer (included in windows) and check application and system logs and google the error codes if I run into problems. In Windows 10 there's a known system error related to runtimebroker (i think) labeled "DistributedCom" which is safe to ignore ime, but if you're seeing more yellow triangles or red circles down the windows system logs than that they could be an indicator of something funky going on and sometimes even lead to solutions. I'm hopeful the lower resolution sorts you out, as it saves you from the tediousness of troubleshooting routines. >.> Good luck man. Do tell if you figure it out.
  21. If there is a CPU bottleneck you generally wont see any improvement in FPS on a more powerful GPU at all. In my experiences of running into bottlenecks, it's either one core of the CPU running at 99% constantly (cpu limited), or my GPU being at 98-99% constantly (gpu limited, preferable for getting your bang for your buck) during gameplay, but that's just my personal experience. I'm still wondering if your GPU is choking at that resolution. I've been GPU limited in most modern titles on my 980Ti ever since I got my 3440x1440 monitor, and the same with the Odyssey I now use for simming, Which has been a couple years now on the monitor. I'd expect the same out of pascal or turing in many cases. So fly your heart out @4k via 1 monitor and ssaa 1.5 and see if things change. If it happens at more reasonable resolutions then you've got other issues on your hands. If the spiking behavior were to continue I'd start digging around for errors caused by outdated drivers or rather, possibly drivers without continuing updates/support or checking all my power management settings, using poolmon to look for leaks, making sure my CPU heatsink was clean and my system wasn't throttling. I've seen cpu heatsinks get so dirty it's less effort to earn the money to replace them entirely. I'd also ensure gpu fans are clean. But - 3ghz cpu is on the slow side of single core speed these days, so that bottleneck can't be ruled out either. At least with today's CPU/GPU combos increases in resolution will reverse that, making the GPU the bottleneck. So start with seeing what happens with a lower res to rule out you aren't whipping your TitanXP like a slave trying to get it to build you a pyramid on it's lonesome as he's crumbling under the weight of the stone you want him to move. :) If the CPU is a bottleneck you should be able to notice an increase in performance by upping the clockspeed of your current CPU. You can run prime95 or something to see how hot your CPU gets under a full load at default clocks which should give you an idea if you have any headroom to OC at all with your current cooler. I mean I wouldn't try to take it to 4ghz on air but you should still be able raise it enough to see if there's any performance increase. Look for reputable guides on the subject with your CPU and motherboard combination. I'm sure they exist. But I'm reading that 4.2-4.4ghz is about the limit before it takes too much power to be worth more frequency on the 5960x If things get any better with faster clocks it should give you an idea on whether a CPU upgrade might be worth it, or even alleviate your issues because at that resolution you shouldn't need extremely high clockspeeds, your GPU should be doing the brunt of the work with that amount of pixels. Do have to admit - I love my AIO liquid unit. I have a Corsair h100i, but there are a lot of brands out there. I used the thermal pad that came on the CPU cooler and average about 75C when stress testing on my 8700k @4.9ghz. I would still see how things go on a single monitor with ssaa 1.5x and rule everything else out before throwing money at it. I wouldn't want to be the guy who talked you into buying new hardware and it end up not helping. Today's CPUs are on another level than the ol 5960x though. I'll also say this - If i was on a 1080Ti or TitanXP I probably wuoldn't have bought my 2080Ti, my 980Ti has been struggling to maintain the FPS I want on my 3440x1440 monitor and VR headset for awhile now and it was time. . Your resolution is way higher than the majority of people are trying to run DCS or most other games on. Likely could bring a 2080Ti to its knees yet. I know the cards are capable of displaying up to an 8k resolutions 33.something megapixels, But I'd wager that only sees use in professional photo/video usage if at all. Games are still fleshing out of 1080P standards and jumping for the 4k marks. Most of us aim for the best single GPU solution for our needs as well, as SLI just doesn't have the support from developers it needs. NV Link, while far more impressive, may take that same route in the gaming world. Right now the only real upgrade for your TitanXP without DLSS support being adopted is a 2080Ti and it might not be a huge jump. For more perspective - in benchmarks lately for 1080Ti vs 2080Ti, we've actually learned that the 1080Ti is powerful enough to be limited by an 8700k @ 5.0ghz @ 2560x1440 resolution in certain Dx11 titles. 900 series we were running into CPU limitations at 1080P on 4ghz+ cpus. And it took exceeding 1080P, or in the 1080Ti's case exceeding 1440P to see the difference in performance between the following generation of card. And at those higher resolutions, you could even likely begin to lower CPU clock speeds to an extent without seeing a decline in framerate.
  22. I'm honestly quite curious how the 9600K is going to compare to the 8600K. I wouldn't be caught off guard if they were pretty similar, just one not requiring a delid to achieve extreme clockspeeds with reasonable temps.
  23. Call me surprised if 9GB is all that resolution is using even over Caucasus. No sarcasm. You say this has been going on for months. Was there a point this wasn't happening? If single monitor and lower res doesn't help you out, Running "DCS Repair" from the start menu should also repair the VC runtimes that are installed with DCS. I also started using Process Lasso myself on Bitmaster's advice throughout these forums and DCS isn't the only executable I have set to always run in single threaded performance mode, which basically just sets CPU affinity to physical cores on the CPU without you having to go into task manager every time you run DCS.exe for the same result. Also.. is DCS installed on an HDD or an SSD? My instinct says you are using an SSD and that it's not completely full so that shouldn't be the cause. But worth mentioning as I haven't seen a drive type (sorry if you've mentioned and sorry for basic questions, you sound like a guy who's probably done his research.) I'm under the impression you're using windows 10, could also be worth seeing how it runs with a system managed page file. Aside from running poolmon to make sure there aren't any memory leaks from drivers or background tasks that might have popped up with a windows update have you considered installing an AIO liquid CPU cooler and OC'ing the CPU? 3ghz is on the slow end these days, despite being fast for an 8 core processor at the time. quick google search found a forum thread where people reporting 1.15-1.2 Vcore getting them to 4.2ghz but would be pretty toasty on an air cooler, and single threaded performance is what would matter if a CPU limitation was the cause of your issue. On the point of upgrading from Titan XP to 2080Ti, for that kind of money you could build a brand new i7 9th gen CPU and slap your current system's drives and GPU's into it. But it's probably worth seeing what kind of solutions you can come up with without throwing money at things. I.E. OC the CPU see how she goes if none of the above helps. I'll shush now, and I wish you luck in resolving your issues, sorry if I've been no help but I do hope it's something you can resolve without much expense.
  24. Hehe I hope your new rig brings you relief from the anticipation and all of the excitement that you expected. Your current rig probably handles VR better than me atm, at least until my 2080 Ti arrives.. although it's going to be hard to fly having traded an arm and a leg for it.
  25. Afterburner works with anything except maybe overclocking newer EVGA GPU's which I'm uncertain if that's still true or what. We're not talking overclocking gpu's anyway. You have to play with the settings and what shows up on the on-screen display, but you can defintely get an idea of your CPU and GPU workload with it while flying around in DCS. For seeing what's happening while you game it's honestly the easiest to configure. Don't chew too much.. you're either getting a CPU or GPU upgrade soon and you get to check one off your list. ;)
×
×
  • Create New...