Sub2K Posted September 3, 2018 Author Posted September 3, 2018 @ElementLT, I already had it selected, but I now also selected the other settings that I can see on your screenshot. I'll give them a try. :) i7-9700K 3.6@5.0GHz / 32GB DDR4 3200 / XPG SX8200 SSD / GTX 1080 Ti / 3 x 23" LCDs (5760x1080) / TrackIR 5 / TM T-Flight HOTAS
Tj1376 Posted September 3, 2018 Posted September 3, 2018 @TJ, Actually, since I'm using 3 x 1080p monitors, I'm at 5760. Your talking width, which does task the gpu harder but isn’t the same as the last number (which in your screenshot is still 1080.) Once you move to 1440 resolution, you’ll have 25% more pixel density per square inch of monitor space. Or put another way, two exact same size monitors with one running in 1080p and the other in 1440p- the 1440p will require 25% more GPU power as it has 25% more pixels in the same area as the 1080p monitor. The gpu has to render more pixels in the same space as that 1080p monitor and it’s much tougher on the gpu. Hence- if you really want to see that 1080ti pushed to the limit on your setup, trade those 1080p panels for 1440p panels. TJ Sent from my iPhone using Tapatalk
Tippis Posted September 3, 2018 Posted September 3, 2018 (edited) Your talking width, which does task the gpu harder but isn’t the same as the last number (which in your screenshot is still 1080.) Once you move to 1440 resolution, you’ll have 25% more pixel density per square inch of monitor space. Or put another way, two exact same size monitors with one running in 1080p and the other in 1440p- the 1440p will require 25% more GPU power as it has 25% more pixels in the same area as the 1080p monitor. The gpu has to render more pixels in the same space as that 1080p monitor and it’s much tougher on the gpu. Hence- if you really want to see that 1080ti pushed to the limit on your setup, trade those 1080p panels for 1440p panels. You're confusing a couple of things here. You're right that sheer amount of pixels matters, but screen size or pixel density do not. Nor does vertical resolution take any kind of precedence over horizontal — it's the product of the two that matters, so increasing one is exactly the same as increasing the other. Just because he's running 1080 vertical resolution does not mean he's not putting a lot of strain on the graphics card — he is, just on the horizontal axis instead. In fact, he's doing a lot more so with his setup than what you're suggesting. At 5760×1080, he's pushing 6.2 Mpx compared to merely 3.7 Mpx for a regular 1440p display — almost 70% more — and it's 25% more than the 3440×1440px resolution Headwarp is describing. Sub2K has already put a lot more strain on the graphics card than just going from a 1080p display to any regular 1440p one. Edited September 3, 2018 by Tippis ❧ ❧ Inside you are two wolves. One cannot land; the other shoots friendlies. You are a Goon. ❧ ❧
Tj1376 Posted September 3, 2018 Posted September 3, 2018 You're confusing a couple of things here. You're right that sheer amount of pixels matters, but screen size or pixel density do not. Nor does vertical resolution take any kind of precedence over horizontal — it's the product of the two that matters, so increasing one is exactly the same as increasing the other. Just because he's running 1080 vertical resolution does not mean he's not putting a lot of strain on the graphics card — he is, just on the horizontal axis instead. In fact, he's doing a lot more so with his setup than what you're suggesting. At 5760×1080, he's pushing 6.2 Mpx compared to merely 3.7 Mpx for a regular 1440p display — almost 70% more. He has already put a lot more strain on the graphics card than just going from a 1080p display to a 1440p one. We were comparing two setups from two people, both with triple monitors. One went to 1440p the other stayed at 1080p and the two posters were trying to figure out why the 1080p setup wasn’t taxing the gpu like the other poster who had 1440p. My post still stands. The only way he is going to tax the 1080ti gpu further is to increase resolution. And since he already has three screens- that means more pixel density. 1440p it is! Also- pixel density is everything. If I can cram 25% more pixels into the same space- the gpu will work harder to render that frame. Your first sentence in your second paragraph is factually incorrect. Although I do agree with the rest of your post- I think you just lost the context of the conversation. TJ Sent from my iPhone using Tapatalk
Sub2K Posted September 3, 2018 Author Posted September 3, 2018 @TJ, If I'm gonna upgrade my three 1080 monitors to three 1440..... I'm gonna need a loan!... ;) i7-9700K 3.6@5.0GHz / 32GB DDR4 3200 / XPG SX8200 SSD / GTX 1080 Ti / 3 x 23" LCDs (5760x1080) / TrackIR 5 / TM T-Flight HOTAS
Tj1376 Posted September 3, 2018 Posted September 3, 2018 @TJ, If I'm gonna upgrade my three 1080 monitors to three 1440..... I'm gonna need a loan!... ;) #TruthFact! I’m still nursing a 780Ti for a similar reason! TJ Sent from my iPhone using Tapatalk
Tippis Posted September 3, 2018 Posted September 3, 2018 (edited) We were comparing two setups from two people, both with triple monitors. One went to 1440p the other stayed at 1080p and the two posters were trying to figure out why the 1080p setup wasn’t taxing the gpu like the other poster who had 1440p. My post still stands. I'm simply looking at the context of that quote and what Sub2k was responding to. It seemed like you were suggesting that the difference Headwarp saw when he went to 3440×1440 was somehow due to the change in vertical resolution, whereas (by the sound of it) Sub2k should not have seen a change because he remained at 1080. The exchange I followed was: That being said - my experience ends at the 980Ti - I might learn some things when I do ugprade, hopefully this month. For the life of me - the fact that with more pixels, your 1080Ti is giving the exact same framerate as your 1060 as if you were CPU bound on both cards where my 980Ti has been at 99% ever since I got a 3440x1440 monitor on the i5 counterpart of your exact build has me baffled.Resolution is the reason. You went from 1080p to 1440p- original poster did not (he is still 1080p.) Increasing resolution puts relatively little strain on CPU but tortures the GPU. Actually, since I'm using 3 x 1080p monitors, I'm at 5760. I'm saying that the vertical resolution isn't what matters — it's the sum total of the amount of pixels that need to be pushed that does. The reason the Headwarp's card is taxed is because he's pushing out almost 5 Mpx per frame which is (apparently) at the very edge of what the card can handle. Sub2k is pushing out even more — 6.2 Mpx. It's 69% more than you have to feed a regular 2560×1440 display; it's 26% more than what's needed for 3440×1440 that Headwarp was talking about; it is (obviously) 3× more than a regular 1920×1080 display. So the 1440p vs 1080p comparison is a red herring. Between the 5760×1080 and 3440×1440 setups, the former — the 1080p setup — is the more taxing one. More to the point, while it might not max out a 1080 Ti, it should be a severe strain on a 1060 and thus the upgrade from the latter to the former should have made a difference. But it didn't. So either a 1060 is more than enough to process up that huge resolution, and the Ti much more so, but in that case, he shouldn't suffer such low fps on either card. Or he was pushing the limit of the 1060 (which would explain the low fps there), in which case we have to wonder why he did not see an improvement when he gets a graphics card that isn't tortured by that resolution — some other bottleneck is keeping it down, and it's not resolution bound. Or last option: that large amount of pixels is actually enough to choke the Ti, in which case the mind boggles as to how the 1060 managed to keep up the same frame rate. Also- pixel density is everything. If I can cram 25% more pixels into the same space- the gpu will work harder to render that frame. No. Pixel density is just a matter for the monitor manufacturer to worry about in terms of maintaining build and image quality. It makes zero difference for the graphics card because the graphics card does not care about the monitor size — it just pushes pixels. It takes the exact same amount of work to push 3.7 Mpx to a 24" display (122 px/in) as it does to push 3.7 Mpx to a 30" display (98 px/in), even though the former has 25% higher pixel density. The GPU has to work harder if you increase the resolution, but that has nothing to do with pixel density. Edited September 3, 2018 by Tippis ❧ ❧ Inside you are two wolves. One cannot land; the other shoots friendlies. You are a Goon. ❧ ❧
Headwarp Posted September 3, 2018 Posted September 3, 2018 (edited) Resolution is the reason. You went from 1080p to 1440p- original poster did not (he is still 1080p.) Increasing resolution puts relatively little strain on CPU but tortures the GPU. This is also why most folks don’t recommend a 1080ti for 1080p gaming- you just aren’t going to see any major improvements because at 1080p there still isn’t enough CPU power to push that card. TJ Sent from my iPhone using Tapatalk He's not @ 1080P he's using 3 screens thats 5760x1080. Again.. something is hogging up his CPU causing his GPU not to work hard even though he has 2million more pixels to process than if using my single 21:9 monitor. .. and having experience with 2nd gen intel chips - my guess is a faulty system driver, that may well be the latest one provided by his mobo manufacturer, but eating up cpu cycles. His 1060 should have been pegged at that resolution. If you dig hard enough - at least up until I built this system at the end of last year - you can find drivers that work better than others. Google how to find faulty drivers with Poolmon..the windows developer kit is free. IF I'm wrong - increasing CPU clockspeed should provide an increase to FPS. Edited September 3, 2018 by Headwarp Spoiler Win 11 Pro, z790 i9 13900k, RTX 4090 , 64GB DDR 6400GB, OS and DCS are on separate pci-e 4.0 drives Sim hardware - VKB MCG Ultimate with 200mm extension, Virpil T-50CM3 Dual throttles. Blackhog B-explorer (A), TM Cougar MFD's (two), MFG Crosswinds with dampener. Obutto R3volution gaming pit.
Tj1376 Posted September 3, 2018 Posted September 3, 2018 I'm simply looking at the context of that quote and what Sub2k was responding to. It seemed like you were suggesting that the difference Headwarp saw when he went to 3440×1440 was somehow due to the change in vertical resolution, whereas (by the sound of it) Sub2k should not have seen a change because he remained at 1080. The exchange I followed was: I'm saying that the vertical resolution isn't what matters — it's the sum total of the amount of pixels that need to be pushed that does. The reason the Headwarp's card is taxed is because he's pushing out almost 5 Mpx per frame which is (apparently) more than the card can handle. Sub2k is pushing out even more — 6.2 Mpx. It's 69% more than you have to feed a regular 2560×1440 display; it's 26% more than what's needed for 3440×1440 that Headwarp was talking about; it is (obviously) 3× more than a regular 1920×1080 display. So the 1440p vs 1080p comparison is a red herring. Between the 5760×1080 and 3440×1440 setups, the former — the 1080p setup — is the more taxing one. More to the point, while it might not max out a 1080 Ti, it should be a severe strain on a 1060 and thus the upgrade from the latter to the former should have made a difference. But it didn't. So either a 1060 is more than enough to process up that huge resolution, and the Ti much more so, but in that case, he shouldn't suffer such low fps on either card. Or he was pushing the limit of the 1060 (which would explain the low fps there), in which case we have to wonder why he did not see an improvement when he gets a graphics card that isn't tortured by that resolution — some other bottleneck is keeping it down, and it's not resolution bound. No. Pixel density is just a matter for the monitor manufacturer to worry about in terms of maintaining build and image quality. It makes zero difference for the graphics card because the graphics card does not care about the monitor size — it just pushes pixels. It takes the exact same amount of work to push 3.7 Mpx to a 24" display (122 px/in) as it does to push 3.7 Mpx to a 30" display (98 px/in), even though the former has 25% higher pixel density. I just can’t man. I don’t even know where to begin. You obviously can’t compare MPX in two different monitor sizes! Sure, a triple monitor setup will require MORE PIXELS to compute, which requires more gpu. Also, if you increase the pixel density with the same size monitor (ala moving from 1080 to 1440) you’ll tax the gpu much harder. Again- more pixels per square inch (say two 27 inch monitors one at 1080 the other at 1440) causes the gpu to work about 25% harder on each frame for the 1440 due to the increase in pixel density per square inch. These are basic fundamental facts. His CPU (as posted in a screenshot earlier) is the cause for his poor performance. It’s old and showing it’s age in its single thread performance. Sure he might clean some drivers up and get a small increase, but an old Sandy Bridge won’t power his 1060, let alone the 1080ti. Again, at 1080. Increase your resolution and you can tax that gpu much harder- say by moving to three same size 1440p monitors. Or add a fourth 1080... I didn’t realize nvidia allows four monitors on the 10xx series. I’m going to let this die. It’s clear from your long post (and the multiple edits that took place while I wrote this quick reply) that you have time to argue and debate. I’m not interested. I was here to help a guy understand how his 2600k was his bottleneck and what he might do to improve it. I’ve proven that point and will move on. Good day. TJ Sent from my iPhone using Tapatalk
Headwarp Posted September 3, 2018 Posted September 3, 2018 (edited) I just can’t man. I don’t even know where to begin. You obviously can’t compare MPX in two different monitor sizes! Sure, a triple monitor setup will require MORE PIXELS to compute, which requires more gpu. Also, if you increase the pixel density with the same size monitor (ala moving from 1080 to 1440) you’ll tax the gpu much harder. Again- more pixels per square inch (say two 27 inch monitors one at 1080 the other at 1440) causes the gpu to work about 25% harder on each frame for the 1440 due to the increase in pixel density per square inch. These are basic fundamental facts. His CPU (as posted in a screenshot earlier) is the cause for his poor performance. It’s old and showing it’s age in its single thread performance. Sure he might clean some drivers up and get a small increase, but an old Sandy Bridge won’t power his 1060, let alone the 1080ti. Again, at 1080. Increase your resolution and you can tax that gpu much harder- say by moving to three same size 1440p monitors. Or add a fourth 1080... I didn’t realize nvidia allows four monitors on the 10xx series. I’m going to let this die. It’s clear from your long post (and the multiple edits that took place while I wrote this quick reply) that you have time to argue and debate. I’m not interested. I was here to help a guy understand how his 2600k was his bottleneck and what he might do to improve it. I’ve proven that point and will move on. Good day. TJ Sent from my iPhone using Tapatalk Dude if his drivers are faulty and causing high cpu usage, cleaning them up will increase performance in a CPU hogging game by multitudes.. I've experienced the issue and what i can do to gaming. I never once mentioned going from 1080P to 1440p, which 3440x1440 isn't 1440p lol. What I did say that modern cards are powerful enough that @ 1080P (which is 16:9 1920x1080, not 21:9 5760x1080) CPU's under a certain clockspeed are bottlenecked. Even a faulty network driver can hog cpu cycles and drastically reduce performance on a machine. In fact the only difference between his rig and my 2500k is hyperthreading, resolution of which he has more, and his GPUs, beyond perhaps different branding of the mobo/ram components. With that res his 1060 should have been pegged.. but the fact that NO fps increase came from the upgrade, hints at that same CPU bottleneck being present before the GPU swap, which simply doesn't make sense, considering again that a 980TI at lower resolution, more powerful than 1060, was not held back by a 4.3ghz second gen intel cpu. Again.. if it's JUST his CPU -increasing clock speed would net pretty large FPS increases. You've not really proven a thing.. there are multitudes of things that can cause high cpu usage, and it takes more than looking at task manager and coming to a conclusion. \ The fact that I didn't get an increase going from a 4.3ghz 2500k to a 4.7ghz 8700k, tells me taht I could downclock to a lower CPU speed than 4.3ghz and not take an fps hit because my GPU is maxed. If he were to plug his 1060 back in and it not be pegged at 99% at that resolution - I'd bet money it was one of the janky drivers I was talking about. I'd be willing to bet that if you overclocked that 3770k of yours to 4.5 or more ghz you'd see increases in fps without paying for more than a new cpu cooler, provided - you have the best working drivers for your system. @Sub2k don't buy 3 1440P monitors. ;P But you can try supersampling in game or DSR from NV control panel to achieve higher resolutions without it looking bad, forcing more strain on the GPU. Anytime I encounter fishy behavior on my PC, even if it's only in ONE cpu hogging game - troubleshooting steps start at making sure I have the best working drivers my system and no CPU throttling, which usually isn't much of an issue with newer hardware. Only after I'm sure that all of that is correctly installed, or that I'm not suffering memory leaks, or CPU hogging faulty drivers. This is something I had to do regularly with my 2500k, as I wipe and reinstall everything on my rigs once or twice a year, everytime I did, I had to go through some routine with my 2nd gen intel board to find the best set of drivers. It's one of the things that sold me on finally upgrading, getting current driver updates. As otherwise - sandy bridge is still an overclocking beast. Edited September 3, 2018 by Headwarp Spoiler Win 11 Pro, z790 i9 13900k, RTX 4090 , 64GB DDR 6400GB, OS and DCS are on separate pci-e 4.0 drives Sim hardware - VKB MCG Ultimate with 200mm extension, Virpil T-50CM3 Dual throttles. Blackhog B-explorer (A), TM Cougar MFD's (two), MFG Crosswinds with dampener. Obutto R3volution gaming pit.
Tj1376 Posted September 3, 2018 Posted September 3, 2018 Dude if his drivers are faulty and causing high cpu usage, cleaning them up will increase performance in a CPU hogging game by multitudes.. I've experienced the issue and what i can do to gaming. I never once mentioned going from 1080P to 1440p, which 3440x1440 isn't 1440p lol. What I did say that modern cards are powerful enough that @ 1080P CPU's under a certain clockspeed are bottlenecked. Even a faulty network driver can hog cpu cycles and drastically reduce performance on a machine. In fact the only difference between his rig and my 2500k is hyperthreading, resolution of which he has more, and his GPUs. With that res his 1060 should have been pegged.. but the fact that NO fps increase came from the upgrade, shows that it wasn't. Again.. if it's JUST his CPU -increasing clock speed would net FPS increases. You've not really proven a thing.. there are multitudes of things that can cause high cpu usage, and it takes more than looking at task manager and coming to a conclusion. \ The fact that I didn't get an increase going from a 4.3ghz 2500k to a 4.7ghz 8700k, tells me taht I could downclock to a lower CPU speed and not take an fps hit because my GPU is maxed. If he were to plug his 1060 back in and it not be pegged at 99% at that resolution - I'd bet money it was one of the janky drivers I was talking about. I'd be willing to bet that if you overclocked that 3770k of yours to 4.5 or more ghz you'd see increases in fps without paying for more than a new cpu cooler, provided - you have the best working drivers for your system. Show me a driver issue that nets you even a ten percent change in performance and I’ll be amazed. This isn’t an AMD platform we are talking about here. :) TJ Sent from my iPhone using Tapatalk
Tippis Posted September 3, 2018 Posted September 3, 2018 (edited) I just can’t man. I don’t even know where to begin. You obviously can’t compare MPX in two different monitor sizes! Of course you can, because monitor size simply does not matter. Pixel density is a physical characteristic of the monitor build — stuff the graphics card couldn't care less about. If you push an image of a given resolution to a 10" screen, it will require exactly the same amount of GPU power as if the same resolution was displayed on a 100" screen, irrespective of the much lower pixel density of the latter. If you pushed a 25% larger image to a 25% larger screen, you'd need 25% more GPU power, even though the pixel density would be exactly the same. Also, if you increase the pixel density with the same size monitor (ala moving from 1080 to 1440) you’ll tax the gpu much harder. Close, but not quite. If you move to a higher resolution, you tax the GPU more, but again, monitor size and pixel density are not factors in that increased load. Only the resolution itself matters. Again, at 1080. Increase your resolution and you can tax that gpu much harder- say by moving to three same size 1440p monitors. Or add a fourth 1080. Exactly. The whole 1080 vs 1440 is not really what mattes — it's the total resolution, and his setup means that he's already running at a resolution that taxes the GPU pretty hard. Further increasing this resolution might tax the card more — no surprise there — but that doesn't really help resolve anything. Yes, CPU limitations are a likely candidate for explaining the lack of improvement, just as the screwy multi-monitor rendering suggested by toutenglisse, or any of a number of driver issues as Headwarp suggests. That was never really the question — just that even with his current setup, he is pushing the card harder than the 1440 setup that it was compared against and that you can't just look at vertical resolution and conclude which one is more punishing for the graphics card. Edited September 3, 2018 by Tippis ❧ ❧ Inside you are two wolves. One cannot land; the other shoots friendlies. You are a Goon. ❧ ❧
Headwarp Posted September 3, 2018 Posted September 3, 2018 (edited) Show me a driver issue that nets you even a ten percent change in performance and I’ll be amazed. This isn’t an AMD platform we are talking about here. :) TJ Sent from my iPhone using Tapatalk Man, 2nd gen drivers are from 2010 and 2013.. I promise you, there's at least one that causes just about any process to cause spikes to 99% cpu usage that normally wouldn't be a heavy load. I've had to hunt down drivers to fix that often enough to know. The main culprits again were sata controller drivers and network drivers. Try a fresh windows install without installing any of the drivers for your motherboard at all. You might get lucky and windows 10 have them in their database. But if not - you can watch the cpu go to 99% generally just by moving your mouse. I'm not completely ruling out the 1080Ti being cpu bottlenecked @ 4.3 ghz, but at the same time - his 1060 should not have been at his high resolution, though it would struggle with it, which would at least give SOME increase upon swapping to a more powerful gpu up to the point of the cpu not being fast enough. I've had to trouble shoot a 2nd gen intel system enough to say - check for driver faults with Poolmon..it's pretty imformative.. you might find an older driver works better than the latest one. here I did some googling for you please check the troubleshooting steps listed in this guide for troubling high cpu usage errors/system interupts, including "Update device drivers" https://blog.pcrisk.com/windows/12795-system-interrupts-causing-high-cpu-usage https://superuser.com/questions/1133501/25-cpu-usage-at-idle-windows-10-system-interrupts I mean..drivers only tell your hardware how to operate, even your CPU itself. You can't imagine errors in driver software causing problems? Edited September 3, 2018 by Headwarp Spoiler Win 11 Pro, z790 i9 13900k, RTX 4090 , 64GB DDR 6400GB, OS and DCS are on separate pci-e 4.0 drives Sim hardware - VKB MCG Ultimate with 200mm extension, Virpil T-50CM3 Dual throttles. Blackhog B-explorer (A), TM Cougar MFD's (two), MFG Crosswinds with dampener. Obutto R3volution gaming pit.
Tj1376 Posted September 3, 2018 Posted September 3, 2018 Of course you can, because monitor size simply does not matter. Pixel density is a physical characteristic of the monitor build — stuff the graphics card couldn't care less about. If you push an image of a given resolution to a 10" screen, it will require exactly the same amount of GPU power as if the same resolution was displayed on a 100" screen, irrespective of the much lower pixel density of the latter. If you pushed a 25% larger image to a 25% larger screen, you'd need 25% more GPU power, even though the pixel density would be exactly the same.[/Quote] But this isn’t even the argument we are making.... We are discussing two monitors (well in this case six monitors) of exactly the same size. Three of those monitors are 1080p, three are 1440p. If you are going to sit there and tell me that the 1440p monitors won’t tax the gpu harder, then there really isn’t anything I’m going to be able to say to convince you otherwise. My post here was telling the OP that if he wants to see that 1080ti trimble at the knees, he’d have to increase his resolution- because there is nothing his CPU will be able to do to increase FPS in any significant way. Close' date=' but not quite. If you move [i']to a higher resolution[/i], you tax the GPU more, but again, monitor size and pixel density are not factors in that increased load. Only the resolution itself matters. Exactly. The whole 1080 vs 1440 is not really what mattes — it's the total resolution, and his setup means that he's already running at a resolution that taxes the GPU pretty hard. Further increasing this resolution might tax the card more — no surprise there — but that doesn't really help resolve anything. Yes, CPU limitations are a likely candidate for explaining the lack of improvement, just as the screwy multi-monitor rendering suggested by toutenglisse, or any of a number of driver issues as Headwarp suggests. That was never really the question — just that even with his current setup, he is pushing the card harder than the 1440 setup that it was compared against and that you can't just look at vertical resolution and conclude which one is more punishing for the graphics card. I agree with all of this, except the last paragraph second sentence. I was suggesting to increase monitor resolution by moving to 1440- the OP agreed with his “I’d need a loan” comment. This wouldn’t tax the CPU anywhere near as much as it would hurt the GPU. (Of course I also don’t recommend DCS in 1440p- plane spotting is hard enough in 1080! That increase in pixel density makes it much harder for your physical eye to see the speck in the horizon!) Cheers- I’m off to the day job. TJ Sent from my iPhone using Tapatalk
Tj1376 Posted September 3, 2018 Posted September 3, 2018 Man, 2nd gen drivers are from 2010 and 2013.. I promise you, there's at least one that causes just about any process to cause spikes to 99% cpu usage that normally wouldn't be a heavy load. I've had to hunt down drivers to fix that often enough to know. The main culprits again were sata controller drivers and network drivers. Try a fresh windows install without installing any of the drivers for your motherboard at all. You might get lucky and windows 10 have them in their database. But if not - you can watch the cpu go to 99% generally just by moving your mouse. I'm not completely ruling out the 1080Ti being cpu bottlenecked @ 4.3 ghz, but at the same time - his 1060 should not have been at his high resolution, though it would struggle with it, which would at least give SOME increase upon swapping to a more powerful gpu up to the point of the cpu not being fast enough. I've had to trouble shoot a 2nd gen intel system enough to say - check for driver faults with Poolmon..it's pretty imformative.. you might find an older driver works better than the latest one. Yeah but in your scenario I imagine he’d have a horrible experience with his machine in general. I doubt he would have just dropped all that coin to upgrade to 1080ti if his overall machine was horrible. Instead, it’s exactly this that I’ve seen time and time again on these forums. 2600K folks upgrading to 1080 or 1080ti and not understanding why they don’t seen a difference in performance. This screen grab of the 2600k cpu running DCS is exactly why the 1080ti makes no (or little) difference. There is no headroom in the single thread CPU performance, hence the CPU bottleneck when running DCS. Sent from my iPhone using Tapatalk
Tippis Posted September 3, 2018 Posted September 3, 2018 (edited) But this isn’t even the argument we are making.... We are discussing two monitors (well in this case six monitors) of exactly the same size. Three of those monitors are 1080p, three are 1440p. If you are going to sit there and tell me that the 1440p monitors won’t tax the gpu harder, then there really isn’t anything I’m going to be able to say to convince you otherwise. No, I'm simply telling you that the size is irrelevant, as is the pixel density (since that's a function of size) — only the resolution matters. My point was that his resolution was already significant — indeed higher than the one it was compared against — and making his graphics card tremble wouldn't exactly improve his FPS. In addition, if taxing the GPU was the end goal, then based on what we've seen so far, DCS isn't really the right software to do that in because something else is holding him back. And in other games, he can achieve the same effect simply by ramping up graphics quality until that resolution becomes an issue for the rendering pipeline (which often can be done without causing any additional CPU load). My other point was that, if you're going to suggest things that get more out of the graphics card, then something as irrelevant as pixel density isn't a good choice because, again, it's not a factor. Oh, and… I agree with all of this, except the last paragraph second sentence If you mean the part about how “with his current setup, he is pushing the card harder than the 1440 setup that it was compared against and that you can't just look at vertical resolution and conclude which one is more punishing for the graphics card” then I'm sorry, but that's just a fact. 5760×1080 is more taxing than 3440×1440 and vertical resolution alone simply cannot tell us which is tougher to render because it is, quite literally, only half of the equation. Edited September 3, 2018 by Tippis ❧ ❧ Inside you are two wolves. One cannot land; the other shoots friendlies. You are a Goon. ❧ ❧
Tj1376 Posted September 3, 2018 Posted September 3, 2018 No, I'm simply telling you that the size is irrelevant, as is the pixel density (since that's a function of size) — only the resolution matters. My point was that his resolution was already significant — indeed higher than the one it was compared against — and making his graphics card tremble wouldn't exactly improve his FPS. In addition, if taxing the GPU was the end goal, then based on what we've seen so far, DCS isn't really the right software to do that in because something else is holding him back. And in other games, he can achieve the same effect simply by ramping up graphics quality until that resolution becomes an issue for the rendering pipeline. My other point was that, if you're going to suggest things that get more out of the graphics card, then something as irrelevant as pixel density isn't a good choice because, again, it's not a factor. Agree to disagree. TJ Sent from my iPhone using Tapatalk
Headwarp Posted September 3, 2018 Posted September 3, 2018 (edited) Yeah but in your scenario I imagine he’d have a horrible experience with his machine in general. I doubt he would have just dropped all that coin to upgrade to 1080ti if his overall machine was horrible. Instead, it’s exactly this that I’ve seen time and time again on these forums. 2600K folks upgrading to 1080 or 1080ti and not understanding why they don’t seen a difference in performance. This screen grab of the 2600k cpu running DCS is exactly why the 1080ti makes no (or little) difference. There is no headroom in the single thread CPU performance, hence the CPU bottleneck when running DCS. Sent from my iPhone using Tapatalk You'd be surprised.. there were I games I played with a throttling CPU due to heat before I ever OC'd 2500kthat ran fine, while cpu intensive games crapped on themselves, indicating to me I needed to troubleshoot. There were games that didn't care that my faulty drivers were stealing CPU cycles, while others would at run at lower FPS if not grind to a freezing halt, sometimes recovering. That screen grab only states that one core of the cpu is taxed at 99%, not the cause of it being so. The question is - was he also at 99% on 1 core of his cpu and low GPU utilization with 1060? If abolutely zero fps increase between cards, that sounds like yes. If so - that indicates something going on that doesn't quite make sense given the results I had on a comparible machine running a lower resolution on a more powerful GPU, also a 2nd gen intel, that I had to be super picky about which drivers I installed or I'd run into issues into games that made the CPU do any work. There could be more to the cause of it. There could be faulty drivers causing the cpu to endure more stress than it otherwise would. It could be malware, it could be Windows Power Management rather than nvidia power management, it could be anti-virus software running in the background, windows defender, it could a number of things. Windows 10 might have finally broken all remaining drivers for chipsets that old. who knows. Regardless - if the cpu is a bottleneck not as a result of faulty drivers or system settings - clockspeed increases should net performance gains, which that sandy bridge could probably do with a more efficient CPU cooler. Running an old system, with drivers that are not updated further requires a bit more in terms of maintenance than newer hardware with continuing support. Wrapping my head around that let me game on a 2500k for 5+ years, feeling like a brand new machine when i finally OC'd it long after it was out of warranty. I'd still be doing it if I had opted for a 1080Ti rather than a CPU upgrade. It took me 6 generations to be convinced to upgrade my CPU. 2 more cores, stock clocks faster than my 2500k was OC'd on air. NOt having to remember which specific driver version I needed for a stable system. If it were just gaming performance, then I wasted $1200 on a new rig. Edited September 4, 2018 by Headwarp Spoiler Win 11 Pro, z790 i9 13900k, RTX 4090 , 64GB DDR 6400GB, OS and DCS are on separate pci-e 4.0 drives Sim hardware - VKB MCG Ultimate with 200mm extension, Virpil T-50CM3 Dual throttles. Blackhog B-explorer (A), TM Cougar MFD's (two), MFG Crosswinds with dampener. Obutto R3volution gaming pit.
Calinho Posted September 4, 2018 Posted September 4, 2018 (edited) just crank this 2600k to 4.8Ghz. My i7 2700k run @4.8Ghz 24/7 air cooled with 1.37V. A have very stable FPS +70 FPS(GTX 1070) in 1080p + RAID 0 SSD SATA3 + 16GB DDR3 1866 CL9, almost everything maxed out Edited September 4, 2018 by Calinho
Mars Exulte Posted September 4, 2018 Posted September 4, 2018 That CPU was launched in 2011, and discontinued in 2013. You have a current gen 1080ti. You are CPU bottlenecked. Де вороги, знайдуться козаки їх перемогти. 5800x3d * 3090 * 64gb * Reverb G2
Headwarp Posted September 4, 2018 Posted September 4, 2018 (edited) Screen grab came from this video (turn down your sound the music is awful.) 8700k gets huge leap in multi-core performance, but the single threaded performance of a 2600k @4.7ghz is almost upthere with a stock 8700k turboing all of its 6 cores to 4.4ghz. Only using because it's like the only thing I can find directly comparing stock 8700k to OC'd 2600k Now, the 8700k does give improved performance, but the OC'd 2600k isn't detrimentally behind the stock 8700k CPU. 4.3ghz is likely bottlenecking the 1080Ti. But to the point of 22 fps? The same 22 fps you got ona 1060? Very coincidental that your cpu bottleneck would be right at the maximum capability of a gpu that should perform worse than a high end 9 series card. Crank the clockspeeds of your CPU, see if it helps, if not - im still wagering drivers. Bare in mind DX12 benchmarks will likely make better use of multi-core (like vulkan should), where DX11 titles the two CPU's remain pretty close with their single threaded performance. Pretty close isn't that bad if you aren't happy at the idea of investing another $1300 USD or more into your gaming rig. There could also be some truth to using Process Lasso to ensure DCS is using physical cores rather than logical cores, which should give the same effect as disabling hyperthreading while playing DCS, but still offering the extra threads for windows and background tasks. Being bottlenecked at the GPU myself, I haven't stumbled across a need to find out. Edited September 4, 2018 by Headwarp Spoiler Win 11 Pro, z790 i9 13900k, RTX 4090 , 64GB DDR 6400GB, OS and DCS are on separate pci-e 4.0 drives Sim hardware - VKB MCG Ultimate with 200mm extension, Virpil T-50CM3 Dual throttles. Blackhog B-explorer (A), TM Cougar MFD's (two), MFG Crosswinds with dampener. Obutto R3volution gaming pit.
David OC Posted September 4, 2018 Posted September 4, 2018 Have you tried setting this up a different way? I cannot see that you have posted how you setup the 3 screens in DCS are you using Nvidia Surround? All the old post and old DCS point to... Quote LINK "GTX 1080 FTW I'm only getting 70% GPU utilization and about 20-30 FPS." "UPDATE" "My three lower monitors that are joined via Nvidia Surround were plugged into my gtx 1080 and the 4th monitor was plugged into my second GPU (non-sli). I plugged my 4th monitor into the gtx 1080 and BAM! 60fps! The GPU is not running at 99%, but the 1080 can easily maintain 60fps even using the "High" preset in the graphics settings." End Quote LINK Have you tried any of these old settings on 2.5 now and see if you gain 40 fps? i7-7700K OC @ 5Ghz | ASUS IX Hero MB | ASUS GTX 1080 Ti STRIX | 32GB Corsair 3000Mhz | Corsair H100i V2 Radiator | Samsung 960 EVO M.2 NVMe 500G SSD | Samsung 850 EVO 500G SSD | Corsair HX850i Platinum 850W | Oculus Rift | ASUS PG278Q 27-inch, 2560 x 1440, G-SYNC, 144Hz, 1ms | VKB Gunfighter Pro Chuck's DCS Tutorial Library Download PDF Tutorial guides to help get up to speed with aircraft quickly and also great for taking a good look at the aircraft available for DCS before purchasing. Link
Sub2K Posted September 5, 2018 Author Posted September 5, 2018 @ David OC, My 3 x 1080p monitors are all connected with individual DisplayPort cables to the 1080 Ti. I no longer use Nvidia Surround as I found it to be a bit 'flaky' at times. I simply setup my monitors as '3 Screen' and 5.33 aspect ratio in DCS. i7-9700K 3.6@5.0GHz / 32GB DDR4 3200 / XPG SX8200 SSD / GTX 1080 Ti / 3 x 23" LCDs (5760x1080) / TrackIR 5 / TM T-Flight HOTAS
vortexringstate Posted September 11, 2018 Posted September 11, 2018 Iv just put in a new 1080ti and although light years better than my old 1060, my FPS is locked on 45. Only in the start screen does it sit at 90. Iv been tweaking up and up all the Gfx options but nothing changes the FPS...?! Someone suggested unchecking full screen ( didntbmake any difference) Help please Occulus Rift i7-7700 3.6Ghz 16GB Ram 1080Ti 11GB SSDx2
Sub2K Posted November 10, 2018 Author Posted November 10, 2018 I don't want to resurrect this old thread, but I want to pass on that the helpful folks here were right: I upgraded to an i7-8700K, and my DCS frame rate has easily more than doubled. That definitely confirms that I was greatly CPU-bound with my good 'ol 2600K... :( i7-9700K 3.6@5.0GHz / 32GB DDR4 3200 / XPG SX8200 SSD / GTX 1080 Ti / 3 x 23" LCDs (5760x1080) / TrackIR 5 / TM T-Flight HOTAS
Recommended Posts