-
Posts
991 -
Joined
-
Last visited
Content Type
Profiles
Forums
Events
Everything posted by Headwarp
-
I'm all for it. I want to know how much truth is behind nvidia offering to implement DLSS for free if given a development copy of games to work with. Adding DLSS wouldn't negatively affect current users of cards not capable in anyway, yet paves the road for future cards. I think AMD needs to step up their game and come up with a response to it but, if DLSS offers the performance benefits it's showing off in the reviews I've watched it could only help DCS World to utilize it, for people already buying 20 series cards and future nvidia series cards down the road. DCS seems like a game that would benefit the most from AA being offloaded to the tensor cores.
-
Hehe My Oddyssey I have to compromise texture quality and stay at 1.0 PD for a constant 45fps. Hoping for high/high settings perhaps a P.D. increase but we'll see. I'm eager to see what things will look like when I can get away with added pixel density with the native 1440x1600 per eye. . I'm thinking SSAA + PD increases could be rough. Other than that just kind of dreaming ED will be like "DLSS sounds awesome."
-
We would, but that's up to the devs to submit their code to nVidia for DLSS implementation. I'd love this honestly, but don't hold your breath.
-
Try running without MSAA and enable SSAA. Basically increases the resolution which should get around any cpu limitations @1440p. 8700k in some dx11 titles @ 5.0ghz has ended up cpu limited with a 1080Ti @1080P and @1440p in reviews I've watched. 4k res should be a different story and I'd imagine VR as well (slightly higher pixel density). I don't know if it will provide framerates as high as your 1080Ti/8700k provided @1440P, but it should perform at higher framerates than your 1080Ti @1440p + SSAA on both cards. If you're only interest is DCS as it is right now, make sure you're already GPU limited before upgrading, but a 2080Ti should offer better performance with enough supersampling to put the 1080Ti at 99%, which you still may be happy with the results of your 1080Ti in that regard, but the 2080Ti is a 4k card. Until the 1080Ti 1080P was the resolution we were most likely to see a CPU limitation. Now apparently we've moved up to 5ghz and 1440p. Disregarding VR.
-
I've seen at least one review where a guy even shows CPU limitations on an 8700k @5.0ghz @1080P and 1440P even with the 1080Ti (some games, others were more gpu hungry). I'm not going to link it because it contains other games and I don't want to break any forum rules, but 2080 and 2080Ti are the latest UWHQ/4k cards. I'm watching a review right now where the 2080 is gaining 20-30+ fps @4k using DLSS over a 1080Ti using TAA in more than one game. 40-45+ @4k on the 2080Ti. So again, to me, the value is in whether or not game developers will step up and utilize this tech to improve performance with a new AA rendering technique on hardware designed for it. Otherwise we just aren't utilizing its full potential. I honestly think nVidia should have put more effort into hyping DLSS than Ray Tracing.. Ray tracing is looking like it's going to take dye shrinks to make room for more RT cores to become mainstream with framerates gamers tend to find acceptable. 2080 without DLSS? On par with 1080Ti, 2080 with DLSS? 30 fps can mean alot especially if you're already sub 60fps, or using say VR. Right now as it stands - don't expect the 2080 to outperform a 1080Ti in DCS by much if at all. If you're not already CPU limited, a 2080Ti will offer performance increases, but not quite $600 more performance. Buying a 2080 or 2080Ti series card right now is mostly about helping to adopt a new technology and huge epeen, or increased high res/vr performance. but if somehow we managed to get DLSS implemented 2080 would smoke the 1080Ti. Can PM me if you want links to the reviews I've mentioned. Also things to consider are VR headset resolutions. I've a mate who says he gets away with about 1.2PD with high settings to achieve 45fps VR in DCS in his Vive Pro.. on my 980Ti I have to deal with lower textures/shadows to acheive the same at 1.0PD on my Odyssey (same native resolution on the Vive Pro.) prior to that he was running a Vive above default PD on a 980Ti. YOu can also set supersampling from SteamVR itself so there are a lot of factors there. For my purposes the 2080Ti is the fastest option available for me to crank those settings up. DCS is also not the only use I get out of a GPU. For me buying pascal just makes less sense. Not many things I buy new other than my PC components, and I tend to make them last. I'm obviously drooling about what DLSS could do for that experience in DCS. For the time being, for someone who primarily uses their PC for DCS a used 1080Ti is probably best bang per buck. Chances are that unfortunately we won't see DLSS at all. (even if I want ED to be like HAH, you're wrong!)
-
It's the same for me. I wasn't replacing my 980Ti with a 10 series card as a matter of principle. And - like any other time I've upgraded my gpu (previously a GTX 680 4GB, and prior to that a GTX 285) I'm buying the most powerful single GPU solution I can (albeit this time I'm spending way more money) , and it will likely last me 3-5 years. I'll get my money's worth. Nobody's saying someone running a 1080Ti or a 10 series card should run out and upgrade. Nobody's saying if you can't afford a $1200 GPU that you should feel bad about buying a used 1080Ti either. Just like, nobody's going to make me feel bad about my upcoming GPU upgrade. Ray tracing is cool and all.. but again.. the potential of DLSS could be pretty ground breaking/game changing on it's own and it seems like nVidia is poised to go out of their way to help developers implement this with minimal effort from game developers. What these tensor cores are doing are like, the biggest change to how graphics are handled since the release of Monster Voodoo, which also took the support of game developers to become as popular as it was as "3d gaming" first came to light. Provided DLSS offers the performance boost we witnessed at gamescom over TAA, it won't be until developers begin adopting that capability that we really see what these cards can do. So far, the potential of DLSS says that if a gamer's selection of games implement support for this feature - the performance value of the 2080 to that gamer goes way up. And we'll still have to wait and see, but to me that's nothing to scoff at. That's a new set of hardware taking over the role of traditional AA, taking the load off of whatever part of the GPU handled it in the past. As someone who planned to upgrade GPU's whenever the next thing came out anyway, and is still going to do so..and doesn't believe AMD is going to suddenly become competitive in the GPU market, DLSS is the thing I want to experience the most, especially in DCS with its MSAA performance. Even without it however? I'm getting a huge upgrade from a 980Ti to a 2080Ti, albeit I might as well be giving up an appendage.. Not blindly either. I didn't pre-order. I run at resolutions both on my monitor and VR that demand a fair amount of GPU power. And to me - especially given experiences with MSAA performance in DCS, DLSS is a very exciting feature/capability. I can only hope my upcoming purchase helps pave the way for such a feature to become a future standard. That being said - I can easily see ED being stubborn about this, no disrespect intended to the team. So I'm not overly hopeful. But I'd love to find myself surprised to find DLSS as an option in DCS in the future.. I think that is likely where the value in these high priced cards will begin to show. Furreal, think about how DCS performs when you disable MSAA. Those lovely high framerates you get. Now imagine getting that SAME performance, only with a supersampled image being provided by the tensor cores on an RTX card clearing up all the jaggies and shimmers with straight and curved lines. Yeah.. you'd quickly open your wallet I'm sure. Benchmarks are showing for traditional methods, a 1080Ti is still a valid/practical option. But, I'm looking at it from the viewpoint of - How long will it be before game devs give in and show us what Deep-learning AI can really do for graphics in gaming? This is innovation gentlemen. We've never had this opportunity before. Where would we be if iD software and Parrallax/Interplay never adopted 3d graphics into their games? Look how that blew up. Without the s3 Virge, ATI Rage, 3dfx Monster Voodoo, and devs jumping on that tech to showcase it, where would graphics be today? Before the introduction of the current state of console gaming, game developers and hardware engineers complimented eachother well as the driving force of innovation in this hobby of ours. Ever since? It seems like an uphill battle trying to see the beneifts of what could be some pretty big strides in technological advancements. Oh by the way, the majority of signatures specifically in these forums, seem to often include an nVidia GPU proudly displayed along with the other hardware in their system. DCS seems to make us chase better performance, which DLSS could rightly offer. nVidia is like Charlie Sheen brosephs. "Winning." Only without the health issues. I had to edit one more point in here... how many of you were building PC's for gaming when a top of the line GPU cost $120US + tax some 20+ years ago? Everything in the US and perhaps world is and has been growing the size of their pricetags. Not just GPU's.
-
Not holding my breath, but DLSS would definitely be nice compared to current anti-aliasing options within DCS. The devs claim 4k won't need anti-aliasing, but I know VR and my 3440x1440 21:9 have to use msaa 2x at least or else almost everything looks pretty bad/jaggy/shimmery, and FXAA didn't do much to alleviate it. I also know that MSAA wrecks our framerates I don't know the science enough but - if an anti-aliasing technique is invented that offers double the performance numbers and likely clearer/smoother image quality, especially in a program as resource instensive as DCS World it makes no sense not to research and eventually implement it. People who chooes not to opt for an RTX card can stick to old methods, but completely ignoring this tech sounds silly to me. The NGX SDK will be available for download in a few weeks supposedly, and I could be mistaken, but in the interveiws I've been watching, nVidia will train the AI neural network on their superexpensive supercomputer and implement code for DLSS support for free. The basic interpretation I got was "Hey, we want our technology to succeed because it's honestly pretty awesome, and we want your games to run well and we're sure your userbase will appreciate it, submit a development copy of your game and we'll get it working for you." Win/win if you ask me. https://developer.nvidia.com/rtx/ngx Early adoption may be a slow process given the majority of developers develop for console or mobile, but this just isn't physx or Hairworks. AI enhanced graphics are the future, real-time ray-tracing aside. . And we'll have to wait for games with DLSS support and windows updates to be certain, but so far it sounds like DLSS going to be a substantial performance/quality increase compared to current graphics technologies. DLSS and tensors cores? There are BOOKS worth of complaints about the current state of anti-aliasing in DCS World within these forums. DLSS COULD be the thing that allows for 90FPS VR in DCS with a picture much clearer than what we can achieve currently. I won't keep visiting these threads as I've said similar in another, but as a fan of this game who will likely be spending money on it for years to come, hoping these guys improve and go the distance, I strongly encourage Eagle Dynamics to look in what it would really take to implement DLSS functionality, of course, after we see some real performance comparisons in upcoming games released with DLSS support. We got a lot of folks soothesaying because of high pricetags which is understandable, however, I strongly believe that if a feature were implemented that allowed for near double performance with BETTER image quality, quite a few DCS World "Pilots" would be forking out the dough for a card capable of supporting it. And be able to witness the full potential of deferred shading at last without compromising framerates or resolution. I'll stop posting in these threads about this subject. But the thought does plague my mind regarding DLSS and DCS World. It does seem like a solution for a lot performance complaints, and it's not like adding it will make the game any less playable for people who don't adopt capable cards. Although, I'm in the mindset that EVENTUALLY even if a couple years down the road, we'll all be sporting some kind of card capable of AI enhanced graphics. *edit* - At least one of the games (Ark) already announced to feature DLSS isn't even DX12 exclusive, so - realm of possibility here anyone?
-
Nvidia RTX cards and DCS - ?
Headwarp replied to 0414 Wee Neal's topic in PC Hardware and Related Software
I have to admit my mind has been on this topic a lot lately. For me I'm more interested in the tensor cores and deep-learning AI rather than ray-tracing. @ 3440x1440 DCS without any AA can just look outright bad, shimmers all over the place, wings disappearing off aircraft in the distance and the sort. FXAA never did much to alleviate this, and MSAA just wrecks havoc on framerates since the implementation of deferred rendering. All of this is also true in the Odyssey as well. Flight/Racing sims also tend to be a driving reason for having triple monitor setups or a VR headset. I know the cards aren't even out yet and I'm not all that hopeful that ED is going to pick up on this, but the more I read about DLSS and it's performance increases with the 20 series cards the more I feel like it's the solution to our anti-aliasing woes ever since the switch to deferred shading. That being said - too early to know everything about it. It might require DX12, or they may look into Vulkan support. What we do know is lacking a most likely rather expensive AI supercomputer, developers would have to send code to nvidia for AI training instructions and implementation. Of course it could be a few years before enough people are sporting GPU's with tensor cores to justify the adoption of this tech, and also pending any changes to the graphics API they end up using. I'm not holding my breath for ED offering that kind of cooperation, but I can't help but desire it. I'm not trying to force this opinion.. just some wishful thinking going on here, as anti-aliasing has been my biggest concern with the jump to 2.5. Regardless I'm still most likely retiring my 980Ti for a 2080Ti. The pricetag is high, but it's still going to offer the best performance for my setup at this point in time, and my GPU is showing its age. -
Much better video card = same frame rate...
Headwarp replied to Sub2K's topic in Game Performance Bugs
Screen grab came from this video (turn down your sound the music is awful.) 8700k gets huge leap in multi-core performance, but the single threaded performance of a 2600k @4.7ghz is almost upthere with a stock 8700k turboing all of its 6 cores to 4.4ghz. Only using because it's like the only thing I can find directly comparing stock 8700k to OC'd 2600k Now, the 8700k does give improved performance, but the OC'd 2600k isn't detrimentally behind the stock 8700k CPU. 4.3ghz is likely bottlenecking the 1080Ti. But to the point of 22 fps? The same 22 fps you got ona 1060? Very coincidental that your cpu bottleneck would be right at the maximum capability of a gpu that should perform worse than a high end 9 series card. Crank the clockspeeds of your CPU, see if it helps, if not - im still wagering drivers. Bare in mind DX12 benchmarks will likely make better use of multi-core (like vulkan should), where DX11 titles the two CPU's remain pretty close with their single threaded performance. Pretty close isn't that bad if you aren't happy at the idea of investing another $1300 USD or more into your gaming rig. There could also be some truth to using Process Lasso to ensure DCS is using physical cores rather than logical cores, which should give the same effect as disabling hyperthreading while playing DCS, but still offering the extra threads for windows and background tasks. Being bottlenecked at the GPU myself, I haven't stumbled across a need to find out. -
Much better video card = same frame rate...
Headwarp replied to Sub2K's topic in Game Performance Bugs
You'd be surprised.. there were I games I played with a throttling CPU due to heat before I ever OC'd 2500kthat ran fine, while cpu intensive games crapped on themselves, indicating to me I needed to troubleshoot. There were games that didn't care that my faulty drivers were stealing CPU cycles, while others would at run at lower FPS if not grind to a freezing halt, sometimes recovering. That screen grab only states that one core of the cpu is taxed at 99%, not the cause of it being so. The question is - was he also at 99% on 1 core of his cpu and low GPU utilization with 1060? If abolutely zero fps increase between cards, that sounds like yes. If so - that indicates something going on that doesn't quite make sense given the results I had on a comparible machine running a lower resolution on a more powerful GPU, also a 2nd gen intel, that I had to be super picky about which drivers I installed or I'd run into issues into games that made the CPU do any work. There could be more to the cause of it. There could be faulty drivers causing the cpu to endure more stress than it otherwise would. It could be malware, it could be Windows Power Management rather than nvidia power management, it could be anti-virus software running in the background, windows defender, it could a number of things. Windows 10 might have finally broken all remaining drivers for chipsets that old. who knows. Regardless - if the cpu is a bottleneck not as a result of faulty drivers or system settings - clockspeed increases should net performance gains, which that sandy bridge could probably do with a more efficient CPU cooler. Running an old system, with drivers that are not updated further requires a bit more in terms of maintenance than newer hardware with continuing support. Wrapping my head around that let me game on a 2500k for 5+ years, feeling like a brand new machine when i finally OC'd it long after it was out of warranty. I'd still be doing it if I had opted for a 1080Ti rather than a CPU upgrade. It took me 6 generations to be convinced to upgrade my CPU. 2 more cores, stock clocks faster than my 2500k was OC'd on air. NOt having to remember which specific driver version I needed for a stable system. If it were just gaming performance, then I wasted $1200 on a new rig. -
Much better video card = same frame rate...
Headwarp replied to Sub2K's topic in Game Performance Bugs
Man, 2nd gen drivers are from 2010 and 2013.. I promise you, there's at least one that causes just about any process to cause spikes to 99% cpu usage that normally wouldn't be a heavy load. I've had to hunt down drivers to fix that often enough to know. The main culprits again were sata controller drivers and network drivers. Try a fresh windows install without installing any of the drivers for your motherboard at all. You might get lucky and windows 10 have them in their database. But if not - you can watch the cpu go to 99% generally just by moving your mouse. I'm not completely ruling out the 1080Ti being cpu bottlenecked @ 4.3 ghz, but at the same time - his 1060 should not have been at his high resolution, though it would struggle with it, which would at least give SOME increase upon swapping to a more powerful gpu up to the point of the cpu not being fast enough. I've had to trouble shoot a 2nd gen intel system enough to say - check for driver faults with Poolmon..it's pretty imformative.. you might find an older driver works better than the latest one. here I did some googling for you please check the troubleshooting steps listed in this guide for troubling high cpu usage errors/system interupts, including "Update device drivers" https://blog.pcrisk.com/windows/12795-system-interrupts-causing-high-cpu-usage https://superuser.com/questions/1133501/25-cpu-usage-at-idle-windows-10-system-interrupts I mean..drivers only tell your hardware how to operate, even your CPU itself. You can't imagine errors in driver software causing problems? -
Much better video card = same frame rate...
Headwarp replied to Sub2K's topic in Game Performance Bugs
Dude if his drivers are faulty and causing high cpu usage, cleaning them up will increase performance in a CPU hogging game by multitudes.. I've experienced the issue and what i can do to gaming. I never once mentioned going from 1080P to 1440p, which 3440x1440 isn't 1440p lol. What I did say that modern cards are powerful enough that @ 1080P (which is 16:9 1920x1080, not 21:9 5760x1080) CPU's under a certain clockspeed are bottlenecked. Even a faulty network driver can hog cpu cycles and drastically reduce performance on a machine. In fact the only difference between his rig and my 2500k is hyperthreading, resolution of which he has more, and his GPUs, beyond perhaps different branding of the mobo/ram components. With that res his 1060 should have been pegged.. but the fact that NO fps increase came from the upgrade, hints at that same CPU bottleneck being present before the GPU swap, which simply doesn't make sense, considering again that a 980TI at lower resolution, more powerful than 1060, was not held back by a 4.3ghz second gen intel cpu. Again.. if it's JUST his CPU -increasing clock speed would net pretty large FPS increases. You've not really proven a thing.. there are multitudes of things that can cause high cpu usage, and it takes more than looking at task manager and coming to a conclusion. \ The fact that I didn't get an increase going from a 4.3ghz 2500k to a 4.7ghz 8700k, tells me taht I could downclock to a lower CPU speed than 4.3ghz and not take an fps hit because my GPU is maxed. If he were to plug his 1060 back in and it not be pegged at 99% at that resolution - I'd bet money it was one of the janky drivers I was talking about. I'd be willing to bet that if you overclocked that 3770k of yours to 4.5 or more ghz you'd see increases in fps without paying for more than a new cpu cooler, provided - you have the best working drivers for your system. @Sub2k don't buy 3 1440P monitors. ;P But you can try supersampling in game or DSR from NV control panel to achieve higher resolutions without it looking bad, forcing more strain on the GPU. Anytime I encounter fishy behavior on my PC, even if it's only in ONE cpu hogging game - troubleshooting steps start at making sure I have the best working drivers my system and no CPU throttling, which usually isn't much of an issue with newer hardware. Only after I'm sure that all of that is correctly installed, or that I'm not suffering memory leaks, or CPU hogging faulty drivers. This is something I had to do regularly with my 2500k, as I wipe and reinstall everything on my rigs once or twice a year, everytime I did, I had to go through some routine with my 2nd gen intel board to find the best set of drivers. It's one of the things that sold me on finally upgrading, getting current driver updates. As otherwise - sandy bridge is still an overclocking beast. -
Much better video card = same frame rate...
Headwarp replied to Sub2K's topic in Game Performance Bugs
He's not @ 1080P he's using 3 screens thats 5760x1080. Again.. something is hogging up his CPU causing his GPU not to work hard even though he has 2million more pixels to process than if using my single 21:9 monitor. .. and having experience with 2nd gen intel chips - my guess is a faulty system driver, that may well be the latest one provided by his mobo manufacturer, but eating up cpu cycles. His 1060 should have been pegged at that resolution. If you dig hard enough - at least up until I built this system at the end of last year - you can find drivers that work better than others. Google how to find faulty drivers with Poolmon..the windows developer kit is free. IF I'm wrong - increasing CPU clockspeed should provide an increase to FPS. -
Much better video card = same frame rate...
Headwarp replied to Sub2K's topic in Game Performance Bugs
Personally, windows 10 handles page files rather well. I'd just let it do it's auto management thing. You are definitely maxing out one of your cores, and you're @ 4.3ghz. I still feel like you should be able to get more FPS than I did on my 2500k rig which i had at 4.3ghz using a 980Ti. What kind of frames do you get over Caucasus? Cheaper solution than building an entirely new computer - throw a corsair h115I or something in your rig as a CPU cooling solution rather than Air, I have an h110I and my 8700k doesn't even get above 70C when stress testing @ 4.7ghz on all cores. I'd wager you'd be able to get that 2500k up to at least 4.7ghz, which should offer you a framerate increase. 4.5ghz was about the max i could push on air with my sandy bridge, and I backed that down to 4.3 because I was reaching 85C during stress tests. Disabling hyperthreading might allow for higher clockspeeds with lower temps as well. But frankly I'm pretty sure sandy bridge heatspreader is soldered to the cpu unlike newer intel chips, meaning easier to cool. I wouldn't worry about the cache frequency, all I ever did on the 2500k was adjust the clockspeed multiplyer and the voltage, and like 2 other settings. I have adjusted the cache frequnecy on my newer rig, but, my reading and experience leads me to believe upping the cache frequency has minimal effect on gaming performance vs clockspeed adjustments. But I do have to wonder how you could've been cpu bottlenecked with a 1060 when my gpu was and still is a bottleneck at lower resolutions with a more powerful card. Check your CPU usage when no games are in the background, and pretty much idle on the desktop, it should only be between like 1-5%. If it's much higher I'd be getting the windows development kit for Poolmon to try and track down faulty drivers. I had to hunt down intel drivers for a couple components rather than from EVGA's product page because of faulty drivers eating up cpu cycles and/or memory. I think specifically, the NIC drivers had to be a specific version, and perhaps the sata controller and chipset drivers even. 2nd gen boards haven't received proper driver updates for almost a decade now. If it does turn out to be something like faulty onboard nic drivers - you could even fix that with a pci-e nic card. Also check memory usage, faulty drivers may not always directly affect cpu usage @ idle. The only other thing I can think of, that explains why at a higher resolution you were cpu bound witha weaker gpu than my 980Ti is that I built my 8700K before the fixes for meltdown/spectre came and older intel chips took the hardest hit.. but everything I've read says that you won't really see the effects of that when it comes to gaming. Which is also a little disheartening, because I kind of had plans for old sandy since I'd have a 980Ti to throw in it afterI upgrade my GPU. That being said - my experience ends at the 980Ti - I might learn some things when I do ugprade, hopefully this month. For the life of me - the fact that with more pixels, your 1080Ti is giving the exact same framerate as your 1060 as if you were CPU bound on both cards where my 980Ti has been at 99% ever since I got a 3440x1440 monitor on the i5 counterpart of your exact build has me baffled. *edit* Googled "i5 2500k after meltdown" and found some reddit threads regarding 2500k and 2600k saying they didn't notice much of an impact as far as gaming benchmarks. - I personally think it's either some kind of system driver issue or just a matter of more efficient cooling and upping your cpu clockspeeds, unless maybe your ssd has no free space, because it likes free space. Your 1060 should have been pinned at 99% usage. The 1080Ti might be able to use higher clockspeeds at that resolution, idk, but the SAME framerate between the two indicates you were also bottelnecked on the 1060 and that just doesn't make sense to me without something causing higher cpu usage than normal. -
Much better video card = same frame rate...
Headwarp replied to Sub2K's topic in Game Performance Bugs
Just a guess here.. you aren't running with civillian traffic on are you? Turn that off. It's a resource hog. Also - might be worth disabling MSAA in game and enabling FXAA in NV control panel. You'll get shimmers...but framerate will go up quite a bit. Also worth running "dcs_updater.exe repair" as DCS can be weird about graphics driver updates sometimes. Without msaa on my 3440x1440 monitor i'd easily be pushing 80-100 fps @ 3440x1440, turning it on dropped me to about 50-60 at low altitudes. Which is also similar to my experiences on my 2500k build. On both builds - 980Ti would be @ 99% utilization in just about any game I play at that resolution. The only other thing I can think of is open task manager and make sure your CPU is indeed running at 4.4ghz and not being throttled. If it isn't running at 4.4ghz, clean or replace your CPU heatsink + apply some new thermal paste because if you don't dust them regularly they will get caked up in dust. It was a flight sim that made me investigate performance issues I was having, when I realized my CPU wasn't even running @ stock clocks. It was when I replaced my cpu cooler that I overclocked it for the first time. My 2500k OC'd to 4.5ghz on air. I can only imagine I could've gotten it faster with an AIO liquid cooler. I'm doubting with your benchmarking scores that heat could be the issue, but worth looking at. And Im sorry I seemed to hone in on Bonedust's VR experience in this thread when I wrote that post and somehow managed to think you were using VR too. I mean you're running at a pretty high resolution with 3 screens... but I imagine there's a setting being overlooked somewhere causing you to run at such low fps. You say you get the same framerate you had with a 1060, and I just don't see that relating to a CPU bottleneck, unless the CPU was throttling due to heat. A 1060 would surely be pegged trying to run 6,220,800 pixels which is a bit over 2 million more pixels than i was running on my 980Ti (which is closer to a 1070 in performance). I.E. - if my 980Ti was doing all the work @ 3440x1440, your 1060 should definitely have been doing all the work @ 5760x1080, and you should be seeing higher framerates with a faster gpu. Then again - if you've even got one core of your cpu pegged to 100% (which would =12.5% cpu usage total with 8 threads) and no heat issues when flying then I'm just completely wrong and my 2500k was simply amazing. P.S. - Click the Go advanced button at the bottom of any post you're trying to edit or make, at the top will be an icon that looks like a paperclip, it opens a new window that allows you to upload files like screenshots and what not. Once you've uploaded them, you use the dropdown menu next to the paperclip icon, click your file and it will attach that screenshot to your post based on where your cursor is. Use PRNT SCRN to take a screenshot of your system settings in DCS. They get saved to C:\users\xxxx\saved games\DCS.openbeta\screenshots\ (if on the beta version) You can also use windows snipping tool to show us what you've done within NV control panel. -
Much better video card = same frame rate...
Headwarp replied to Sub2K's topic in Game Performance Bugs
We're not even sure a 2080Ti can handle high settings + msaa and get over 45 fps in DCS VR yet, which is basically ASW kicking in because you can't get a stable 90FPS. But - there's more to it than simply swapping GPU.. check your DCS profile in NV control panel, if it doesn't exist add it..make sure power management option is set to "maximum performance" Sometimes it's best to completely uninstall everything nvidia with DDU and install new when swapping cards as well. Also - be aware of steam's super sampling settings, personally I recommend setting it to "manual" and 100% because it will default to higher resolutions, which impact performance. From there start at 1.0 PD in DCS, and if it allows for it, turn the pixel density up if you can until you can no longer maintain 45 fps and then drop back down a notch to where you can. This is DCS's method of supersampling for VR, and turning it up will make your GPU work harder while providing a clearer image. This also should have the effect of lowering the ceiling for cpu clockspeed bottlenecks, if any are present to begin with and you manage to pin your 1080Ti. I some times get 60-80 FPS @ altitude in my odyssey with my 980Ti. one of my mates is also in ASW with his 1080Ti on a 7700k (turbos to 4.5ghz out of the box, I've at least heard mentioned more than once capable of being oc'd to like 4.9ghz) Pretty much no cpu/gpu upgrade is going to make DCS run without ASW with current hardware, unless the RTX series turns out to be amazing and ED manages to allow for Deep-learning AA/SS on the tensor core, which I'm not holding my breath for considering the time development takes with this team. Although it would be nice because MSAA currently is a huge FPS hit in DCS and it's too shimmery without it at least 2x, even if turning it off allows for much higher framerates. That being said- 45 fps in VR in DCS is absolutely playable, its a feature of your VR headset kicking in to keep you from getting woozy by using rendering techniques to keep what you see looking smooth. You can probably still achieve it with high textures + medium shadows and pixel density increases where I have to compromise with my 980Ti with Medium textures and shadows off or flat. sandy bridge CPU's OC'd to 4.5ghz or more are still excellent gaming CPU's, especially given that VR and high resolutions/supersampling put more workload on the GPU, and that also because a single GPU still won't saturate pci-e 2.0 x16. However moving up to a 1080Ti or better I hear some people claiming they can actually utilize 5ghz cpu. But then again I'm unsure at what resolutions people play at. Anything GTX 970 or better will likely be CPU bottlenecked @ 1080p allowing for FPS gains by increases in CPU clockspeed or faster architectures, but the higher the resolution, or in VR's case rendering each eye, the more likely for your gpu to be the bottleneck, and the less clock speed matters on the CPU. ASW might make your gpu work less hard because it only needs to achieve 45FPS to function which flat out pins my poor 980Ti in DCS World even at 1.0 PD, although the Odyssey's native resolution is on par with the Vive Pro, so it's comparable to a pixel density increase in the rift/vive. Basically - moral of my story - If you're getting and maintaining 45 fps in DCS world in VR - rejoice - framerate isn't a reason for upgrading CPU/MOBO/RAM. If you absolutely must hit 90 FPS in VR, we're simply still waiting on a GPU that can achieve it in DCS with MSAA on. That's going to be a few years. I say that coming from an i5 2500k @4.5ghz rig to my current 8700k 4.7ghz and getting almost identical performance in DCS at least on my 3440x1440 monitor which allows for higher settings than VR @ odyssey/vive pro resolutions. I didn't have my odyssey when I was still using that system. Same GPU between both systems, that = gpu bottleneck. It's just against my principals to upgrade my GPU without skipping at least one generation, no matter how much my fancy monitor was begging for more gpu power (mostly in flight sims go figure). This is also why I buy the best version I can when I do upgrade, looking to get 3-5 years use out of it. Getting my money's worth. If you do upgrade your core components, do it because of new features and drivers that will get updated for at least a couple years, unlike 2nd gen intel boards that are using drivers from 2010 and 2013. PCI-E x4 NVME drives, faster ram, pci-e 3.0/4.0, RGB lighting (cuz it's shiney/seksai) and better multitasking/production performance, or because the latest intel CPU's hit 5ghz pretty easily and you just like being a tech nerd because I doubt you'll see much, if any, difference in gaming performance based on personal experience. Heck, intel's 9th gen will feature mainstream 8 core processors in response to ryzen that should be able to achieve clockspeeds similar to the 7th and 8th gen intel cpus, although that won't impact gaming much, as we're mostly still dependent on single core performance, but - for multitasking/production work on a rig that also mains as a gaming PC that sounds pretty awesome considering they will have the single core performance to maintain the position of King for gaming cpus. *Final edit of this lengthy informative post* - you can also disable ASW/ASR in steamVR (maybe occulus home too i've never used a rift to know) altogether and get an idea of what framerates your setup is actually capable of with your current settings, but if you're prone to motion sickness.. I wouldn't wear that headset for very long. And ensure only "Allow asynchronus reprojection" is checked in SteamVR settings, leave "Allow interleaved reprojection" unchecked. When you're ready to live with 45FPS and smooth gameplay and turn the feature back on that is. :) Also forgive me if DCS runs natively out of Occulus home, my WMR headset uses steamVR to launch DCS, but I'd imagine Occulus home would have similar settings. It's like not being able to help AMD users with specifics because nVidia likes to take my money, but I can at least give a general guideline. -
Just chiming in to say f-10 map is janky in VR as well, as I'm sure others have said. Have several friends using VR that have given up on using the map for this update. May try seeing if I still get decent performance by upping the resolution in game to match my Odyssey and if it is a viable workaround until ED fixes it.
-
I actually get by with VR on my Odyssey with my 980Ti but - had to lower some settings to get at least a steady 45fps @ 1.0 PD/SS. (at altitude sometimes I start getting 60-80FPS but mostly ASW kicks in while I'm flying) I still run msaa 2x however, to reduce shimmering.. which has a huge impact on FPS. That being said - when running at high resolutions or VR - The GPU is bottlenecking so hard that I could drop my CPU clockspeeds to 3.7ghz and not notice a difference in performance. #1 make sure steam isn't trying to do some rediculous amount of Supersampling. It tries to guess how much SS your system can handle and ime it's not very good at doing so. In this screenshot you'll notice it recommends 126% SS for me, when I first started using my Odyssey it defaulted to like 185%. And yeah, no. People with Rift or Vive already have to supersample to obtain the Odyssey's resolution/image clarity. Maybe after a GPU upgrade I might be able to get away with pixel density increases, which I would change in DCS rather than SteamVR. #2 The image quality in VR is just never going to be as pretty as it looks on your monitor even with supersampling, largely to do with the screen door effect. But - even on Medium textures and low landscape textures, over time your eyes adjust to what you're doing and personally - I don't really mind the lower quality.. although I am in the process of working up funds for a 2080Ti to replace my 980Ti. (not that we know much about how the card will perform yet. I've just been waiting for a new GPU for awhle now because I decided to run on a fancy high res 21:9 monitor and a WMR headset with the same resolution as the Vive Pro, my 980Ti handles most games well at 3440x1440, but flight sims tend to make my 980Ti show it's limitations). The more I fly DCS in VR the more impressed I am with it, and how good it looks even @ lower settings. #3 in game start with PD 1.0 and see how it performs. I can't speak for the RX 580, but when stepping into the realm of high resolutions and VR especially in combat flight sims, the demand for GPU power is real, and AMD just hasn't been competing with Nvidia in that department lately.
-
Samsung Odyssey <3
-
I'm personally going to order one as soon as I have the funds and my preferred brand releases the version I want, but I already knew I that well before I knew nvidia was going to be like "yo .. we want to change the rendering game say hi to RTX" And that's mainly because my monitor @3440x1440 wants more GPU power, and now I've started using an Odyssey for simming.. where my 980Ti is maxed out in that case as well. I'm looking at it this way - it should certainly outperform the 10 series (even if by how much is still up in the air) even in games that don't support RTX, and I'll be able to enjoy our first glimpses at real time ray tracing in games that support it. I almost got the gas for a 1080Ti throughout this year but there's just no way I'll ever be convinced to ugrade gpu's without skipping at least one generation, especially where you have to fight with resellers jacking up prices. Don't get me wrong - this is just what I'm doing. For flying in DCS? If you have a 10 series card definitely wait to see what people say about them.. it could be some years before developers produce titles that utilize this tech. Although it was honestly impressive the list of titles that are already slated to use it. And that's in no way trying to speak for what plans ED might or might not have with this in the future. If you're comfortably sitting on a pascal gpu i'd advise holding off til people have had a chance to review and benchmark these things.
-
I have to chime in and share my thoughts on the potential of RTX... The kind of reflections and shadows this hardware has made possible is kind of a flight simmer's wet dream. Vulkan supposedly is implementing RT capability into the API. I know we just had a major graphics overhaul and everything but.. I must say the thought of shadows behaving the way they do with RTX in DCS World would be topping on the cake.. seeing sunlight glint off cockpits in the distance and things of that nature just sound amazing. These are things we've up to this point been told are impossible with today's hardware and suddenly, it's in the realm of possibility. So while we may not see it in DCS the near future, it seems to me like game developers are already taking interest in this technology. And personally, I couldn't see a better place for it than the flight simming world. I'm not holding my breath for something anytime soon, but I do have a tinge of hope that ED has their eye on technology's current course, and similar dreams. From the way the NV CEO was talking, RTX tech could offer performance benefits to deferred rendering techniques as well, although it was only briefly mentioned and I don't want to speculate too much, especially with my limited knowledge of programming and rendering. I'll leave my thoughts at that. I don't really have expectations, but this particular advance in tech has provided me food for thought. I do have to wonder, what the sim could look like IF the capability were coded.
-
Have to chime in and say I'd like an absolute value for the Radar Elevation on an axis as well as opposed to a relative axis. Might be different if I had a spring loaded axis that returns to center to use, but the old warthog throttle doesn't provide such. For now it's just easier to use a hatswitch than the slider I use for radar elevation in any other aircraft with a2a radar.
-
If you want to use the AB detent, you can simply leave the screws hand tightened. It's sufficient. Still can be a pita when jumping to aircraft without AB.
-
Thanks I'll see if I can manage to set up a default snap view I like. Still.. I don't think it would hurt to have a centered view as some kind of "Default" to begin with. ;) Edit: FWIW looking through that thread one of the instructions is to pause TrackIR. I'm not certain there is a way to pause vr like you could TrackIR. Through enough tedium I may get it sorted, but it's going to require fumbling around blindly for my keyboard while trying to hold completely still, or try to do it on screen with my keyboard and mouse. For future installations and new DCS VR users, I'd hope ED could see the usefulness of an option for a default centered view without the involved hassle from the end user. At the very least VR users should in my opinion start centered in the seat, it's not like TrackIR where it was hard to find the right spot. Although, a keybind that switches between a centered view, or a lined up with optical sight view has been used in older sims and would be acceptable and likely appreciated by non VR users as well. Thanks for the work around - I tried this last night but I was doing something wrong. Randomly feeling my way out from ralt to rctrl and hitting num 0 eventually put me where I wanted to be in my seat. Easier than I thought.