StrongHarm Posted January 8, 2010 Posted January 8, 2010 (edited) Hey all.. first post. Just started DCS.. love it! Used military grade sims when I was in the Navy and this gives me the same feeling. I'm seeing an oddity in that my vid cards aren't heating up. I'm using nVidia 8800GTX 768s in SLI. I've checked my nVidia profile, verified that SLI is working using the indicator, and turned off AA etc etc.. as listed in previous posts. I'm using high settings in 1920x1200, but my GPUs are only heating up by about 10F and my frame rates average about 20-30fps. Other modern games increase the temp by about 50F. My system is in excellent condition (water cooled etc.etc.) as I've been a sys admin for about 15yrs. Anyone know what I'm missing? Thanks in advance, StrongHarm EDIT:__________________________________________ Thanks for all the great help. Staggering conclusions HERE. ------------------------------------------------------------------ Edited January 9, 2010 by StrongHarm 1 It's a good thing that this is Early Access and we've all volunteered to help test and enhance this work in progress... despite the frustrations inherent in the task with even the simplest of software... otherwise people might not understand that this incredibly complex unfinished module is unfinished. /light-hearted sarcasm
Made.In.China.00 Posted January 8, 2010 Posted January 8, 2010 Well, DCS isn't very GPU intensive, and you nice video cards are just overkill for DCS. Your GPUs aren't heating up because they don't have much work to do.
Panzertard Posted January 8, 2010 Posted January 8, 2010 (edited) o7 Welcome to the forums, sir. Even the best of us forgets many of the obvious things now & then. Heating; <nm - failed to read - I thought you said your system was heating up 10F above normal games> As for FPS - DCS is heavily dependant on the CPU - and it's not very optimized for multithreading. There are a few tips around on how to get the most speed out of DCS. Water can be set to 0 and such. But of course any background processes in the system will affect performance. Here's one: http://forums.eagle.ru/showpost.php?p=760153&postcount=6 Much good tips to be found in the Problems and Bugs section ;) Edited January 8, 2010 by Panzertard The mind is like a parachute. It only works when it's open | The important thing is not to stop questioning
159th_Viper Posted January 8, 2010 Posted January 8, 2010 Welcome Agent Smith - As said it's your CPU that gets clubbed :joystick: Novice or Veteran looking for an alternative MP career? Click me to commence your Journey of Pillage and Plunder! [sIGPIC][/sIGPIC] '....And when I get to Heaven, to St Peter I will tell.... One more Soldier reporting Sir, I've served my time in Hell......'
StrongHarm Posted January 8, 2010 Author Posted January 8, 2010 Thanks for the warm welcome. DCS community seems like a great one. My CPU is a Core2Extreme6850 running at 3.66ghz. I'm using Vista64 properly configured.. and show a good spread across the 4 cores. Combined average utilization during DCS:BS is around 70%. I do understand when you say that DCS is CPU intensive, so my CPU is taking a bigger hit than my GPUs. It just amazes me that the graphics are decent and the GPUs aren't taking that big of a hit. Do the Russians understand DirectX that much better than American developers? heheh... or is the app just not using DirectX properly? My machine eats games like Crysis and Dragon Age for breakfast, nearing 100fps, and those apps are super graphics intensive. Things that make you go hmmmm..... It's a good thing that this is Early Access and we've all volunteered to help test and enhance this work in progress... despite the frustrations inherent in the task with even the simplest of software... otherwise people might not understand that this incredibly complex unfinished module is unfinished. /light-hearted sarcasm
isoul Posted January 8, 2010 Posted January 8, 2010 As above posters said I can confirm you that DCS:BS performance won't increase that much with a powerful GPU. Having tested with 3 already (8800GTS/768MB, 9800GTX+/1GB, GTS250/1GB) I got almost the same results (25-35fps while in cockpit). Thats easy to understand since DCS may have nice graphics but the really hard time is when CPU has to calculate all the trajectories of bullets/missiles and the forces that are applied on your helicopter while in flight(very realistic flight model). I hope that next week, or the week after, I ll be able to post any increase with my new Q9400 CPU. Currently I am using an E6600. Oh btw... get DCSMax since this utility/launcher will help you enable the additional cores your CPU may have and hopefully increase the game's performance (someone post a link for downloading DCSMax please).
Boulund Posted January 8, 2010 Posted January 8, 2010 Oh btw... get DCSMax since this utility/launcher will help you enable the additional cores your CPU may have and hopefully increase the game's performance (someone post a link for downloading DCSMax please). This is no longer needed since the patch. The patch fixed the issue where the game wouldn't be given affinity to all available cores upon startup. Reiterating what several others have said before me: * You get "low" fps because DCS is not able to fully utilize several cores, BUT operating systems like Windows 7 or vista affects the core load-balancing and thus improves performance over operating systems without such features (e.g. Windows XP) * The graphics cards you're using are kind of relaxing when playing Black Shark because there is very little graphics lull going on - hence the low temperatures. * What you can do to improve performance is to lower some settings - especially water. There is a configuration file: ...\BlackShark\Config\graphics.cfg where on line 149 you find (it is VERY IMPORTANT that you use a good editor for this, e.g. Notepad++ (do NOT use windows built it notepad program)) WaterQuality = 0; Setting the water quality to 0 massively improves your frame rate, because of an inefficient implementation of the calculation of the water surface in the game engine. Searching for posts in the forum will help you find even more performance hints as several others of us here have all been through this process :) Welcome and have fun Core i5-760 @ 3.6Ghz, 4GB DDR3, Geforce GTX470, Samsung SATA HDD, Dell UH2311H 1920x1080, Saitek X52 Pro., FreeTrack homemade cap w/ LifeCam VX-1000, Windows 7 Professional x64 SP1. FreeTrack in DCS A10C (64bit): samttheeagle's headtracker.dll
sobek Posted January 8, 2010 Posted January 8, 2010 Do the Russians understand DirectX that much better than American developers? heheh... or is the app just not using DirectX properly? My machine eats games like Crysis and Dragon Age for breakfast, nearing 100fps, and those apps are super graphics intensive. Things that make you go hmmmm..... Apples and oranges, m8. Dragon Age etc. were written at a time when powerful GPUs were already common, DCS engine has been continuously upgraded, but you can still notice that it's from the early 2000s, basically. DCS runs a lot of stuff in the background, like weapons ballistics, flight model (large CPU eater), you name it , that all have to run on the CPU (for now) for various reasons. You see, this is a combination of on the one hand, utilising an old engine that can not easily be tweaked towards parallel processing and GPGPU tasks, on the other hand (while i'm no experienced programmer, i do believe this is one reason) GPUs (especially ATI) lack the sophisticated command sets of CPUs and can therefore not be made to render all the stuff that goes on in DCS. Good, fast, cheap. Choose any two. Come let's eat grandpa! Use punctuation, save lives!
Feuerfalke Posted January 8, 2010 Posted January 8, 2010 As above posters said I can confirm you that DCS:BS performance won't increase that much with a powerful GPU. Having tested with 3 already (8800GTS/768MB, 9800GTX+/1GB, GTS250/1GB) I got almost the same results (25-35fps while in cockpit). Inconclusive. The late 8800GTS (especially the 768MB oc-versions) are only marginally slower than the 9800GTX. Basically just the DX-compatibility was changed. Same is for the GTS250. Almost identical frequencies, almost identical chipsets, just new DX-features DCS simply doesn't profit from. It also depends a lot on your resolution and quality settings, as StrongHarm already is aware of. I fly with 1900x1200 8xAF and 8xFSAA and when going from my 8800gts 768 OC to the GTX285 my FPS doubled. @ StrongHarm: DCS is mostly demanding on your CPU, as previous posters already pointed out. As such, it is also vital to have enough fast RAM installed and of course AHCI enabled for fast access to your HDD. Also, please make sure that you run DCS in fullscreen mode. Windowed mode costs you a lot of FPS as well. You can also try to decrease "cockpit resolution"-setting. It does not change your textures, but the rendering resolution of your SHKVAL and ABRIS. To my experience 1024 every frame is overkill in 90% of my engagements (and probably unrealistic, as I doubt the screens have such a high resolution in RL). Gigabyte GA-Z87-UD3H | i7 4470k @ 4.5 GHz | 16 GB DDR3 @ 2.133 Ghz | GTX 1080 | LG 55" @ 4K | Cougar 1000 W | Creative X-Fi Ti | TIR5 | CH HOTAS (with BU0836X-12 Bit) + Crosswind Pedals | Win10 64 HP | X-Keys Pro 20 & Pro 54 | 2x TM MFD
EtherealN Posted January 8, 2010 Posted January 8, 2010 (edited) DCS engine has been continuously upgraded, but you can still notice that it's from the early 2000s, basically. Nitpicking here, but that should actually say "The TFCSE Engine". Or well, it shouldn't, 'cause that becomes a redundant "The The Fighter Collection Simulation Engine Engine", but you know what I mean. :P Anyways, welcome to the our humble abode, StrongArm. :) As people have been mentioning, the GPU thing is easy to explain. Basically a single 8800 can probably run DCS equally well as most other graphics cards or SLI/Tri-SLI/Crossfire combos, since that's about the point where the GPU starts getting unused cycles. Obviously though, a stronger GPU setup allows more aggressive use of AA/AF and such. I would recommend taking some time to increase those settings, with test-flights in between, to see how much AA and such you can add without losing FPS. For my single 9800GTX+ this went to around 8x on both. on the other hand (while i'm no experienced programmer, i do believe this is one reason) GPUs (especially ATI) lack the sophisticated command sets of CPUs and can therefore not be made to render all the stuff that goes on in DCS. Not quite true. Most GPUs from the 8800 generation and newer (with the ATI equivalents) do have programmable shaders (they effectively work like separate "cores", simplified) that are very capable at physics calculations (used in science for that purpose as a "poor man's supercomputer"). The problem lies more in the architecture than the specific calculation capabilities - a graphics card is very very slow. After all, it usually only needs to give output between 60 and 100 times per second, which means that the compute resources are very parallellised to allow independent calculation of separate parts of the screen. Unfortunately, there are some types of calculations that this doesn't lend itself all that well to. For example, applying AA is a pretty simple thing - just have the rendered screen frame and chop it up for the shaders to work on and then reassemble the AA'd product. But many physics problems that apply to flight simulation is more in-line in the sense that to run calculation X you need some of the output of calculation Y, which needs some of the output of calculation Z and so on. This is why some things are very difficult to migrate to the GPU - they just need a fast in-line process to be done effectively. The "big thing" in moving code to the GPU like that isn't really that much in actual "programming" in the sense of hacking code, but more in traditional "flowchart programming" where the software engineers have to spread the process flowchart in front of them and try to figure out a way to split it up without breaking anything. And in a modern software application that flowchart can be BIG. :P EDIT: As such, it is also vital to have enough fast RAM installed I'm not really sure of that. I get zero impact (that is, nothing outside the margin of error) when flipping my RAM sticks between 800MHz and 1100MHz. Edited January 8, 2010 by EtherealN [sIGPIC][/sIGPIC] Daniel "EtherealN" Agorander | Даниэль "эфирныйн" Агорандер Intel i7 2600K @ 4.4GHz, ASUS Sabertooth P67, 8GB Corsair Vengeance @ 1600MHz, ASUS GTX 560Ti DirectCU II 1GB, Samsung 830series 512GB SSD, Corsair AX850w, two BENQ screens and TM HOTAS Warthog DCS: A-10C Warthog FAQ | DCS: P-51D FAQ | Remember to read the Forum Rules | | | Life of a Game Tester
Boulund Posted January 8, 2010 Posted January 8, 2010 (edited) Inconclusive. The late 8800GTS (especially the 768MB oc-versions) are only marginally slower than the 9800GTX. Basically just the DX-compatibility was changed. Same is for the GTS250. Almost identical frequencies, almost identical chipsets, just new DX-features DCS simply doesn't profit from. Slightly off topic: This is actually a really interesting thing to note that here nVidia's marketing really works. Slapping a new name on what's essentially the same card makes people go nuts. I believe that the 8800-series (G92) must have been the most rebranded line of GPUs ever. It's totally true what you're saying, 8800, 9800 and gts250 are in effect the exact same card. This will surely be why isoul's testing gave no differences. Edited January 8, 2010 by Boulund 1 Core i5-760 @ 3.6Ghz, 4GB DDR3, Geforce GTX470, Samsung SATA HDD, Dell UH2311H 1920x1080, Saitek X52 Pro., FreeTrack homemade cap w/ LifeCam VX-1000, Windows 7 Professional x64 SP1. FreeTrack in DCS A10C (64bit): samttheeagle's headtracker.dll
Feuerfalke Posted January 8, 2010 Posted January 8, 2010 True, it's really stunning how especially ATI and nVidia manage to define new names for basically the same hardware. Both even added, raised or lowered type-numbers opposing established marks, just to confuse customers and sell older and hardware as new. In fact, when the late 8800GTS & GTX hit the market, it was noticed by hardware-magazines, that they were already equipped with the new chipset of the 9800. As nVidia noticed GTX was prefered over GTS, they simply changed the name and so the 8800GTS with the slightly upgraded chipset became the new 9800GTX. And indeed there were a lot of reports how MUCH faster the new card was - LOL. Funny sidenote: As the 9800-series also included PhysX, they locked this feature in the gfx-card BIOS, so it wasn't detected by games and drivers. Of course there were quickly workarounds available. ATI did the same, e.g. with the 9800pro's chipset, which was remarkable strong and found it's way on later graphics-cards as well, just with a new label - and a higher price, of course.:doh: 1 Gigabyte GA-Z87-UD3H | i7 4470k @ 4.5 GHz | 16 GB DDR3 @ 2.133 Ghz | GTX 1080 | LG 55" @ 4K | Cougar 1000 W | Creative X-Fi Ti | TIR5 | CH HOTAS (with BU0836X-12 Bit) + Crosswind Pedals | Win10 64 HP | X-Keys Pro 20 & Pro 54 | 2x TM MFD
isoul Posted January 8, 2010 Posted January 8, 2010 (edited) Inconclusive. The late 8800GTS (especially the 768MB oc-versions) are only marginally slower than the 9800GTX. Basically just the DX-compatibility was changed. Same is for the GTS250. Almost identical frequencies, almost identical chipsets, just new DX-features DCS simply doesn't profit from. It also depends a lot on your resolution and quality settings, as StrongHarm already is aware of. I fly with 1900x1200 8xAF and 8xFSAA and when going from my 8800gts 768 OC to the GTX285 my FPS doubled. Thats true for 8800,9800 and GTS250... But I am not that sure that the overall performance is exactly the same since with GTS250 I get better performance(in most other games) given that all 3 cards(especially 9800GTX+ & GTS250) have the (almost) the same specs(freq etc etc). Edited January 8, 2010 by isoul
Boulund Posted January 8, 2010 Posted January 8, 2010 (edited) Thats true for 8800 and 9800... But I am not that sure if it is for GTS250 since I get much better performance(in most other games) given all 3 cards have the almost the same specs(freq etc etc). This is derailing the thread so hard I know... :smilewink: Reading up on the specifications on nVidia's homepage reveals that the difference between the cards most probably responsible for your observed performance differences stem from slightly higher clocks on the "newer" cards (equally reachable through overclocking the card on your own) as well as a handful more shader cores (nothing you can change on your own though, and nVidia likes to call them "CUDA Cores" nowadays). I see the GTS250 has got 128 cuda cores against the 112 found on the earlier 8800 and 9800 models - noteworthy is that 8800 GTX introduced 128 cores (8800 GT has only 112 cores and 8800 GTS (the old G80-GPU) has a mere 96). Edited January 8, 2010 by Boulund Core i5-760 @ 3.6Ghz, 4GB DDR3, Geforce GTX470, Samsung SATA HDD, Dell UH2311H 1920x1080, Saitek X52 Pro., FreeTrack homemade cap w/ LifeCam VX-1000, Windows 7 Professional x64 SP1. FreeTrack in DCS A10C (64bit): samttheeagle's headtracker.dll
Feuerfalke Posted January 8, 2010 Posted January 8, 2010 Please also bear in mind that the 8800 and the 9800 use the same drivers, whereas the gtx250 uses features of the newer drivers - or did you really use one and the same driver for all cards? In other words: The performance increase is not necessarily based on more powerful hardware. Performance can also differ if you used the cards under different circumstances, as they scale with CPU and RAM quite a bit and have quite different effects under XP and Vista/Win7. Gigabyte GA-Z87-UD3H | i7 4470k @ 4.5 GHz | 16 GB DDR3 @ 2.133 Ghz | GTX 1080 | LG 55" @ 4K | Cougar 1000 W | Creative X-Fi Ti | TIR5 | CH HOTAS (with BU0836X-12 Bit) + Crosswind Pedals | Win10 64 HP | X-Keys Pro 20 & Pro 54 | 2x TM MFD
Distiler Posted January 8, 2010 Posted January 8, 2010 Could be nice to pass a couple of effects from the CPU to the GPU, just to add something like 5fps. There is a lot of people that is in the brink of 20-25 fps and a little perfomance gain means going from unplayable to playable. I suppose this must be difficult, but would be nice anyway. AMD Ryzen 1400 // 16 GB DDR4 2933Mhz // Nvidia 1060 6GB // W10 64bit // Microsoft Sidewinder Precision 2
isoul Posted January 8, 2010 Posted January 8, 2010 (edited) Oh I give up! Probably that's why my retailer suggested them as a replace for my faulty original 8800GTS... Still, I get better performance with GTS250 compared to 9800GTX+ which have e x a c t l y the same (basic)specs. EDIT : The rest of the system remained exactly the same. The driver used with 8800 and 9800 were the same but with GTS250 were newest version. Still the performance increased quite much for just a new version of the driver. Can't say anything else cause I haven't searched the matter further since the 9880 and GTS250 were a free replacement of my faulty 8800(as mentioned above). Edited January 8, 2010 by isoul
StrongHarm Posted January 8, 2010 Author Posted January 8, 2010 (edited) Awesome! Thanks for the patient and insightful answers. I saw a bumper sticker that said 'pilots do it better'... did it mean answering questions? EtherealN, thanks for taking the time to post. Although I'm not a dev, your explanation brought it to my level, and I'm thankful for the understanding. I've been doing some testing and found some interesting things. Could you comment on these settings and your opinions please, EtheralN? I might also respectfully request that you or someone at ED post a sticky on performance. What a huge difference some config made! The best post I found for performance config was http://forums.eagle.ru/showthread.php?t=43222 . The best post I found for an explanation of nVidia settings was http://www.tweakguides.com/NVFORCE_6.html . I followed his suggestions closely, comparing it with other posts, and did some testing of my own. His suggestions were as follows: //graphics.cfg: MaxFPS = 60; //options.lua ["water"] = "1", ["shadows"] = "0", ["effects"] = 2, ["terrPrld"] = "100", //InGame Options: Textures - High Scenes - High Civ Traffic - Yes Water - Normal Visib Range - Medium Heat Blur - Off Shadows - None (See above) Resolution - 2560x1600 Res. of Cockpit Displays - 1024 //nVidia options: Anisotropic Filtering - 16x AA Gamma Correction - On AA Mode - Override Application AA Transparency - Off AA Setting - 4x Triple Buffering - Off Extension limit - Off Error Reporting - Off Maximum pre-rendered frames - 3 Force Mipmaps - None Multi Display - Single Negative LOD Bias - Clamp Texture Filtering Quality - High Quality Threaded Optimisation - Auto Vertical Sync - Automatically On I was finally able to heat up my cards while doubling my FPS! In my testing I created a mission where I start on the pad and several flights of Russian SUs and KA50s are pounding some French armor (sorry, it's an American thing) right in front of me on the other side of the pad. I turned my IT23 and ABRIS on during testing. It's interesting that in testing my new config with time acceleration on full, my FPS went down to the FPS values from before reconfig. hah BEFORE CONFIG: 20-30FPS with GPU temps approx 120F. CPU utilization was average 70-90%. AFTER CONFIG: 40-60FPS with GPU temps around 150F CPU utilization average 40-70%. IMPORTANT DIFFERENCES FROM PREVIOUS POSTS CONFIG: Full screen increases performance by about 5FPS Shadows=0 and Shadows=FULL shows only about 2FPS difference Water=0 gave me the largest increase in FPS(20ish), but the water looks like something from Falcon 1.0 :doh:. Water=1 made the water look very good, but the difference between water=0 and water=1 is only 1 or 2FPS.. I kept it at 1. Res Cockpit Displays from 1024 down to 512 made no difference in performance, but I also saw no decrease in the visual quality, so I left it at 512 just in case. Multi-Display/Mixed-GPU Acceleration to 'Single display performance mode' brought my CPU utilization down by about 10%. I don't really understand how this could be, but the explanation on the above URL for nVidia settings was that some apps that allow multiple monitors in high graphics try to span the resources out to both monitors, even if you're using only 1 for the app. AA setting in nVidia settings made THE biggest difference. I went from 4x to 'SLI 32xQ' and saw a breathtaking difference in video quality. I actually saw an increase of 5-10FPS. My CPU usage went down quite a bit, and my GPUs started to heat up! Thanks for all of your comments and help in reaching this point. I'm very happy with the performance, visual quality, and most of all the amazing content in DCS: Blackshark. Thanks for a great product. I'll be purchasing A10 /excited! and all future variations. Edited January 8, 2010 by StrongHarm 3 It's a good thing that this is Early Access and we've all volunteered to help test and enhance this work in progress... despite the frustrations inherent in the task with even the simplest of software... otherwise people might not understand that this incredibly complex unfinished module is unfinished. /light-hearted sarcasm
EtherealN Posted January 8, 2010 Posted January 8, 2010 Could be nice to pass a couple of effects from the CPU to the GPU, just to add something like 5fps. There is a lot of people that is in the brink of 20-25 fps and a little perfomance gain means going from unplayable to playable. I suppose this must be difficult, but would be nice anyway. It is indeed difficult, since a CPU and a GPU, while deep down based on the same type of technology, differ vastly in their fundamental architectures. They are different in a way that makes the x86 and the old Motorola 68000 seem like one and the same chip. [sIGPIC][/sIGPIC] Daniel "EtherealN" Agorander | Даниэль "эфирныйн" Агорандер Intel i7 2600K @ 4.4GHz, ASUS Sabertooth P67, 8GB Corsair Vengeance @ 1600MHz, ASUS GTX 560Ti DirectCU II 1GB, Samsung 830series 512GB SSD, Corsair AX850w, two BENQ screens and TM HOTAS Warthog DCS: A-10C Warthog FAQ | DCS: P-51D FAQ | Remember to read the Forum Rules | | | Life of a Game Tester
EtherealN Posted January 8, 2010 Posted January 8, 2010 (edited) EtherealN, thanks for taking the time to post. What can I say, I'm a proper forum whore. :P Although I'm not a dev, your explanation brought it to my level, and I'm thankful for the understanding. I'm actually not a dev either (in the sense of actually hacking code - most advanced application I wrote on my own was a set of command-line calculators on an old barebone linux boxen without GUI. I do however have background in the "business", so to speak. :P But an aside there, that I myself consider important to remember, is that several of the foremost Linux "kernel hackers" didn't actually write a single line of code. What they did was look at the big picture - the flowchart, so to speak - and pick-and-choose from existing code to assemble a kernel that fitted their needs, and some of those were so successful that their kernels became as "mainstream" as any desktop Linux distribution. (Often the cause for their work was that they liked the solidity of the OS, but found it's server orientation counter-productive to their own desktop usage and therefore decided to tweak it to make sure that those things they use as an "office user" worked as good and responsively as possible, with the more server-side apps being given a lower priority.) I might also respectfully request that you or someone at ED post a sticky on performance. What a huge difference some config made! It's a good idea, but a big job since you need a big, coherently assembled, dataset to be sure that the advice you are giving is applicable enough to a general audience to be worth a sticky position. I'll try though. :) //graphics.cfg: MaxFPS = 60; Not entirely sure about this one, really. It's the equivalent of a vSync on most screens, and as far as I've seen it really only makes a difference for two things: tearing and stuttering. The former is most pronounced on some setups using TrackIR (I've seen this) but can sometimes come at a total FPS cost. In the case of stuttering it's when a system generally runs at a high FPS but is unable to keep it fluent, something that the human brain is very good at "picking up" and getting irritated by even though it's not actually low FPS. One of the cases where this solved an issue was a user that never went below ~45, but the rapid changes from 100 to 45 was enough to cause disturbance. //options.lua ["water"] = "1", ["shadows"] = "0", ["effects"] = 2, ["terrPrld"] = "100", Looks pretty good. I haven't played around a lot with the preload though, but water 1 or 0 definitely saves a lot of resources. B]//InGame Options:[/b] Textures - High Scenes - High Civ Traffic - Yes Water - Normal Visib Range - Medium Heat Blur - Off Shadows - None (See above) Resolution - 2560x1600 Res. of Cockpit Displays - 1024 In my opinion there is very little visible difference between cockpit displays 512 and 1024. The biggest saver is to stay aways from the "every frame" option though. //nVidia options: Anisotropic Filtering - 16x AA Gamma Correction - On AA Mode - Override Application AA Transparency - Off AA Setting - 4x Triple Buffering - Off Extension limit - Off Error Reporting - Off Maximum pre-rendered frames - 3 Force Mipmaps - None Multi Display - Single Negative LOD Bias - Clamp Texture Filtering Quality - High Quality Threaded Optimisation - Auto Vertical Sync - Automatically On Here we are entering what is more like pure GPU territory, where my recommendation would be to just play around with it and get settings that don't affect your total FPS. A good tool here is the application called FRAPS, which is usually used for video recording but also has a very useful benchmarking tool. It is payware and I'm not sure what benchmarking features the shareware version has, though. But definitely worth the money even if you don't intend to make videos, in my opinion, since it can output the FPS data in excel format and thus let you make some visual comparisons of the performance different settings gave you. And even if that is a bit of overkill for getting performance out of a single application, it also has uses in deciding on computer upgrades since you can then run tests on different settings and find out where your bottlenecks are, ensuring that when you spend your rubles on new hardware it is hardware that you actually need. That aside, your conclusions in the end of the post largely look good. The multi-display stuff is interesting and something I'll have to look at at some point. Also, an interesting thing with some nVidia card/driver combos is that you can sometimes increase performance through increasing AA settings. I'm not entirely sure how this happens "under the hood", but it should be a driver issue. For that reason, it's always a good idea to run a benchmark before and after updating your drivers since you might sometimes have better performance on an older driver - depending on exact hardware/software configuration. Edited January 8, 2010 by EtherealN [sIGPIC][/sIGPIC] Daniel "EtherealN" Agorander | Даниэль "эфирныйн" Агорандер Intel i7 2600K @ 4.4GHz, ASUS Sabertooth P67, 8GB Corsair Vengeance @ 1600MHz, ASUS GTX 560Ti DirectCU II 1GB, Samsung 830series 512GB SSD, Corsair AX850w, two BENQ screens and TM HOTAS Warthog DCS: A-10C Warthog FAQ | DCS: P-51D FAQ | Remember to read the Forum Rules | | | Life of a Game Tester
Panzertard Posted January 9, 2010 Posted January 9, 2010 What can I say, I'm a proper forum whore. :P Quoted for Thruth. In my opinion there is very little visible difference between cockpit displays 512 and 1024. The biggest saver is to stay aways from the "every frame" option though. Option says "cockpit" - but it should probably say "Shkval". You'll notice the Shkval looks like an Atari 2600 spaceinvaders when at 512, while like proper image at 1024. "Every frame" steals more FPS, IMO I've not been able to spot any significant difference on the Shkval. The mind is like a parachute. It only works when it's open | The important thing is not to stop questioning
Feuerfalke Posted January 9, 2010 Posted January 9, 2010 Setting it to 1024 or 512 depends a lot on your overall graphics settings and personal preference. Yes, 512 looks clumsier, but to be honest, I doubt the SHKVAL or ABRIS-monitor is using a high resolution display in reality, so what? You only notice the difference when you zoom in very much anyway. ;) @ StrongHarm: Kudos for checking out other threads and gathering information and details about the settings and also for testing which settings are best suited for your rig. IMHO there would be a lot less problem- and bug-reports, if everybody would take the initiative you did. :thumbup: Gigabyte GA-Z87-UD3H | i7 4470k @ 4.5 GHz | 16 GB DDR3 @ 2.133 Ghz | GTX 1080 | LG 55" @ 4K | Cougar 1000 W | Creative X-Fi Ti | TIR5 | CH HOTAS (with BU0836X-12 Bit) + Crosswind Pedals | Win10 64 HP | X-Keys Pro 20 & Pro 54 | 2x TM MFD
Boulund Posted January 9, 2010 Posted January 9, 2010 @ StrongHarm: Kudos for checking out other threads and gathering information and details about the settings and also for testing which settings are best suited for your rig. IMHO there would be a lot less problem- and bug-reports, if everybody would take the initiative you did. :thumbup: The next-to-perfect forum manners displayed by our new friend StrongHarm is an inspiration to us all, good read-up on information in the forum, a specific and well thought out question and finishing it off with saying "hey, I got your help and it worked, thanks". Topping that with editing the first post to point to the answer displaying the solution to the problem is why I think everyone should rep this guy right now =) Core i5-760 @ 3.6Ghz, 4GB DDR3, Geforce GTX470, Samsung SATA HDD, Dell UH2311H 1920x1080, Saitek X52 Pro., FreeTrack homemade cap w/ LifeCam VX-1000, Windows 7 Professional x64 SP1. FreeTrack in DCS A10C (64bit): samttheeagle's headtracker.dll
sinelnic Posted January 9, 2010 Posted January 9, 2010 StrongHarm is a Pro, always nice and refreshing to find one! Welcome! And btw, about the sticky, don´t hesitate in coming up with a candidate yourself! Once a certain guy EinsteinEP sustained a great discussion on the flight model with ED, along with other forumites and myself, and then went ahead and published a great article at SimHQ with all the info he gathered. This really is a community that values input! Westinghouse W-600 refrigerator - Corona six-pack - Marlboro reds - Patience by Girlfriend "Engineering is the art of modelling materials we do not wholly understand, into shapes we cannot precisely analyse so as to withstand forces we cannot properly assess, in such a way that the public has no reason to suspect the extent of our ignorance." (Dr. A. R. Dykes - British Institution of Structural Engineers, 1976)
goldfinger35 Posted January 9, 2010 Posted January 9, 2010 I don't know why, but I get 23% increase in FPS if I do the following: When mission loads (cockpit view and game is paused), I press alt-tab to exit the game and than alt-tab again to return to the game. Here are some results with FRAPS testing: 2010-01-08 23:04:05 - not using above mentioned trick: Frames: 2857 - Time: 93700ms - Avg: 30.490 - Min: 26 - Max: 38 2010-01-08 23:09:53 - alt-tab trick: Frames: 3447 - Time: 93577ms - Avg: 36.835 - Min: 32 - Max: 46 My PC specs are in my sig. i7 920@4.0Ghz, 12 GB RAM, ATI 4890, LG L246WHX@1920x1200, Saitek X52 Pro, Saitek pro flight rudder pedals, TrackIR4, Audigy 2ZS, Logitech G9x, Vista 64bit.
Recommended Posts