Jump to content

Recommended Posts

Posted

I am running Black Shark on a AMD x2 core 7750 with a nvidia 285 graphics card and had some questions

(1)What are the best nvidia control panel settings to use for the game?

(I have no idea what to set options like force mimaps and texture filtering (there are 4 options to select) and pre rendered frames to get the best out of the game)

(2)In Black Shark do you need to set the antialising and anisotropic filtering in the nvidia control panel or should you select application controlled and then can you select them in game?

(3)Would you also recommend using all the above settings for Lock on as well?

Posted (edited)

If XP I suggest full screen in game settings and create a profile in the Nvidia Control panel for Black Shark AA=8X and AF=8X vsync on. I don' mess with the other settings you mentioned so not sure on those. As for LockOn, I run in windowed mode AA=4X and AF=8X. I run the AA at 4X because of wierdness with zooming in at higher AA settings. No vsync on lockon for me

 

Works great for me.

 

Out

Edited by PoleCat
Posted

For BS, I leave V-Sync forced, with 4x AA and 16x AF. Pretty much everything else (I think) is still at default. But you have a better card than my 9800GTX+, so I think PoleCat is right, set the AA to 8x.

Posted

Cant seem to post the screenshot but below are my current settings.

 

Ansitropic filtering - Application controlled

AA Gamma correction - On

AA Mode - Appl control

AA Setting - Applic control

AA transparency - off

Conformat texture clamp - Use hardware

error reporting - off

extension limit - off

force mipmapping - trilinear

max pre render frame - 6

multi display - single

texture filter Ansitropic sample optimisation - on

" negative LOD bias - allow

" quality - quality

" trilinear optimisation - Auto

Triple buffer - off

Vert sync - force off

 

I tend to run with v sync off (and therefore switch triple buffer off) as

I dont seem to suffer from any screen ripping.

Also I am finding I dont get particularly high frame rates with the above settings typically around 20-35 fps running with in game settings at medium.

Posted

Your graphics card is almost irrelevant to your FPS. Set your graphics to high.

 

Your FPS is governed by your CPU - that's where the heavy workload on a sim like this is at.

[sIGPIC][/sIGPIC]

Daniel "EtherealN" Agorander | Даниэль "эфирныйн" Агорандер

Intel i7 2600K @ 4.4GHz, ASUS Sabertooth P67, 8GB Corsair Vengeance @ 1600MHz, ASUS GTX 560Ti DirectCU II 1GB, Samsung 830series 512GB SSD, Corsair AX850w, two BENQ screens and TM HOTAS Warthog

DCS: A-10C Warthog FAQ | DCS: P-51D FAQ | Remember to read the Forum Rules |

|
| Life of a Game Tester
Posted

Can someone help on the SLI performance mode settings.

I run 2x 9600gt and don't know if there is anything to gain from force split frame rendering, force alternate frame rendering 1, or force alternate frame rendering 2.

Had BS for two weeks and I'm hooked and this forum has been more help then any other source so thanks everyone.

Posted
Your graphics card is almost irrelevant to your FPS. Set your graphics to high.

 

Your FPS is governed by your CPU - that's where the heavy workload on a sim like this is at.

 

 

Has anyone really ever tested this? I'd love to see some benchmarks.

 

I know the FPS would change at different AA / AF settings, but outside that is where I'm interested in the framerates.

The right man in the wrong place makes all the difference in the world.

Current Projects:  Grayflag ServerScripting Wiki

Useful Links: Mission Scripting Tools MIST-(GitHub) MIST-(Thread)

 SLMOD, Wiki wishlist, Mission Editing Wiki!, Mission Building Forum

Posted

Well, the fact that the affinity trick works is test enough, to be honest, even though it does not really give massively detailed numbers in any of the tests I've seen (they've mainly compared affinity tweaking results to each other rather than CPU vs GPU).

 

I'll see if I can devise some exhaustive-yet-simple testing scheme tomorrow that I can run on my machine and that would also be easy for others to replicate.

 

Would indeed be interesting with something concrete. :)

[sIGPIC][/sIGPIC]

Daniel "EtherealN" Agorander | Даниэль "эфирныйн" Агорандер

Intel i7 2600K @ 4.4GHz, ASUS Sabertooth P67, 8GB Corsair Vengeance @ 1600MHz, ASUS GTX 560Ti DirectCU II 1GB, Samsung 830series 512GB SSD, Corsair AX850w, two BENQ screens and TM HOTAS Warthog

DCS: A-10C Warthog FAQ | DCS: P-51D FAQ | Remember to read the Forum Rules |

|
| Life of a Game Tester
Posted

On my PC, some settings are very gpu dependant and some are very cpu dependant.

 

 

Check this test I did with fraps and my custom made mission:

1. aa 4x, af 4x, all on except mirrors

Avg: 27.988 - Min: 16 - Max: 45

 

2. aa 4x, af 4x, all on except mirrors, medium shadows, water norm

Avg: 45.114 - Min: 23 - Max: 107

i7 920@4.0Ghz, 12 GB RAM, ATI 4890, LG L246WHX@1920x1200, Saitek X52 Pro, Saitek pro flight rudder pedals, TrackIR4, Audigy 2ZS, Logitech G9x, Vista 64bit.

Posted

To really test whether something is GPU or CPU dependent you need to test identical scenarios with variable hardware configuration. This is most easily achieved through underclocking your hardware for some of the tests, which is what I intend to do. (Though since most of my hardware is actually overclocked, part of my test will technically not be through underclocking, but rather returning to stock... :P )

 

But anyway, I need my computer free for the rest of the evening to rescue my Folding@home team from some awful daily statistics, but I'm devising a regime that will hopefully let me run some exhaustive and enlightening tests tomorrow. (Though I'll do that in a new thread in that case, since we're already offtopic here now. :P Included in that thread will be all relevant track files and so on.)

[sIGPIC][/sIGPIC]

Daniel "EtherealN" Agorander | Даниэль "эфирныйн" Агорандер

Intel i7 2600K @ 4.4GHz, ASUS Sabertooth P67, 8GB Corsair Vengeance @ 1600MHz, ASUS GTX 560Ti DirectCU II 1GB, Samsung 830series 512GB SSD, Corsair AX850w, two BENQ screens and TM HOTAS Warthog

DCS: A-10C Warthog FAQ | DCS: P-51D FAQ | Remember to read the Forum Rules |

|
| Life of a Game Tester
Posted
I am running Black Shark on a AMD x2 core 7750 with a nvidia 285 graphics card and had some questions

(1)What are the best nvidia control panel settings to use for the game?

(I have no idea what to set options like force mimaps and texture filtering (there are 4 options to select) and pre rendered frames to get the best out of the game)

(2)In Black Shark do you need to set the antialising and anisotropic filtering in the nvidia control panel or should you select application controlled and then can you select them in game?

(3)Would you also recommend using all the above settings for Lock on as well?

 

i agree with what has been said by others here. BS does not appear to utilise the GPU efficiently. in addition eye-candy post-processing effects like AA can pretty much be set to max without fear of diminishing FPS since BS is under the threshold. other settings like vsync are pointless as it is unlikely anyone will exceed the monitors refresh rate at this time.

 

as an alternative to the rather unfriendly nVidia CP, you may want to consider nHancer which allows you to customise profiles more easily. the application also contains hyperlinks to online help which explains each setting and shows comparisons between possible choices.

hardware: Alienware Area-51 7500 - 2x 8800 GTX 768 MB SLI - 4GB RAM - Vista 64-bit - Saitek X52 Pro - TrackIR 5 Pro

Posted

Thanks for all the replys.

EtherealN are u saying that actually I should see no frame rate difference in moving the in game detail from Medium to High?

Also could someone please give some guidance on what I should set the

nvidia control panel options settings to to get the best from the game?(AA and AF I will try at x8 and Vsync I will take off as i uderstand it can reduce slightly frame rate? but the other settings I have no idea what to select?)

Posted

Also could someone please give some guidance on what I should set the

nvidia control panel options settings to to get the best from the game?(AA and AF I will try at x8 and Vsync I will take off as i uderstand it can reduce slightly frame rate? but the other settings I have no idea what to select?)

 

i have included hyperlinks to nhancer's help. though you may not be using it, it does explain the same settings in nvidia's control panel complete with pretty piccys :thumbup:

 

  • AA should be good at 8x

 

  • You don't need to worry about Vsync unless your frame rate is EXCEEDING the refresh rate of the monitor. most monitors these days are VESA 75 Hz or better for CRTs or 60Hz for LCDs. These are rather large numbers as far as BS is concerned. many people are only getting 10-25 FPS so turning on VSync is unlikely to hinder game performance (since your monitor is at least 60 Hz). [1] www.nhancer.com/?dat=d_enhancements#VSync

 

 

  • anistropic filtering is a good one. perhaps start at 4x and work your way up. valid values depend on what card you have. www.nhancer.com/?dat=d_AFPres

hope this helps

 

[1] this is because we are only seeing one (1) in-game frame for every three (3) monitor frames. in other words, by the time BS has rendered a scene, 3 monitor refreshes have occurred anyway regardless of what the computer is outputting. 'the monitor is waiting for the game'. with a very fast game, one that exceeds 65 FPS on a 60 Hz monitor, enabling vsync will 'slow' the game limiting the FPS to the monitors refresh rate of 60Hz. 'the game is waiting for the monitor'. seriously, turning off vsync on fast games is ridiculous since you are not seeing a true frame rate anyway - merely half frames. not to mention frame tearing.

hardware: Alienware Area-51 7500 - 2x 8800 GTX 768 MB SLI - 4GB RAM - Vista 64-bit - Saitek X52 Pro - TrackIR 5 Pro

Posted

my advice:

 

Ansitropic filtering - Force 16x

AA Gamma correction - On

AA Mode - 16xQ

AA Setting - Force

AA transparency - On

Conformat texture clamp - Use hardware (use default)

error reporting - off (use default)

extension limit - off (use default)

force mipmapping - trilinear (use default)

max pre render frame - 6 (use default)

multi display - single

texture filter Ansitropic sample optimisation - on

" negative LOD bias - allow (use default)

" quality - quality

" trilinear optimisation - Auto

Triple buffer - On

Vert sync - force On

 

 

then you will get a better graphic

RTX 3070

Posted
EtherealN are u saying that actually I should see no frame rate difference in moving the in game detail from Medium to High?

 

I have a 9800GTX+ currently running on stock settings, and it barely feels any settings in there unless I rediculously overdo them - and it seems to feel the water but I am devising tests aiming to discern whether it actually is the GPU that's being hit by that. I keep well fluid FPS even if I go for 16x AA and 16x AF (though I was forced away from the high ones since they caused some flickering).

 

But your graphics card is much stronger than mine, and the native graphics settings within DCS:Black Shark barely changes anything in my DCS performance - as mentioned, except for water.

 

If you have performance issues it's the CPU I would be looking at, since the governing factor is the MASSIVE amount of physics calculations that the CPU has to do. I have also seen people upgrade from 8800GT's to GTX285's and similar and report zero performance difference, which would indicate that even their old 8800's weren't under any serious heat.

 

Other things to remember is that Mirrors eat a good bit of power on your computer, and playing in windowed mode is absolute murder.

 

But in the end, yes, upping to maximum (exceot for the water) should have no or next to no effect on your FPS. (Adding AA and AF and so on might have an effect, and depending on card you might actually get better FPS with 16x AA than with 8x or lower... :P ).

[sIGPIC][/sIGPIC]

Daniel "EtherealN" Agorander | Даниэль "эфирныйн" Агорандер

Intel i7 2600K @ 4.4GHz, ASUS Sabertooth P67, 8GB Corsair Vengeance @ 1600MHz, ASUS GTX 560Ti DirectCU II 1GB, Samsung 830series 512GB SSD, Corsair AX850w, two BENQ screens and TM HOTAS Warthog

DCS: A-10C Warthog FAQ | DCS: P-51D FAQ | Remember to read the Forum Rules |

|
| Life of a Game Tester
Posted
To really test whether something is GPU or CPU dependent you need to test identical scenarios with variable hardware configuration. This is most easily achieved through underclocking your hardware for some of the tests, which is what I intend to do. (Though since most of my hardware is actually overclocked, part of my test will technically not be through underclocking, but rather returning to stock... :P )

 

 

That is exactly what I did. I didn't really play the game for first two days because I was testing different settings until I finally found my optimal settings.

i7 920@4.0Ghz, 12 GB RAM, ATI 4890, LG L246WHX@1920x1200, Saitek X52 Pro, Saitek pro flight rudder pedals, TrackIR4, Audigy 2ZS, Logitech G9x, Vista 64bit.

Posted

Okeys, but that wasn't entirely evident in the results display you made. I would be interested in your test methodology - who knows, you may have noticed something I missed.

 

But the important point is that just because a setting governs a graphical effect does NOT mean that it has anything to do with the GPU. It probably is, but there's nothing really stopping the code from being written to execute certain bits of graphics on the CPU.

 

Then of course there always is some things that are technically 100% GPU resident but that will not affect your FPS at all, and that is the most common thing - for those things that are GPU-resident (like texture sizes and so on) tweaking them will not care about the CPU - but with DCS:BS it seems to me that those settings will not stress the GPU to the point where it is working at capacity. In those cases changing those settings will have no effect on overall performance, and that is the main thing I want to highlight. (Just as in the discussion about people wanting BS to be SLI-enabled I had to ask what the point was, since my previous generation card isn't getting run at capacity anyhow... Why spend energy to let people run two or three new-generation cards when the extra capacity will go unused?)

 

Anyways, if you feel like it, please PM me the details of your testing methodology - I'm not a game QA professional so there might be important parameters I have forgotten to adjust for. :)

[sIGPIC][/sIGPIC]

Daniel "EtherealN" Agorander | Даниэль "эфирныйн" Агорандер

Intel i7 2600K @ 4.4GHz, ASUS Sabertooth P67, 8GB Corsair Vengeance @ 1600MHz, ASUS GTX 560Ti DirectCU II 1GB, Samsung 830series 512GB SSD, Corsair AX850w, two BENQ screens and TM HOTAS Warthog

DCS: A-10C Warthog FAQ | DCS: P-51D FAQ | Remember to read the Forum Rules |

|
| Life of a Game Tester
Posted
Okeys, but that wasn't entirely evident in the results display you made. I would be interested in your test methodology - who knows, you may have noticed something I missed.

 

 

Sorry, I didn't post that I have tested many, many different settings.

I have posted only two results as an example. I have created my own track (around 15 minutes long) where I fly with shkval always on.

My idea was that most important information is minimum fps: it happens with shkval on and a lot of action on screen, so I adjusted my setting accordingly.

My major conclusion was that this game is mostly about CPU, not GPU.

Also, RAM plays a major role during loading of missions.

 

I will send you a summary of my testings.

i7 920@4.0Ghz, 12 GB RAM, ATI 4890, LG L246WHX@1920x1200, Saitek X52 Pro, Saitek pro flight rudder pedals, TrackIR4, Audigy 2ZS, Logitech G9x, Vista 64bit.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...