Jump to content

Recommended Posts

Posted

FWIW, I got my EVGA GTX 670 FTW ocerclocked to 1238 MHz (Max Boost Freq), 3691 MHz memory clock (7382 MHz effective clock) and it's power stays below 112% the entire time under full load. (I have my power target set to 114% so it doesn't throttle from exceeding the TDP). Max temps in any benchmark and the highest I've seen (in BF3) have been 67 C using a fan curve that gives me 80% fan speed at 69 C (to keep it under the first step of Kepler thermal throttling that starts occuring at 70 C).

 

Just did a quick test in BF3 after running numerous Heaven DX11 benchmarks flawlessly... no less than ~65-70 fps with maxed out utlra settings and 16xQ CSAA, 16x AF, 8x SS in NVIDIA CP. I'm using a single display at 1920x1080...

 

This card absolutely kicks ass!!!

  • Like 1

[sigpic]http://www.virtualthunderbirds.com/Signatures/sig_LD.jpg[/sigpic]

Virtual Thunderbirds, LLC | Sponsored by Thrustmaster

 

Corsair 750D Case | Corsair RM850i PSU | ASUS ROG MAXIMUS X CODE | 32GB Corsair DDR4 3200 |

Intel i7-8086K | Corsair Hydro H100i v2 Cooler | EVGA GTX 1080 Ti FTW | Oculus Rift |

X-Fi Titanium Fatal1ty | Samsung SSD 970 EVO 1TB NVMe | Samsung SSD 850 EVO 1TB | WD Caviar Black 2 x 1TB |

TM HOTAS Warthog | TM Pendular Rudder | TM MFD Cougar Pack | 40" LG 1080p LED | Win10 |

Posted

Not yet, using 301.42 ATM.

[sigpic]http://www.virtualthunderbirds.com/Signatures/sig_LD.jpg[/sigpic]

Virtual Thunderbirds, LLC | Sponsored by Thrustmaster

 

Corsair 750D Case | Corsair RM850i PSU | ASUS ROG MAXIMUS X CODE | 32GB Corsair DDR4 3200 |

Intel i7-8086K | Corsair Hydro H100i v2 Cooler | EVGA GTX 1080 Ti FTW | Oculus Rift |

X-Fi Titanium Fatal1ty | Samsung SSD 970 EVO 1TB NVMe | Samsung SSD 850 EVO 1TB | WD Caviar Black 2 x 1TB |

TM HOTAS Warthog | TM Pendular Rudder | TM MFD Cougar Pack | 40" LG 1080p LED | Win10 |

Posted
Not yet, using 301.42 ATM.

 

I'm using the newest driver and I haven't noticed a difference. But, I was only using 301.42 for a day so I didn't have enough time to test it out.

 

What I'm also finding impressive is I've been able to record with FRAPS at a steady 30FPS during almost half a dozen BS2 missions. This card truly is a monster. Makes me wonder what the 690 4GB can do.

 

Now if only DCS in future updates could utilize more cores, and with good video cards with lots of vram, along with 64bit goodness, we will hopefully end up with missions that have hundreds or even thousands of units. Yummy.

i7-12700k, 32GB Ram, RTX 3060 12GB, TrackIR 5, Lots of SSD Space, etc etc

DCS World - All the cool modules

Posted
...Just did a quick test in BF3 after running numerous Heaven DX11 benchmarks flawlessly... no less than ~65-70 fps with maxed out utlra settings and 16xQ CSAA, 16x AF, 8x SS in NVIDIA CP. I'm using a single display at 1920x1080...

 

This card absolutely kicks ass!!!

 

Yes I'm still looking forward to spending a few hours in BF3 with ultra high settings. I bet it looks amazing! :thumbup:

i7-12700k, 32GB Ram, RTX 3060 12GB, TrackIR 5, Lots of SSD Space, etc etc

DCS World - All the cool modules

Posted
Great to hear! I'll be spending some time this week tweaking mine. What version of EVGA's GTX 670 do you have, and if you don't mind what max boost freq did you manage (and under what boost clock offset did this OC run without thermal throttling taking place)? Just looking for a comparison...

 

I have a GTX 670 FTW.

 

I believe I put the slider up to +100 core clock and +250 on the memory, in the EVGA OC'ing software. My experience with overclocking video cards is almost nill, so I'm keeping it light OC for now, until I learn more. I'm not even sure what temps are being produced yet when running games at high settings. I will have to do some testing at some point. Maybe a few days off work..heheh. :-)

i7-12700k, 32GB Ram, RTX 3060 12GB, TrackIR 5, Lots of SSD Space, etc etc

DCS World - All the cool modules

Posted
I believe I put the slider up to +100 core clock and +250 on the memory, in the EVGA OC'ing software. My experience with overclocking video cards is almost nill, so I'm keeping it light OC for now, until I learn more. I'm not even sure what temps are being produced yet when running games at high settings. I will have to do some testing at some point. Maybe a few days off work..heheh. :-)

 

Be sure to post your MBF (Max Boost Freq), not the CPU clock offset or mem clock offset... reason why is my card, the EVGA GTX 670 FTW may run with a +50 clock offset that gives me a bump from its default boost clock of 1084 MHz to 1134 MHz, add the Kepler boost (by card design - mine is 104 MHz extra before it starts to thermal throttle, which happens above 69C)... the MBF is 1238 MHz (default boost + offset + kepler boost). This is what my clock is actually running at (as long as the temp stays below 70C for the Kepler boost not to ratchet down).

 

For someone else using a diff version of the GTX 670, their boost clock default could be 915 MHz for instance, add +100 clock offset and the card's Kepler boost (varies, but often 104 MHz it seems)... this would give that card owner a MBF of 1119 MHz.

 

Hence, my +50 clock offset = 1238 MHz while someone whose GTX 670 has a boost clock lower than mine (915 MHz in the example I gave) and using +100 clock offset = 1119 MHz. This is why the MBF is the ONLY number that really counts in the end as every card is different (i.e. reference cards, superclocked versions, FTW editions etc. etc.).

 

In the end, the headroom for overclocking is probably a little higher for the base/reference models than it is for a factory overclocked card, which means the "slower" stock cards will probably run with a higher + clock offset, but rarely faster than the cards that are factory overclocked even if their + clock offset is lower.

 

Hopefully that all made sense.

 

Just keep your temp under 70C while under full load (running Heaven DX11 benchmark maxed out) and you'll get the max out of your GTX 670. The guide will help you find your most optimal + clock offset and + mem offset. Set a fan curve that maintains temps below 70C once you've found your + offsets. The last part is looking at the power % in GPU-Z (highest noted) and then dragging your power slider up to a value higher than that noted so it doesn't throttle down your OC because of exceeding the TDP (power) you've set.

 

Just remember to post MBF when comparing clock speeds! :thumbup:

[sigpic]http://www.virtualthunderbirds.com/Signatures/sig_LD.jpg[/sigpic]

Virtual Thunderbirds, LLC | Sponsored by Thrustmaster

 

Corsair 750D Case | Corsair RM850i PSU | ASUS ROG MAXIMUS X CODE | 32GB Corsair DDR4 3200 |

Intel i7-8086K | Corsair Hydro H100i v2 Cooler | EVGA GTX 1080 Ti FTW | Oculus Rift |

X-Fi Titanium Fatal1ty | Samsung SSD 970 EVO 1TB NVMe | Samsung SSD 850 EVO 1TB | WD Caviar Black 2 x 1TB |

TM HOTAS Warthog | TM Pendular Rudder | TM MFD Cougar Pack | 40" LG 1080p LED | Win10 |

Posted

Honestly, the memory size (2G vs 4G) really shouldn't matter much when you're talking about a triple screen setup. The overwhelming majority of your video memory is consumed by texture caching, not screen draws.

 

Put it this way....suppose you're running a 1920x1200 display...that's about 2.3 million pixels. Let's assume 32 bit color....so now you're talking 10 megabytes to contain what's on the screen. The way video works is to draw one frame while rendering another...so you need another 10 for the extra frame. So 20 megabytes of video memory per screen. Three screens means 60 meg. You still have 1940 megs of VRAM available with 3 screens on a 2 gig card....

 

On my 1G GTX 460 if I load up A-10C sitting at Batumi....I'm pulling this a bit outta my arse because I don't remember precisely....but if I set textures to "low", I use around 650M of VRAM...putting it to medium boosts me to 9xx somewhere and it still performs fine (generally 40-60fps, even on the ground at Batumi). If I set it to "high" textures, it maxes out the video and I get crap performance.

 

One side note that affects video memory - I always set my max pre-rendered frames to zero as it's almost a necessity in multiplayer (big cause of microstutters is having pre-rendered frames that have to be scrapped because they're no longer accurate due to something another player did...) Obviously if you're setting that up around 8 (as I know some people do) that ratchets up the amount of VRAM you're using...

 

The main point, though, is that the vast majority of your video memory is being used to cache textures - the same textures that will be used on other displays....so if you need 1.5G to run 1 screen....that does not mean you'll need 4.5 to run 3....more like 1.6G.

"Tank! I need a program for a TM Warthog!"

 

[sIGPIC][/sIGPIC]

Virtual Thunderbirds, LLC | Sponsored by Thrustmaster

 

Thermaltake V9 SECC case | Corsair RM750 PSU | Asus ROG Ranger VIII | Intel i7 6700K | 16GB 3000mhz RAM |

EVGA GTX 980Ti FTW | TrackIR 4 w/ pro clip | TM HOTAS Warthog | TM MFD Cougar Pack | Win 10 x64 |

Posted

Well...yes...there is SOME memory taken up depending on how much AA you crank in there, too. Forgot about that. But then, my eyes personally don't require those settings to be too high before I'm happy.

"Tank! I need a program for a TM Warthog!"

 

[sIGPIC][/sIGPIC]

Virtual Thunderbirds, LLC | Sponsored by Thrustmaster

 

Thermaltake V9 SECC case | Corsair RM750 PSU | Asus ROG Ranger VIII | Intel i7 6700K | 16GB 3000mhz RAM |

EVGA GTX 980Ti FTW | TrackIR 4 w/ pro clip | TM HOTAS Warthog | TM MFD Cougar Pack | Win 10 x64 |

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...