Jump to content

Lock On and SLI technology


Recommended Posts

Well, it appeares that SLI does not work for Lock On. Most likely reason is that NVIDIA did not optimize the SLI driver for Lock On.

 

Is Eagle Dynamics in contact with NVIDIA to solve this "problem"? Or is the Lock On engine, the technology behind the lock On code, that prevents performance gains with SLI?

 

Check this link and you will see that there is no gain when using SLI with Lock On:

 

http://firingsquad.com/hardware/nvidia_nforce_4_sli/page7.asp

Thermaltake Kandalf LCS | Gigabyte GA-X58A-UD3R | Etasis ET750 (850W Max) | i7-920 OC to 4.0 GHz | Gigabyte HD5850 | OCZ Gold 6GB DDR3 2000 | 2 X 30GB OCZ Vertex SSD in RAID 0 | ASUS VW266H 25.5" | LG Blue Ray 10X burner | TIR 5 | Saitek X-52 Pro | Logitech G930 | Saitek Pro flight rudder pedals | Windows 7 Home Premium 64 bit

Link to comment
Share on other sites

SLI tech

 

It's not the Direct3D based engine that is the problem, it the SLI technology itself.

SLI, IMHO is a technology demo/gimmick, rather then a mass market product.

I know many people won't agree, but SLI will create more problems than it solves. Take e.g. PSU requirements.

Marketing wise it's a great idea though (benchmarks).

But, I believe it will/may slow down driver development.

Nvidia seems to have a difficult time making new drivers availabe because of the Unified Driver Architecture and now... SLI.

BTW, I am a very happy owner of a 6800GT.

LockON is IT !

AMD 3500+ - GF6800GT - 1GB RAM low tatency - MSI NEO2 PLatinum

20" BENQ S-IPS TFT 1600X1200 - 32 bit color

Link to comment
Share on other sites

  • 2 weeks later...

Devs, Have you evaluated LOMAC compatibility with SLI?

 

NVidia has a programming guide on there site with a section pertaining to SLI.

 

http://developer.nvidia.com/object/gpu_programming_guide.html

 

My recent experiences running LOMAC with an Athlon 3500+ and 6800GT SLI initially led me to believe that LOMAC was strictly CPU limited, but after overclocking my CPU 10% I only saw a 1% gain in LOMAC performance at 1024x768. Further, I've tweaked settings everywhere possible to reduce CPU overhead and created missions void of AI's and saw minimal improvement in LOMAC framerates. Turning in the GPU display feature in the NVidia drivers confirmed that no matter the settings/mission parameters in LOMAC, the GPU's simply aren't sharing the graphics load in AFR or SFR mode. So, the whole "CPU bottle-neck" idea may be detracting from an additional problem with SLI being fundamentally unworkable with LOMAC in its current state. This message is not intended to start some pathetic flame war about SLI; ITS ADDRESSED SPECIFICALLY TO ED ABOUT SLI OBSERVATION AND FUTURE POTENTIAL :wink:

 

The Nividia programming guide section on SLI is a very quick read. ED, could you provide feedback on feasability of SLI support or correspondance with NVidia on that topic? Any feedback from ED would be greatly appreciated.

 

Thanks,

 

spearsd

Link to comment
Share on other sites

Re: Devs, Have you evaluated LOMAC compatibility with SLI?

 

ED, could you provide feedback on feasability of SLI support or correspondance with NVidia on that topic? Any feedback from ED would be greatly appreciated

 

ED and All,

 

I, for one, thoroughly endorse the above plea and invite everyone interested in this subject to come here and write a line or two just to let ED knows how important SLI in LOMAC would be for us all.

 

To the devs, for letting us know their plans regarding SLI, I thank in advance.

FlyHigh and Check Six!

Muttley, out.

Link to comment
Share on other sites

All of whom? The few people who can currently afford this are likely a very small minority.

 

I agree with you GGTharos. I, myself, will have to go to hell and back to afford it now but don't loose perspective... prices drop and the sooner the support to SLI is implemented, the better.

 

Could someone enlighten me about one SLI question? This issue with SLI and LOMAC, the fact that the latter doesn't derive any benefits from the first... is this a common scenario with other games/sims? Which other games face the same problem? I ask so because if, for instance, 90% of games suffer the same problem and considering what GGTharos said, maybe devs shouldn't spend time on this. But if it's the other way around, I would like to know why the devs couldn't or shouldn't fix this issue.

 

Anyways, it would be wonderful to hear directly from them.

FlyHigh and Check Six!

Muttley, out.

Link to comment
Share on other sites

Honestly, I think if it's an easy fix for them, they should do it. If it'snot an eay fix..well. But they know better than me and they'll make the right decision I think :)

 

I wonder what's involved in the SLI programming myself.

[sIGPIC][/sIGPIC]

Reminder: SAM = Speed Bump :D

I used to play flight sims like you, but then I took a slammer to the knee - Yoda

Link to comment
Share on other sites

Honestly, I think if it's an easy fix for them, they should do it. If it'snot an eay fix..well. But they know better than me and they'll make the right decision I think :)

 

I wonder what's involved in the SLI programming myself.

 

To partially answer your question, some of guidelines and "don'ts" can be found in the short "SLI" section of NVidia GPU programming guide here...

 

http://developer.nvidia.com/object/sli_home.html

 

There really is no coding directly for SLI, just certain guidelines and particular programming practices to avoid. In a nutshell, it could be a simple enhancement to existing code :) , or a major re-write :cry: . All depends on the developers existing design.

 

-spearsd

Link to comment
Share on other sites

It is my understanding that NVIDIA creates SLI driver for a specific game! So there is nothing Eagle Dynamics can do other then send an e-mail to NVIDIA requesting SLI driver to be updated to include Lock On!

 

Maybe, we all could send an e-mail or two asking NVIDIA directly to update the driver.

 

After all Lock On is the only modern combat airplanes simulator on the market today!

Thermaltake Kandalf LCS | Gigabyte GA-X58A-UD3R | Etasis ET750 (850W Max) | i7-920 OC to 4.0 GHz | Gigabyte HD5850 | OCZ Gold 6GB DDR3 2000 | 2 X 30GB OCZ Vertex SSD in RAID 0 | ASUS VW266H 25.5" | LG Blue Ray 10X burner | TIR 5 | Saitek X-52 Pro | Logitech G930 | Saitek Pro flight rudder pedals | Windows 7 Home Premium 64 bit

Link to comment
Share on other sites

Hajduk,

 

If you look at the NVidia documentation I've linked to above, it will explain how SLI works, and what is necessary. No, you typically don't need to have NVidia driver SLI specific support for a game, although some games by design will be unable to use SLI effectively (also explained). The NVidia drivers appear to effectively use SLI to improve most of the games I've tried. The Forceware drivers allow you to create profiles for specific games and adjust the SLI mode used to "Auto,AFR,SFR, or Single GPU".

 

Then again, maybe I've misunderstood what I've read there. Read it for yourself and let me know what you think. I'm not making a sales pitch for SLI. I'm simply noting that if through simple changes it can be supported, then maybe it should be.

 

-spearsd

 

It is my understanding that NVIDIA creates SLI driver for a specific game! So there is nothing Eagle Dynamics can do other then send an e-mail to NVIDIA requesting SLI driver to be updated to include Lock On!

 

Maybe, we all could send an e-mail or two asking NVIDIA directly to update the driver.

 

After all Lock On is the only modern combat airplanes simulator on the market today!

Link to comment
Share on other sites

i doubt the nvidia driver is the culprit. nvidia claims that SLI doesn't require any extra developer support whatsoever; the same techniques that work for single-GPU systems work for multi-GPU systems as well.

 

i think LOMAC's sucky performance with SLI is probably the result of ubisoft pressuring ED to release LOMAC prematurely, without proper optimization. remember, LOMAC performance sucked to begin with.

 

the good news: if i'm right, flaming cliffs should work just fine with SLI, since ED now has time to fix what's broken.

Link to comment
Share on other sites

Yeasty,

i doubt the nvidia driver is the culprit. nvidia claims that SLI doesn't require any extra developer support whatsoever; the same techniques that work for single-GPU systems work for multi-GPU systems as well.

This is quite correct, although certain design approaches chosen may render SLI completely useless (detailed in the gpu programmers guide).

 

i think LOMAC's sucky performance with SLI is probably the result of ubisoft pressuring ED to release LOMAC prematurely, without proper optimization. remember, LOMAC performance sucked to begin with.

I very much doubt this is the case, well, for SLI atleast; remember, SLI was just recently introduced by NVidia. ED probably didn't even take it into consideration during the design of LOMAC's engine because it wasn't available and consequently may have implemented rendering such that SLI can't do what its supposed to. Maybe in a complex way, but maybe in a very, very simple way. I'm just hoping they look at it.

 

 

the good news: if i'm right, flaming cliffs should work just fine with SLI, since ED now has time to fix what's broken.

If they had, I'm sure they would have told us by now ;-) We shall probably have to wait beyond 1.1 for the possibility of SLI compatibility if we see it at all.

Link to comment
Share on other sites

the good news: if i'm right, flaming cliffs should work just fine with SLI, since ED now has time to fix what's broken.

 

I haven't seen any statements from ED that Flaming Cliffs will benifit from SLI?

Thermaltake Kandalf LCS | Gigabyte GA-X58A-UD3R | Etasis ET750 (850W Max) | i7-920 OC to 4.0 GHz | Gigabyte HD5850 | OCZ Gold 6GB DDR3 2000 | 2 X 30GB OCZ Vertex SSD in RAID 0 | ASUS VW266H 25.5" | LG Blue Ray 10X burner | TIR 5 | Saitek X-52 Pro | Logitech G930 | Saitek Pro flight rudder pedals | Windows 7 Home Premium 64 bit

Link to comment
Share on other sites

I'm not like you.

 

if youre like me youre going to buy an SLI capable board and get a kick ass NON SLI CAPABLE CARD (X850XT PE). this way when a good sli solution comes by ill see if it is better than the x850 (doubtful :D)

 

Sorry, but last time I checked this thread wasn't about what "still to be released in stores" video card to buy for LOMAC. It's about "Lock On and SLI technology".

 

BTW, I'm an ATI fan from way back (owned a Mach64, 8500, 9700Pro, 9600XT, and 9800XT) and I think the X850XT PE will be a wonderful card when its finally fabricated reliably and available to the masses. If it had been available a month ago, I probably would have bought one instead of my SLI pair :lol: .

 

So, let's get back to the topic, eh? Wow, I feel sort of like a moderator here :roll:.

Link to comment
Share on other sites

Flaming Cliffs will not be delayed according to Valery.

 

Besides, what's SLI gonna do fo ryou? Can you afford it? If you can, you can afford a kicka-ss card too, and you can live with it as well.

[sIGPIC][/sIGPIC]

Reminder: SAM = Speed Bump :D

I used to play flight sims like you, but then I took a slammer to the knee - Yoda

Link to comment
Share on other sites

GGTharos,

I agree with you about FC not being delayed; t shouldn't be. CPU increases seem to be more important for LOMAC than SLI right now.

 

However, my initial experiences with overclocking an A64 3500+ to 2605MHz 18% from 2200MHz) showed a small performance increase of 3%. I'm still quite confused about the small performance gains and it leads me to believe that LOMAC may have performance issues that go beyond simply being CPU bound. What that may be? I don't know.

 

The important thing is to get the devs to investigate LOMAC performance with CPU scaling (including multiprocessing) and multi-GPU (SLI or other) in mind so that in the near future we will all be able to leverage these technologies as they become more common place on the desktop.

 

-spearsd

 

 

 

 

Flaming Cliffs will not be delayed according to Valery.

 

Besides, what's SLI gonna do fo ryou? Can you afford it? If you can, you can afford a kicka-ss card too, and you can live with it as well.

Link to comment
Share on other sites

I agree that it requires some investigation ... I too think the AI code in particular may be possible to optimize to run faster, same with the shaders etc ... however I'm only guessing. :/

[sIGPIC][/sIGPIC]

Reminder: SAM = Speed Bump :D

I used to play flight sims like you, but then I took a slammer to the knee - Yoda

Link to comment
Share on other sites

spearsd,

 

Your point seems to be well substantiated. The performance increase after cpu overclocking seems negligible. One thing I've been asking myself since you have reported your findings is how would LOMAC perform with a Gigabyte 3D1, the one PCB SLI'ed GT6600??? :?: :?: :?: :?:

FlyHigh and Check Six!

Muttley, out.

Link to comment
Share on other sites

spearsd,

 

I'm guessing about the same as a pair of 6800GT Ultras in a system like mine with a stock 3500+. As I noted before, it seems like the 6800GT pair is barely breathing hard. I just want to know why my 20% cpu gain equals 3% LOMAC gain! :-) LOL!

 

-spearsd

 

Your point seems to be well substantiated. The performance increase after cpu overclocking seems negligible. One thing I've been asking myself since you have reported your findings is how would LOMAC perform with a Gigabyte 3D1, the one PCB SLI'ed GT6600??? :?: :?: :?: :?:

Link to comment
Share on other sites

speards,

 

I understand your point, but there's also the possibility that your SLI shows no improvement in LOMAC because LOMAC is incompatible with SLI. If that's the case, it is ok for you since LOMAC could simply disregard the second board and work based on the first one.

 

But if that's case, what would LOMAC do when you can't "undo" the SLI?? At least hardware wise?? Maybe just deselecting the SLI option in nVidia's driver would do for the the 3D1, but I'd still be willing to hear from someone with hands-on experience with this board.

FlyHigh and Check Six!

Muttley, out.

Link to comment
Share on other sites

No, my point is LOMAC currently doesn't benefit from SLI regardless of whether its operating in single or dual GPU mode because that's not where the bottle-kneck is.

 

Everyone says the bottleneck is CPU, and part of me wants to jump on that boat but I've tried overclocking my CPU 18% and received only a 3% gain in LOMAC performance. Now, I don't expect 1:1 performance increases with my overclocking, but 1:6 seems somewhat under my expectations if LOMAC is CPU bound as I keep hearing throughout the LOMAC community. Maybe, LOMAC uses some funky job manager for certain "less critical" computations and load-sheds them based on priority. This could explain the funky 6:1 correlation. I just don't know. :roll:

 

As for operating in single GPU mode; yes, you can do this effectively through NVidia's control panel. No need to reconfigure the mainboard if that's what you want to accomplish (unless you will also be running quad display alternatively).

 

speards,

 

I understand your point, but there's also the possibility that your SLI shows no improvement in LOMAC because LOMAC is incompatible with SLI. If that's the case, it is ok for you since LOMAC could simply disregard the second board and work based on the first one.

 

But if that's case, what would LOMAC do when you can't "undo" the SLI?? At least hardware wise?? Maybe just deselecting the SLI option in nVidia's driver would do for the the 3D1, but I'd still be willing to hear from someone with hands-on experience with this board.

Link to comment
Share on other sites

Hey spearsd, you got me all wrong... :shock: it's prolly my poor English, sorry... :oops:

 

Anyways, as I said before, I *had* understood your point and agree with it totally. After your overclocking, I would also be expecting a greater performance increase for such a (supposedly) CPU bound sim.

 

What I was trying to say about the one PCB SLI solution offered by Gigabyte, is that it would be interesting to try to break up the problem LOMAC has with SLI in two: hardware and software.

 

The idea would be forgetting about the option to disable SLI in nVidia's driver and see if LOMAC would perform as a dual GT6600 or as a single (regardless of how this performance compares to any other board). If LOMAC performs with only one GPU, nothing's new. But if it performs with both GPU's, I guess we would be much closer to a solution for the regular SLI... the programming changes might not be so complicated after all!!!!

 

Of course, much simpler than that would be if someone from the dev team could take 5/10 minutes to tell us what's going on and what the plans are. :roll: It's just me being overcurious about this LOMAC/SLI thing...

FlyHigh and Check Six!

Muttley, out.

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...