Jump to content

Recommended Posts

Posted

I have a question with regards to ED´s plans for the upcoming Vulkan API. Maybe you (ED) are willing to shed some light on this topic.

 

As we know, VR happens to demand much GPU power due to the need of rendering two screens, often upscaled resolutions and all that in a complex environment (sim). SLI, which was never really ideal either (low % performance increase per added card), is not an option for quite a few years now. 

 

On the other side, I do understand it such that Vulkan API and DX12 have MultiGPU support built into their support. The big question is then; is ED planning to open for the support of MultiGPU in their Vulkan API release?

 

As a short background, for someone who has two top specced PC's, I feel that RTX 5090 is still not enough for the VR, if image quality and framerate are to be top notch. I might be changing to a RTX 6000 PRO just to get that 10%-15% more, but it would actually help immensly, if I knew that e.g. two RTX 6000 PRO would be supported by the API. What I'm effectively saying is that the VR environment is not all that it could be, and the problem isn't so much with DCS, but rather available GPU power being behind the VR demand curve.

 

I hope an ED rep. can chime in on this and shed some light on what near-future capabilities DCS as a platform will support (hardware).

 

-=zerO=-

  • Thanks 1

[sIGPIC][/sIGPIC]

Posted

Not exactly deep into the details, but when Vulkan API has Multi-GPU support, is a dedicated support by the application even necessary? IIRC back in the days SLI accelerated all games , no? (different animal I know, that was supported on a GPU-driver level, but still)
 

  • Like 1

"Muß ich denn jedes Mal, wenn ich sauge oder saugblase den Schlauchstecker in die Schlauchnut schieben?"

Posted (edited)

Or likes Lossing Scaling, one GPU does the frame generation and/or scaling, and another one handles the video stuff?

So that, two GPUs do not have to be the same vendor and/or model because the API does not use SLI.

Edited by scommander2
  • Like 2
Spoiler

Dell XPS 9730, i9-13900H, DDR5 64GB, Discrete GPU: NVIDIA GeForce RTX 4080, 1+2TB M.2 SSD | Thrustmaster Warthog HOTAS + TPR | TKIR5/TrackClipPro | Total Controls Multi-Function Button Box | Dell 32 4K UHD Gaming Monitor G3223Q | Win 11 Pro

 

Posted (edited)
24 minutes ago, BIGNEWY said:

Isnt SLI being dropped by NVIDIA on newer cards? I must be out of touch. 

I think Multi-GPU nowaday is something different than SLI was back in the days. SLI connected the cards directly and was managed by the Nvidia driver.
Multi GPU today combines two or more cards logically just by software and divides tasks like scomander2 described....
But that is a very vague recollection on my side. I might be completely wrong....

I think the reason why it is so out of fashion today is, that it is cheaper just to use the next tier GPU, than to combine two lesser. (Unless you are on a 4090/5090, but then you are CPU bottlenecked anyway in VR (with DLSS at least)) - and it becomes stupidly expensive of course.....

Edited by Hiob
  • Like 2

"Muß ich denn jedes Mal, wenn ich sauge oder saugblase den Schlauchstecker in die Schlauchnut schieben?"

Posted

I believe BIGNEWY is correct that SLI is dead for even the RTX 6000 PRO.

A 5090 should be more than sufficient for VR. I'm using a 3090 and get a very small amount of stutter with most settings set to the highest. I think I have smoke and traffic lowered. The stutter is only noticeable when I look to the ground from the corner of my eyes. Overall I have a smooth experience.

Image quality is largely going to be a limitation by the headset lenses.

Posted
1 hour ago, zerO_crash said:

if I knew that e.g. two RTX 6000 PRO would be supported by the API

You know that’s like $20K worth of hardware 😮 

i9-14900KS | ASUS ROG MAXIMUS Z790 HERO | 64GB DDR5 5600MHz | iCUE H150i Liquid CPU Cooler | ASUS TUF GeForce RTX 4090 OC | Windows 11 Home | 2TB Samsung 980 PRO NVMe | Corsair RM1000x | LG 48GQ900-B 4K OLED Monitor | CH Fighterstick | Ch Pro Throttle | T.Flight Rudder Pedals | TrackIR 5

Posted (edited)
3 hours ago, Hiob said:

Not exactly deep into the details, but when Vulkan API has Multi-GPU support, is a dedicated support by the application even necessary? IIRC back in the days SLI accelerated all games , no? (different animal I know, that was supported on a GPU-driver level, but still)
 


What I read is that the support is there, but the specific developer has to enable it in their project. I am not sure, however, about how simple/difficult it is to enable, and whether there is maintenance associated with it (on the final dev's side - ED). You don't even need a driver from Nvidia/AMD supporting this, as the whole underlying functionality (distribution of tasking among GPUs recognized by a Windows installation) solely lies in the Vulkan API.
 

3 hours ago, BIGNEWY said:

Isnt SLI being dropped by NVIDIA on newer cards? I must be out of touch. 


SLI doesn't exist anymore. The connectors have been dropped at least since 1080ti as far as I remember. The idea with MultiGPU mentioned here, is that they are not supposed to need NVlink (SLI). The "cooperation" between the cards is meant to go through PCi-express.

 

A good example of MultiGPU running between multiple GPUs in unison, is AI- modelling/simulation software. There is more professional software supporting this as well.

 

Sending a www- link to a scientific document posted by the Norwegian NTNU university going in depth on this matter. Would be interesting to hear from a competent programmer how doable this would be here.

 

Short version:

https://www.ntnu.no/ojs/index.php/nikt/article/view/5367/4843

 

Full version:

https://bora.uib.no/bora-xmlui/bitstream/handle/1956/19628/report.pdf?sequence=1&isAllowed=y

Edited by zerO_crash
  • Like 1

[sIGPIC][/sIGPIC]

Posted (edited)
2 hours ago, Zebra1-1 said:

I believe BIGNEWY is correct that SLI is dead for even the RTX 6000 PRO.

A 5090 should be more than sufficient for VR. I'm using a 3090 and get a very small amount of stutter with most settings set to the highest. I think I have smoke and traffic lowered. The stutter is only noticeable when I look to the ground from the corner of my eyes. Overall I have a smooth experience.

Image quality is largely going to be a limitation by the headset lenses.


What is enough and isn't is very individual. I'm currently waiting for Bigscreen Beyond 2e, and meanwhile am using the Meta Quest 3. In order to have any resolution similar to 2D monitor, I have to render the resolution 1.7x up. Even if I'd stay with the native resolution, and still keep many settings on medium or close to (view range, clouds, shadows, mirror resolution, +++), I still get the ASW to jump down to 45 fps. In particular, Ka-50 BS3 cockpit is a monster on the fps (vanilla high res textures), which brings the 5090 to its knees. We are very far away from "enough" tbh. 

 

2 hours ago, SharpeXB said:

You know that’s like $20K worth of hardware 😮 

 

Yup, money never was a problem for me. Time on the other hand... 🤷‍♀️ But that's me. The genius part of this implementation, is that it would allow to mix and match any GPUs from a manufacturer, and still have it working. You wouldn't be locked to say 2x 980s, instead, a 2080ti and 3090ti would work together. Pretty sure it would be a welcome addition across the community.

Edited by zerO_crash
  • Like 1

[sIGPIC][/sIGPIC]

Posted

@zerO_crash

I doubt that any amount of GPU power beyond a 5090 will benefit your VR experience unless you figure out a way to Multi-CPU (😆). As far as I'm aware that's the bottleneck.
At least it is for me (4090), but I don't have the latest and greatest.

  • Like 2

"Muß ich denn jedes Mal, wenn ich sauge oder saugblase den Schlauchstecker in die Schlauchnut schieben?"

Posted (edited)
33 minutes ago, Hiob said:

@zerO_crash

I doubt that any amount of GPU power beyond a 5090 will benefit your VR experience unless you figure out a way to Multi-CPU (😆). As far as I'm aware that's the bottleneck.
At least it is for me (4090), but I don't have the latest and greatest.


Well, some settings are GPU- heavy, som CPU. Consider this though; when MT was announced publicly, the idea was that we were supposed to be able to go from utilizing one CPU core to two cores. Last I checked, DCS can utilize even 4 cores (with less load on the remaining two). My point being that we are slowly, but surely, improving. With VR though, we've actually hit the GPU bottleneck. I can verify that based on my settings and observations from HWmonitor. 
 

The real question with this proposal, is that currently, some video-related options, still claim medium-/high- CPU demand. What if that load could be GPU alone?! This is all up to the creativity of the programmer. 
 

In any case, the measured results in the scientific paper speak for themselves - 80% utilization of the following GPUs. For reference, SLI/X-fire(Crossfire) could only ever attain sub 60%, and that, was on select titles (PR for selling the GPUs). Realistically, you'd be closer to 40%, and even that wasn't common.

Edited by zerO_crash
  • Like 1

[sIGPIC][/sIGPIC]

Posted
1 hour ago, zerO_crash said:

In order to have any resolution similar to 2D monitor, I have to render the resolution 1.7x up

It’s a lost cause because all you’re doing is supersampling. If you really want your $20K worth, the headset would need the native resolution to be worth the while.

You're making VR look hopeless to the mere mortals here 😆 Now when using a monitor it’s possible to fully exploit the currently available hardware at a fraction of the cost. 

 

i9-14900KS | ASUS ROG MAXIMUS Z790 HERO | 64GB DDR5 5600MHz | iCUE H150i Liquid CPU Cooler | ASUS TUF GeForce RTX 4090 OC | Windows 11 Home | 2TB Samsung 980 PRO NVMe | Corsair RM1000x | LG 48GQ900-B 4K OLED Monitor | CH Fighterstick | Ch Pro Throttle | T.Flight Rudder Pedals | TrackIR 5

Posted

Haha 😂 Well, as said, Bigscreen is on the way. If this was inplemented, even more reason to go with Shiftall. That said, while upscaling never will equal a pixel-per-pixel, it's amazing how good DCS starts to look even with pixel upscaling. 

 

As to the general benefit, consider that everyone earns here, especially people with weaker PCs. Not everyone here can run maxed out on even a 2D screen. If you have a free card lying around, it's free performance for you. You can get two 2080ti's cheaper than a single 5090, for example. 

  • Like 1

[sIGPIC][/sIGPIC]

Posted
17 minutes ago, zerO_crash said:

Not everyone here can run maxed out on even a 2D screen.

Yeah but the threshold for that is much more mainstream attainable. This is far from the most demanding game. 

i9-14900KS | ASUS ROG MAXIMUS Z790 HERO | 64GB DDR5 5600MHz | iCUE H150i Liquid CPU Cooler | ASUS TUF GeForce RTX 4090 OC | Windows 11 Home | 2TB Samsung 980 PRO NVMe | Corsair RM1000x | LG 48GQ900-B 4K OLED Monitor | CH Fighterstick | Ch Pro Throttle | T.Flight Rudder Pedals | TrackIR 5

Posted (edited)

Indeed, VR is the devil in question here. Then even that becomes more mainstream. Look, personally, I support implementing technologies that might not necessarily benefit me. DLSS with in-app upscaling is an example here. It allows people with lesser hardware to experience VR, and great. For me though, it degrades detail too much. Given how complex and time-consuming programming generally is, one has to think way ahead in terms of what technologies to implement and based on predicted development of the sim.

 

Funny enough, original HTC Vive, was the only VR I was ever able to max out (1080ti). Ever since then, while GPUs get 15%-20% more powerfull per release (~ 3 years) on average, the release of new VR headsets and implementation of new features make the demand rather exponential. In particular, if you look at this generation, the main technologies propagated by Nvidia were further DLSS advancements (degrade your picture quality for better performance - in 2025 we call it a feature, pre 2000, this would be criminal). The other technology being Frame Generation - again, multiplying frames in order to trick the user, does not actually mitigate the still non-articulated frametime. This can be compared to a well known issue in DCS occuring for some users where their FPS is above 30 FPS (often above 80), yet when they look at the ground from cockpit, it jitters. 
 

To keep it short, besides the incremental update from e.g. Nvidia, there really are no new technologies that are implemented to mitigate the ever increasing demand from VR. Now, consider upcoming features like dynamic campaign, ever increasing rivet details on each new module, etc... We should frankly, as a community, welcome any performance- giving addition with open hands, be it CPU- or GPU- related. What I'm trying to get across as well, is that it isn't only a win for the top-end user. Remember that people here live in countries where the 5090 or even 4090 is not available yet, or ever will be. This technology gives us all new options. 

Edited by zerO_crash
  • Like 1

[sIGPIC][/sIGPIC]

Posted

I think that we should first wait for Vulkan and VR optimizations that will come with it. If the 5090 isn't enough to run DCS maxed out in VR (for the record, I find my ReverbG2 works fine with a 3090), the answer isn't to throw even more processing power at the problem, but to optimize DCS, because there's other VR software that has no problems at that resolution. Vulkan should allow that to be done, the legacy DX11 code is likely at the root of our performance issues.

2 hours ago, zerO_crash said:

SLI doesn't exist anymore. The connectors have been dropped at least since 1080ti as far as I remember. 

Technically true, but its successor interface, NVLink, is still available. My 3090 comes with it. They killed this tech on all but top end hardware, probably because people were buying two cheaper cards and linking them to get performance of a higher end one. Obviously not a concern with the very top end of the range, or with the Quattro line. This seems to be more of a thing for crypto mining than for gaming, though.

  • Like 1
Posted (edited)

Part of the optimization lies in changing out old technologies with new. This is why ED is going Vulkan to begin with. Also, the jump from a DX version to the next one has never yielded the performance gains you stipulate. There is nothing to support that DX12 alone will solve a noticeable part of the performance question.

 

As to your second mention, I'll refer to the motivation listed in the paper. Apparently someone who has studied this very subject proves you wrong. A single person has been able to write a basic library and components needed in order to utilize the functionality of Vulkan and in that, bring technologies that were formerly restricted to the Pro consumer, to indeed the commercial one. Take a look at what I linked to earlier.

Edited by zerO_crash
  • Like 1

[sIGPIC][/sIGPIC]

Posted

Multi-GPU support is very much dead outside of certain server farm applications. Nvidia and AMD stopped making multi-GPU capable consumer hardware simply becasue it was not popular enough or efficient enough to be worth it anymore. Most people don't even own motherboards that have more than one PCIe slot so most people can't use SLI/Crossfire anyways. Hardly any DCS players are going to buy a workstation-class PC and a pair of workstation GPU cards just to improve their DCS performance. Plus, the workstation-class cards that have NVLink connectors don't have game-optimised drivers anyways becasue they're intended for working, not gaming. When Nvidia writes the drivers for those cards they don't optimise them for games; never have, never will. So even if ED wanted to work on multi-GPU support for your workstation-class cards they would be hamstrung by drivers that are not optimised for DCS.

ED would be better off trying to leverage the NPU chips that are in the latest generation of Intel and AMD CPUs becasue those will eventually be in every CPU within the next ten years... unless the AI trend turns out to be another bubble. I mean, I don't know what they could actually do with NPU cores but I can image they could think of something. Maybe make the ATC be smarter?

Remember: Just becasue you can't do something doesn't mean you should.

Posted (edited)

There seems to be a serious problem, at times, with communication in these forums. Madman1, your whole comment is irrelevant to the technology presented here. MultiGPU has absolutely nothing to do with SLI/Crossfire, it also has nothing to do with purely workstation applications, neither can you make the assumption that "most" people have or don't motherboards with multiple PCi-express slots (practically every MB today that isn't ITX/mini-ATX is equipped with at least two PCi-express slots, most have three and north), nor does the MultiGPU rely on any drivers from GPU manufacturers... 

 

You need to properly read up on what is being discussed here. It cannot be that hard to open the documents that I posted and read, at the very least, the abstract. It's literally one paragraph. Come on people! 
 

At this point, I leave it to ED to decide whether they want to look into this more thoroughly. I literally don't have the time to explain things five times over on posts misleading the discussion. 

Edited by zerO_crash
  • Like 2

[sIGPIC][/sIGPIC]

Posted
1 hour ago, zerO_crash said:

There seems to be a serious problem, at times, with communication in these forums.

100% true!

Thank you for starting this thread! This is one important one for everyone. I really hope we get multi GPU support through Vulkan. Even today we can have Lossless Scaling like @scommander2 mentions, with a mix of cards. I would prefer though to have ED give us native support in Vulkan.

Cheers! 

PS: Off topic, as you point out SLI is not part of this discussion and is "dead". Just want to mention that @Dragon1-1is correct. The 4090 has the link connectors/hardware support, and every other card pre the 4000 series. SLI still works, sort of, in DCS today through Nvidia Inspector. 

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...