Jump to content

Aussie guy compares a 4090 and 5090 in DCS in VR using Pimax Crystal Light and Quest 3


Recommended Posts

Posted

This is probably the first VR-review featuring the 5090 and DCS. It also covers MSFS2020/2024. Spoiler: The 5090 is fastest 🙂 Around +20 to 25 fps difference.

Video: 

 

  • Like 3
  • Thanks 1
Posted (edited)

His results are almost exactly what I can recreate. Four notable points:

1 - Currently, with the very latest drivers for the RTX 5090 (572.42 released three days ago), the 5090 across the board in DCS specifically is averaging a 23% gain over the 4090 when using DLSS at the very highest settings combined with Mbbuchia's QuadViews software. (This is a gain over what I measured just two weeks ago.) When using DLAA, the gain is about ~ 18% at the very highest settings (this is averaging the six headsets we do have in the lab). With regards to what I think of as the "high-end" headsets, ... the Somnium Ultimate, the Crystal and the Varjo Aero, ... the gain of the 5090 over the 4090 in DCS specifically is averaging 18.9% as of driver version 572.42 (with settings absolutely maxed). In IL2, the gains are slightly larger, ... averaging 27% across six headsets. There are bigger gains to be had in MSFS 2020 and 2024 than with DCS (there, ... the 5090 is beating the 4090 by about 33% averaged over the six headsets). I have not had a chance to test with XPlane yet.

2 - Just like I did, he got the 5090 and the Pimax Crystal to work together fine. I believe what other posters have written online about their Crystals not working with the 5090 are true and valid, ... but for myself personally, I was able to get it working. (I also am not using an Nvidia card, ... I am using an ASUS TUF 5090.)

3 - If or when DCS implements DLSS 4 (as Cyberpunk 2077 has), then yes, the price and electricity expenditure of the 5090 may be worth the resultant visual increase. Currently, ... I guess you will have to decide if 23% visual increase is worth the price over the 4090. I suspect it is going to be a mute point anyway for some many months unless you want to pay vastly inflated scalper prices for the 5090. The upcoming trade war and recession will make these prices even higher.

4 - Beginning to build some experience with the 5090 card now and I can say that overall this impresses me as a "poor" Nvidia release. The 2080 Ti was a "good" release, whereas in retrospect, the 3090 Ti is often seen as a "poor" release (hot, loud, with high electricity costs). Following that cycle, the 4090 was an "awesome" release, ... arguably the best in the last decade if not longer. In comparison, the 5090 is a "poor" release.

Edited by Tensorial_Architect
  • Like 1
  • Thanks 4

A wonderful method for appreciating the beauty of the Multiverse is to learn the language in which it was written, ... mathematics.

(Intel 13900k, Aorus Z790, DDR5 6400, Asus TUF 5090 (testing), Samsung 990 Pro, VKB Gun IV SCG/STECS/Slaw Viper RX, Varjo Aero, Quest Pro, Somnium VR1 Ultimate)

Posted
11 hours ago, Tensorial_Architect said:

4 - Beginning to build some experience with the 5090 card now and I can say that overall this impresses me as a "poor" Nvidia release. The 2080 Ti was a "good" release, whereas in retrospect, the 3090 Ti is often seen as a "poor" release (hot, loud, with high electricity costs). Following that cycle, the 4090 was an "awesome" release, ... arguably the best in the last decade if not longer. In comparison, the 5090 is a "poor" release.

The 2080 Ti is very similar to the 5090. 20-30% performance increase, but with a price to match. So not sure why you would classify them differently. And I would argue that the 1080 Ti was the good release. People generally ignore the 2080 Ti.

And the 3090 Ti was never considered good by those in the know.

Posted

Yeah, it has been awhile since I thought on the 1080 Ti. I would agree with you.

A wonderful method for appreciating the beauty of the Multiverse is to learn the language in which it was written, ... mathematics.

(Intel 13900k, Aorus Z790, DDR5 6400, Asus TUF 5090 (testing), Samsung 990 Pro, VKB Gun IV SCG/STECS/Slaw Viper RX, Varjo Aero, Quest Pro, Somnium VR1 Ultimate)

Posted (edited)

This is just an update for Varjo Aero users; Varjo has been in contact with Nvidia and last night, Nvidia released new drivers (572.47) that fix some of the issues. I can now confirm that the Aero is working stably with the RTX 5090 with MBucchia's QuadViews in Fixed Foveated mode (but with eye tracking disabled in Base). Hope this helps out a few guys. (Still working to get QuadViews with eye tracking working)

(I am also not having any issues getting the Pimax Crystal to work other than the regular issues that occur every week with trying to get the Pimax Crystal to work. 🙂 )

Edited by Tensorial_Architect

A wonderful method for appreciating the beauty of the Multiverse is to learn the language in which it was written, ... mathematics.

(Intel 13900k, Aorus Z790, DDR5 6400, Asus TUF 5090 (testing), Samsung 990 Pro, VKB Gun IV SCG/STECS/Slaw Viper RX, Varjo Aero, Quest Pro, Somnium VR1 Ultimate)

Posted (edited)
On 2/16/2025 at 2:29 PM, Tensorial_Architect said:

His results are almost exactly what I can recreate.

 

You previously reported a 11-15% increase in performance. This video is showing something closer to 25-30% more performance. Way bigger gains..

Edited by Jimmy8x

5800X3D | 64GB DDR4 3600 | RTX 4090 + Varjo Aero VR | F-16, F-15E, F-18 

Posted (edited)

No, I reported an initial increase of 11.3% to 17.3% for the 5090 over the 4090. Also, Aussie Dude above indicates nothing about eye tracking or the use of MBucchia's software, which is what I used. That is why my numbers are different and why as of Feb 24th, they still do not scale up to his results. (My numbers more closely match his with MSFS 2020 and 2024 because I am not adding any additional APIs). In addition, there has been one Nvidia beta and two hotfix driver releases since Aussie Dude put up his vid. In DCS specifically with the very latest drivers (572.47) and MBucchia's QuadViews OpenXR API, I am measuring a 24.1% increase for the 5090 over the 4090 with every setting in DCS at "high" or "ultra" while using DLSS across six headsets (Reverb G2, Quest 3, Quest Pro, Varjo Aero, Pimax Crystal, Somnium Ultimate). With DLAA, I am measuring an increase of 19.5%. Why the increase? ... It's simple, ... Nvidia is ironing out the problems which happens every time a new GPU series is released. That said,

1 - The 5090 is doing 33% more work than the 4090 (electrically speaking, wattage is work) for less than a 33% gain in framerate. In comparison, the 4090 beat the 3090 and 3090 Ti by a far larger percentage. This means that after two years, the optimization of the GPU is fairly poor. In electrical engineering speak, ... the 5090 is a s$#! release.

2 - When the 4090 was released, its retail was $1500 but its street price was $1599 to $1899 for most the variants (Nvidia, ASUS, Gigabyte, Zotac, etc ...) in that first year (late 2022 and all of 2023). In comparison, the 5090 carries a retail price of $2000 but its street price is $4500 to $6800, ... pure scalper prices. The very small amount of cards that Nvidia can actually ship to reputable retailers (Amazon, Newegg, Best Buy, MicroCenter, etc ...) are gone in seconds. This makes the 5090 all but unavailable. TSMC cannot make many chips (GB202) and Nvidia is going to rape every wealthy American gamer they can for its flagship. Because of the current White House tariffs and the upcoming recession, these prices are not likely to go down in the next two years. Millions straddling the fence voted for lower grocery prices and are going to see not only higher grocery prices but increased inflation (inflation has risen to 3.1% since Jan 21st). That is what tariffs and trade wars do. In essence, the 5090 is TWICE as expensive as the 4090 for gains in the 20 to 30% range. Again, ... that makes it a s%#! release.

3 - The electricity costs of this card will really show up in your monthly bill. If you live in rural Louisiana where you can purchase electricity in the 11.3 to 12 cents per kilowatt hour (kWh) range or Bumpkee, North Dakota , the card will show up on your monthly bill if you do any significant gaming with DCS. If you live in the far more populous urban areas of America (about ~ 80% of Americans live in urban areas), the 5090 is really going to show up. Running this card several hours a day is going to run into your ability to buy food in a state like California or Hawaii or Florida very fast. There is also the issue that many people are going to need to upgrade to a 1200 to 1600 Watt power supply unit to run the 5090 reliably. 

I continue to stand by original sentiments, ... the 5090 is a poor release.

Edited by Tensorial_Architect
  • Like 1
  • Thanks 1

A wonderful method for appreciating the beauty of the Multiverse is to learn the language in which it was written, ... mathematics.

(Intel 13900k, Aorus Z790, DDR5 6400, Asus TUF 5090 (testing), Samsung 990 Pro, VKB Gun IV SCG/STECS/Slaw Viper RX, Varjo Aero, Quest Pro, Somnium VR1 Ultimate)

Posted

Also a few extra tidbits to help inform the larger DCS community; When you purchase a high-end GPU, Nvidia (and to a lesser extent AMD) is charging you a "gamer" premium.

As an example, take a Rolex Submariner wristwatch. It typically sells in the $9000 to $10,000 range but is estimated to cost somewhere around $520 to $700 to actually manufacture. When marketing costs are added along with additional fees, Rolex is making roughly $8000 for each Submariner.

No one but Nvidia knows exactly the research, development, and marketing costs to get the RTX 4090 into stores but many outlets put the actual cost to manufacture an RTX 4090 card at roughly $420 to $450 to build when it came out in late 2022 (for a street price of $1600 to $2000 for most variants). If the real cost was, say, $600 to bring the RTX 4090 to market, then you have a 3:1 profit ratio.

In that same vein, I have been watching some of the TSMC feeds and those in Malaysia and Indonesia who actually work to assemble the RTX 5090 (for companies like Zotac). They are guessing that is costs about $600 to actually make an RTX 5090. If the real cost to Nvidia to get the RTX 5090 on to the streets is $700, then again, you aren't far from a 3:1 ratio.

  • Thanks 1

A wonderful method for appreciating the beauty of the Multiverse is to learn the language in which it was written, ... mathematics.

(Intel 13900k, Aorus Z790, DDR5 6400, Asus TUF 5090 (testing), Samsung 990 Pro, VKB Gun IV SCG/STECS/Slaw Viper RX, Varjo Aero, Quest Pro, Somnium VR1 Ultimate)

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...