Jump to content

Recommended Posts

Posted

Very annoying. 

  • Like 1

My first assigned aircraft is in my profile name

Ryzen 9800x3d/64gb DDR5 amd expo/RTX 5080/4tb m2/ Win11 pro/Pimax crystal light 

Winwing Orion F16ex (Shaker kit)/Skywalker pedals/Orion 2 F15EX II Throttle/3 MFD units/Virpil CM3 Mongoose Throttle/Trackir 5 

F-16/A10II A/C /F-18/F-15E/F-15C/F-14/F5E II/F-4/Ah64/UH60/P51-D/Super Carrier/Syria/Sinai/Iraq/Persian Gulf/Afghanistan/Nevada/Normandy 2.0

Posted
1 hour ago, Aapje said:

This statement betrays that you have a very simplistic mental model of the situation, and do not in fact 'understand plenty.'

I'll give you one example to show why you are wrong. The key to this is to understand that thinking in FPS is deceptive and the actual time to render the frame is more important.

If the GPU takes the entire 1/120th of a second to render a frame and can thus barely produce 120 FPS, then this means that the world state (which includes, but is not limited to your controller input) that the rendered image is based on, is going to be at least 1/120th of a second old. But if the GPU takes 1/240th of a second to render a frame, then the GPU can render two frames for every frame that is shown on the screen. Obviously both cannot be shown to the user, since the monitor is not fast enough for that. So what happens is that the oldest frame is dropped and the newer frame is shown. So now the shown frame is based on world state that is at least 1/240th of a second old, which means that you will have lower latency (up to twice as low, although that also depends on the CPU), compared to the situation with the slower GPU.

Of course, this is rather wasteful, since you are rendering frames that are not used. This is actually what Nvidia Reflex (1) is made for. It delays the moment that the CPU processes the latest world state and asks the GPU to render a frame, until the last possible moment. So if it only takes the GPU 1/240th of a second to render a frame, but you have the Nvidia Reflex framerate limit at 120 FPS, then the game is asked to render the world state at such a moment that there is only 1/240th of a second left before the monitor refreshes.

Of course, DCS does not support Reflex, which requires the game to cooperate, but the more wasteful method does work for DCS to lower the delay between what happens in the world state, and what is shown on the screen.

So there is a real benefit, although that doesn't mean that it is necessarily worth the downsides.


Very informative, thank you for that clearly presented and understandable data.

The other guy, over a period of days and posts I have found you to be very articulate but essentially toxic as fk. You have been weighed, measured and found wanting. Woke victimhood is how I've heard such behaviour described. Please knock it off before yet another member (me) adds you to the block-list.

  • Like 1
  • Thanks 1
Posted
16 minutes ago, Panzerlang said:


Very informative, thank you for that clearly presented and understandable data.

The other guy, over a period of days and posts I have found you to be very articulate but essentially toxic as fk. You have been weighed, measured and found wanting. Woke victimhood is how I've heard such behaviour described. Please knock it off before yet another member (me) adds you to the block-list.

Everything else is pretty much it. Bolded not so much, unless you can define what that even means. 
 

Anyway, steering my ship back on course here.

I’m conflicted as to what any of this (5090 reviews) actually means to me in terms of VR performance. 2d performance be damned as my 4080s was more than capable of letting me play at 4k and at a minimum 90fps maxed out. 
 

What I want is vr performance reviews. 
 

And the whole cost issue is a bit lost on my honestly. Back in May when I built my PC, 4090s were selling at $2k+ already and were limited with inventory. There were no msrp priced cards. So instead I bought an ‘80 super for $1100. So for me going to a 5090 would be a huge jump in performance, as well as those who have cards below that tier. If you have a 4090 I guess the price increase of 25% for an almost equal performance bump would be questionable. 
 

But even then my post is still a little off topic. I’m imagining that $2400 will be the average for those partner cards. Maybe a PNY card will be around $2200? No idea. Guess we’ll find out in a few days/weeks.

Good luck to anyone trying to snag one tomorrow(?).

  • Like 2

My first assigned aircraft is in my profile name

Ryzen 9800x3d/64gb DDR5 amd expo/RTX 5080/4tb m2/ Win11 pro/Pimax crystal light 

Winwing Orion F16ex (Shaker kit)/Skywalker pedals/Orion 2 F15EX II Throttle/3 MFD units/Virpil CM3 Mongoose Throttle/Trackir 5 

F-16/A10II A/C /F-18/F-15E/F-15C/F-14/F5E II/F-4/Ah64/UH60/P51-D/Super Carrier/Syria/Sinai/Iraq/Persian Gulf/Afghanistan/Nevada/Normandy 2.0

Posted (edited)
28 minutes ago, Panzerlang said:

Woke victimhood is how I've heard such behaviour described. 

Honestly, I think a more apt description of that behavior is "thinks he knows better than anyone else just because he builds rigs for money, and is being a jerk about it". It has nothing to do with the real woke movement, and the rightwing hijacking of the term essentially boils down "a thing that we don't like", which is not terribly useful (nor is the leftwing hijacking the term to include gender/class/sex equality struggle, the real deal is about black people always getting the shaft in the US).

Myself, I just want to know how much the 5090 gains in VR, and without the BS frames, please. I've seen some reviews, and it does seem to outperform my 3090 (it had better!), but I haven't seen anything conclusive on just how much of a gain there is.

Edited by Dragon1-1
  • Like 1
Posted
1 hour ago, kksnowbear said:

A 50 series GPU cannot increase a monitor's maximum refresh rate. A 50 series GPU cannot make a monitor display frames at a rate higher than it's maximum refresh rate.  It doesn't work that way.

You claimed that there is no benefit to having a faster GPU once you hit the max frame rate of the monitor. I have shown that this is not necessarily true, and there can be a benefit to having a faster GPU in that situation.

Moving the goal posts to a different claim, that I never addressed in the first place, is pointless.

  • Like 1
Posted
28 minutes ago, Blackhawk163 said:

What I want is vr performance reviews. 

Unfortunately, I found only two people who did somewhat serious VR reviews, Maraksot78 and Babeltechreviews, but the guy who did those reviews on that latter website sold the site and retired. So no more VR reviews over there.

And neither did DCS reviews anyway. So I think that the best bet is to wait for the reviews on this forum.

Since you have a 4080 Super, I would suggest not rushing into things. Even in the worst case, where you cannot get your hands on one (or only for an absurd price), you still have a very strong card.

  • Like 2
Posted (edited)
42 minutes ago, Aapje said:

Unfortunately, I found only two people who did somewhat serious VR reviews, Maraksot78 and Babeltechreviews, but the guy who did those reviews on that latter website sold the site and retired. So no more VR reviews over there.

And neither did DCS reviews anyway. So I think that the best bet is to wait for the reviews on this forum.

Since you have a 4080 Super, I would suggest not rushing into things. Even in the worst case, where you cannot get your hands on one (or only for an absurd price), you still have a very strong card.

I watched one just the other day, a fledging account (at least to me) that tested DCS in the Syria map I believe across a range of GPU's in VR So hopefully that pops off soon and he gets higher end GPUS. 

 

I'm in no super rush to replace my 4080s, but if I can snag an FE for MSRP then sure why not? There's value in it for me, even if there really isn't any for others. Would I pay well and above that? Absolutely not. I'll sit it out for the inevitable Supers and Ti cards that will come out or for the 6k series. 

Edited by Blackhawk163
  • Like 1

My first assigned aircraft is in my profile name

Ryzen 9800x3d/64gb DDR5 amd expo/RTX 5080/4tb m2/ Win11 pro/Pimax crystal light 

Winwing Orion F16ex (Shaker kit)/Skywalker pedals/Orion 2 F15EX II Throttle/3 MFD units/Virpil CM3 Mongoose Throttle/Trackir 5 

F-16/A10II A/C /F-18/F-15E/F-15C/F-14/F5E II/F-4/Ah64/UH60/P51-D/Super Carrier/Syria/Sinai/Iraq/Persian Gulf/Afghanistan/Nevada/Normandy 2.0

Posted

A refreshingly interesting look at the 5080 launch. Not a 5080 review. 

 

Windows 11 23H2| ASUS X670E-F STRIX | AMD 9800X3D@ 5.6Ghz | G.Skill 64Gb DDR5 6200 28-36-36-38  | RTX 4090 undervolted | MSI MPG A1000G PSU | VKB MCG Ultimate + VKB T-Rudders + WH Throttle |  HP Reverb G2  Quest 3 + VD

Posted

 

 

so basically maybe if on a 3xxx series worth the upgrade but if on a 4xxx not so much but bragging rights.. I still sit on the fence. I would like to get better then the current 4070Ti I have maybe I will waint and see what the used 4080S are selling for..

 

Still a good breakdown with benchs and such with the raw performance..

  • Like 1

Intel Ultra 265K 5.5GHZ   /  Gigabyte Z890 Aorus Elite  /  MSI 4070Ti Ventus 12GB   /  SoundBlaster Z SoundCard  /  Corsair Vengance 64GB Ram  /  HP Reverb G2  /  Samsung 980 Pro 2TB Games   /  Crucial 512GB M.2 Win 11 Pro 21H2 /  ButtKicker Gamer  /  CoolerMaster TD500 Mesh V2 PC Case

Posted
8 hours ago, Blackhawk163 said:

I'm in no super rush to replace my 4080s, but if I can snag an FE for MSRP then sure why not? There's value in it for me, even if there really isn't any for others. Would I pay well and above that? Absolutely not. I'll sit it out for the inevitable Supers and Ti cards that will come out or for the 6k series. 

Do keep in mind that there are already reports of PCIe 5 issues with the FE models. Their design effectively builds a riser cable into the card, which is known to cause issues, so this may be another case of Nvidia rushing new tech without testing it properly (like with the power connectors). Then again, the performance loss of switching to PCIe 4 is minimal.

But of course you are free to do what you want.

  • Like 1
Posted (edited)
On 1/28/2025 at 12:22 PM, okopanja said:

Well aware, but the thing here is how you market and sell it.

From the money point of view the GPU core is the primary value in such combo, with CPU being added value, as opposed CPU-as-main-value.

 

Meanwhile with DEEPSEEK, I wonder if Jensen should reconsider the pricing for 5090 (and the rest)...

Looks like he will have extra capacity to offer within high end chips to the gaming market.

image.png

Only now noticed this posted, and well noted.

Nvidia will quickly recover (already is), because it seems DEEPSEEK (open source AI software, which is all the rage now) has all its users running for any ≥16GB VRAM GPU that fits the application.
Meaning, regardless of a possible failure of RTX5000 series in the gaming market, it'll be a sucess in this area, which will (unfortunately) also push for high prices.

The problem now for PC gamers (and includes DCS users who want to get ≥16GB VRAM GPUs) is not only brand new units but also second-hand units, will probably become scarce and also with inflated prices.....

It could be the "2020 mining craze" all over again! 😬  lol 

Edited by LucShep
  • Like 1

CGTC - Caucasus retexture  |  A-10A cockpit retexture  |  Shadows Reduced Impact  |  DCS 2.5.6 - a lighter alternative 

DCS terrain modules_July23_27pc_ns.pngDCS aircraft modules_July23_27pc_ns.png 

Spoiler

Win10 Pro x64  |  Intel i7 12700K (OC@ 5.1/5.0p + 4.0e)  |  64GB DDR4 (OC@ 3700 CL17 Crucial Ballistix)  |  RTX 3090 24GB EVGA FTW3 Ultra  |  2TB NVMe (MP600 Pro XT) + 500GB SSD (WD Blue) + 3TB HDD (Toshiba P300) + 1TB HDD (WD Blue)  |  Corsair RMX 850W  |  Asus Z690 TUF+ D4  |  TR PA120SE  |  Fractal Meshify-C  |  UAD Volt1 + Sennheiser HD-599SE  |  7x USB 3.0 Hub |  50'' 4K Philips PUS7608 UHD TV + Head Tracking  |  HP Reverb G1 Pro (VR)  |  TM Warthog + Logitech X56 

 

Posted

It's pretty clear that Deepseek has lied about it's training costs and there has been a lot of misinformation about how easy it is to run the model. If you want to run the R1 model yourself, then the 32 GB of the 5090 is not going to cut it, if you want to get any serious speed out of it. There are some guys who have made a cut down version of R1 that runs on 20 GB cards, but even then it runs way slower than if you run it on the commercial H100 systems.

Of course, it may be true that even at these much slower speeds, very many people will want to run it on desktop GPUs, but I have my doubts.

Posted (edited)

Remember that 4080 12GB which Nvidia "unlaunched" at the last minute (which became the 4070Ti), right after the general raging of the public?

Well, it could be that they pulled a similar trick on us again - successfully this time - and the 5080 that we're seeing should've actually been the 5070Ti.... 🙄

One among other videos which you'll see soon on youtube about this:

Edited by LucShep
  • Like 1

CGTC - Caucasus retexture  |  A-10A cockpit retexture  |  Shadows Reduced Impact  |  DCS 2.5.6 - a lighter alternative 

DCS terrain modules_July23_27pc_ns.pngDCS aircraft modules_July23_27pc_ns.png 

Spoiler

Win10 Pro x64  |  Intel i7 12700K (OC@ 5.1/5.0p + 4.0e)  |  64GB DDR4 (OC@ 3700 CL17 Crucial Ballistix)  |  RTX 3090 24GB EVGA FTW3 Ultra  |  2TB NVMe (MP600 Pro XT) + 500GB SSD (WD Blue) + 3TB HDD (Toshiba P300) + 1TB HDD (WD Blue)  |  Corsair RMX 850W  |  Asus Z690 TUF+ D4  |  TR PA120SE  |  Fractal Meshify-C  |  UAD Volt1 + Sennheiser HD-599SE  |  7x USB 3.0 Hub |  50'' 4K Philips PUS7608 UHD TV + Head Tracking  |  HP Reverb G1 Pro (VR)  |  TM Warthog + Logitech X56 

 

Posted (edited)
3 hours ago, LucShep said:

Nvidia will quickly recover (already is), because it seems DEEPSEEK (open source AI software, which is all the rage now) has all its users running for any ≥16GB VRAM GPU that fits the application

You can run this model on 8B, works fairly well. IMHO rebound occurred after claims of derived work, however using this AI feels very much different from chatgpt.

As for "home" use it appears that VRAM size is more relevant than number of cores or their speed. E.g. you might be better with 3090 than with 4070 or 4080 when using it for AI.

Also they sank again, likely due to gwen 2.5 Max announcement.

Edited by okopanja
Posted (edited)
1 hour ago, LucShep said:

Remember that 4080 12GB which Nvidia "unlaunched" at the last minute (which became the 4070Ti), right after the general raging of the public?

Well, it could be that they pulled a similar trick on us again - successfully this time - and the 5080 that we're seeing should've actually been the 5070Ti.... 🙄

The situation is not the same, because back then they announced a nearly full AD103-based 4080 and also a very much cut down 4080 based on the same chip. This made absolutely no sense, as normally a cut down version of a chip is sold as a lower tier, which in this case would be the 4070 Ti, which it of course became after Nvidia unannounced that cut down 4080.

But the 5080 does seem to be almost the full chip. The AD103 and GB203 chips are almost identical in die size, and the CUDA core count is very similar, at 10,240 for the 4080 Super and 10,752 for the 5080. Given that the process node is almost the same, I don't see how the GB203 chip could have a significantly larger number of cores that are disabled.

So I'm pretty sure that Nvidia doesn't have the option to unlock a whole lot more power from the GB203. The chip simply is not all that much faster compared to the generation before it.

The only thing that Nvidia could have done, once they finished the design of the chips, is to put a cut down GB202-chip in the 5080, just like the 3080 had a cut down GA102. However, they are never ever going to do that, for a bunch of reasons, like the GB203 being way more expensive to produce than the GA102.

I don't even see them selling a cut down GB202 as a 4080 Ti, because they can just sell the partially broken chips as 4090D cards to china.

I think that we simply have to accept that this is it, for now. I do wonder what Nvidia is going to do if the 5080 is going to sell extremely poorly, which I think it will. I don't think that they even have the option to release a Super with significantly more performance, similar to how the 4080 Super was almost identical in speed to the 4080. But I also doubt that they will be willing to do a price cut for a 5080 Super, after Jensen already had to eat crow by doing a significant price cut for the 4080. Perhaps they'll just accept really poor sales for this tier.

Edited by Aapje
  • Like 1
Posted (edited)
2 hours ago, Aapje said:

The situation is not the same, because back then they announced a nearly full AD103-based 4080 and also a very much cut down 4080 based on the same chip. This made absolutely no sense, as normally a cut down version of a chip is sold as a lower tier, which in this case would be the 4070 Ti, which it of course became after Nvidia unannounced that cut down 4080.

But the 5080 does seem to be almost the full chip. The AD103 and GB203 chips are almost identical in die size, and the CUDA core count is very similar, at 10,240 for the 4080 Super and 10,752 for the 5080. Given that the process node is almost the same, I don't see how the GB203 chip could have a significantly larger number of cores that are disabled.

So I'm pretty sure that Nvidia doesn't have the option to unlock a whole lot more power from the GB203. The chip simply is not all that much faster compared to the generation before it.

The only thing that Nvidia could have done, once they finished the design of the chips, is to put a cut down GB202-chip in the 5080, just like the 3080 had a cut down GA102. However, they are never ever going to do that, for a bunch of reasons, like the GB203 being way more expensive to produce than the GA102.

I don't even see them selling a cut down GB202 as a 4080 Ti, because they can just sell the partially broken chips as 4090D cards to china.

I think that we simply have to accept that this is it, for now. I do wonder what Nvidia is going to do if the 5080 is going to sell extremely poorly, which I think it will. I don't think that they even have the option to release a Super with significantly more performance, similar to how the 4080 Super was almost identical in speed to the 4080. But I also doubt that they will be willing to do a price cut for a 5080 Super, after Jensen already had to eat crow by doing a significant price cut for the 4080. Perhaps they'll just accept really poor sales for this tier.

You're thinking over the previous generation.
As others into this subject, I'm thinking over other generations before.

For instances, the RTX 3080 was a variant of the very same GA102 as the 3090.
As had been the case previously, when the 2080Ti (the class the current xx80 became) had the same TU102 of the Titan RTX (the class/model that the xx90 became). 

They can do it, they just don't want to. For obvious reasons. (*$ !catching! $*)

The 4080 was no longer based on the "xx2" top chip used on the 4090 (AD102), it was AD103 (higher numbers reserved for lower tier models), as was the 4070Ti-Super. 
And they're doing the same for the RTX5000 series, the 5090 being GB202 and the 5080 being a lower spec GB203, again, as is the 5070Ti.  

I hope that makes sense. It's no surprise the gap between 4090 and 4080 was so big, and now even bigger between the 5090 and 5080.
The 5080Ti 20GB or 24GB (with a variant of the GB202) just isn't out yet, but we all know they have it on their sleeve, if the current "dumb down" plan they went for, since the 4000 series, does backfire now (and hopefully it does)...
 

Edited by LucShep
  • Like 2

CGTC - Caucasus retexture  |  A-10A cockpit retexture  |  Shadows Reduced Impact  |  DCS 2.5.6 - a lighter alternative 

DCS terrain modules_July23_27pc_ns.pngDCS aircraft modules_July23_27pc_ns.png 

Spoiler

Win10 Pro x64  |  Intel i7 12700K (OC@ 5.1/5.0p + 4.0e)  |  64GB DDR4 (OC@ 3700 CL17 Crucial Ballistix)  |  RTX 3090 24GB EVGA FTW3 Ultra  |  2TB NVMe (MP600 Pro XT) + 500GB SSD (WD Blue) + 3TB HDD (Toshiba P300) + 1TB HDD (WD Blue)  |  Corsair RMX 850W  |  Asus Z690 TUF+ D4  |  TR PA120SE  |  Fractal Meshify-C  |  UAD Volt1 + Sennheiser HD-599SE  |  7x USB 3.0 Hub |  50'' 4K Philips PUS7608 UHD TV + Head Tracking  |  HP Reverb G1 Pro (VR)  |  TM Warthog + Logitech X56 

 

Posted (edited)

I've addressed the 3080. The GA102 chip is smaller than GB202 and was made on a far cheaper process. There is also no reason for them to sell the GB202 for so 'little' when they can sell it for much more.

2 hours ago, LucShep said:

As had been the case previously, when the 2080Ti (the class the current xx80 became) had the same TU102 of the Titan RTX (the class/model that the xx90 became). 

The 2080 Ti was made on the last TSMC process (12 Nm) before prices started going up enormously:

Alleged Prices of TSMC Silicon Wafers Appear | TechPowerUp

Then Nvidia resisted the costs of TSMC 7 Nm by using Samsung for the 30-series. But Samsung's process was too poor, so for the 40-series they went back to TSMC, using 5 Nm. But with those prices, they are never going to give you 600-800 squared mm of die for $1k. You can complain about them being greedy, but they obviously not going to lower their profits to not pass those increased TSMC prices onto us.

Edited by Aapje
  • Like 1
Posted (edited)

Gainward GeForce RTX 5080 Phoenix, graphic card is listed at 1169 Eur in Germany.

Not in stock, others easily 30-40% more.

Edited by okopanja
  • Like 1
Posted
32 minutes ago, Aapje said:

I've addressed the 3080. The GA102 chip is smaller than GB202 and was made on a far cheaper process. There is also no reason for them to sell the GB202 for so 'little' when they can sell it for much more.

The 2080 Ti was made on the last TSMC process (12 Nm) before prices started going up enormously:

Alleged Prices of TSMC Silicon Wafers Appear | TechPowerUp

Then Nvidia resisted the costs of TSMC 7 Nm by using Samsung for the 30-series. But Samsung's process was too poor, so for the 40-series they went back to TSMC, using 5 Nm. But with those prices, they are never going to give you 600-800 squared mm of die for $1k. You can complain about them being greedy, but they obviously not going to lower their profits to not pass those increased TSMC prices onto us.

That means absolutely jack sh!t for the end customer if, all of a sudden, the GPUs become forgotten on the shelves....

We've seen before that they can decrease the prices on a whim, when nobody was buying the 4080 and the "Super" versions came out with a +200,00$ discounted price. 🙂 

They're obviously stil sending the clay to the wall to see if it sticks now.
It's all a matter of seeing who bites the bait and keeps their plan going, until the next gen comes along, when all this milking gets repeated.

  • Like 2

CGTC - Caucasus retexture  |  A-10A cockpit retexture  |  Shadows Reduced Impact  |  DCS 2.5.6 - a lighter alternative 

DCS terrain modules_July23_27pc_ns.pngDCS aircraft modules_July23_27pc_ns.png 

Spoiler

Win10 Pro x64  |  Intel i7 12700K (OC@ 5.1/5.0p + 4.0e)  |  64GB DDR4 (OC@ 3700 CL17 Crucial Ballistix)  |  RTX 3090 24GB EVGA FTW3 Ultra  |  2TB NVMe (MP600 Pro XT) + 500GB SSD (WD Blue) + 3TB HDD (Toshiba P300) + 1TB HDD (WD Blue)  |  Corsair RMX 850W  |  Asus Z690 TUF+ D4  |  TR PA120SE  |  Fractal Meshify-C  |  UAD Volt1 + Sennheiser HD-599SE  |  7x USB 3.0 Hub |  50'' 4K Philips PUS7608 UHD TV + Head Tracking  |  HP Reverb G1 Pro (VR)  |  TM Warthog + Logitech X56 

 

Posted
Gainward GeForce RTX 5080 Phoenix, Grafikkarte is listed ar 1169 Eur in Germany.
Not in stock, others easily 30-40% more.


Sent from my SM-A536B using Tapatalk

Posted

Apparently you can overclock some 5080's to 4090 levels, thought you can't simulate that extra 8Gb of VRAM (I can't believe the industry came down to this).

 

  • Like 1

.

Posted
Apparently you can overclock some 5080's to 4090 levels, thought you can't simulate that extra 8Gb of VRAM (I can't believe the industry came down to this).
 
Exactly this!

Sent from my SM-A536B using Tapatalk

Posted

Well all I can say is this ....Another Nvidia  Paper launch! 

Asus ROG Crosshair Hero VIII , Ryzen 3900X, Nzxt Kraken Z73, Vengence RBG Pro DDR4 3600mhz 32 GB, 2x Corsair MP 600 pcie4 M.2 2 TB , 2x Samsung Qvo SSD 2x TB, RTX 3090 FE, EVGA PSU 800watt, Steelseries Apex Pro. TM WartHog,TM TPR, Track IR, TM 2 x MFD, Asus VG289Q, Virpil Control Panel#2

Guest
This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...