Jump to content

Recommended Posts

Posted (edited)
15 minutes ago, SharpeXB said:

Well it has every time for me or the demand dies down and they’re available everywhere. No reason to expect that will be different this time. I’ll let you know. 😉

Your own personal experience buying a few GPUs does not a market make.  Being on a wait list (still) doesn't guarantee a card.

(And yes, we all now understand you'll be buying a 5090, as I'm sure was already the case before this discussion ever started...see above re: target market.  Can't wait to see your new sig...lol)

15 minutes ago, SharpeXB said:

Nvidia doesn’t have a monopoly here although they do make a good product. Again both the RTX 4080 and equivalent AMD 7900 XTX sell at the same $999. If Nvidia is price fixing why is the AMD competitor just as expensive?

No one said anything about a monopoly.  And what Nvidia is doing with 50-series supply has zero to do with AMD or the 4080.

As I said already, Nvidia is perfectly happy if some people believe that simple supply and demand explains their behavior.  Some of us know better.

Edited by kksnowbear

Free professional advice: Do not rely upon any advice concerning computers from anyone who uses the terms "beast" or "rocking" to refer to computer hardware.  Just...don't.  You've been warned.

While we're at it, people should stop using the term "uplift" to convey "increase".  This is a technical endeavor, we're not in church or at the movies - and it's science, not drama.

Posted (edited)
21 minutes ago, kksnowbear said:

Your own personal experience buying a few GPUs does not a market make. 

Well that’s just my experience here in the US. I can’t imagine that’s any different for other people here. Obviously I can’t speak for the whole world. 

21 minutes ago, kksnowbear said:

And yes, we all now understand you'll be buying a 5090

I’m not totally sold on it yet but we’ll see. My usual thing is to just upgrade the GPU every cycle and sell my old card while it can still get a good price. Thus cushioning the bleeding edge blow. Bottom line a 5090 is still an improvement albeit a small one. And it will still be a boost in DCS. It’s either that or wait two more years…

21 minutes ago, kksnowbear said:

No one said anything about a monopoly.

So how could Nvidia be engaged in price fixing without a monopoly? If their prices were artificially high you’d see AMD undercut them. Yet that competitor charges the same. 

“Demand will still be high because it's relative to supply, which everyone already knows Nvidia is manipulating to control the market.”

PS that statement implies Nvidia has a monopoly otherwise how else would they “control the market”? 

Edited by SharpeXB

i9-14900KS | ASUS ROG MAXIMUS Z790 HERO | 64GB DDR5 5600MHz | iCUE H150i Liquid CPU Cooler | ASUS TUF GeForce RTX 4090 OC | Windows 11 Home | 2TB Samsung 980 PRO NVMe | Corsair RM1000x | LG 48GQ900-B 4K OLED Monitor | CH Fighterstick | Ch Pro Throttle | CH Pro Pedals | TrackIR 5

Posted (edited)
1 hour ago, SharpeXB said:

Obviously I can’t speak for the whole world. 

Obviously.  Realistically, not for anything of any statistical significance.

1 hour ago, SharpeXB said:

Bottom line a 5090 is still an improvement albeit a small one. And it will still be a boost in DCS.

Beyond a frame rate you've indicated several times you already cap, because your 4090 exceeds your monitor's 120 refresh in DCS?

So you'll be buying a new monitor (to get a higher refresh rate; see $4300 beakdown above)...or getting more frames than your monitor can physically display. 

Alrighty then.

1 hour ago, SharpeXB said:

So how could Nvidia be engaged in price fixing without a monopoly?

You keep trying to impose a grade-school explanation of simple supply and demand economic theory...Nvidia is limiting supply so that demand will increase price.  Pretty simple, really.  Most people seem to get what's happening, anyway.

No, you wouldn't necessarily see AMD undercut them, because some people won't buy an AMD GPU regardless.   AMD's smart enough to realize that there's no point cutting their own throats to entice people who aren't going to buy anyway.

1 hour ago, SharpeXB said:

PS that statement implies Nvidia has a monopoly otherwise how else would they “control the market”?

Only if you impose a simple 'supply and demand' argument.  Look around online, there are plenty of reputable sources discussing how Nvidia is limiting supply to levels not seen before.  Again, that 'law' exists only as a means to explain an inversely proportional relationship at a grade-school level.  It only explains what happens *when* supply goes down; it doesn't account for *how*.

Nvidia isn't controlling the entire GPU market and I didn't say that.  They're controlling the Nvidia 50-series GPU market.

Edited by kksnowbear

Free professional advice: Do not rely upon any advice concerning computers from anyone who uses the terms "beast" or "rocking" to refer to computer hardware.  Just...don't.  You've been warned.

While we're at it, people should stop using the term "uplift" to convey "increase".  This is a technical endeavor, we're not in church or at the movies - and it's science, not drama.

Posted (edited)
1 hour ago, SharpeXB said:

I’m not totally sold on it yet but we’ll see. My usual thing is to just upgrade the GPU every cycle and sell my old card while it can still get a good price. Thus cushioning the bleeding edge blow. Bottom line a 5090 is still an improvement albeit a small one. And it will still be a boost in DCS. It’s either that or wait two more years…

huh.... 😮 so what's this then?
 

15 hours ago, SharpeXB said:

DCS is an older game. It’s far from the most demanding title out there. It’s not MSFS that’s for sure. Until I replaced my 7-10 year old PC two years ago it was the only game I could still run decently. The performance trouble with DCS comes from trying to run it in VR at the higher setting which are intended for 2D

You wrote that just yesterday.  So, which is which? 

You don't use VR (IIRC?). And with that top system, why buy the 5090 then? 🤨 

hmmmmmmm contraditory fishy comments there... 🤔


You definitely don't need it. If you want to blow that money, then make some donations to ED - perhaps it'll help with more manpower, to bring us more performance/bug fixes (and less need for expensive hardware)...... :music_whistling:

 

Edited by LucShep

CGTC - Caucasus retexture  |  A-10A cockpit retexture  |  Shadows Reduced Impact  |  DCS 2.5.6 - a lighter alternative 

DCS terrain modules_July23_27pc_ns.pngDCS aircraft modules_July23_27pc_ns.png 

Spoiler

Win10 Pro x64  |  Intel i7 12700K (OC@ 5.1/5.0p + 4.0e)  |  64GB DDR4 (OC@ 3700 CL17 Crucial Ballistix)  |  RTX 3090 24GB EVGA FTW3 Ultra  |  2TB NVMe (MP600 Pro XT) + 500GB SSD (WD Blue) + 3TB HDD (Toshiba P300) + 1TB HDD (WD Blue)  |  Corsair RMX 850W  |  Asus Z690 TUF+ D4  |  TR PA120SE  |  Fractal Meshify-C  |  UAD Volt1 + Sennheiser HD-599SE  |  7x USB 3.0 Hub |  50'' 4K Philips PUS7608 UHD TV + Head Tracking  |  HP Reverb G1 Pro (VR)  |  TM Warthog + Logitech X56 

 

Posted (edited)
49 minutes ago, LucShep said:

You definitely don't need it. If you want to blow that money, then make some donations to ED - perhaps it'll help with more manpower to bring us more performance/bug fixes...... :music_whistling:

+1

Seriously.

48 minutes ago, LucShep said:

And with that top system, why buy the 5090 then? 🤨

I think this question more or less answers itself... 😄 😄 😄

 

Edited by kksnowbear

Free professional advice: Do not rely upon any advice concerning computers from anyone who uses the terms "beast" or "rocking" to refer to computer hardware.  Just...don't.  You've been warned.

While we're at it, people should stop using the term "uplift" to convey "increase".  This is a technical endeavor, we're not in church or at the movies - and it's science, not drama.

Posted (edited)

Incidentally, I should say here that I have absolutely no problem with "bragging rights" if that's what makes someone happy...

Just "own it", that's all.

Edited by kksnowbear

Free professional advice: Do not rely upon any advice concerning computers from anyone who uses the terms "beast" or "rocking" to refer to computer hardware.  Just...don't.  You've been warned.

While we're at it, people should stop using the term "uplift" to convey "increase".  This is a technical endeavor, we're not in church or at the movies - and it's science, not drama.

Posted
2 hours ago, LucShep said:

And with that top system, why buy the 5090 then?

I’m actually GPU limited with the 4090 so I can only assume I would see a boost in this game. Not much though I expect, all the tests seem to indicate just 25%. Not great. Plus over the lifetime of using that card things can change. Like I said I’m not entirely sold on the idea. I do also play other games where staying current with the graphics card is probably a help. 
 

i9-14900KS | ASUS ROG MAXIMUS Z790 HERO | 64GB DDR5 5600MHz | iCUE H150i Liquid CPU Cooler | ASUS TUF GeForce RTX 4090 OC | Windows 11 Home | 2TB Samsung 980 PRO NVMe | Corsair RM1000x | LG 48GQ900-B 4K OLED Monitor | CH Fighterstick | Ch Pro Throttle | CH Pro Pedals | TrackIR 5

Posted (edited)
1 hour ago, SharpeXB said:

I’m actually GPU limited with the 4090 so I can only assume I would see a boost in this game.

But you've mentioned several times that your 4090 gives you a frame rate in DCS that exceeds your monitor's 120hz refresh rate, so you cap frames at 117.

Even if a 5090 were an improvement, how's that going to increase your monitor's refresh rate?  You'll be generating frames that your monitor cannot display.

Far as the future and other games go...well, unless you're getting a native render rate around 100-120, MFG is likely to do more harm than good (please see HUB video linked above, and note that this is more likely in games like flight sims where your view is constantly and quickly changing).

And if you *are* getting 100-120 "non-magic" frames, why would you pay $2150 (possibly much more) for more frames, when your monitor cannot display more than 120 regardless?

Edited by kksnowbear

Free professional advice: Do not rely upon any advice concerning computers from anyone who uses the terms "beast" or "rocking" to refer to computer hardware.  Just...don't.  You've been warned.

While we're at it, people should stop using the term "uplift" to convey "increase".  This is a technical endeavor, we're not in church or at the movies - and it's science, not drama.

Posted
1 hour ago, kksnowbear said:

But you've mentioned several times that your 4090 gives you a frame rate in DCS that exceeds your monitor's 120hz refresh rate, so you cap frames at 117.

In some situations it does exceed the refresh rate. In some it doesn’t. And DCS gets more demanding all the time. Or maybe DCS gets easier…? Maybe Vulkan lightens the load. Or maybe Vulkan allows ED to pack in more stuff and performance drops again 🤷‍♂️ lots can change year after year. 

i9-14900KS | ASUS ROG MAXIMUS Z790 HERO | 64GB DDR5 5600MHz | iCUE H150i Liquid CPU Cooler | ASUS TUF GeForce RTX 4090 OC | Windows 11 Home | 2TB Samsung 980 PRO NVMe | Corsair RM1000x | LG 48GQ900-B 4K OLED Monitor | CH Fighterstick | Ch Pro Throttle | CH Pro Pedals | TrackIR 5

Posted (edited)
28 minutes ago, SharpeXB said:

In some situations it does exceed the refresh rate. In some it doesn’t. And DCS gets more demanding all the time. Or maybe DCS gets easier…? Maybe Vulkan lightens the load. Or maybe Vulkan allows ED to pack in more stuff and performance drops again 🤷‍♂️ lots can change year after year. 

Yes, but the entire intent of getting a 5090 is to get a higher frame rate - so if the 4090 is right on the edge of that, unless you get another monitor...you're generating frames (and heat) you can't see.  If the 'magic' smoke and mirrors were to work (not in DCS) then you'll exceed 120, but your monitor can't display that.

And if a game that actually supports MFG drops below a 100-120 native rendering rate - for whatever reason - then MFG is likely to look bad (and worse in things like flight sims).  FG *might* be OK (it has limits too), but that's fully supported by the 40 series; no need to buy into the 50s.

Seems to me no matter how you look at it, you're either buying something you can't use currently for the features that are it's primary selling points, or, in order to use those features, you're looking at double the price of the GPU itself - some $4000 - for maybe ~20% gain in any game(s) that don't support the 'magic' (like DCS), or 350FPS using MFG that might be a laggy, blurry mess.

But hey, you do you.

Edited by kksnowbear

Free professional advice: Do not rely upon any advice concerning computers from anyone who uses the terms "beast" or "rocking" to refer to computer hardware.  Just...don't.  You've been warned.

While we're at it, people should stop using the term "uplift" to convey "increase".  This is a technical endeavor, we're not in church or at the movies - and it's science, not drama.

Posted
20 hours ago, LucShep said:

I'm noticing lots of new people in forums, clearly resorting to older games, and also emulators (previous gen consoles gaming, on pc), finding out that there's a huge list of quality game titles providing gameplay fun and enough eye candy. More so with modding, which also empowers and prolongues the life of such games, which don't require ubber-hardware, and are relatively bug-free at this point.

Yeah, this is the way if you want to game for cheap. You can use older hardware, and buy games with huge discounts. And in many genres, the graphics improvements are not as much as they used to be, so you don't necessarily miss all that much.

However, there are a bunch of caveats, like VR being hard to do on older hardware, some multiplayer titles having dead servers if they are old, etc.

Posted
51 minutes ago, kksnowbear said:

Yes, but the entire intent of getting a 5090 is to get a higher frame rate - so if the 4090 is right on the edge of that, unless you get another monitor...you're generating frames (and heat) you can't see.  If the 'magic' smoke and mirrors were to work (not in DCS) then you'll exceed 120, but your monitor can't display that.

Yeah right at the moment the potential improvement wouldn’t be much. The only point of upgrading is to be ready for the future, and that’s only a so-so reason. DCS does take on a nice shine at 120Hz but getting that at the cost of quality, like enabling DLSS isn’t worth it IMO. Multi Frame Generation from a 5090 just seems like overkill. And any loss of quality doesn’t seem worthwhile at all. 
 

i9-14900KS | ASUS ROG MAXIMUS Z790 HERO | 64GB DDR5 5600MHz | iCUE H150i Liquid CPU Cooler | ASUS TUF GeForce RTX 4090 OC | Windows 11 Home | 2TB Samsung 980 PRO NVMe | Corsair RM1000x | LG 48GQ900-B 4K OLED Monitor | CH Fighterstick | Ch Pro Throttle | CH Pro Pedals | TrackIR 5

Posted
3 hours ago, kksnowbear said:

But you've mentioned several times that your 4090 gives you a frame rate in DCS that exceeds your monitor's 120hz refresh rate, so you cap frames at 117.

You are ignoring his FPS lows & averages and just looking at his FPS highs, which is what the capping is based on.

Also, there is actually a benefit to rendering more frames than the monitor can display: https://www.quora.com/Why-does-higher-FPS-than-my-monitor-can-support-make-games-smoother

Not saying that it is necessarily worth the money, but you might want to learn a bit more about how the rendering pipelines and such work, so you understand that your simplistic model of 'card FPS should match monitor FPS' is not necessarily correct.

Posted (edited)
9 hours ago, LucShep said:

As expected, the RTX5080 is even more underwhelming when compared to previous gen competitors...

RTX5080 review (same as youtube's HUB, but in written format):  https://www.techspot.com/review/2947-nvidia-geforce-rtx-5080/

Quoting:
"Sure, the 5080 was, on average, 11% faster – so at least we hit double digits – but that's still quite underwhelming for a next-gen GPU.
Compared to the original RTX 4080, it's only 14% faster, and when stacked against AMD's nearest competitor, it offers just an 8% gain over the 7900 XTX."

1080p.png1440p.png2160p.png


...and if you think that the expected "real" price increase (forget MSRP, it won't happen in practice) is far higher than the very small performance improvement, when compared to the current RTX4080S, it starts to look like this might be the worst release for an Nvidia GPU series since the early 2000s maligned FX-5000 series.  
 

The 17-game HUB average at 1080p, 5080: 176FPS, 4080 Super: 177FPS.  (See below about why a 4080 Super - but even if you used a plain 4080, it averaged 175).  1 or 2 two frames at ~175...that's margin of error.

So much for 'objective gain' in price/performance of rasterization, without factoring in the fake frames, for all but the 5090.

But, to be accurate, the point I raised about that comment was that there was actually no data at that time to support the claim - and there wasn't, of course.

Now the data is available, and we'll just say "it ain't good" 😄 😄 😄 

Per Steve:  Essentially, the GeForce RTX 5080 only managed to match the 4080 Super at this resolution. It seems the Blackwell architecture struggles slightly more than previous generations at lower resolutions, and this isn't always due to a CPU bottleneck.

Essentially, if you wanted this level of performance from a 16GB GeForce GPU, you could have gotten it a year ago.

Some may argue that the RTX 5080 should be compared to the original 4080, but that's nonsense.  The RTX 4080 was essentially a failed product – and that's not even referring to the "unlaunched" AD104 version. The $1,200 RTX 4080 we ultimately received was a disappointment, and most gamers agreed by not buying any. This led to stock sitting on shelves, forcing Nvidia to release the 4080 Super, which was essentially the same GPU in terms of performance but with a $200 price cut.

Maybe if Nvidia lowers the 5080 by $200, it'll be more of an objective gain in price/performance.  Unfortunately, that doesn't seem likely - at least for a while.

Edited by kksnowbear

Free professional advice: Do not rely upon any advice concerning computers from anyone who uses the terms "beast" or "rocking" to refer to computer hardware.  Just...don't.  You've been warned.

While we're at it, people should stop using the term "uplift" to convey "increase".  This is a technical endeavor, we're not in church or at the movies - and it's science, not drama.

Posted
6 minutes ago, Aapje said:

Also, there is actually a benefit to rendering more frames than the monitor can display

The trouble with that is screen tearing which I think competitive games just ignore in the quest for low latency. I’m not good enough that latency matters 😆

I understand the conventional wisdom with G-Sync is to turn off Vsync and avoid the latency, then cap your rate at or just below the screen refresh rate. If I cap right at 120 I get a little tearing hence the 117 setting. I set Low Latency in NCP to On, setting that to Ultra I think made DCS crash. 

i9-14900KS | ASUS ROG MAXIMUS Z790 HERO | 64GB DDR5 5600MHz | iCUE H150i Liquid CPU Cooler | ASUS TUF GeForce RTX 4090 OC | Windows 11 Home | 2TB Samsung 980 PRO NVMe | Corsair RM1000x | LG 48GQ900-B 4K OLED Monitor | CH Fighterstick | Ch Pro Throttle | CH Pro Pedals | TrackIR 5

Posted (edited)
1 hour ago, Aapje said:

Not saying that it is necessarily worth the money, but you might want to learn a bit more about how the rendering pipelines and such work, so you understand that your simplistic model of 'card FPS should match monitor FPS' is not necessarily correct.

The thread topic is NVIDIA 5 Series cards, as renamed.  My comments are strictly related to 50 series card performance (including as applies to refresh rates).  Your comments are about how monitors work, and have nothing directly to do with the topic of Nvidia 5 series cards.

I have asked many times to stay on topic, stop the personal attacks and insults, and stop trying to pick a fight with me.

Edited by kksnowbear

Free professional advice: Do not rely upon any advice concerning computers from anyone who uses the terms "beast" or "rocking" to refer to computer hardware.  Just...don't.  You've been warned.

While we're at it, people should stop using the term "uplift" to convey "increase".  This is a technical endeavor, we're not in church or at the movies - and it's science, not drama.

Posted (edited)
50 minutes ago, kksnowbear said:

I understand plenty, thanks. Your understanding apparently fails to grasp the concept that a 120Hz monitor physically cannot display more FPS than 120, period.

This statement betrays that you have a very simplistic mental model of the situation, and do not in fact 'understand plenty.'

I'll give you one example to show why you are wrong. The key to this is to understand that thinking in FPS is deceptive and the actual time to render the frame is more important.

If the GPU takes the entire 1/120th of a second to render a frame and can thus barely produce 120 FPS, then this means that the world state (which includes, but is not limited to your controller input) that the rendered image is based on, is going to be at least 1/120th of a second old. But if the GPU takes 1/240th of a second to render a frame, then the GPU can render two frames for every frame that is shown on the screen. Obviously both cannot be shown to the user, since the monitor is not fast enough for that. So what happens is that the oldest frame is dropped and the newer frame is shown. So now the shown frame is based on world state that is at least 1/240th of a second old, which means that you will have lower latency (up to twice as low, although that also depends on the CPU), compared to the situation with the slower GPU.

Of course, this is rather wasteful, since you are rendering frames that are not used. This is actually what Nvidia Reflex (1) is made for. It delays the moment that the CPU processes the latest world state and asks the GPU to render a frame, until the last possible moment. So if it only takes the GPU 1/240th of a second to render a frame, but you have the Nvidia Reflex framerate limit at 120 FPS, then the game is asked to render the world state at such a moment that there is only 1/240th of a second left before the monitor refreshes.

Of course, DCS does not support Reflex, which requires the game to cooperate, but the more wasteful method does work for DCS to lower the delay between what happens in the world state, and what is shown on the screen.

So there is a real benefit, although that doesn't mean that it is necessarily worth the downsides.

Edited by Aapje
  • Like 2
Posted (edited)
7 minutes ago, Aapje said:

This statement betrays that you have a very simplistic mental model of the situation, and do not in fact 'understand plenty.'

I'll give you one example to show why you are wrong. The key to this is to understand that thinking in FPS is deceptive and the actual time to render the frame is more important.

If the GPU takes the entire 1/120th of a second to render a frame and can thus barely produce 120 FPS, then this means that the world state (which includes, but is not limited to your controller input) that the rendered image is based on, is going to be at least 1/120th of a second old. But if the GPU takes 1/240th of a second to render a frame, then the GPU can render two frames for every frame that is shown on the screen. Obviously both cannot be shown to the user, since the monitor is not fast enough for that. So what happens is that the oldest frame is dropped and the newer frame is shown. So now the shown frame is based on world state that is at least 1/240th of a second old, which means that you will have lower latency (up to twice as low, although that also depends on the CPU), compared to the situation with the slower GPU.

Of course, this is rather wasteful, since you are rendering frames that are not used. This is actually what Nvidia Reflex (1) is made for. It delays the moment that the CPU processes the latest world state and asks the GPU to render a frame until the last possible moment. So if it only takes the GPU 1/240th of a second to render a frame, but you have the Nvidia Reflex framerate limit at 120 FPS, then the game is asked to render the world state at such a moment that there is only 1/240th of a second left before the monitor refreshes.

Of course, DCS does not support Reflex, which requires the game to cooperate, but the more wasteful method does work for DCS to lower the delay between what happens in the world state, and what is shown on the screen.

So there is a real benefit, although that doesn't mean that it is necessarily worth the downsides.

 

The topic was explicitly renamed to "NVIDIA 5 Series cards". This was made very clear.  Nothing in your post has anything to do with the topic.  Please stay on topic and stop the insults and personal attacks.

Edited by kksnowbear

Free professional advice: Do not rely upon any advice concerning computers from anyone who uses the terms "beast" or "rocking" to refer to computer hardware.  Just...don't.  You've been warned.

While we're at it, people should stop using the term "uplift" to convey "increase".  This is a technical endeavor, we're not in church or at the movies - and it's science, not drama.

Posted (edited)
7 minutes ago, kksnowbear said:

The topic was explicitly renamed to "NVIDIA 5 Series cards". This was made very clear.  Nothing in your post has anything to do with the topic.  Please stay on topic and stop the personal attacks.

Seriously? You talked about this as well, so then you somehow weren't concerned about this being off topic. Yet once you are shown to be wrong, you suddenly discovered that this is offtopic and you sadly can't respond to my comment. How convenient.

Also, you are again making a false claim of being a victim of a personal attack, even though I merely said that you have an inaccurate understanding, which is not a personal attack at all. If one is not allowed to say that someone else is wrong, then a friendly argument becomes impossible.

You might want to read up on what a personal attack actually is, and that does not include disagreement with someone's beliefs: https://en.wikipedia.org/wiki/Ad_hominem

Edited by Aapje
  • Like 1
Posted (edited)

I simply responded to your assertions concerning my prior comments - which were absolutely on topic.  Yours were not.

Please stay on topic and stop trying to pick a fight with me.  I've asked repeatedly.

Edited by kksnowbear

Free professional advice: Do not rely upon any advice concerning computers from anyone who uses the terms "beast" or "rocking" to refer to computer hardware.  Just...don't.  You've been warned.

While we're at it, people should stop using the term "uplift" to convey "increase".  This is a technical endeavor, we're not in church or at the movies - and it's science, not drama.

Posted (edited)
4 minutes ago, kksnowbear said:

I simply responded to your assertions concerning my prior comments - which were absolutely on topic.

Please stay on topic and stop trying to pick a fight with me.  I've asked repeatedly.

You are the one going offtopic by playing the victim, which is a strategy that you have employed repeatedly. Please respond to the actual arguments.

Edited by Aapje
Posted

Please stay on topic and stop trying to pick a fight with me.  I've asked repeatedly and politely.

Free professional advice: Do not rely upon any advice concerning computers from anyone who uses the terms "beast" or "rocking" to refer to computer hardware.  Just...don't.  You've been warned.

While we're at it, people should stop using the term "uplift" to convey "increase".  This is a technical endeavor, we're not in church or at the movies - and it's science, not drama.

Posted (edited)

I was being on topic by addressing whether the 5090 can provide a benefit even if you are already capped on your highs. You are the one who went off topic with your little victim spiel, once I showed that your claim is wrong.

PS. Repeating a falsehood does not make it true.

Edited by Aapje
Posted (edited)

A 50 series GPU cannot increase a monitor's maximum refresh rate. A 50 series GPU cannot make a monitor display frames at a rate higher than it's maximum refresh rate.  It doesn't work that way.

That's your response.

The topic has nothing to do with 'victim spiel'. Please stop the personal insults.  I've asked you many times, politely, to stop.

Edited by kksnowbear

Free professional advice: Do not rely upon any advice concerning computers from anyone who uses the terms "beast" or "rocking" to refer to computer hardware.  Just...don't.  You've been warned.

While we're at it, people should stop using the term "uplift" to convey "increase".  This is a technical endeavor, we're not in church or at the movies - and it's science, not drama.

Guest
This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...