

kksnowbear
Members-
Posts
877 -
Joined
-
Last visited
Content Type
Profiles
Forums
Events
Everything posted by kksnowbear
-
Actually, the overwhelming majority of gamers still plays at lower resolutions than 4k*. Competitive gamers in particular almost universally prefer 1080p. Cheaper and much higher frame rates and refresh rates. Many of them make money gaming...so that seems pretty serious to me. And the fact that CPUs can't keep up only makes the value proposition of more expensive GPUs even less. *According to at least one source that's fairly reputable, anyhow.
-
Yeah...I'm gonna go with the perspective of an actual world-renowned expert, just the same. My guess is I'm not the only one by far who will share Steve's perspective. Maybe he just doesn't understand the science either. PS, I believe Steve was considering MSRP in his comment regarding no increase in cost per frame. He actually says "Yeah it's probably going to be pretty damn ugly in terms of value, because at $2,000 US it's already pretty ugly..." Perhaps you missed that. But, like me, his comments indicate he seems to believe MSRP is a farce. In any case, the more someone actually does pay, it just gets worse than "no improvement".
-
The performance increase also isn't anywhere near 25% unless you're running 4k. For others running far more common resolutions like 1080 and 1440, it's down as low as single digits. (And I'm pretty sure Nvidia hasn't said this GPU is only for people with 4k monitors...unless they intend that everyone also has to buy 4k monitors now, just to get a "meh" increase for the cost of the 5090 *and* a 4k monitor...) Steve at HUB said it best: "After a little over two years, we're seeing no improvement in cost per frame." And that's at 4k and considering MSRP...so it's actually worse at lower resolutions and/or higher price - hard as it is to imagine worse than "no improvement" He also discusses the MSRP vs "real" pricing toward the end of the video. He addresses the question of a 5090 actually being available at MSRP ("...do we honestly believe the 5090 will be $2,000 US? Do we really believe that? Yeah it's probably going to be pretty damn ugly in terms of value, because at $2,000 US it's already pretty ugly..."). So, no improvement in "cost per frame", even at 4k - according to a widely respected, competent reviewer, based on first hand measurements and factual data, as opposed to baseless speculation and uninformed opinion. Oh, and I believe it's fair to say the 5080 is generally expected to be worse. As I said, even before the reviews were out: Just don't do it.
-
RTX 50 No hot spot sensor data?
kksnowbear replied to AngleOff66's topic in PC Hardware and Related Software
Just another strike against improved price v performance IMO. Not at all surprised by anything Nvidia does anymore TBH. -
Sure, when you present data to back up your opinion, rather than stating it as if it's fact without any proof. This is really simple: When you make a statement as if it's fact, be prepared to substantiate your position. Show data; have proof. Conversely, don't represent opinion as if it's fact. Don't get all ticked off because someone asks you to show data that you know you don't have. (Note there still is no data on purely rasterization performance of the 5080 and below, so...you couldn't very well have had data when you made the statement). Unless you're a reviewer, in which case you'd likely just say you were under NDA. But I get a feeling you're not.
-
More name-calling. I asked if you could stop the personal attacks. Guess not. Well, at least now that we do have reliable data, I guess the matter of price performance is pretty much resolved. No improvement in cost-per-frame. Again, you can lead a horse...I am very glad that at least the majority seems to get it. 5080 tomorrow...one down, one to go
-
Thank God for HUB...one chart to nicely summarize average performance for each resolution (why GN Steve can't get it is beyond me...then again, I consider the source). And the results? Improvement over 17 games, average FPS at 1080p: 204 vs 202 FPS - effectively zero and within margin of error. No human will ever be able to see the difference in two frames when already over 200. Even at 1440, the average improvement across 17 games was 12.2%. 192 frames vs 171; 21 frames when you're already getting 170+. Again, no human can reliably tell the difference. HUB Steve also discusses the MSRP vs "real" pricing value toward the end of the video. Nice of him to include the perspective I've been discussing all along (and getting jumped on for it). His words (21:29): "After a little over two years, we're seeing no improvement in cost per frame." (And that's at 4k, so it only gets worse at lower resolutions). He addresses the question of a 5090 actually being available at MSRP ("do we honestly believe the 5090 will be $2,000 US? Do we really believe that? Yeah it's probably going to be pretty damn ugly in terms of value, because at $2,000 US it's it's already pretty ugly...") And that's a competent, highly reputable reviewer. "Reliable enough", as it were. (Of course, I guess his science is all wrong too ) So much for "objective gain in price performance". Not really. You're confining your perspective to people more like yourself, is my guess. Of the people I build machines for - we're talking probably 100 machines over the past few years, at least...I've done exactly one for someone using 4k. The others - and this includes older flight simmers, younger 'twitch gamers' and MMO/RPG players...were running 1080, 1440 or some wider aspect ratio variant of those. All but 1 of 100 doesn't even want 4k, for their own reasons. I've actually told people that even a 4090 is overkill for 1080. But I'm still getting calls - from people running 1080p - asking when I can get them a 5090 *LMAO* Of course, I (do my best to) talk them out of it...but the fact that I have to talk them out of it illustrates the problem. It illustrates why Nvidia marketing BS and bragging rights are completely out of control. People are spending tons of money that many of them don't honestly have, for crap that isn't really going to do much of anything for them. Wanting 300+ frames when they use a 165Hz (or even 240) monitor and they're getting 250 already. Nvidia wants people to forget empirical and quantifiable data, because then they'll spend money they don't need to, on promises that don't pan out. They want people to buy on impulse, based on marketing blather, plain and simple. It's wrong, it's misleading, and it's harmful to gamers in general. And yes, that's an opinion...but it seems to agree with the data. You know, science.
-
Pretty sure he states that kind of increase "at 4k". I believe that's still not the majority of gamers by far. He goes on to acknowledge that at the more common lesser resolutions of 1080p and 1440, the advantage - even when it's not really CPU bound "really starts to come down". (~ 00:50) So I guess, according to Nvidia's reasoning, we all must also pay for 4k monitors to get the benefit of paying for a 5090 Would've been appropriate to show a combined chart of the games tested at 1080p and 1440...but I guess that's not how Steve at GN sees things lol...as it is, you have to plod through manually to see that the lower resolutions are...well, much less favorable, we'll say (I'm admittedly still plodding through). I've already seen quite a few games with 6 or 8% improvement at 1080/no 'magic'.
-
Just read Nvidia has bumped the reviewers embargo on the 5080...(not the 5090 mind you, just the 5080). I bet the reviewers, who already expended resources working with 5080s and now will probably wind up starting over, are really pleased with Nvidia right now. Hard to believe some people can't see all this nonsense for what it is. Oh well. Ya can lead a horse...
-
What you're now describing is various people's opinions about whether something or someone is/not reliable. Unfortunately that's not the same as the meaning of "reliable" itself. Reliable means consistently good in quality or performance; able to be trusted. There's no opinion in what "reliable" means. Something is reliable or it's not. It's consistently good in quality or it's not. And again with the personal attacks! LOL How, exactly, do I prevent people from having a different opinion? There are pages on this very thread to show you're wrong about this, too: If it were true I'm keeping anyone from having an opinion of their own, then I'd prevent anyone from arguing with me here. Yet that's not happening. I'm simply expressing my own perspective, calling BS when I see it. You're just ticked off because you had no data to back up your claim when I called you on it, simple. All this going on about science and why numbers can't be empirical is just subterfuge to obscure the fact that you made an unqualified statement, when you had no data to support your assertion. In this mysterious "scientific" world if yours, nothing can be quantified, making it very convenient for people like you, who want to say anything they want and have it accepted as fact. In the real world (where I earned a living in a very technical industry), there are quantifiable values. There are empirical proofs. Electronics don't work on opinions, and computers don't work on guesses. It's still the mathematical difference in a 1 and a 0. Now you assert that all advice is unreliable. I guess it's impossible, then, that I've been professionally compensated my entire adult life based on my perspectives about how computers work. Fact is, many people here and on other gaming/sim forums have at times trusted my advice as "reliable enough" to spend their hard-earned money on. It's really hard to imagine successful businesses and individuals with well-functioning machines would trust someone with a complete and utter lack of understanding about how all this works. But since, in your world where nothing is empirical and reliable, $100 is apparently not $100, and we can all pay Nvidia with fake bills, in exchange for their fake frames. Now we're talkin' PS I'm an honorably discharged veteran who spent about a quarter of my adult life defending people's right to have and express their views, and I find any suggestion to the contrary genuinely offensive. I'd really appreciate it if you laid off the personal attacks and stick to the actual subject.
-
Adequate Storage Suggestion!
kksnowbear replied to mytai01's topic in PC Hardware and Related Software
Yes, the faster the storage the less the impact, generally. However, as I mentioned it will still happen at times, and ideally it would be completely eliminated. The cost of a separate, small boot drive is a tiny fraction of a typical gaming machine these days, so it's not usually a big obstacle. There are other benefits to having a separate OS drive as well. And it is a great idea to include storage considerations as part of designing any build. -
Adequate Storage Suggestion!
kksnowbear replied to mytai01's topic in PC Hardware and Related Software
It's always better for performance to have separate drives for the OS and any game(s). Otherwise, requests for data from a single drive will cause "contention", which happens when multiple processes try to get data from a single storage device at the same time. Regardless of how fast the storage device is, somebody's having to wait. How often it happens, and to what extent it causes delays, etc will vary - but it will happen. Easiest way to avoid this is keep things on different drives, as much as practical/within reason. Naturally, it does have to be balanced against cost, but that needn't always be prohibitive. And as mentioned above, most full sized motherboards (ATX) have two or even three M2 slots - but it does depend on what generation of board, and smaller "micro ATX" models have fewer. If you can provide your motherboard model number, it will help confirm details. -
Again with the personal attacks. And nope, I don't think any such thing. I just don't call something "proof" unless it actually is proof (even though I can be mistaken, and make corrections as needed). My "personal threshold of what is valid" is based on observable fact. Empirical measurement. Reliable data. Not speculation and opinion, or bullsh*t marketing. I'm not sure why that offends you, other than because it's your opinion I'm not just going to accept without question. And I already explained I have no problem with expression of opinion; I enjoy discussion involving different perspectives. But if someone represents something as fact, they need to have data to support that. Like the grade school math teacher tells us: Show your work. Otherwise there's no way for the reader to actually trust anything written here; basically *all* advice becomes unreliable. People can just make up whatever they want, since nothing is ever subject to scrutiny. That's why facts and reliable data are important. They help separate bad ideas from good ideas, and what is BS opinion vs what is trustworthy advice.
-
Nope. The fact that you have to use the word "enough" means you're talking about an opinion. I did not use the word "enough". Reliable data means exactly what it says; it is not subjective. $100 is $100. 100mph = 100mph. These are empirical fact, not subjective. Nope. Again, you used the word "enough", which changes the entire context from objective to subjective. I didn't say "reliable enough".
-
Uhh...no. Again: the continued discussion about MSRP is because you cannot have any meaningful price performance assessments without reliable data for both price and performance. No real prices are in effect, since you cannot buy what nobody has in stock. And no real performance data is available, because Nvidia only published biased marketing drivel. No reliable data means no valid price performance comparisons are possible...but some in this thread continue to act as if valid conclusions are possible. How's that, when no accurate data performance is available yet (not counting marketing BS)? How do you establish price v performance based on prices that apply to stock no one has? And I don't know where your idea of MSRP comes from, but this isn't car sales. MSRP in this market is typically the lowest price you'll see. So, since most any price for actual in stock hardware will be higher (if you can get the cards at all) then price v performance will be worse at "street" price than if you could buy cards at MSRP. The significance of MSRP (price) and fake frames etc (performance) is that these things mean Nvidia's trying to convince consumers they're getting more (fake frames) for less (MSRP). The reality is exactly opposite; you're getting less performance for more money. That's what many people are pissed about. You can't discuss price v performance without meaningful, accurate data, for both price and performance. Hopefully that clarifies things for you. (This was actually your quote, not sure why the site is saying its mine) Anyway, this part is actually correct - which is another reason AIB MSRP is not really the number to use if you're trying to prove price performance is better like some here are: It'll only come out worse by using a higher value for price. (More cost, roughly same performance). You'd actually want the lowest possible cost (Nvidia MSRP) and the highest performance data. Trouble is, as before, Nvidia MSRP isn't reliable because it won't apply to most of the people most of the time; almost certainly not right at release.
-
Well, I'm not 100% sure what you mean...but I do believe that's what I've been saying all along. You can't discuss price v performance without solid data on either one. So far no one here has this data (in spite of trying to sound like they do). And using Nvidia's BS marketing to establish meaningful values for either price or performance is delusional.
-
I think you missed my point (about you buying a 5090)... But, whatever I'm considering selling my 4090 but for a reasonable and fair price. I wouldn't ask scalper's prices for anything, ever. Guess that's just me. Now if you'll excuse me, I gotta talk to a guy who I bet that I could get you to say you were going to buy a 5090 before they were even reviewed...
-
You were likely to be among the first to buy a 5090 to begin with, in my estimation. And buying one, regardless of price, doesn't change that performance for the price is said to be "disappointing" even at MSRP (Although maybe you can get it for a lot less than MSRP...that's what it would take). But if you wanna spend that kind of money on a GPU that's "disappointing" right out of the box, just to make you feel like you're right, by all means Wouldn't surprise me in the least.
-
You did not get your 4090 for Nvidia's $1599 MSRP. This is really, really simple. The fact you paid more for an AIB card, but still only got a 4090 isn't exactly helping your argument. As I've repeatedly explained, most sensible people and a good number of (actual) experts are calling the whole thing for what it is. So you're not arguing with me, you're arguing with all of them.
-
Uh, no. I've identified my perspectives as just that. I also stated quite clearly I have no problem with expressing opinion, perspectives etc. The problem is when someone, like you, wants to post opinion and have it treated as fact. I explained why it's misleading and harmful. I've already quoted you several times, showing you were representing your opinions as fact. If you'd like, I can quote you again: "There is objectively a gain in price/performance of rasterization, without factoring in the fake frames, for all but the 5090" You have no factual data upon which to base comments like this, but it is stated as if it's fact. I asked multiple times, but the truth is you don't have it. That means your claim about rasterization is (at present) an opinion. Since this was also the basis for your assertion regarding price/performance, that claim is also unproven as of when it was stated. It was speculation, just as I said.