-
Posts
593 -
Joined
-
Last visited
Content Type
Profiles
Forums
Events
Everything posted by Waxer
-
I am done with this thread. Too many narrow minded "people with limited understanding" diverting the thread into CPU issues - without reading context of the thread.
-
The i5 10600K is a very interesting chip. And for a gamer, good value.
-
Good data. Thank you mate. If I get one of these rather than Navi 21, then I will either go Founders, Asus TUF, Asus Strix. Maybe FTW3 if I can get one without paying silly prices or waiting til 2021!
-
Base model Zotac = Bad. Base model EVGA / XC3 = "Within specifications" Base model Asus/ TUF = using the more expensive capacitors. Testers found this one of the best budget end cards of the 3080 AIBs and a better performer than the Founders Edition. Asus engineers did a good job. You can take comfort in Jacobs words about the XC3, but the hardware re engineering was limited to the FTW3. The XC3 was not intended for pushing overclocks and likelihood is that such activity will be hardware limited and that EVGA will push a bios that will prevent the failure / crash limit being reached.
-
Jayz2cents video where he takes the backplate off the XC3 to show the cheaper capacitors. Solution is likely software patch to lower the allowed overclock bins. Pretty confident the higher specification FTW3s would not repeat the same mistake. Edit: XC3 uses 5 of the cheaper caps, not 6. With 1 bank of more expensive mini caps (not 2, not 0). And EVGA's position is that for the XC3 it is "within specifications". ie don't expect stability if you push the overclock. FTW3: EVGA reengineered the card to get rid of the 2 of the cheaper caps and install two banks of the many, smaller caps. This explains the shipping delay on launch. But the card should be good for retail release.
-
Never mind: [interesting. Please share details: specifically which AIBs?] For people interested in details, either look at latest Igor's Labs or Jayz2cents video. Low end EVGA, Zotac, Gigabyte all using lower specification capacitor arrays on their low end boards. This may not be the case (hope not) on more expensive boards.
-
Sure, sure. But the OP made an unsubstantiated comment. He was told he was wrong. He doubled down and offered proof. So yeah... I'd like to see nessuno0505's data. I'll laugh if he links to the pre-release leaker that was showing game benchmark results extrapolated from synthetic benchmarks. But yeah... let's see what you got nessuno0505.
-
I'd still like to see his data.
-
Sure: go ahead. Let's see your source.
-
Right. Somewhere else on this forum - can't remember where - people were insisting that their mid range 650W, 750W and 850W PSUs would definitely be sufficient. What these people don't seem to understand is that while the overall PSU is rated to - well, let's just take the mid case which is a typical nice mid range PSU... 750W - hitting 750W power draw is NOT THE ONLY ISSUE. Far more fundamental is that these PSUs have detailed specifications that describe limiting power draw on individual rails of the PSU. If peak current draw on a single +12V rail exceeds the maximum specified, while the overall power draw on the PSU could be within specification, the user could exceed the limit of an individual power rail. What happens then? Either a system crash, or artefacts, or an unexpected slow system with system instability. GDDR6 used on Turing cards and many earlier generations of Nvidia GPUs is not error correcting: if GPUs ask for more current and this is not supplied sufficiently quickly non error correcting will often exhibit artefacts on screen before ultimately crashing the system. GDDR6X used on Ampere is error correcting. Hence if the GPU detects memory inconsistencies, the GPU tries to resolve the inconsistency by redoing calculations. Hence before system crash your system might run slow. And then if the GPU really can't figure out what the hell is going on then it might crash. Igors lab has done testing where he shows that the GDDR6X on Ampere is running at c.100 deg C on the Founders Edition 3080 with backplate. The memory is rated to 105 deg C, so that is within specification... but it is awfully close. For that reason - while Micron produce GDDR6X capable of running at 21Gbps - Nvidia chose 19Gbps rated GDDR6X for the 3080 and 3090. If they fitted the higher clocked stuff they would have bigger problems cooling it and the memory controller of their 8nm Samsung GPU die. So on top of all the usual reasons for RMAs (manufacturing defects, component failures etc) we have two additional sources from weak design / design compromises resulting from Nvidia's choice of Samsung 8nm and GDDR6X: 1) high system power draw, causing system instability on systems with insufficiently powerful GPUs (or old, degraded PSUs) and 2) crashing causes by system components - very often the memory - getting too hot. Note that electronic components wear out: electron creep in transistors. This happens faster when silicon is run close to its maximum temperatures for long periods. Hence the importance of 3 year plus warranties and a AIB service centre with customer friendly turn around. I am not saying don't get Ampere... I might get one myself. But I am saying buyer beware. And AMD's offering this season deserves a look in as it is using a more power efficient TSMC node for the GPU and it is using GDDR6 which is likely to be easier to cool. This is why Nvidia's Ampere Quadro cards are using the "inferior" GDDR6, not 6X. "Moores Law is Dead" did a good video on this for people that are interested in the detail. You might not care about a melting polar ice cap. Maybe you do. Maybe you don't. But I do expect you care if your GPU works and keeps working over it's lifecycle.
-
I've seen that claim and it is just not credible. Either some settings have been changed; otherwise there is a CPU or memory bottleneck.
-
No. No. No. More like 10% in the best case scenario of 4K gaming. 18% only in benchmarking. It is all in the thread already, if you bother to read it. (Or read or listen to the numerous reviews out there.
-
Crickey... that sounds like an advert for the 3090. :megalol: We really need the fps number for a 3080 in same test conditions to put that in greater perspective.
-
I am thinking along the same lines as you guys. My intention is to wait until October and wait and see what the 6900 XT is like. While I think that it will be good I am not expecting it to surpass the 3080, except in memory capacity. There might be specific titles where it outperforms, but I doubt if it will be even a marginal win, let alone a significant win overall. It might be a bit cheaper, but even then if we are talking c.$100 difference I would rather get the higher performing card, regardless of a fairly small price difference. AMD does have ray tracing (not that I care), but it's implementation of DHSS is not as powerful as DHSS 2.0 yet. So big advantage to Nvidia there. And overall Nvidia's software support is stronger. This might well change over the years with software development done primarily for these new consoles but by the time that has happened it will be GPU upgrade time again. The 3080 / 20 will be interesting for sure, but if the 6900 XT does not clearly beat the 3800 / 10 then I imagine NVidia will be quite greedy on the pricing of that card, especially considering how high the demand was at the 3080 / 10's launch. Meanwhile I am not yet convinced that 20GB of GDDR6X or 16GB of GDDR6 is necessary for DCS at 4K. 10GB will probably be fine. Even if you find yourself hitting GPU memory limits all it takes is the lowering of one of two settings and you are back to the races. I will probably get a 3080 / 10. Either the beautifully designed Founders Edition, a Strix or a FTW3. But I still want to wait and see what 6900 XT is like. ... besides... I wasn't as fast as the bots on launch day!
-
Screen res: I actually use a 5K monitor, so 15mn pixels compared to 4K monitor 8mn pixels. So yeah, I am interested in 1440p --> 4K performance because it gives me a taste of what to expect on 5K. (And this explains why I get peeved with people talking about CPU bottlenecks, who refuse to accept that someone else with a different system could be getting GPU bottleneck). CPU: Comment about "not much you can do to improve CPU performance." Assuming that you are already using a Comet Lake at 5.0-5.3 GHz, yes and no. Clearly if you have such a rig then you will be using a relatively fast low latency memory kit such as 3200 CL14, or 3600 CL16. But I've seen some people get a few extra percentage of performance with a 10xxx series chip using expensive Samsung B die overlocking memory and tuning the timings carefully. Here is a link to one such guy that has managed to squeeze a little extra performance from his system. Note that he is using an Asus Z490 Apex motherboard. (Other option would be the unobtanium EVGA Z490 Dark). https://imgur.com/a/OkMl9sN Results from memory testing on an earlier build of his: https://kingfaris.co.uk/ram/15
-
We've got some folks in the community upgrading from the 2080 Ti, so rather than speculate on DCS by extrapolating from MSFS 2020 (they are quite different executions of a flight simulator and MSFS is also very early in release and will no doubt get lots of optimisations in the future) lets just wait a little to get feedback from the early adopters within our community. And yeah... I have no patience for a CPU bottleneck discussion. It has been done to death. Yes, we know it is important, and it could be an issue for some people with some setups. But let's wait and see peoples' actual experience.
-
Wasn't much luck for the poor soul having a heart attack either. Hope you were able to help them. :thumbup:
-
The testing shows that 3090 is at most 18% better than the 3080 at 4K. And 4K results could be as little as 4%. Typical / average result is 8-11%, so Nvidia not flat out lying, but erring on the side of generosity to their new product as you would expect. Some people just want the best because they can. (They make money, and they don't spend it on racing yachts or private jets etc). And also I don't think you should overlook the fantastic Founders Edition cooler. Using 3 slots I expect it to perform very favourably against the AIBs. The 3080 Founders Edition cooler suffered in comparison to AIBs as it was only 2 slot, against two and a half to three slots with the AIB cards. So for a 2 slot solution it was performing above it's scale. Anyways, I would much rather a 3090 at RRP than overpay for a 3080 right now.
-
I had a chat to a very nice / helpful sales guy at one of the stores: He basically told me that they had 3080s (the conversation was not about the 3090) in stock currently. However, because "they didn't make much money from the GPUs", all of the 3080s that they had were reserved for their full system builds where overall sales and margins are a lot higher. They expect to be freeing up GPU stock for self builds / upgrades in October. I doubt many system builds will include 3090s, and I imagine margins are a little better so maybe the retailers are more willing to just pass these on without keeping many for their system builds? I did hear of some nutters on Reddit that were buying entire systems just so they could get their paws on 3080s on launch. Seem all a bit ridiculous to me. But that is capitalism for you.
-
There you go: enjoy! And remember your mates on the forum... let us know how much of an improvement from the 2080 Ti in DCS.
-
Ahh, sucks. OP is in Sweden, not UK. I guess we are a bigger market for the bots to target. Never mind... you get to wait until all the information gets in to make your choice. I remember you saying that you were more focused on ultimate performance than on price, but even so you will be armed with the data and hopefully have DCS benchmark data to make you decision. Even if you still go for the BFGPU in the end! Besides the 2080 Ti still no slouch.
-
That is great. Congrats. Glad the actual humans got in on this one.
-
Well... this isn't the same as RMA figures over a 12 month period, but this is an inauspicious start. https://videocardz.com/newz/geforce-rtx-3080-sees-increasing-reports-of-crashes-in-games
-
Upgrade? Yes, if you can easily afford it. *** BUT *** You are literally 2 weeks away from the announcement of Zen 3. Sure... the bots will be rebooted for action, but I would wait myself. I know you can always make this argument... but 2 weeks...