Jump to content

Waxer

Members
  • Posts

    583
  • Joined

  • Last visited

Everything posted by Waxer

  1. AMD will be announcing Zen 3 in two and a half weeks. And... it is going to be good. Very, very, very good. Will it beat Comet Lake in gaming... yes, I believe so. Will it beat Rocket Lake? Maybe, but even if it doesn't it is likely to be a dead heat in anything above 1080p. Big IPC gains and closing the clock speed gap with Intel by another 200MHz or so. I would be tempted by a 10600K, but knowing Zen 3 is coming out I think it will be a no brainer. (Well, only if you intend to upgrade that is).
  2. No. Your maths is wrong: Pay 114% more for 4-11% gain in games. :megalol: https://videocardz.com/newz/nvidia-geforce-rtx-3090-gaming-performance-review-leaks-out 20% is just for synthetic benchmarks. And, no. You don't need to say anything.
  3. The video is 95% unless. All he does is multiply 3080 results by 20% on the basis of synthetic benchmark results. Meanwhile a Chinese site has leaked in game results at 4K in a Z490 / OCed 10900K system. Bottom line is 4-10% better fps in a selection of AAA games. Not 20%.
  4. There is nothing fundamentally wrong with this shopping list. In some areas what has been selected is overkill for DCS, but if the PC is being used for other applications too then there could be good reasons for this. Some specifics: 1. Getting K not KF = good suggestion. Not enough price differential. Makes near as no difference to overclock potential. And also gives you the option to use the computer with iGPU if you are without a graphics card while you wait for a new GPU to get delivered. (You can still get online, web browse etc). 2. No need for 10 core. i7 8 core fine too. And now that hyperthreading is enabled on the i5 the 6 core 10600K is a great, inexpensive CPU too. For DCS you would be giving nothing up getting the i5. (It would also be a bit easier to keep cool, than the 8 or 10 cores). 3. Memory 2x32GB is overkill. 2x16GB is plenty for DCS. Also I'd get the 3200 CL 14 or the 3600 CL 16 as the OP also advised. Get Samsung B-die if you can... typically G.Skill or Team Group. Not Corsair as their secondary timings are loose. 4. I used to own a Corsair HX1000i. It blew up. No reason. It just blew up. Big bang. Blue smoke. You don't need 1.2kW. 1.2kW is fine as the PSU will hardly break a sweat. 850W, 1000W, 1200W all fine. I'd probably pay the extra for Gold, Platinum and I typically would favour Seasonic or EVGA. 5. Nice Asus Hero motherboard chosen, but it is more than you need to spend. But also the MSI MEG Unify is good for the price (bit cheaper). If I was going to spend more than the Unify I would get the Apex in preference to the Hero. But it is personal preference, not a big deal or difference really. 6. I would stick to M.2 drives... just less messing around with cables and all. Basically what you got is great, just you could save a bit if you wanted and make a few slightly better choices. Mothing major.
  5. We've been through it multiple times and it is getting very, very boring. I want information on the new GPUs with DCS and VR. I am not interested on opinions about CPU limiting DCS. I simply ask that people that want to share their opinions about that do it in a thread about CPU vs GPU throttling separate to this one. Is that too much to ask?
  6. Indeed. And most people already understand that. And they also know based on their own use case how much of the time they are CPU bound and how often it is the GPU that is bounding their performance. Hence the wish to get BACK ON TOPIC. And the desire of learn from the experience of lucky community members that have taken delivery of an Ampere card and are now using it in our preferred application.
  7. I think that you are correct: RTX features use GPU RAM. Currently most RTX titles only have a couple of effects active. But DOOM for example uses a number of RTX features and this is one of the reasons it hammers the memory hard. 24GB certainly future proofs you. But so would the 20GB version of the 3080. You don't want to wait and you want the extra 5-10% performance... that is fine. There is also the consideration that Ampere's fps horsepower might still not be enough for too many RTX features to be on to a large extent at the same time... in that case the speed of the card would likely become an issue before 16GB (3070 16GB) or 20GB (3080 20GB) or 24GB (3090) became an issue.
  8. Please can people stay on topic and discuss the 3080 and 3090? If you want to talk about CPU bottlenecks please start a new thread.
  9. Thank for posting that link FoxTwo. If true - we can't be sure what is actual and what is synthetic - then RIP. That really makes an additional £750 over and above the 3080 price, very, very hard to justify. At least for all but the most price insensitive. And yes... the extra heat... RIP.
  10. Yeah. It is worth it. At least on Intel's side: they is low hanging fruit and you can easily get a decent few percent increase in DCS performance and all round performance from clicking a few buttons in bios. And provided that you don't have a completely potato PC ventelation and CPU cooling performance there is very little cost and no risk to doing so. Marginal downside (bit of extra power consumption and Saint Greta of Stockholm is angry with you, that's all), small upside. But to push a high overclock starts to involve compromises and is best left to someone that knows what they are doing. On AMD the built in algos are so effective that, no, overclocking is not worth it and more often that not you go backwards.
  11. Of course not. And I don't and never have advocated buying power hungry components for the sake of it. In fact if you check my recent posts you would find that I have been critical of Ampere for being a bit of a power hog. (Problems: noise, heat, energy bill, polar bears). That is why I have a relatively modest i7-7800X 6 core CPU (which I bought at 50% of RRP) and overclock to 4.8GHz. I think the 10600K is a great value CPU. I see loads of people with that CPU and a mid range motherboard getting 5.0GHz overclocks. For relative peanuts. If I was buying a CPU for DCS and I didn't actually have anything already, that is what I would pick if I had to buy this week. But I would still prefer to wait a few weeks to see what Zen 3 comes up with given the clock speeds are rumoured to be reaching 4.9 boost, boost is likely to be maintained for longer, and IPC is rumoured to be improved 20% on Zen 2. But the point is that my PSU operates in silent mode with passive cooling only. And my system is easily upgradable to ANYTHING I throw at it without having to upgrade PSU. I don't even have to think about it. It is a non issue for a component I only have to buy once where technology is moving relatively slowly. And - personal choice - I built my own custom PSU cables to correct lengths. Not changing PSU for a long time means that I can stick with my custom PSU cables without worrying about any potential compatibility problems. Likewise I spent a lot on my case. But it means that I never have to buy another PC case. Like ever. Someone that a year ago was sat there with an online calculator that spat out the answer 650W power supply and is now looking to upgrade to a 3080 - or heaven forbid a 3090 - is now facing the prospect of buying a new PSU as well as a new GPU. Whereas if they had only bought an 850W PSU to begin with they would only have to upgrade one component. That is my point.
  12. My understanding is that the application leaves some headroom for non-application system use. But the allocation that it has taken can run out.
  13. It allocates 10GB of 11GB (if available). That does not mean it uses 10GB. There is plenty of discussion of this online. Unfortunately there is no easy way of telling how much is used until you actually run into a memory wall and your frame rates tank. I wish one of the Youtubers would do an analysis on this. It would be a popular video.
  14. I think that is the issue. DLSS fills in the blanks and makes guesses. The nature of DCS - outside the cockpit at least is that you are scanning the sky for a bandit, who is a couple of fast moving pixels. Maybe the software would work if you just wrote script instructing the algo to leave bandit dots alone and render them fully. RTX ray tracing and DLSS would certainly be a nice in cockpits however. But I'd rather see better multicore CPU optimisation as a priority than GPU bells and whistles like those. CPU generational performance is moving much slower than GPU performance.
  15. With respect this is a perfect example of someone who has armed himself with an online PSU calculator and made sensible choices for a mid-range PC. Nothing wrong with that. But back to my point: upgrading to "balls to the wall" components like a 10900K overclocked and a 3080 OC or 3090 OC (we are on a thread titled "3090? 3080? AMD!" after all) will require an upgrade to the PSU. How do you do that on your 600W PSU? Nvidia themselves make the point about customers underestimating the impact of PSU age on PSU peak performance. NVidia themselves make the point about customers underestimating the importance of transient responses for PSUs. People making PSU choices based on Wattage alone expected to run into trouble. Both of these points were made explicitly in the "deep dive Q&A" that Nvidia gave prior to Ampere's launch. RMAs of Ampere will be higher than average for an Nvidia release.
  16. I wish that I had £1 for every time that some well meaning, but ill informed, person advised me to get a 550-600W PSU for my PC builds. Variations of "ohh, you don't need more than that for 1 CPU and 1 GPU. And there is no point of SLI anymore." These flawed advice always ignore headroom for power hungry components, headroom for adding pumps, fans etc, headroom for overclocking. They also assume a saving of £10-20 on a PSU makes a significant difference. Frankly saving that little is peanuts in the context of a rig, so it is far more important to me to get quality components that will last through multiple rebuilds. This works out cheaper than buying multiple barely sufficient PSUs. They ignore the fact that PSUs degrade in efficiency and performance as the components age. They ignore that PSUs are at peak efficiency if you run them around half maximum rated capacity. They ignore that PSUs run quiet if you run they at low loads relative to their rating. They ignore that bigger units have nice modular cables as a rule, not an exception. They ignore that PSUs running at currents well below their peak will cope with large and sudden transients far better than PSUs running towards their limit. In fact this last point is something that is likely to cause problems with the 3080 and 3090 Ampere launches: so much so that Asus has installed LEDs at its EPS power connectors on the GPU to signal power supply failures where transient response was not fast enough to deliver the load asked by the GPU. I am guessing that the RMA return rate is going to be quite high on Ampere and this likely being one of the reasons. (Another being the GDDR6X memory).
  17. Yeah, I agree with you. If I had a 1080 Ti or older I would be upgrading to something this year for sure. But already having a 2080 Ti which I am fundamentally quite happy with so far I've been crunching the numbers and looking at early reviews. My conclusion is that - even though I can afford it - the 3080 is a nice upgrade, but not great value at £800-850 including the cost of a water block. And 10GB is a step backwards. And I am not convinced that £750 extra for a 3090 is worth it. I am also not impressed by the heat generation of these cards. Heat = noise and also uncomfortable rooms during the summer. Especially so wearing VR kit. So my plan is to wait for the G2, use it initially with my 2080 Ti and be happy. Only then if I am crying out for more GPU performance will I consider one of the upgrade options. By the time I have assessed the G2 paired with the 2080 Ti we should also know about the performance of the 6900 XT and Nvidia should have announced their 20GB 3080. Waiting is the smart play right now for 2080 Ti owners. I might even sit this cycle out. Likewise Zen 3 should be very, very good with IPC gains and clock speed gains (base and sustained turbo). But I don't fancy investing in the last of the AM4 chips ahead of an architecture change to DDR5 and all the other bells and whistles from Xen 3's successor. Again, my Haswell is reaching 4.8 GHz and getting the job done for peanuts, and it is a fully depreciated sunk cost anyways.
  18. $900 for a 10GB 3800? There will be a 16GB 6900 XT announced in October and so Nvidia will announce a 20GB 3080 for c.$1000 in October or November... It is not long to be patient. And yes, I know... there will most likely be some delays getting supplies on these products too, but still.
  19. Oh good grief. Not this tangental discussion again. Please! We've been through this before. And you've made your opinion clear. Okay? Let's move back on to a discussion on the 3080.
  20. Hardware Unboxed specifically addressed this question. There are one of two slides that summarise the results of their testing. 1) They did X570 board using PCIe 4.0 x 16; then tested PCIe 3.0 x 16. Result: 4K literally no difference. 1440p c.2% 1080p c.4% But literally no one will be buying a 3080 for 1080p And even for 1440p a 3080 is overkill unless you are using it for e-sport first person shooters. At the 4K resolution the 3080 is best suited to there is no difference 2) They also tested X570 vs Z490. Result: At 4K near identical performance as GPU bound. And below 4K the speed of the Intel chips cancelled out the small advantage of PCIe 4.0. So dead heat effectively at 1080p, 1440p and 4K. NVidia's SSD RTX I/O tech could change the maths on that for new titles.
  21. Well congrats. And enjoy! What are you running currently? Any chance you can give us some before and after scores on whatever system you have once you get up and running? (What screen res are you using etc).
  22. Yes agreed. I was assuming that pushing power limit over 400W would be accompanied by very beefy cooling, like with a powerful custom loop otherwise you would indeed get chips running hot and becoming less efficient. Vicious cycle. Pushing them much beyond FE will not be cheap. Meanwhile I've seen a tester get 4% better fps from just the memory overclock on his test card.
  23. No. You can OC the memory... it is rated for 21Gb/s and running at 19. Also the GPU itself will run faster, but the constraint is the BIOS power limit. AIB cards like the FTW3 will have higher BIOS power limit allowing significant OC headroom. The early / low end cards of mostly power limited, not temperature limited, not voltage limited and not crashing. OCing will mean GPU power draw of over 400 Watts however!!!
  24. It is pretty funny: FE went straight from "notify me" to "out of stock" at £649 in UK. Asus TUF went from £649 before release to £799.99 in the UK when I last looked. I ain't playing this game. I am checking out. I'll wait for the 6900 XT announcement and reassess then. Not saying I will not get Ampere, just that I will keep my money and make an informed decision. And I will not pay scalper prices for any of this consumer tech. All this stuff is landfill in 5 years time anyways. Alright... slight exaggeration, but it is not like buying AMD shares when they were at $2.50 / share!
  25. https://videocardz.com/newz/nvidia-teases-geforce-rtx-3080-with-20gb-memory
×
×
  • Create New...