Jump to content

BitMaster

Members
  • Posts

    7770
  • Joined

  • Last visited

  • Days Won

    4

Everything posted by BitMaster

  1. That will likely NOT work with DDR5 at this stage. My advice, check what speed your board supports with 64GB @ 2x32GB-modules and buy that as one 64GB kit. With 4 modules, the supported speeds dump WAY DOWN at this stage of DDR5 degree of ripeness.
  2. When you look at other CPU's for other segments of the market you can see this is a decade old tool or strategy to give more IPC to a lower clocked core. Server CPU's usually run at lower MHz but have a massive Cache compared to Desktop CPU's to partly compensate for the lack of speed. Same for some Laptop CPU variants. To give them more oomps at say 1.8GHz they get more L3-Cache, 6MB instead of 3 for example, cheapest way to remedy some of it. I am really looking forward to those 7000 series X3D CPUs. I hope they offer more than one SKU.
  3. With all the cost involved, why 32GB only ?
  4. There are statements in this forum, one even from today or yesterday regarding the 5800X3D. My resume from this is very positive. Same goes to Guru3D, they say the 5800X3D is likely the better deal for now.
  5. It seems odd that it does lack some power in many Desktop and Synthetic Benchmarks but when you look at the real deal, FPS in real games, it just can play it'S cards better than any other CPU out there yet. Maybe 13thgen or AMD's own next 3D-Cache CPU, till then I think the best gaming CPU is the one with the most Cache.
  6. Sadly yes...wouldn't be surprissed if the 5800X3D will skyrocket in price as well.
  7. If it shows anything it's "keep Data close to the Core, L1-L2-L3 Cache, the closer the better. Screw the MHz Race when the cost is 150+Watt on top of what's needed when more Cache can deliver better results with less energy consumption on top of it. Considering the much lower MHz of the 5800X3D, it beats them all by FPS per Watt and FPS per MHz.
  8. Take the ACE MAX, plenty USB directly from the board, 2.5G, WiFi ( sometimes, it just comes in handy, believe me ), likely better components used as well. Best of all, it's actually a 570S, S for silent, no chipset fan, a big PLUS. No fan that can fail after a couple years... Alternative, the board from my sig, about same spec and it runs very very robust. Best board I had in 10+ years hands down.
  9. So, after flying over the first view reviews, it looks like the "old" 5800X3D is still the best gaming CPU. It here and there only sits in 2nd or 3rd place but overall it leads the pack. Together with the ATX3.0 issues this makes it not so tasty to buy into the brand new stuff, slower and flawed...WTF is going on ??
  10. No one for sure can know the details that you need. Wait for multiple reviews, cross read, and decide then.
  11. Let me forecast the GPU Weather: Dark smokey clouds above certain homes followed by a spot on rain of 4k gallons of freshwater, delivered by your personal DIY cooling Loop, the local Fire Department. Jokes aside, this is borderline life dangerous. Would you, if money is not a factor, let your 6-year old girl play a game on such a computer without CONSTANTLY monitoring the device. I am not even trying to put such a shoe on my feet. PCI-SIG...you failed miserably.
  12. As it seems, according to the latest GPU and ATX 3.0 news, any card that draws that much power will likely run into an issue...and it's not nvidia or AMD to blame here, it's a poorly designed plug for the desired usecase. This will draw some attention and may render the plug unusable or unsafe to use for cards like a 4090. Since the main issue are the tiny connectors and since the damage can happen on any end of the cable at any device PSU-CABLE-GPU one risks a lot of money when trying to push a 4090 to its default limits, let alone overclocked, and do a 4h burn in. Imho, they have to redesign the plug with bigger, more robust pins and more bend resistant. Wouldn't be surprised if that's the outcome of the letter shown. I would not buy or recommend buying such a card from any vendor at this point, just because of that issue ... and you can't solve it as it is, the design is wrong. https://www.youtube.com/watch?v=p48T1Mo9D3Q&t=322s&ab_channel=GamersNexus https://www.youtube.com/watch?v=K6FiGEAp928&ab_channel=JayzTwoCents
  13. Have you considered the 5800X3D ? I need the my cores for virtual machines but if my own rig was only for DCS/Gaming I would now pick the 5800X3D over my 5900X. Don't get me wrong, I love it, but for DCS 8 cores are enough and rather take the higher IPC instead.
  14. Yep, exactly how I do it Get the 420 and 4x 200mm fans, a D5-PWM and a Heatkiller 200ml Res.. *look for that nice star-shaped fan-hub they have for the 420-200fan combo, NICE little detail that saves headaches when you know the circumstances.
  15. Indeed, let's wait for reviews with official, off the shelf bought CPU's. There's some disturning news at guru3d that the 7900X was still slower in Single Thread than a 12700k... if that is true, no gamer will go AMD this round tbh...anyway...couple days and we know.
  16. Had he better not asked, now he needs a bottle of wine to digest all this
  17. Get the fastest your budget can buy after you bought the biggest GPU. 12700k/12900k can use DDR4 but the soon arriving Ryzen 7000 is likely considerably faster..but needs DDR5 and has a steep price curve on boards. Wiith Intel you can get away cheaper unless you consider Ryzen 5000 an option, the 5800X3D would be my pick then. If I had to buy now, today, I would find an excuse not to and wait till Ryzen 7000 is here with some reviews. It's literally only weeks till you can buy them. When it comes to GPU, boy, you are lucky ! ETH just dropped proof of work and prices are falling. Hurray For VR you should go Nvidia but VR is not really my expertise atm.
  18. More of this ! That's the language Jensen understands.
  19. I am not a VR user anymore but from what the forum generally says, VR is 3080 and up. Maybe wait till the new GPU's arrive to get a better deal on a VR capable card.
  20. Those 60w wasted for nothing in idle is more than many laptops, Apple foremost, need for all their work, they have a 65w AC plug. Nvidia, god damn wake up !
  21. rephrase it to...32 or 64GB RAM. 16GB & eye candy...ehhh, no.
  22. Just one thing I want to add. I often read the loose usage of Nvidia Gsync ... compatible. You have to read very closely or maybe read it in the technical docs what it exactly says. Gsync compatible is not the same as just "Gsync". Nvidia Gsync needs a chip inside the monitor, hence the higher price, and only works with Display Port, not HDMI. So if you buy a TV that has no DP it also doesn't have true Gsync, maybe Gsync compatible if Nvidia also lists that TV in their chart for a given driver.
  23. +1 for WC-GPU. You rather watercool the GPU these days than a CPU. The big problem is, finding a matching block for the myriad of AIC designs, that has to be thought of before you click the BUY button. The hottest I have seen my GPU, tho an old 1080ti, is ~52°C when fully stressed. Idle is a few °C above ambient air...and it's very silent, even under full tilt load.
  24. 60°C in idle ? There is a problem
  25. it's a performance gap that might make a difference, just check the 5800X3D reviews to get a glance of how well it performs in games in general. I would only pick the 3D variant if I was buying AM4 now....and get 64GB, DCS can easily flood those 32GB in seconds.
×
×
  • Create New...