Jump to content

BitMaster

Members
  • Posts

    7751
  • Joined

  • Last visited

  • Days Won

    4

Everything posted by BitMaster

  1. Those 60w wasted for nothing in idle is more than many laptops, Apple foremost, need for all their work, they have a 65w AC plug. Nvidia, god damn wake up !
  2. rephrase it to...32 or 64GB RAM. 16GB & eye candy...ehhh, no.
  3. Just one thing I want to add. I often read the loose usage of Nvidia Gsync ... compatible. You have to read very closely or maybe read it in the technical docs what it exactly says. Gsync compatible is not the same as just "Gsync". Nvidia Gsync needs a chip inside the monitor, hence the higher price, and only works with Display Port, not HDMI. So if you buy a TV that has no DP it also doesn't have true Gsync, maybe Gsync compatible if Nvidia also lists that TV in their chart for a given driver.
  4. +1 for WC-GPU. You rather watercool the GPU these days than a CPU. The big problem is, finding a matching block for the myriad of AIC designs, that has to be thought of before you click the BUY button. The hottest I have seen my GPU, tho an old 1080ti, is ~52°C when fully stressed. Idle is a few °C above ambient air...and it's very silent, even under full tilt load.
  5. 60°C in idle ? There is a problem
  6. it's a performance gap that might make a difference, just check the 5800X3D reviews to get a glance of how well it performs in games in general. I would only pick the 3D variant if I was buying AM4 now....and get 64GB, DCS can easily flood those 32GB in seconds.
  7. ehhh, ok, the new 7000 AMD show higher temps..BUT..this has a reason, it's t.i.n.y. 5nm lithoghraphy and things get very very small and thus the watts get burned in a smaller oven that then glows hotter. AMD still needs a lot LESS watts than Intel does and probably will keep that advantage for this coming round. It's just that AMD is a few nm smaller than Intel and that's why hotspots are more pronounced. I wouldn't say you should measure the Cooler at the CPU temp alone, the actual wattage it burns is the real deal. As long as AMD says those temps are fine I wouldn't care too much. But what Luc says about the 3D-Cache makes sense.. tho, it is always the "wrong" time to buy parts, it really never fits and one must decide or wait forever for the perfect day to come. If I had to buy today and couldn't wait, I'd go 5800X-3D, 64GB ( maybe even 128 ), and a nice board that fits the needs. The rest is the same, regardless of what you buy. Maybe the PSU is a big "?". If you buy new, get one that has the new ATX connectors for GPU's. That is WAY safer than any adapter if there ever will be one. That new connector talks to the GPU and negotiates wattage draw according to PSU specs. So a 600w GPU won't pull 12v-600w from a 12v-400w capable PSU. Those new toys talk to each other, clever. I doubt an adapter can mimick the PSU's sensor lines or give it the function to talk bidirectional over wires it doesn't have :(. So if you plan for a new rig, consider a modern PSU.
  8. Used to do that with my Asus screen and it is not reliable in my case but that is due to the Screen's USB hub constantly disconnecting after sleep mode and general random disconnections. I don't have my Screen's USB connected anymore for a long time but if that hub was ok I'd use it. When it worked it did so very well, just the disconns were a PITA.
  9. +1 to what LucShep said. Take DDR4 3200-3600 64GB with a matching board and grab as high as you can with the GPU. I don't know that tower but make sure it's got PLENTY airflow for the GPU.
  10. Actually, it's impossible to avoid LED, they all have some blinki blinki somewhere. The board, the GPU and the case nowadays mostly have some kind of LED sensation. I wouldn't need it tbh. Remember to have some thermal paste at hand if you have to remove the cooler ! ...and some rubbing alcohol to clean the old one off the CPU and cooler...
  11. You should be ok with 32GB for single player tho I have seen 40+GB on SP on the ramp. Removing the cooler is usually not that complicated but I get what you say. Small screws and old eyes, gentle tightening and "look at my old numb fingers"... . Damn Best is to read the cooler manual before you touch any screw, maybe it's enough to take one fan off the cooler if it's only that which is in the way. Yeah, who doesn't hate that.
  12. Well, we all knew that those 16GB wouldn't last long You could have asked them to take the 16 back and send a 64 instead.
  13. I may have not been clear enough on the point I wanted to stretch. Yes, you are right, AM5 will be THE socket for AMD for the next coming years, but that's it. Don't get me wrong, I don't want to blame AMD or anything. See, the 670er chipsets and boards can only be tested with RAM that they have and CPU features that are present NOW. In the past, you could change a 1800X for a 2700X and later to a 3900X on X370 on most boards, but could you also get to the same RAM speed with a new 3600 kit? They mostly wouldn't run on X370 iirc and then there are the other features you will only get if you also get the new chipset, whatever features those may be. Yes, it works in general but there are pitfalls, let me put it this way. It's impossible to say where exactly the pit is and where not, but sure fast RAM well beyond 670er specs is a typical candidate..
  14. I wouldn't bet on using the 1st iteration of AM5 board for the 2nd and 3rd generation just because of DDR5. Once they promote the 2nd generation of AM5 chips with better RAM support and faster RAM available the limiting factor will be the 1st gen. motherboard not being able to run those new settings. That's how it has mostly been across vendors and the past years/decade with sockets and generations in general.
  15. no 32GB on 870 ! copied & pasted: The DDR3 standard permits DRAM chip capacities of up to 8 gigabits (Gbit), and up to four ranks of 64 bits each for a total maximum of 16 gigabytes (GB) per DDR3 DIMM. Because of a hardware limitation not fixed until Ivy Bridge-E in 2013, most older Intel CPUs only support up to 4-Gbit chips for 8 GB DIMMs (Intel's Core 2 DDR3 chipsets only support up to 2 Gbit). All AMD CPUs correctly support the full specification for 16 GB DDR3 DIMMs.[1] Intel says same, i7-870 max 16GB https://ark.intel.com/content/www/de/de/ark/products/41315/intel-core-i7870-processor-8m-cache-2-93-ghz.html
  16. IIRC the biggest DDR3 module made for Desktop is a 4GB Dimm, so with 4 slots it will top out at 16GB, 4x4GB. I assume it would need 2R-Dimms with 16 dies on it, likely made for servers, I would not try that route. No idea what GPU was best, get the fattest one for the 150-200€ I'd say
  17. If someone got the talent it is 100% worth learning basic computer hardware stuff. You don't need a MSCE certificate to build, run and service your very own rig. It also pays if you understand the OS a bit more than average joe, again, you don't need to be a dev to properly configure windows. Those, who for what ever reason cannot put their minds and hands into this, pay guys like me so that their stuff runs as intended and that someone also tells them what it can all do, what makes sense and what better not ;). Those who can answer this to themselves save a ton of money and get a faster pace in getting things done whenever a PC is involved. It's like owning a car, either you can fix it yourself or you need a garage and have to pay. It's easier to fix a PC than a car, hands down.
  18. Yes, invest in plenty fans. Even if you watercool everything you likely end up with a gazillion of fans and it pays back not to pick the cheapest. Rather have too many than too few if your GPU dumps it all in the case. I myself recently changed fans on my external radiator as the ones I had turned squeeky after 5 years but since you just can't get 4x 180x180x20mm fans anymore I had to opt for 9x 120mm fans. Well, that is a big accoustic change as well, from 4x 180mm to 9x 120mm...dang ! I did the right thing and bought 9x Noctua and they are even quieter than the 180mm Phobya I had. The downside is 4x the price. FYI: For anyone planning on an external Mo-Ra3, they seem to become more and more woke, by all means, get the 420 instead of 360 size. Both can have many small fans or 4 "BIG" fans, just no one makes them big 180mm fans in narrow for the 360 anymore, which is 4x 180mmx20mm. Noctua on the other hand makes a superb 200mm fan, even in Mo-Ra3-Edition meanwhile, so you can skip 9x 140mm and bolt on 4x 200mm Noctua and be super silent and save money too. Imho the best rad solution out there.
  19. We have had this question many many times over here. As a rule of thumb: No matter where you buy, let them outline EACH & EVERY component they intend to use, here are 2 examples: Version A: LGA1700 Motherboard, USB3.2, PCIe, Sata, NVMe = U S E L E S S info VersionB: MSI Tomahawk Z690 B2-A, 8x USB3.2, 6x Sata, 3x NVMe ( 2x v4 1x v3 ), WLAN-AX, BT-5.2, Gigabit Lan Intel-211, etc etc. T H I S is useful and you can compare. same goes for cooling and Power Supply Unit PSU, have them list EXACTLY what they use or they likely rip you off exactly with those items. If money is secondary, YOU tell them what to put together...and that's where we, the community, can help pick the best for your money. But to do this, you need a company that is transparent in their builds. IIRC BuzzU's rig had it all listed !? BuzzU, can you comment on that. Did they list the details from A-Z ??
  20. There is no reason not to move on to 11 unless you dislike Windows in general. There aren't more of those BS-Apps installed as on 10, yes, it's not nice but it only takes you 5-10min to clear them all out, this goes for 10 & 11. 11 is also pretty rock solid ( it needs a solid hardware to run error free first off ) and by 2025 10 is EOL anyway, that's less than 3 years ahead. It could also well be that as with many new CPU generations, they work best & fastest on the "currently" top-tier OS and that is 11, not 10.
  21. The „light & thin“ actually speaks against it. Cooling capacity needs some mass and volume to work consistently. Otherwise things get heatloaded pretty quick and start to throttle. I don’t think there are major exceptions to this physical limit.
  22. Without any further details all I can suggest is a fresh GPU driver install and a DCS repair.
  23. I hope it does ! it didn’t work out on my end and I finally gave up VR for now. Some say, fly 15min then pause, then another stint of 15min and so on until you feel it’s getting better. Good Luck and enjoy
  24. Yes and no. i dare to say that a budget desktop is not ment to be used 24/7/365 doing 99% CPU & GPU load. If it fails after 3 weeks you picked the wrong parts for Server loads. contrary, a well built desktop should run full tilt for many weeks and months if it needs to be. Desktops are in general made for 8h use per day, 9-to-5. Workstations and Servers should cope with the 24/7/365 full tilt scenario.
×
×
  • Create New...