Jump to content

BitMaster

Members
  • Posts

    7770
  • Joined

  • Last visited

  • Days Won

    4

Everything posted by BitMaster

  1. Take a Seasonic Prime TX 850w Platinum. One of the best you can get. If you plan on buying one of those rumoured 4xxx Nvidia's going 500+ Watts you may want tzo consider the 1kW version of this PSU. https://seasonic.com/consumer/power-supplies?attr_80plus=16&wattage=20&wattage_range=820_928
  2. I just built a new system and put 64GB in it. Imho 16GB is not an option, really, if you want to use DCS seriously. You don't need the fastest RAM money can buy, neither for Intel nor for AMD. The sweetspot is somewhere around 3200-3800MHz, cl14-cl18. The best kits you can get use Samsung B-die and one way of finding them is looking at the latency values. 3200-14-14-14-34 is only available on B-Die kits, so is 3600-16-16-16-36. Those kits are 100% B-die and like to overclock if you intend to do so. My new 64GB kit is a 3200-14-14-14-34 B-Die and my old 8700k uses a 32GB 3600-16-16-16-36 kit, also B-die. Kits with 3600 16-18-18-38 can be B-die but could also be made of a few others but none of those can do such tight timings as B-die can. Going to speeds above 3800-4000MHz and beyond makes little to no sense with DDR4. latency penalties equals the gains in clock cycles usually & the price is unproportionaly high. With AMD it's mostly contra-productive above 3600~3800MHz. The sweetspot is 3600-CL16 with a little tendency to more bandwith at a higher cost vs 3200CL14 with equal latency and a little less bandwith but therefor ~100€ cheaper ( 64GB kit ). I bought the 3200CL14 and overclocked it to 3600CL16-16-16-36-1T and it runs fine, not even the slightest glitch or anything. But tbh, you wont see much difference as long as you stay somewhere between 3200-3600 even if the latency is worse. Still, if you want to be on the safe side get low latency modules. Since overclocking beyond 3600 makes little sense with my AMD CPU I opted for the 3200 kit and hoped it would run 3600 with CL16 all the way and it did. And yes, they are also Black, no LED ! edit: usually, 3600 CL14 is only available at 1.45v...and I would be cautious with this, depending on what CPU you run because it may not harm RAM but your IMC may not like 1.45v ( or even 1.47xv depending on your Bios/Board ) for a longer period of time. I would not do this on my 8700k as I know that chip doesn't like such high voltage on RAM modules. I honestly dont know how other CPUs or even my 5900x would handle that. iirc Buildzoid considers 1.45v as the absolut max for every day use, that is already a warning for me to step back to 1.40v, which is what I am using for most overclocks, and currently use for this rig. Out of the box the board applies 1.374v to XMP3200 and for 3600 it needs a little more with 4 modules. If it would need 1.45v to be stable I would dial back until 1.4v was enough to be stabe. That's roughly my guideline when I play with RAM.
  3. If I had to bet I'd bet that those current Z690 boards will not accept any DDR5 module from the future...and there also is the IMC, both have to accept the RAM. DDR5 is nice but not for now imho.
  4. As a sidenote, if you need to clean out Windows you might want to give jv16 a look. It's been developed for almost two decades ( changed it's name from RegSupreme to RegSupremePro, jv16 Powertools, .. maybe a few more ) and it only got better. It has a 14 day trial which is basically enough to clean out windows. During my 20 years+ fixing PC's this tool has helped on many many occasions to get a PC back working as it should. As always, such tools are like an axe in the forrest, use it wisely and always use the backup feature ( jv16 does create a backup ) in case it goes wrong. In all those years it only failed once on my own system and deleted the Brother Scanner GUI...many many years ago, ever since it never failed again cleaning registry or installs.
  5. Honestly, who cares about the Warranty with a RAM kit. How could they ever proof I ran them above 3200 ? There is no way to do that. The only thing you have to obey with Ryzen is that you better don't force the IO-Die to go into 1/2 mode, somewhere between 3600-4000MHz that will likely happen. If you stay below that crucial tipping point, say 3600 as a safe setting, you do not have to fear any downsides. I am very sure Intel uses a whole bunch of kits to test and not only this ONE kit, be realistic.
  6. I run a true 3200-14-14-14-34-17 kit, 4x16GB, and it is outperformed under full stress from the same kit at 3600-16-16-16-36-1T. There is no drop in performance anywhere, neither my fps go down, my latency is ~99% the same where as Aida64 tells me I improved by a few nanoseconds, bandwidth is a little higher as expected. Bottom line, I cannot see any proof that 3200/14 is always better than 3600/16 nor does my CPU go nuts and cuts down on performance. The good thing is, I saved quite a bit of cash getting the 3200/14 and oc them to 3600/16. They are basically the same kit. My other true 3600/16 kit ( only 32GB, 4x8) has 3200/14 as one of it's profiles... OK, I need to up the Volts to 1.4v but that is well within DDR4 specs for permanent usage.
  7. Actually, a GPU running at 99-100% is what you paid for and is the desired state of the card when gaming while NEITHER bottlenecked somewhere else NOR fps-capped by the user. A GPU running in the 80s% and not fps capped is a clear sign of a big bottleneck, usually CPU in terms of DCS. Those cards are ment to be run at 100% by design.
  8. VIA 686B SouthBridge, best when fully loaded with 4 devices @Ultra-DMA133(?)... Maybe one of the worst Southchip Bridges ever released LoL Jokes aside. I upgraded both as well, newest X570 chipset driver + latest Nvidia and all seems normal.
  9. I shortly tried 11 on my new 5900 system and it was meeehh. I had a serious performance loss but I cannot for sure tell it was 11 only as I had a freeze bug as well that caused me to install 11 first place as I couldn't find the fix in 10 so I gave it a shot. Sadly I carried the bug over, also in DCS with a bad performance, somewhere around 25%ish loss in fps. But I ran DCS on my 8700k with 11 and had no such loss iirc. If I dont remember a loss, there was none, that's my logic here. With the 5900 and 11 the loss was so obvious that you cannot not remember it, hope you get what I mean. I might try 11 on this rig again, just for the kicks, I will play around with it for a little longer before I put a final install on it anyway, it installs so quick, even quicker if this damn Gigabyte Bios had a working Netboot but it doesnt. I for one am lucky that I found the culprit for the freezes on this Gigabyte Aorus X570S Master board, it's any version of SIV they have, install it on my specific rig at least, 10 or 11, and you get annoying freezes watching YT, gaming, etc.. rendering the whole system useless for anything content related. Type a letter, yes ! But for that I dont need 12 cores and 64GB, ehhh It's nothing new, Gigabyte has just the same rubbish software bundled with their systems sometimes as Asus has, great hardware but the Software...not really.
  10. No, you can't even install it. Mind the architecture, x86 vs. ARM. You'd need to install Win10 for ARM.
  11. I ran into a distorted sound issue under high load and the Realtek Audio Control won't let me split front and rear audio out into 2 separate jacks, which I think it should do as of 2021 and that kind of board. So...after trying many things that didnt fix the sound issues I am trying out 11.... let's see if it fixes the sound issue, heck, there is always something... PXE doesnt work on that board, which is a pain... I like to install via PXE and it errors out when connecting to the WDS....LoL
  12. factor in the kW/h price that one pays. What pays in Texas might not pay in Denmark and for sure not in Germany where electricity is so damn expensive. But true, I used my 1080ti for mining back the extra money I paid for when I got it "overpriced" ( you would now call this a bargain I guess ). Actually, if I had kept the BC I amde with it I could have made a few Grand off of it but I sold mine when they were 6-8k€...crying doesnt help.
  13. My 8700k runs OK on 11, a few things get annoying, for example the left-mouse-click bug in File Explorer but that is minor and likely to get fixed. With my new 5900X I am sticking to 10 for now. Not even trying to think about installing it, LoL. Man, this 5900X flies...on 10 :=)
  14. You misinterprete the CPU value. You refer to the total CPU load but for DCS it is far more important to see each individual core as it mainly uses 1 core to run the game and a few other in addition for other tasks. YOu will never see DCS use 100% on modern CPUs, they have too many cores of which DCS currently makes no use of. Look at the individual CPU graph ( logical cores in Task Manager ) and you will see that 1 core is likely at 95-100% most of the time. More RAM makes sense but I would hesitate to buy a new "old" board and another 16GB of new "old" DDR3 RAM. Maybe a new board-cpu-ram combo makes more sense !?
  15. 1.5v is likely DDR3 ( what your System runs on ) but the 1.35v is a DDR4 value and that RAM will not fit into your system. Best is to go to a local shop, friend etc. that can guide you. Replacing that board might be not the best idea, rather think of board+cpu+ram then, makes far more sense if you can afford it.
  16. So, ordered my 5900X today, along with 64GB and the Aorus X570S Master mobo. As a result of the Win11-AMD issue I am installing this new rig with 10Pro for the time being though my 8700k runs OK with 11 so far. Tbh, my idle power usage is down by 40 Watts with 11. Can't tell why, but my Corsair AXi PSU says 100-120w now and not 160w in idle which it used to be ever since with this combo. edit: found it, the 1080ti was always consuming 67w in idle under 10 and the drivers of that past 3 years. Now with 11 it is 18-19w only !!!! Why was it consuming that much more under 10 ??? 0,04kWx24hx365dx3y x 0.30€cent = 315€ electric bill, that's a nice 2TB Samsung wasted in idle ! OK, in RL it slept 2/3 of that time, so it wasted only around 100€ worth of energy.... still, small things add up to a big pile edit2: also found out why the Geforce card used more in the past. It's the Maximum Performance setting on GeForce Cpanel that causes it. I thought it would not affect idle consumption but it does and you have to reboot to make the effect work. Most settings in there work with "Apply" but that setting needs a reboot to show an idle consumption from less than 20w to hop to almost constant 76w. Yeah, I tried that option long before to see if it matters with idle consumption and I saw no change...I didn't reboot !! Anyway, got me an eBay copy of 10Pro and start there. I will reinstall the OS a few times anyway when the rig is new. Luckily I have a WDS Server with 10 and 11 on PXE
  17. Yep, just hit the button on my 64GB kit. 64 is the new 32
  18. They have a Dossier from anyone, incl. you Thinder I am afraid.
  19. Now that you have the naked board, take the CR2032 Battery out and put a coin in for 15min to fully flatten the cells. This has sometimes helpedlto revive boards that seems dead, but usually the CPU had to be removed to make it work, which you have. Also place the cap on the Clear CMOS Jumper, totally drain it. If that won't boot it and if your board doesn't have the Q-Flash option then I guess the board needs either a new Bios, IF you can get the EPROM out, many vendors don't have replaceable EPROMS anymore and that would suck in your case but iirc one Pilot wrote that there are proms for your board available in eBay, so likely yours can be pulled out. We used to do that on previous TM Hotas gear back in the days....or it's beyond user means to get repaired = RMA. Such things suck ! New stuff, ought to be fun and then it turns out into a medium nightmare RMA, hate it but such things do happen.
  20. That is indeed best, the downside is that you need a full 2nd license to do so.
  21. Ripjaw-V goes way down in Latency, just as the LED versions do, just a matter of how much you want to spend. There is a reason why those are often used in LN overclocking instead of LED versions. Less trouble booting a less complicated PCB ( led traces, LED power, etc.. ) Why don't you go to Gskill and check those kits.
  22. I just upgraded my current install. I have, same as you, no interest in installing 11 from scratch plus all my software and licenses...ahh NO, only if I really need to ( and therefor I have Acronis Cyber Protect and a Backup ). No, just upgrade it, takes about an hour and all is good.
  23. To make use of my boards LEDs & GPU LEDs ( EK block with RGB ) I need a ton of Asus software and in addition Corsair iCue to combine them all and have the LED color correspond to each hardware's temperature ( green --> yellow--> red ). That is somewhat usefull as I can tell by the colour of the LEDs how warm/hot the system is. Unfortunately, no other vendor has that temp feature and allows you to assign them independently. I dont need to show the CPU temp with the GPU or vice versa. It's a mess and more often than not it caused trouble over the years. The combo with iCue is somewhat stable for months now but I really dislike most LED software I have ever come across. Yes. I agree, LEDs boost your overall perception of FPS That's why I keep the door open to the LED Gskill kit. Just want to point out, there is a kit with same performance for less. Actually, all of those GTZR's have a GVK counterpart with exact same Latencies and Volts, from 3200 till 4800 MHz range.. Once my payment arrives I will order the last parts missing and start building it. I am really eager to see the difference between 3600-16 and 3200-14 and if I can stress it so much that 3600 falls significantly behind 3200, as to what Thinder says. I don't question it but I wanna see it myself.
  24. I did the upgrade yesterday and it all "seems" to work as intended. No errors or glitches so far.
×
×
  • Create New...