Jump to content

Anyone know if there's any point in non-VR, 1440p, 3070, 5600X to 5800X3D?


Recommended Posts

Posted (edited)

I have a new 5800X3D on hold at Memory Express here for $429 CAD ($320 USD).  I do have to eventually upgrade or get a new system so I can hand mine down to my sons (older one would give the younger one his 3600/3060ti when he wants a PC).  I will also get likely a 7900XT/4080/4070, or whatever AMD comes out with in a 4070-tier. 

I'm at 3070/5600X/1440p and wonder if I'd just be better off getting a higher refresh monitor (currently 75Hz, 1440p, IPS).  My older son will have to get a CPU regardless, but he could spend 1/3 and just get another 5600X or even 5600 (he has a 165Hz, 1440p monitor). 

I mostly just play single player DCS, but may get into MP someday if I have time.  I'm also assuming the dynamic campaign will tax the CPU more, but if and when ED introduces multi-threading, this might negate any benefits of a 5800X3D.  Or will it be able to use the extra 2 cores?  Thoughts?  

Edited by aleader

"I mean, I guess it would just be a guy who you know, grabs bananas and runs. Or, um, a banana that grabs things. Why would a banana grab another banana? I mean, those are the kind of questions I don't want to answer." - Michael Bluth

Posted

5800x3D's appeal is in a VR scenario. If you're not doing that, you don't need it. If you're playing 2d, honestly, IMO there's little point in upgrading at all. My wife's laptop with a 2060M will run the game at Max settings albeit it's a 1080p screen. A desktop 3070 should be handling 1440p just fine.

Де вороги, знайдуться козаки їх перемогти.

5800x3d * 3090 * 64gb * Reverb G2

Posted

Thanks.  It is handling it, but I was just thinking long-term, get the fastest CPU brand new now, and not have to upgrade anything but my GPU for 10 years or so.  It's hard to tell from the overly-exuberant reviews on Newegg from many who probably think they're seeing a big improvement, but really aren't, unless they're upgrading from a 1600 or older 4690k.  

I don't care about the overall FPS, as it hits over 100 quite a bit, but more for the 1% lows, and if the 5800X3D would smooth out the experience down-low because I mostly fly helos.

"I mean, I guess it would just be a guy who you know, grabs bananas and runs. Or, um, a banana that grabs things. Why would a banana grab another banana? I mean, those are the kind of questions I don't want to answer." - Michael Bluth

Posted

Like I said, in 2d it's unlikely to matter one way or the other. 5800x3d helps with draw calls in VR. If you're not using VR and don't intend to, then it hardly matters what CPU you have unless it's really out of date.

Де вороги, знайдуться козаки їх перемогти.

5800x3d * 3090 * 64gb * Reverb G2

  • 4 weeks later...
Posted (edited)

I wonder, since I have the 3070 gaming x trio and 5600x. While playing DCS my CPU is running only at about 30% while in VR (Reverb G2), at medium settings, but my 3070 is at 100% almost constantly. To me it doesn't seem that a CPU faster than the 5600x would benefit DCS at all.

Edited by empec
Posted

I went from a 5600x to a 5800x3d. In some scenarios I now have visibly increased fps.

I dont agree that an upgrade does not help, since DCS is often CPU-limited. No matter if in 2D or VR.

But the question is, if the upgrade is worth it. Now that ED officially stated, multithreading or from what I understood dual-threading is close by, I highly doubt that.

Posted (edited)

There is little point in swapping to a 5800x3d without a Cl 14 RAM kit to take full advantage of it, Cl14 was already good with the 4600X, it is even better with the 5800x3d, preferably a 4 X 1 sticks kit, it's a bit pricey but in this particular case well worth the investment.

Those kits comes in DDR4-3200 or DDR4-3600 kits, 3600 work well with Cl14 B-die, gain is significant at 4K 2 X MSAA, just make sure your motherboard supports it.

My suggestion: G.SKILL TridentZ RGB Series 32GB (4 x 8GB) 288-Pin PC RAM DDR4 3600 (PC4 28800) Desktop Memory Model F4-3600C14Q-32GTZRA

Gains from 5600X bounded with Cl16 Crucial to same CPU with Cl14 B-die kit: 4K 2 X MSAA 3D Mark Pro. CPU and GPU boosted (Ryzen Master and NVIDIA After burner). G.SKILL TridentZ RGB Series 32GB (4 x 8GB) 288-Pin PC RAM DDR4 3200 (PC4 25600) Desktop Memory Model F4-3200C14Q-32GTZR

GSKILL.jpg

Gain from the set up above with 5800x3d and Cl 24 3600 RAM kit. Same test settings. Same GPU (EVGA GEFORCE GTX 1080Ti). NO boost.

Gains-Stage-1.jpg

The difference between the gains comes mostly from the 5800x3d cache and perhaps well a little from the increased frequency, but as you can see, all chanels are full functional at 4K (under max load) and the CPU doesn't throttle back, which it would do under the conditions below:

Latency higher than Cl14.

More than 4 banks. Maxium for the Ryzen 3 is 4 X 1 bank or 2 X 2 banks. Advantage; interleaving; the controler can spread datas over all 4 banks.

Frequency higher than 3200 GHz with a RAM latency above Cl14, those kits doesn't have the timings to run higher than the crommanded limit of 3200 GHz.

>>>>

From my PoV, going from a 1.33% in Graphics score to 18.91% is already a good reason for bounding this CPU with a Cl14 kit, then 14.09% in Phisics score demonctrate how much the CPU beneficiate from this RAM, but Combined score of 32.34% makes sure it is a winning combination.

Edited by Thinder

Win 11Pro. Corsair RM1000X PSU. ASUS TUF Gaming X570-PLUS [WI-FI], AMD Ryzen 7 5800X 3D, Sapphire Radeon RX 7900 XTX Nitro+ Vapor-X 24GB GDDR6. 32 GB G.SKILL TridentZ RGB Series (4 x 8GB) RAM Cl14 DDR4 3600. Thrustmaster HOTAS WARTHOG Thrustmaster. TWCS Throttle. PICO 4 256GB.

WARNING: Message from AMD: Windows Automatic Update may have replaced their driver by one of their own. Check your drivers.

M-2000C. Mirage F1. F/A-18C Hornet. F-15C. F-5E Tiger II. MiG-29 "Fulcrum".  Avatar: Escadron de Chasse 3/3 Ardennes. Fly like a Maineyak.

 

Posted
4 hours ago, Thinder said:

There is little point in swapping to a 5800x3d without a Cl 14 RAM kit to take full advantage of it, Cl14 was already good with the 4600X, it is even better with the 5800x3d, preferably a 4 X 1 sticks kit, it's a bit pricey but in this particular case well worth the investment.

 

 

Incorrect. 

5600x to 5800X3D is on avg 15-25% gain in performance, a fairly significant increase. 

Lower latency (CL14) dual rank ram will get you another couple  percent at best, not exactly a deal breaker. 

 

Unless you have absolute trash RAM, there will be a performance increase. Now, will it be noticeable? It will depend on the scenario. VR sees the biggest net gain, 2D does as well but may not  be as noticeable. 

 

 

Windows 11 23H2| ASUS X670E-F STRIX | AMD 9800X3D@ 5.6Ghz | G.Skill 64Gb DDR5 6200 28-36-36-38  | RTX 4090 undervolted | MSI MPG A1000G PSU | VKB MCG Ultimate + VKB T-Rudders + WH Throttle |  HP Reverb G2  Quest 3 + VD

Posted (edited)
4 hours ago, EightyDuce said:

Incorrect. 

5600x to 5800X3D is on avg 15-25% gain in performance, a fairly significant increase. 

Lower latency (CL14) dual rank ram will get you another couple  percent at best, not exactly a deal breaker. 

Unless you have absolute trash RAM, there will be a performance increase. Now, will it be noticeable? It will depend on the scenario. VR sees the biggest net gain, 2D does as well but may not  be as noticeable.

Yeah, right...

You're prompt trying to correct my post at every opportunity but did you notice the difference between his system configuration and mine; SINGLE BANK Trident Ripjaws V sticks Cl16 for the 5800X3D? Where is this not BIASED when you see what AMD has to say about it?

biased-test.png

 

Good thing I don't rely on Youtube videos but get my infos from AMD and G-skill before commiting to a RAM kit, which btw I didn't do for once when I ordered and tested an intermediate G-Skill kit (not mentioned here), but which is 2 banks but B-die, and the difference is 2% between it and my last results, same as DDR 5 tested in this video.

ps I did it for free thanks to a refund from Newegg.

Now what AMD says is as following:

ranks.jpg

Quotes:

1) "Running 4 dual rank sticks is very taxing for the CPU's memory controller. The more ranks present in the system, the harder it gets for the CPU to manage all these ranks, especially at high memory speed, and also officially supported memory speed by AMD decreases accordingly".

2) "In your case, running 2 dual rank sticks or 4 single rank sticks might be the better option if you are aiming for both high memory clock and low latency. The best memory for this is the ones using B-die IC chips underneath but they are sold with much higher premium price".

>>>

1) No different from what I say, at least I can quantify my tests properly with similar latency RAM kits, in his first test the difference at 1080 P, best score obtained, is already 7.34%+, then he ran his test with a Cl16 non B-die using a frequencycy not officially supported by the manufacturer and wrong timings.

In any case he never compared the same two configuration I tested and how he can show the same results at 4K just going from 3200 to 3600 GHz and ZERO%  gain with a straight face is beyhond me, good for him he is just doing a cost per frame test, that can "explain" some of it.

2) I didn't know the Trident Ripjaws V DDR4-3600 Cl16 was a B-die kit, did you? No, it aint. You just can't claim a configuration to be faster by using different IC chips as specified by AMD, simply because you won't have the same stability and timings.

I hope you don't expect the same performances from a Cl16 kit costing £155 less than my Cl14 3600 kit, I know X-Mass is close but seriously?

That's why I don't mix orange and lemmon for a test and don't use games but 3D Mark Pro at 4 K 2 X MSAA, plus I give my full configuration as well, which he never does. Q: How many sticks does he use for his test. B-die or not B-die and WHY doesn't he provide the CPU with the best combination for his test?

How it works by AMD:

interleaving.jpg

4 X 1 bank (or rank) is the optimum configuration for a Ryzen3/3D, in the case of the 3D its cache will take full advantage of interleaving, so there is no way around it, you want max performances from it, you need B-die, Cl14, maximum 3600 MHz allowed by the B-die kit timing range, 4 X 1 bank (or rank).

>>>

Quote

Unless you have absolute trash RAM, there will be a performance increase. Now, will it be noticeable? It will depend on the scenario.

 

It's not a question of "trash" it's a question of latency, anyone asking an AMD or G-Skill technical support will get the same answer, BEST B-die, Cl14, 1 X 4 bank (or rank), but apparently it's something you haven't get to term with yet, better post little videos with plenty of unknown parameters.

He does cost per frame, I do optimization...

 

Quote

VR sees the biggest net gain, 2D does as well but may not  be as noticeable.

 

You get it backward but yeah, just made my point, at 1080 P he already had 7.34% difference with the wrong kit for the 5800X3D and at 4K, as expected, the non-B-die Cl 16 causes the CPU controller to throttle down and the difference is ZERO.

I strongly suggest you go the same leaning curve as I did and inform yourself with people who know what they're talking about, not Youtube contributors who start by trashing B-die Cl14 kits as being "ultra-expensive stuff" pretending to do a cost per frame test, then post it with mention "Incorrect" proving you haven't got it yet.

Quoting myself:

Quote

it's a bit pricey but in this particular case well worth the investment.

So AGAIN: Optimization following AMD and G-Skill recommendations = NO DDR5. NO Cl16 which is by far NOT the best for the 5800X3D, but B-die and optimization done by using RAM and configuration the CPU controller can use at full throttle under full load (4K) with the right timings and interleaving, which is not what you get using a Cl16 non B-die kit, what you could call "trash", because G-Skill, Corsair branded or not, they simply are no better than a high street Cl16 kit...

Edited by Thinder

Win 11Pro. Corsair RM1000X PSU. ASUS TUF Gaming X570-PLUS [WI-FI], AMD Ryzen 7 5800X 3D, Sapphire Radeon RX 7900 XTX Nitro+ Vapor-X 24GB GDDR6. 32 GB G.SKILL TridentZ RGB Series (4 x 8GB) RAM Cl14 DDR4 3600. Thrustmaster HOTAS WARTHOG Thrustmaster. TWCS Throttle. PICO 4 256GB.

WARNING: Message from AMD: Windows Automatic Update may have replaced their driver by one of their own. Check your drivers.

M-2000C. Mirage F1. F/A-18C Hornet. F-15C. F-5E Tiger II. MiG-29 "Fulcrum".  Avatar: Escadron de Chasse 3/3 Ardennes. Fly like a Maineyak.

 

Posted
2 hours ago, Thinder said:

Yeah, right...

You're prompt trying to correct my post at every opportunity but did you notice the difference between his system configuration and mine; SINGLE BANK Trident Ripjaws V sticks Cl16 for the 5800X3D? Where is this not BIASED when you see what AMD has to say about it?

biased-test.png

 

Good thing I don't rely on Youtube videos but get my infos from AMD and G-skill before commiting to a RAM kit, which btw I didn't do for once when I ordered and tested an intermediate G-Skill kit (not mentioned here), but which is 2 banks but B-die, and the difference is 2% between it and my last results, same as DDR 5 tested in this video.

ps I did it for free thanks to a refund from Newegg.

Now what AMD says is as following:

ranks.jpg

Quotes:

1) "Running 4 dual rank sticks is very taxing for the CPU's memory controller. The more ranks present in the system, the harder it gets for the CPU to manage all these ranks, especially at high memory speed, and also officially supported memory speed by AMD decreases accordingly".

2) "In your case, running 2 dual rank sticks or 4 single rank sticks might be the better option if you are aiming for both high memory clock and low latency. The best memory for this is the ones using B-die IC chips underneath but they are sold with much higher premium price".

>>>

1) No different from what I say, at least I can quantify my tests properly with similar latency RAM kits, in his first test the difference at 1080 P, best score obtained, is already 7.34%+, then he ran his test with a Cl16 non B-die using a frequencycy not officially supported by the manufacturer and wrong timings.

In any case he never compared the same two configuration I tested and how he can show the same results at 4K just going from 3200 to 3600 GHz and ZERO%  gain with a straight face is beyhond me, good for him he is just doing a cost per frame test, that can "explain" some of it.

2) I didn't know the Trident Ripjaws V DDR4-3600 Cl16 was a B-die kit, did you? No, it aint. You just can't claim a configuration to be faster by using different IC chips as specified by AMD, simply because you won't have the same stability and timings.

I hope you don't expect the same performances from a Cl16 kit costing £155 less than my Cl14 3600 kit, I know X-Mass is close but seriously?

That's why I don't mix orange and lemmon for a test and don't use games but 3D Mark Pro at 4 K 2 X MSAA, plus I give my full configuration as well, which he never does. Q: How many sticks does he use for his test. B-die or not B-die and WHY doesn't he provide the CPU with the best combination for his test?

How it works by AMD:

interleaving.jpg

4 X 1 bank (or rank) is the optimum configuration for a Ryzen3/3D, in the case of the 3D its cache will take full advantage of interleaving, so there is no way around it, you want max performances from it, you need B-die, Cl14, maximum 3600 MHz allowed by the B-die kit timing range, 4 X 1 bank (or rank).

>>>

 

It's not a question of "trash" it's a question of latency, anyone asking an AMD or G-Skill technical support will get the same answer, BEST B-die, Cl14, 1 X 4 bank (or rank), but apparently it's something you haven't get to term with yet, better post little videos with plenty of unknown parameters.

He does cost per frame, I do optimization...

 

 

You get it backward but yeah, just made my point, at 1080 P he already had 7.34% difference with the wrong kit for the 5800X3D and at 4K, as expected, the non-B-die Cl 16 causes the CPU controller to throttle down and the difference is ZERO.

I strongly suggest you go the same leaning curve as I did and inform yourself with people who know what they're talking about, not Youtube contributors who start by trashing B-die Cl14 kits as being "ultra-expensive stuff" pretending to do a cost per frame test, then post it with mention "Incorrect" proving you haven't got it yet.

Quoting myself:

So AGAIN: Optimization following AMD and G-Skill recommendations = NO DDR5. NO Cl16 which is by far NOT the best for the 5800X3D, but B-die and optimization done by using RAM and configuration the CPU controller can use at full throttle under full load (4K) with the right timings and interleaving, which is not what you get using a Cl16 non B-die kit, what you could call "trash", because G-Skill, Corsair branded or not, they simply are no better than a high street Cl16 kit...

 

That's a nice wall of text, but at the end of the day they took the same hardware (MB and RAM) and compared 5600X VS. 5800X3D and 5800X3D came out 15-25% faster. 

You should develop and present your comprehensive testing method since they're apparently doing it wrong, I be curious to see your results instead of a bunch of copy/paste text. 

 

 

 

Windows 11 23H2| ASUS X670E-F STRIX | AMD 9800X3D@ 5.6Ghz | G.Skill 64Gb DDR5 6200 28-36-36-38  | RTX 4090 undervolted | MSI MPG A1000G PSU | VKB MCG Ultimate + VKB T-Rudders + WH Throttle |  HP Reverb G2  Quest 3 + VD

Posted (edited)
On 11/10/2022 at 4:56 PM, aleader said:

I have a new 5800X3D on hold at Memory Express here for $429 CAD ($320 USD).  I do have to eventually upgrade or get a new system so I can hand mine down to my sons (older one would give the younger one his 3600/3060ti when he wants a PC).  I will also get likely a 7900XT/4080/4070, or whatever AMD comes out with in a 4070-tier. 

I'm at 3070/5600X/1440p and wonder if I'd just be better off getting a higher refresh monitor (currently 75Hz, 1440p, IPS).  My older son will have to get a CPU regardless, but he could spend 1/3 and just get another 5600X or even 5600 (he has a 165Hz, 1440p monitor). 

I mostly just play single player DCS, but may get into MP someday if I have time.  I'm also assuming the dynamic campaign will tax the CPU more, but if and when ED introduces multi-threading, this might negate any benefits of a 5800X3D.  Or will it be able to use the extra 2 cores?  Thoughts?  

 

It's simple: Your 5800X3D CPU can handle much more than you might think thanks to its cache, even at high resolutions and multi-threading.

But if you ignore the limitations of the controller and fit a RAM kit limiting its bandwidth, you will not be able to see its optimum performances at whatever resolution, single or multitreading.

People keep trying to imply that you could use whatever RAM kit your motherboard supports and still see the same results but it is not so.

There IS a RAM-CPU bottleneck the same way that there are CPU-GPU (or reverse) bottlenecks in any platform, but it is even more important with the Ryzen 3 and in particular 3D since the maxiumum bandwidth they can use depends entirely on the RAM latency, timing range and frequency.

So it is up to you to chose, either cost per frame and save 150 box, or optimization for your 5800X3D, I answered your concern about its max performances.

For my last enquiery to G-Skill I didn't provide them with the details (I will) but their reply is already clear about the limitations of your CPU, Cl14 is quoted for 3600 MHz, this is what will remove this bottleneck for a Ryzen 3/3D platform.

GSkill-Reply.png

 

15 minutes ago, EightyDuce said:

That's a nice wall of text, but at the end of the day they took the same hardware (MB and RAM) and compared 5600X VS. 5800X3D and 5800X3D came out 15-25% faster. 

You should develop and present your comprehensive testing method since they're apparently doing it wrong, I be curious to see your results instead of a bunch of copy/paste text. 

 

 

 

 

Mate you keep trying to sell wooden chariots for Rolls-Royce, and counterdicting what AMD and GSkill have to say about it continuously with "cost per frame" videos and comparing it with my optimized set-up, proving time and time again that you didn't undertand the A from A-to Z.

I already posted my results but you keep coming with mediocre posts and then ask for them, reminds me of some guys...

AGAIN. You get it all wrong, it would be nice if you were to stop disinforming people. Cheers. And something else, when I need an educated opinion, I know where to ask and if I did listen to you or your Youtube contributors instead, I wouldn't have obtained the results I got today.

ps the copy/paste texts are replies to my email enquieries by AMD and GSkill.

Edited by Thinder

Win 11Pro. Corsair RM1000X PSU. ASUS TUF Gaming X570-PLUS [WI-FI], AMD Ryzen 7 5800X 3D, Sapphire Radeon RX 7900 XTX Nitro+ Vapor-X 24GB GDDR6. 32 GB G.SKILL TridentZ RGB Series (4 x 8GB) RAM Cl14 DDR4 3600. Thrustmaster HOTAS WARTHOG Thrustmaster. TWCS Throttle. PICO 4 256GB.

WARNING: Message from AMD: Windows Automatic Update may have replaced their driver by one of their own. Check your drivers.

M-2000C. Mirage F1. F/A-18C Hornet. F-15C. F-5E Tiger II. MiG-29 "Fulcrum".  Avatar: Escadron de Chasse 3/3 Ardennes. Fly like a Maineyak.

 

Posted
3 hours ago, Thinder said:

It's simple: Your 5800X3D CPU can handle much more than you might think thanks to its cache, even at high resolutions and multi-threading.

But if you ignore the limitations of the controller and fit a RAM kit limiting its bandwidth, you will not be able to see its optimum performances at whatever resolution, single or multitreading.

People keep trying to imply that you could use whatever RAM kit your motherboard supports and still see the same results but it is not so.

There IS a RAM-CPU bottleneck the same way that there are CPU-GPU (or reverse) bottlenecks in any platform, but it is even more important with the Ryzen 3 and in particular 3D since the maxiumum bandwidth they can use depends entirely on the RAM latency, timing range and frequency.

So it is up to you to chose, either cost per frame and save 150 box, or optimization for your 5800X3D, I answered your concern about its max performances.

For my last enquiery to G-Skill I didn't provide them with the details (I will) but their reply is already clear about the limitations of your CPU, Cl14 is quoted for 3600 MHz, this is what will remove this bottleneck for a Ryzen 3/3D platform.

GSkill-Reply.png

 

Mate you keep trying to sell wooden chariots for Rolls-Royce, and counterdicting what AMD and GSkill have to say about it continuously with "cost per frame" videos and comparing it with my optimized set-up, proving time and time again that you didn't undertand the A from A-to Z.

I already posted my results but you keep coming with mediocre posts and then ask for them, reminds me of some guys...

AGAIN. You get it all wrong, it would be nice if you were to stop disinforming people. Cheers. And something else, when I need an educated opinion, I know where to ask and if I did listen to you or your Youtube contributors instead, I wouldn't have obtained the results I got today.

ps the copy/paste texts are replies to my email enquieries by AMD and GSkill.

 

I look forward to your continued comprehensive counter-shill culture analysis of CPU and memory tech that everyone else seems to be missing. 

You'll be great. 

  • Like 1

Windows 11 23H2| ASUS X670E-F STRIX | AMD 9800X3D@ 5.6Ghz | G.Skill 64Gb DDR5 6200 28-36-36-38  | RTX 4090 undervolted | MSI MPG A1000G PSU | VKB MCG Ultimate + VKB T-Rudders + WH Throttle |  HP Reverb G2  Quest 3 + VD

Posted

Thinder,

didn't you previously "ONLY" suggest 3200MHz as the top speed for Ryzen 5000 and categorically excluded 3600 from a valid setting ? I recall a few threads in my head where you were pointed to various AMD statements, incl. Lisa's own show presenting the CPU's , stating that 3600 is the actual sweet spot and you were not too fond of that 3600 setting at all, iirc.

Now you run 3600 yourself, overclocked and overvolted ( like I do ).

Am I wrong or what change of mind have I missed ?

 

I assume, due to yourself running them at 3600 CL14 @ 1.45v, 3600 is the actual faster setting vs. 3200 even under heavy load on the CPU and IMC.

 

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Sapphire  Nitro+ 7800XT - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus XG27ACG QHD 180Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Posted (edited)
10 hours ago, BitMaster said:

Thinder,

didn't you previously "ONLY" suggest 3200MHz as the top speed for Ryzen 5000 and categorically excluded 3600 from a valid setting ? I recall a few threads in my head where you were pointed to various AMD statements, incl. Lisa's own show presenting the CPU's , stating that 3600 is the actual sweet spot and you were not too fond of that 3600 setting at all, iirc.

Check out AMD own comments on their web sites, on top of the technical support comments I posted, and ps, those comments aren't "MINE"...

The 3600 "sweet spot" is specific to their settings, not mine, and instead of relying on Youtube shows, I opted to ask to whom knows best.

AMDConnectivite1.jpg

AMD Ryzen™ 5 5600X

vs

AMDConnectivite.jpg

AMD Ryzen™ 7 5800X3D

I have explained WHY I can run a 3600 MHz kit several time over, but I will reiterate:

My CPU controllers didn't, and still doesn't have to deal with too many banks (Max is 4 in 2 X 2 or 4 X 1 as suggested by AMD and GSkill techies), the wrong timings you'll get from anything higher than Cl14 and the number of stick for this all to happen.

Interleaving (4 X 1 banks) does have an effect and it's not mentioned often in Youtube shows, especially not "cost per frame" tests, but it is mentioned in a comparison between 2 and 4 sticks, the 4 stick combo is faster (as I said) but they couldn't take the RAM up to 4000 MHz (also as I said).

Note that in some cases, maximum frequency is also depending on the motherboard BIOS, some M.B could run the same RAM a bit faster than mine with Cl14 but still not at 4000 MHz, you won't know that from RAM tests but motherboard comparative, mine runs cooler than most, reason for my choice.

AMD Ryzen: 4 vs. 2 Sticks of RAM on R5 5600X for Up to 10% Better Performance

AMD Ryzen 3/3D limits ARE 3200 MHz with "high street" (non B-die) RAM kits, and UNLESS like yourself provide their controllers with the right range of timings which is precisely what a Cl14 kit will do, by removing the RAM-CPU bottleneck created by going over their limitations, if you don't use a B-die Cl4 kit, you won't remove the bottleneck.

You won't notice it at lower resolutions but at 4K there is a huge difference which is not appearent in those "cost per frame" videos because they are not using Cl14 and optimized setting/timings to compare with.

Quote

Now you run 3600 yourself, overclocked and overvolted ( like I do ).

Am I wrong or what change of mind have I missed ?

I assume, due to yourself running them at 3600 CL14 @ 1.45v, 3600 is the actual faster setting vs. 3200 even under heavy load on the CPU and IMC.

 

You assume wrongly, my kit isn't overclocked nor overvolted, it's a stock GSkill Cl14/3600 kit. I never even tried to O.C my previous Cl14 kit and posted their full references multiple times since I procured and tested it. G.SKILL TridentZ RGB Series 32GB (4 x 8GB) 288-Pin PC RAM DDR4 3200 (PC4 25600) Desktop Memory Model F4-3200C14Q-32GTZR

Now I use:

Cl14-3600.jpg

https://www.newegg.com/g-skill-32gb-288-pin-ddr4-sdram/p/N82E16820374226?Item=N82E16820374226

What I can tell you about it, is that there is no provision in BIOS to run it above 3600MHz in standard timings, now weither it is because of my motherboard BIOS or not, it is not impossible but I don't know, from what both AMD and GSkill tech support says, chances are the CPU controller won't take much higher with Cl14 and I didn't even consider trying.

AMD doesn't support overclocking RAM, and when I tried to O.C my Crucial Cl16 3200 MHz kit with my 5600X with best timing available, i lost just above 1% at 4K, the reason why you're able to tweack your kit is because of B-die, its stability and range of timings, for me it is the reason why I rely on tech support.

A "high street" kit won't do that (even branded GSkill or Corsair), and to run their high-speed kits above 3600, RAM manufacturers have to increase latency to keep their kits stable.

Those kits, with much higher latency will run but never get rid of the bottleneck at 4K, they just don't have the right timing range, meaning you will have a much higher frequency but under load (4K) they will cause the controllers of a Ryzen 3/3D to throttle down, hence lack of optimization and limited bandwidth.

It's not the RAM, it's the CPU controller that dictate the tempo of the waltz, but there are some who still try to counterdict AMD or GSkill technicians by pretending otherwise.

From my first conversation with those technical support teams, my goal always was to remove this bottleneck, especially because I purchased the "cheap" 5600X instead of a faster CPU, and since I wanted to play VR at 4K, I needed to squeeze the heck out of my platform without overclocking and losing manufacturer warranties, I only used boosts (AMD Ryzen Master and MSI Afterburner within manufacturers recommendations), NO O.C.

But when I inquiered about Cl16 kits by curiosity, they advised me the same way: They will advise you on the kit they think will work for your platform at the frequency or latency you ask them, same as a company like Newegg, they will sell you their product but if you want to know what works best (optimization) for your Zen3, you need to ask them specifically.

---

 

Edited by Thinder

Win 11Pro. Corsair RM1000X PSU. ASUS TUF Gaming X570-PLUS [WI-FI], AMD Ryzen 7 5800X 3D, Sapphire Radeon RX 7900 XTX Nitro+ Vapor-X 24GB GDDR6. 32 GB G.SKILL TridentZ RGB Series (4 x 8GB) RAM Cl14 DDR4 3600. Thrustmaster HOTAS WARTHOG Thrustmaster. TWCS Throttle. PICO 4 256GB.

WARNING: Message from AMD: Windows Automatic Update may have replaced their driver by one of their own. Check your drivers.

M-2000C. Mirage F1. F/A-18C Hornet. F-15C. F-5E Tiger II. MiG-29 "Fulcrum".  Avatar: Escadron de Chasse 3/3 Ardennes. Fly like a Maineyak.

 

Posted (edited)

OK, I then misread some of it. My fault.

I thought you "tuned" your B-Die from 3200-14 to 3600-14, but you did the safer bet and bought them that way. I said to myself if they are advertised at 1.45v I may try on my own...and got lucky this time, which is often not the case when you run into HW limits of what can be done and what is BSOD  LoL.

 

Anyway, I can follow your idea and it makes sense in general.

B-Die !  

Edited by BitMaster
  • Like 1

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Sapphire  Nitro+ 7800XT - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus XG27ACG QHD 180Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Posted (edited)
1 hour ago, BitMaster said:

OK, I then misread some of it. My fault.

I thought you "tuned" your B-Die from 3200-14 to 3600-14, but you did the safer bet and bought them that way. I said to myself if they are advertised at 1.45v I may try on my own...and got lucky this time, which is often not the case when you run into HW limits of what can be done and what is BSOD  LoL.

 

Anyway, I can follow your idea and it makes sense in general.

B-Die !  

 

No problem.

I got that a lot, first this subject is rarely debated so people are not aware, second, English is not my first language and despite my best efforts, I can often go the long way in my explainations which can be confusing at time.

Anyway, good for you chosing a B-die kit, AMD recommand them for the Zen 3 and it's for a good reason. My tip: If you plan an upgrade from the 5600X for X-Mass, the Ryzen 7 5800X 3D is a very good bound, I'd check on my timings to make sure your infinity fabric runs at a 1:1 ratio.

If you upgrade this way, try to do a back-to-back test at 4K, in all logic you should see results comparible to mine, the 400MHz shouldn't make a huge difference.

Here are the results of my tests: 3DMark Pro, 4K 2 X MSAA. 5600X Cl14 3200 MHz vs 7 5800X 3D Cl14 3600. EVGA GEFORCE GTX 1080 Ti (No boost for the 5800X 3D) and the RAM kits were as explained in my previous post.

Gains-Stage-1.jpg

Edited by Thinder

Win 11Pro. Corsair RM1000X PSU. ASUS TUF Gaming X570-PLUS [WI-FI], AMD Ryzen 7 5800X 3D, Sapphire Radeon RX 7900 XTX Nitro+ Vapor-X 24GB GDDR6. 32 GB G.SKILL TridentZ RGB Series (4 x 8GB) RAM Cl14 DDR4 3600. Thrustmaster HOTAS WARTHOG Thrustmaster. TWCS Throttle. PICO 4 256GB.

WARNING: Message from AMD: Windows Automatic Update may have replaced their driver by one of their own. Check your drivers.

M-2000C. Mirage F1. F/A-18C Hornet. F-15C. F-5E Tiger II. MiG-29 "Fulcrum".  Avatar: Escadron de Chasse 3/3 Ardennes. Fly like a Maineyak.

 

Posted

Ain't upgrading yet since I do run this 5900X 12-core system and fewer cores will be no good for VMware.

I definitely wait till decent 128GB DDR5 kits are on shelf for fair money before I upgrade unless I need to.

Afaik there should also be the option to mix modules wo penalty, like 2x32GB + 2x16GB

for a total of 96GB without braking things, thats what the DDR5 specs say iirc.

but who knows what your IMC thinks of such an idea.

 

Sure, my ratio is 1:1 on the infinity fabric. It is also stable, hasn't yet crashed in a year and

I often saturate my RAM full tilt for hours and nothing bad happens, stable and quiet.

I think if I had 128GB I would just scale my VMware tests bigger and run against the same RAM wall, LoL.

 

 

  • Like 1

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Sapphire  Nitro+ 7800XT - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus XG27ACG QHD 180Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Posted
1 hour ago, BitMaster said:

Ain't upgrading yet since I do run this 5900X 12-core system and fewer cores will be no good for VMware.

I definitely wait till decent 128GB DDR5 kits are on shelf for fair money before I upgrade unless I need to.

Afaik there should also be the option to mix modules wo penalty, like 2x32GB + 2x16GB

for a total of 96GB without braking things, thats what the DDR5 specs say iirc.

but who knows what your IMC thinks of such an idea.

 

Sure, my ratio is 1:1 on the infinity fabric. It is also stable, hasn't yet crashed in a year and

I often saturate my RAM full tilt for hours and nothing bad happens, stable and quiet.

I think if I had 128GB I would just scale my VMware tests bigger and run against the same RAM wall, LoL.

 

 

Got some potentially bad news for you on the 128 Gb front if you're looking at AM5. At least for now. 

https://youtu.be/P58VqVvDjxo

 

Windows 11 23H2| ASUS X670E-F STRIX | AMD 9800X3D@ 5.6Ghz | G.Skill 64Gb DDR5 6200 28-36-36-38  | RTX 4090 undervolted | MSI MPG A1000G PSU | VKB MCG Ultimate + VKB T-Rudders + WH Throttle |  HP Reverb G2  Quest 3 + VD

Posted

Thanks,  have seen this last night already and it really doesn't look too well.

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Sapphire  Nitro+ 7800XT - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus XG27ACG QHD 180Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

  • 2 weeks later...
Posted
On 12/4/2022 at 5:05 PM, Thinder said:

There is little point in swapping to a 5800x3d without a Cl 14 RAM kit to take full advantage of it, Cl14 was already good with the 4600X, it is even better with the 5800x3d, preferably a 4 X 1 sticks kit, it's a bit pricey but in this particular case well worth the investment.

Those kits comes in DDR4-3200 or DDR4-3600 kits, 3600 work well with Cl14 B-die, gain is significant at 4K 2 X MSAA, just make sure your motherboard supports it.

My suggestion: G.SKILL TridentZ RGB Series 32GB (4 x 8GB) 288-Pin PC RAM DDR4 3600 (PC4 28800) Desktop Memory Model F4-3600C14Q-32GTZRA

Gains from 5600X bounded with Cl16 Crucial to same CPU with Cl14 B-die kit: 4K 2 X MSAA 3D Mark Pro. CPU and GPU boosted (Ryzen Master and NVIDIA After burner). G.SKILL TridentZ RGB Series 32GB (4 x 8GB) 288-Pin PC RAM DDR4 3200 (PC4 25600) Desktop Memory Model F4-3200C14Q-32GTZR

 

Thanks, but there's no way I'll be spending an additional almost $500 CAD for new RAM to gain such a small amount of performance at 1440p.  Yours and my idea of 'well worth the investment' must be very different 😉.  I'll put that towards a 4070 or 4070ti (or AMD-equivalent) instead.

"I mean, I guess it would just be a guy who you know, grabs bananas and runs. Or, um, a banana that grabs things. Why would a banana grab another banana? I mean, those are the kind of questions I don't want to answer." - Michael Bluth

Posted (edited)
On 12/21/2022 at 6:50 PM, aleader said:

Thanks, but there's no way I'll be spending an additional almost $500 CAD for new RAM to gain such a small amount of performance at 1440p.  Yours and my idea of 'well worth the investment' must be very different 😉.  I'll put that towards a 4070 or 4070ti (or AMD-equivalent) instead.

First I never quoted gains at 1440p at ANY time, so I hardly see how you can quantify it as "small", second, playing a game like DCS at this resolution is my definition of a waste of money, I just hope you didn't invest in too many modules.

When once a sudden you'll realise that your system is bottlenecked because of your "economy", and want to play at higher resolutions or even try Microsoft Flight Simulation, find no other solutions but to splash more into a substitute such as more RAM, O.C, loose the manufatrurer warrany for a much smaller gain tell us again how this unknown quantity is "small".

Because you see, I don't formulate opinions out of thin air and never tested at 1440p since I want to play full blown 4K 2 X MSAA VR, but even with my pre-upgrade system, the best addition to the 5600X I made was the Cl14 kit, no context.

Some of us like their wine without water...

Edited by Thinder

Win 11Pro. Corsair RM1000X PSU. ASUS TUF Gaming X570-PLUS [WI-FI], AMD Ryzen 7 5800X 3D, Sapphire Radeon RX 7900 XTX Nitro+ Vapor-X 24GB GDDR6. 32 GB G.SKILL TridentZ RGB Series (4 x 8GB) RAM Cl14 DDR4 3600. Thrustmaster HOTAS WARTHOG Thrustmaster. TWCS Throttle. PICO 4 256GB.

WARNING: Message from AMD: Windows Automatic Update may have replaced their driver by one of their own. Check your drivers.

M-2000C. Mirage F1. F/A-18C Hornet. F-15C. F-5E Tiger II. MiG-29 "Fulcrum".  Avatar: Escadron de Chasse 3/3 Ardennes. Fly like a Maineyak.

 

  • 2 weeks later...
Posted (edited)
On 12/21/2022 at 5:19 PM, Thinder said:

First I never quoted gains at 1440p at ANY time, so I hardly see how you can quantify it as "small", second, playing a game like DCS at this resolution is my definition of a waste of money, I just hope you didn't invest in too many modules.

When once a sudden you'll realise that your system is bottlenecked because of your "economy", and want to play at higher resolutions or even try Microsoft Flight Simulation, find no other solutions but to splash more into a substitute such as more RAM, O.C, loose the manufatrurer warrany for a much smaller gain tell us again how this unknown quantity is "small".

Because you see, I don't formulate opinions out of thin air and never tested at 1440p since I want to play full blown 4K 2 X MSAA VR, but even with my pre-upgrade system, the best addition to the 5600X I made was the Cl14 kit, no context.

Some of us like their wine without water...

 

I can't really understand some of what you're saying here because your english isn't great, but playing at 4K and VR is a TINY (read: extremely small) niche thing that only people who spend a LOT of time sitting in front of their PC would want to spend all that money on.  I put in my original post that I was non-VR at 1440p, I never asked for VR 4K data.  MY idea of spending thousands of dollars to play a single game is MY definition of a waste of money that could be much better spent on other things (i.e. outdoor activities and/or a video card upgrade when I NEED one).  

What I will do is post some actual in-game (DCS) performance gains at 1440p if and when I ever put a 'better' CPU in my system.  3D Mark scores and synthetic benchmark numbers aren't much use to me as they rarely represent in-game performance.

Edited by aleader

"I mean, I guess it would just be a guy who you know, grabs bananas and runs. Or, um, a banana that grabs things. Why would a banana grab another banana? I mean, those are the kind of questions I don't want to answer." - Michael Bluth

Posted
2 hours ago, aleader said:

I can't really understand some of what you're saying here because your english isn't great, but playing at 4K and VR is a TINY (read: extremely small) niche thing that only people who spend a LOT of time sitting in front of their PC would want to spend all that money on.  I put in my original post that I was non-VR at 1440p, I never asked for VR 4K data.  MY idea of spending thousands of dollars to play a single game is MY definition of a waste of money that could be much better spent on other things (i.e. outdoor activities and/or a video card upgrade when I NEED one).  

What I will do is post some actual in-game (DCS) performance gains at 1440p if and when I ever put a 'better' CPU in my system.  3D Mark scores and synthetic benchmark numbers aren't much use to me as they rarely represent in-game performance.

 

Get off my back.

 

You don't understand because 1) you don't want to, 2) because you're way too ignoranty to get the basics, reason for taking me on on my English.

I have NO patience for geezers like you, got it?

Bye.

Win 11Pro. Corsair RM1000X PSU. ASUS TUF Gaming X570-PLUS [WI-FI], AMD Ryzen 7 5800X 3D, Sapphire Radeon RX 7900 XTX Nitro+ Vapor-X 24GB GDDR6. 32 GB G.SKILL TridentZ RGB Series (4 x 8GB) RAM Cl14 DDR4 3600. Thrustmaster HOTAS WARTHOG Thrustmaster. TWCS Throttle. PICO 4 256GB.

WARNING: Message from AMD: Windows Automatic Update may have replaced their driver by one of their own. Check your drivers.

M-2000C. Mirage F1. F/A-18C Hornet. F-15C. F-5E Tiger II. MiG-29 "Fulcrum".  Avatar: Escadron de Chasse 3/3 Ardennes. Fly like a Maineyak.

 

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...