-
Posts
7770 -
Joined
-
Last visited
-
Days Won
4
Content Type
Profiles
Forums
Events
Everything posted by BitMaster
-
I may repeat myself but the "GVK" RipJaw-V version of the module uses the same B-die, it just skips the LED and thus cuts cost by 60€, maybe more in Brazil. BTW, Those GVK-RipJaw-V's kit are listed many times in mprime world best list. If they were any lesser than the LED version I am sure those Hardcore Overclockers would use the LED variant of that module. Anyway, I second the CL14-3200 approach, B-die..yes yes yes. I am just not sure if someone needs the LED gimmick when all it mostly does is causing software trouble with the LED Software ( Board+Tower+GPU+RAM..all LED...what a mess to get that all under one umbrella. ) If my budget allow I may get the LED version to match the nice new Case but if the budget says NO, it is no lesser kit just no shiny LED.
-
Tbh, 2023 is FAR away in the computer universe. It is next to impossible to tell you now what to buy March 2023. All I can say, watch the market and technology arriving and make your picure of what's worth waiting for and what's worth spending extra cash on. For sure you want a CPU with excellent IPC and also high clocks, at least 32GB of whatever RAM there is in 2 years and I would think 64GB rather than 32GB then. GPU, well, most cannot spend 3k€ on a GPU so many have to settle with cards around 1k€, +/- what the budget says. Hard to say, 2023 will have Intel offering cards too, that might stirr up the GPU market, I really hope so tbh. I have been following the the prices for the past 6 months and planned various systems, compared, erased, started all over..it's a process. It wasn't easy and with 12th gen arriving I was waiting for the prices of those. Now reading how much a 12900k and DDR5 will cost...No Thank You, I'll grab that Ryzen-9 + DDR4 and call it a day. I have settled for an AMD and X570S, 64GB, 2 x 1TB 980Pro and a new Case. I will keep the DIY-Loop, PSU, GPU and some of my SSD's, the rest stays in my 8700k on air.
-
CPU overclock - is it truely worth it for DCS?
BitMaster replied to sirrah's topic in PC Hardware and Related Software
What I forgot to say about the Volts... Asus, Asrock, Gigabyte, MSI and all others use different approaches to Load Line Calibration. If you are not familiar to that, google it up real quick. The essence is what your dialed in Volts under load say and not the value that you dialed in. That may, and likely are, 2 different pairs of shoes. LLC is the corrective to that ! In order the read the real & present value under load I personally use HWinfo in Sensors Mode Only setting. You can read the correct value there when you stress it. The more it drops from the set value under load the less stable it may get, use a higher LLC setting to correct the drop under load ( some vendors use 1-7 low to high and some do it vice versa iirc. ). Buffer the drop so when you dial in 1.33 it wont read any lower than 1.32 under prime small FFTs. With Z370 and Asus there is a Bios setting to pre-calibrate the LLC and Adaptive Volts offset ( and others may too ) which has Intel default, Best, Medium and WCS ( Worst Case Scenario). To be as linear as possible use BEST, as it will add no extra Voltage to whatever you dial in anywhere. WCS i.e. adds some extra Volts and Heat is likely the reason why Auto-Overclock seldomly brings excellent results vs. DIY with BS&T. With HWinfo you can easily read what each setting adds vs. Best ( or equivalent term in your Bios if present ) with a fixed LLC. I use Best Case to no have Asus add their juice to my soup when dialing in Volts. It makles a considerable difference in the amount of heat it produces and the LLC level you have applied under full load. You may fail 4.8, 4.9 or even 5.0G just because LLC adds way too much volts and heats up the CPU so much that it tilts. With the correct Adpative Volts max 1.35v, correct pre-calibration setting if available and the right or near right LLC setting volts don't go much higher and not much lower than what you dialed in under load. Adaptive Volts and leaving all Speed Step settings enabled enables windows to lower the frequenc & volts in energy save modes, if you want you can switch to high performance mode, the CPU will kick into 5G and 1.35v in idle, when you load it fully it should not drop more than 0.01-2 volts and not get too hot. That setting is the sweetspot for my 8700k. Depending on the silicon yours may need more or less volts to run stable. Delidding certainly helped to reduce the heat on mone by the high 1-digit values across all cores with a beefy DIY Loop. Not delidded and with air the limit under full load may well be under 5G but tbh for DCS you could try 100-200MHZ higher than what WS-stable is. I can play DCS at 5.1-5.2G for at least 2h ( i hardly ever use it longer ) but it is way off from stable under full load because to the heat due to the extra 0.05 Volts it needs to push the die to 5.2G. I havent run the chip above 5G for a long time but it runs 24/7/365 at 5G ever since and only reboots if needed. With VMware I often Kill it, trash it & really really need every little bit of power it's got and it really delivered under full tilt loads. When it crashed and BSOD'ed it did under no load, YT & idle etc. I will upgrade pretty soon and 128GB seems more likely than 64GB and more cores, as much as trhe budget allows, due to VMware, not DCS. Since DDR5 has less than stellar DDR5 latency leaks I think it's still I good time to go team red and get a Ryzen-9. Dial in PBO2 and call it a day. Overclocking is fun but at the endof the day I paid way way too much ti get that 8700k to where it is now ( either that or lesser silicon ). Delidding kit, Liquid Metal thermal paste, WC I think AMDS's appoach with PBO2 is better. Give it a good cooler and lean back for best single core performance, what DCS needs. -
CPU overclock - is it truely worth it for DCS?
BitMaster replied to sirrah's topic in PC Hardware and Related Software
Never really used XTU tbh, I use Bios and also Asus Ai-Suite sometimes. For a 24/7/365 oc you definitely want to do it in Bios. There are so many test suites and benchmarks around, pick a few to find out obvious weaknesses in your OC, the real test is many many days w/o reboot and lots of different tasks inbetween, from DCS/Gaming, browsing, backup over many hundreds of Gigabyte... if it does all that incl. not failing the tests it might be stable LoL Tbh, Win10 itself is BSODing enough, sometimes you wont be able to tell easily why it happend. I see a lot of 10 installs and honestly, Microsoft ....ahhh I stop here..LoL. You get the idea. -
CPU overclock - is it truely worth it for DCS?
BitMaster replied to sirrah's topic in PC Hardware and Related Software
sirrah, I oc my 8700k ever since I own it, it was ment to be overclocked because it is one of the "older" CPU's where you could still gain some serious MHz with modest effort. The 2 things you should care about is Volts and Heat, directly connected to each other. But hey, even if you can cool that CPU with a super cooler it is no good idea to pump more than 1.35v through it under load and run it like that for months and years to come. I would start upping the MHZ first until it fails, that produces less heat. Increase Volts only if need in 0.01-2v increments. Use Adaptive Voltage for the CPU and set it to 1.30, set your CPU to 4.7GHz all cores and test it, if all green, do 4.8, then 4.9..until it fails, if it fails increase voltage by 0.01 or 0.02v and try again but always check that temps dont mainifest in the high 80s. 80-85°C max is still ok under full loadbut I wouldnt run a setting that pushed it into the high 80s into the 90s °C. Things to may tweak are VCCIO and Systam Agent Voltage, both may get auto set to values way above 1.25v to 1.35v, that puts a lot of stress on the IMC and you may set it to 1.15 both manually if you start to get RAM errors or postings issues. Other than that, most do 4.8G with a good air cooler and really good ones do 5G on air but that is borderline if the 8700k is not delidded ( mine is ). With a good AIO or DIY loop most do 5G or even more if delidded and lucky. One more thing, by ALL MEANS, I would NOT boot my Win10 do test this until you have somewhat stable OC. Overclocked-Reboots often cause file corruption, I have ruined many many Win10 installs this way. Best if you makle a bootable USB stick with a Linux and boot that. If that fails or crashes nothing happens to your install. If you Win10 errors, do sfc and dism commands to check file integrity ( google ) -
Do a DCS repair and also a MS "sfc /scannow" and if things are really bad, followed by a DISM command. Google the commands, 1st use sfc, if that fails, follow the guides using dism. Might well be you have corrupted some files with the sudden reboot, that is a very common thing.
-
Suggestion for a gaming PC upgrade - main use for DCS
BitMaster replied to maquez's topic in PC Hardware and Related Software
I would skip the 32GB kit and would buy a 64GB kit right away but settle on 3200-CL16 to keep the price down. With Intel the actual DDR4 speed/performance is not as critical as it is on early Ryzens. You won't really see a lot of difference between 3200-16 and 3600-16 on Intel CPUs when gaming. When you follow the HW threads, there are more and more threads emerging stating RAM usage well above the 32GB when on the right map with the right module etc... Adding RAM a year later can work but chances are bigger it will not. I just bought 64GB yesterday for a buddy, 3200 CL16 for 250€, Gskill. You can get the "same" kit with CL14 but price is double that, 480€.... for AMD I would consider the lower CL, for Intel...skip it and use the cash elsewhere. My 2 cents -
You are mixing up two latencies. The 92.5ns latency is between CPU<-->RAM if I got this right and the other value is RAM speed internally, how fast it can switch etc.. I am not super sure if that is totally correct, I can only say that i.e. my RAM at 3000-14-14-14-34 as of now has roughly 50ns latency on my Z370/8700k. The same RAM kit on Z270 did the XMP 3600-16-16-16-36 and had round about 40ns latency. The 1st gen Ryzens irc had values between 60ish-ns to 90ish-ns and were accused to be too slow for gaming. It will be interesting to see how the first real world benchmarks of DDR5-AlderLake will perform. I expect a rather meh experience with the tip to "Wait for the 2nd gen if you can..." kinda statement.
-
https://www.caseking.de/team-group-elite-u-dimm-ddr5-4800-cl40-on-die-ecc-16-gb-dual-kit-metg-398.html 329,90€ for 16GB 4800 DDR5 with ...CL 40 !!! I am also in the same situation, want to upgrade, 64GB minimum, 12cores minimum, but this Z690 thing does not hype me at all. It may be faster in IPC but if you check how much you will have to pay for FOUR of those kits from the link above with even higher speeds, ehhhh I will rather pull the trigger on 64GB DDR4 3200/CL14 and a Ryzen 5900 or 5950 once the Gigabyte boards are available again. I like the idea of DDR5 but the first iteration of any new DDRx was expensive, not satisfying at initial speeds and latencies, not many modules to choose form, Bios-RAM issues, etc... Those 64GB DDR4 3200/14 go for 485€, that's a real deal compared to 16GB for 330€. That's 1.320€ for 64GB 4800 DDR5 +/- Lets hope the prices come down and don't stay that high once the CPUs and Boards arrive. edit: just looked at my calc on my screen, it says 1.367,62€ 47,62€ more than what that potential 64GB kit would cost. Just that the number stood for a Ryzen 5900X + Gigabyte Aorus Master X570S + 64GB Gskill Trident Z Neo DDR4 3200-14-14-14-34 4x16GB for a total of 1.367,62€ only 1 click away. That is the difference between DDR4 and DDR5. Wow ! Indeed a bit shocked.
-
Recommendations for tracking adversaries in DCS.
BitMaster replied to Paver's topic in PC Hardware and Related Software
1: I'd try one out before I'd buy one. Most love it but some, incl. me, can't overcome nausea after a few minutes and can't use VR as it is as of now. 2: Make sure your system can handle VR at your desired LOD. -
Usefull hardware / software links.
BitMaster replied to Groove's topic in PC Hardware and Related Software
If you really want to get a proper Win10 installer image, this is how you do it: be warned, you will need access to a Server ( can be a trial 180-day lic ) and some time to get it done. Prepare the Server with WDS: Make your custom image, permanently banning Bloatware Take your time.... Bit P.S. even if you dont plan to do all this, it is very informtive and you can also refrain from using the decrapify script and just disbale all Questions MS asks you when you are installing....I recommend this to all users to get a better understanding what MS does with your PC. -
FYI: If you install new or need to update software, you can safe lots of time using ninite.com installer script. It is one of the best tools I have found in 25 years servicing computers. Absolut free of any bloatware, it actually takes it also off from software packages you choose, your language, etc.. And best is, run the script once a week and it will update all the software manag3ed by the script with 1-Click Scroll down the ninite page and check who is using it ( payed version likely ). It is 100% free for personal use. You wont regret reading that page and trying it out ! https://www.ninite.com
-
YES You need to install Chipset drivers !!! Either the ones from your MB Support site or the newest ones from Intel. You likely need the proper Audio driver pack as well, for LAN/WLAN/BT you can use the drivers from 10 if it provides some but I personally would install the latest drivers from the MB page. ALSO....install SAMSUNG NVMe driver !!! otherwise you loose some performance. Samsung NVMe like their own driver far more than MS general version! Samsung Magician is the tool you need to check the drive. Same site has the NVMe ( further down the page ) and may also hold a new Firmware. https://www.samsung.com/semiconductor/minisite/ssd/download/tools/ Samsung Magician, same page as NVMe driver, will scan your drive and read SMART data. Install Magician + NVMe driver from Samsung if you run Samsung SSD of any kind.
-
Imho, wipe the drive and Do-it-again-Jack But before you do that, update your stick with Win10-21H1 so you don't have to download and install prior builds and updates etc... This is the actual MS site do download the tool to prepare the stick. This one is in german, best to google from your desktop and google results will prompt you with the one in your native language. https://www.microsoft.com/de-de/software-download/windows10 You have to distinguish between TIME TO INITIALIZE THE BOARD --and-- TIME TO BOOT WIN10. If it takes very very long to get past the Motherboard initialization your RAM might be at the wrong setting. If you have to wait ages for Win10 to boot, then it's some driver most likely causing issues. Either way, wipe that thing and install a clean 21H1 then install Chipset+GPU+Sound+LAN/WLAN/BT drivers, thern update 10 ( with optional updates, you need .Net etc.. )that so it has all updates and then proceed with the apps that you have. If I set my RAM to "strange" stettings my Bios-Initialization can take up to 30-40 seconds to get the BEEP...+ all the rest that comes after that. With the correct settings it goes BEEP in less than 2 seconds after hitting the power button. There are other settings that can slow down the process: Chipset-Raid-Arrays and AIB like Raid-Cards, feature rich NICs, active ISDN cards but that is likely not your scenario with a home computer running 10 Home
-
Thinder, with which RAM Kit would you bundle a 5900X ( I need the cores for VMware, the extra ram too ) on the Gigabyte X570S Aorus Master ( once available again, it seems to have disappeared in a black hole after the first batch was sold out here n Germany ). Ahh, yes, 64GB and 500€ RAM budget. I thought about 128GB but I think 64GB will be enough for my VM and also future DCS needs. The price difference between 3200-14-14-14-34 and 3600-16-16-16-36 is marginal and both run around 8,75ns latency....but I hear you shouting at me "But the CPU doesn't like 3600 under heavy stress". and I think after 4 years of 3600-16-16-16-36/32GB finest RAM running at 3000-14-14-14-34 as max it will take I rather take the safe bet than being stuck with Uber-RAM in a Board that refuses to work properly with them. Eager to hear your input Bit
-
From my POV, it will take at least till 2023 until DDR5 is a practical thing to do. 2022 will be the year of high priced and highly hyped modules, Paper Launches and what not else. Also, in order to be faster and lower latency, in the past you also had to wait two years until DDR4 exceeded DDR3, DDR surpassed DDR2... in real life. I assume the same will happen with DDR4->DDR5. PCIe5 sounds cool, if it wasn't so expensive for the board makers. Signal integrity once more needs to be pushed beyond current limits for doubling the bandwidth, it doesn't come for free. If GPU's would adopt PCIe5 and could run happily @ x8 it would mean we had finally had a 2nd slot available again for high bandwidth AIB ( NVMe, 2nd GPU, Raid Card, 10-40GB NIC, etc.. ) I think, until end of 2022, nothing much will really happen. with those two I am still battling with myself what to buy: 5900X-64GB ( 3200/14or16 or 3600/16 ?? )-Gigabyte Master X570S ....or a simple MacBook Air M1, 16GB, 512GBSSD...or wait for the MacBook Pro 14" M1X.....heck...I am really torn apart but neither DDR4/5 or PCIe4/5 question is a real factor for now. If, then it is M1's brutal performance and architectural advantages. The 5900X as a 12core will make my life with VMware easier, M1 ARM is a VMware brick , for DCS a 6-8 core is plenty in my use case. 64GB is the minimum I want, I actually think about 128GB 3200/16 which is about the same as 64GB of fast 3200/3600 memory. But then 128GB and only 12 cores, might as well take the 5950X and call it a day.... I am so confused what I shall spend the bucks on. Where is my Wollmilchsau-Gaming&Work PC that is portable too ( now Google that term up ) The only thing sure is I a am not buying a new GPU until prices come down or I can get twice the power of my 1080ti for less than 1k€, mabye a 5060ti or 9600AMD in the far future LoL.
-
Change AMD 5900 for intel? Advice please
BitMaster replied to markturner1960's topic in PC Hardware and Related Software
Actually, you could be right. I wish I had a 5900X here and a few RAM kits to test this. What is not good, as you said, is 4x2R. But that is also usually stated in the QVL and CPU Spec Sheet. If you want to go 128GB you need 4 x 32GB-single Rank iirc. Anyway, those kits are rare and not the norm, 64+GB is imho too much for dual channel but times have changed and I honestly say DCS will make use of more than 32GB RAM in certain scenarios. For me, I do VMware and can make use of 64 or 128GB as long as I have enough cores and I/O, no worries there LoL. 64GB 3600CL16 is very expensive anyway, for the same money you can get 128GB 3200CL16 or if you want...32GB 3200-14-14-14-34. If I buy, I buy at least 64GB, maybe if I go 16core I get the 128GB 3200-CL16 quad-kit. next 2 months I will pull the trigger, either that or a new Apple M1/M1X, need to spend the dough. -
Change AMD 5900 for intel? Advice please
BitMaster replied to markturner1960's topic in PC Hardware and Related Software
I honestly don't think that any kit faster than 3200 is consequently slowing down the CPU. Your promoted solution is indeed great, no doubt, but it is not the only config supported and giving great results. As far as I can remember AMD mentioned 3600MHz as the sweetspot and some even accept 3733 to 3800MHz before the Infinity Fabric switches modes. I do agree, this is then above the specs and outside warranty, just to not get nailed to the floor if I dont mention this. If it wasn't for the money, I would prefer 3600 over 3200 any day if the CL is not off. A 3600 CL16 has 8,88ns latency, just 0,13ns slower than 3200CL14 @ 8,75ns. Balancing Bandwith gain vs. Latency increase vs. cost. As both kits are about in the same ballpark price wise ( with true CL16 or CL14 settings, if you go i.e. CL16-18-18-42 the price drops significantly ), I'd prefer the higher bandwidth and take the little latency hit. My personal rule of thumb, memory ./. bandwidth ~ < 1 64GB ./. ~55-60GB/sec ~1.1 If you have 128GB and DDR4-2133 in Dual Channel you can go and get a coffee if you shuffle lots of GB around as it takes roughly 5 sec to read that amount of RAM in Best Case Scenario, just to give an idea. With a 4, 6 or 8 channel config that goes waaay quicker, or boost up the MHz . Above all, I think in the future, with Win11 and new GPUs etc. the amount of VRAM is getting more important than before once we can load from storage directly into VRAM. Then it may also pay back if you can read fast, very fast or very very fast. The market is forced to change and we can call ourselves lucky if gamers with big fat x86 CPU's dont get left behind. It all moves towards SoC, ARM etc for the main market. How much care can we expect from Intel and AMD if their main focus shifts away towards SoC etc.. Apple has had such a strong impact with their M1 chip that MS is back into developing it's own ARM chip and I dont expect it to fail. Things are moving, Dell and HP have sold really a lot less Notebooks since this M1 came out, they are all forced to do something and Apple is about to launch the next iteration in 2 months already, taking the crown in HE Notebooks, likely. If the money stops to come in, things get changed that were once before written in stone, and this M1 was the beginning of the end of x86 chips as we know them and it will come fast. Once AMD and Intel have branched to issue ARM designs as well, how much R&D will go into our aging class of x86 CPU's ? Most software these days is already ARM compatible due to the mobile market, there are gazillions of apps to get started with, this is not an empty AppStore scenario as in the early days. ...and that's why I will likely prefer a new Apple notebook over a CPU-Mobo-RAM upgrade but I have not decided yet which route I will go, maybe I will get my last x86 CPU, a 5900X..LoL Sorry for hijacking this thread -
Latest Benchmarks - Intel vs AMD
BitMaster replied to m335's topic in PC Hardware and Related Software
Oh well, that is Hand Made with lots of sweat & time. There is no app or modus that will let you measure DCS' performance. You have to dial the settings in by hand, try to fly the same path etc. every time, note the fps and then rinse & repeat with a new setting, new GPU, faster RAM, etc etc... As you can see, if you have a normal life & maybe a family on top, you dont have the time usually to do that. Also, with that many changes happening the value of such graphs are rendered useless after a few months. New GPU drivers, DCS' updates and new modules, newer Win10 Build, Win11 soon, etc etc.. I test it on my system with my settings and that counts for my experience. If it has too little fps I know I have to tune LOD down etc... Graphs from others are only a rough Guide what to expect, it only takes a little overseen setting to ruin it all. -
That's a must have Warbird imho. Preordered
-
Latest Benchmarks - Intel vs AMD
BitMaster replied to m335's topic in PC Hardware and Related Software
Benchmark: 3DMark via Steam https://store.steampowered.com/ Passmark https://www.passmark.com/ Stability Tests: Intel Burn Test IBT v2.6 https://www.majorgeeks.com/files/details/intelburntest.html prime95 https://www.mersenne.org/download/ Aida64 https://www.aida64.com/ if you know Linux, I recommend stressapptest from Google, available via most Software Management Tools Ultimative Test: 14+ days no reboot and no BSOD -
QUOTE: "...and a new GPU equals the end of my marriage" Buy some FLOWERS for your wife.
- 26 replies
-
- triple monitor
- rx6800xt
-
(and 1 more)
Tagged with:
