-
Posts
7752 -
Joined
-
Last visited
-
Days Won
4
Content Type
Profiles
Forums
Events
Everything posted by BitMaster
-
Change AMD 5900 for intel? Advice please
BitMaster replied to markturner1960's topic in PC Hardware and Related Software
Actually, you could be right. I wish I had a 5900X here and a few RAM kits to test this. What is not good, as you said, is 4x2R. But that is also usually stated in the QVL and CPU Spec Sheet. If you want to go 128GB you need 4 x 32GB-single Rank iirc. Anyway, those kits are rare and not the norm, 64+GB is imho too much for dual channel but times have changed and I honestly say DCS will make use of more than 32GB RAM in certain scenarios. For me, I do VMware and can make use of 64 or 128GB as long as I have enough cores and I/O, no worries there LoL. 64GB 3600CL16 is very expensive anyway, for the same money you can get 128GB 3200CL16 or if you want...32GB 3200-14-14-14-34. If I buy, I buy at least 64GB, maybe if I go 16core I get the 128GB 3200-CL16 quad-kit. next 2 months I will pull the trigger, either that or a new Apple M1/M1X, need to spend the dough. -
Change AMD 5900 for intel? Advice please
BitMaster replied to markturner1960's topic in PC Hardware and Related Software
I honestly don't think that any kit faster than 3200 is consequently slowing down the CPU. Your promoted solution is indeed great, no doubt, but it is not the only config supported and giving great results. As far as I can remember AMD mentioned 3600MHz as the sweetspot and some even accept 3733 to 3800MHz before the Infinity Fabric switches modes. I do agree, this is then above the specs and outside warranty, just to not get nailed to the floor if I dont mention this. If it wasn't for the money, I would prefer 3600 over 3200 any day if the CL is not off. A 3600 CL16 has 8,88ns latency, just 0,13ns slower than 3200CL14 @ 8,75ns. Balancing Bandwith gain vs. Latency increase vs. cost. As both kits are about in the same ballpark price wise ( with true CL16 or CL14 settings, if you go i.e. CL16-18-18-42 the price drops significantly ), I'd prefer the higher bandwidth and take the little latency hit. My personal rule of thumb, memory ./. bandwidth ~ < 1 64GB ./. ~55-60GB/sec ~1.1 If you have 128GB and DDR4-2133 in Dual Channel you can go and get a coffee if you shuffle lots of GB around as it takes roughly 5 sec to read that amount of RAM in Best Case Scenario, just to give an idea. With a 4, 6 or 8 channel config that goes waaay quicker, or boost up the MHz . Above all, I think in the future, with Win11 and new GPUs etc. the amount of VRAM is getting more important than before once we can load from storage directly into VRAM. Then it may also pay back if you can read fast, very fast or very very fast. The market is forced to change and we can call ourselves lucky if gamers with big fat x86 CPU's dont get left behind. It all moves towards SoC, ARM etc for the main market. How much care can we expect from Intel and AMD if their main focus shifts away towards SoC etc.. Apple has had such a strong impact with their M1 chip that MS is back into developing it's own ARM chip and I dont expect it to fail. Things are moving, Dell and HP have sold really a lot less Notebooks since this M1 came out, they are all forced to do something and Apple is about to launch the next iteration in 2 months already, taking the crown in HE Notebooks, likely. If the money stops to come in, things get changed that were once before written in stone, and this M1 was the beginning of the end of x86 chips as we know them and it will come fast. Once AMD and Intel have branched to issue ARM designs as well, how much R&D will go into our aging class of x86 CPU's ? Most software these days is already ARM compatible due to the mobile market, there are gazillions of apps to get started with, this is not an empty AppStore scenario as in the early days. ...and that's why I will likely prefer a new Apple notebook over a CPU-Mobo-RAM upgrade but I have not decided yet which route I will go, maybe I will get my last x86 CPU, a 5900X..LoL Sorry for hijacking this thread -
Latest Benchmarks - Intel vs AMD
BitMaster replied to m335's topic in PC Hardware and Related Software
Oh well, that is Hand Made with lots of sweat & time. There is no app or modus that will let you measure DCS' performance. You have to dial the settings in by hand, try to fly the same path etc. every time, note the fps and then rinse & repeat with a new setting, new GPU, faster RAM, etc etc... As you can see, if you have a normal life & maybe a family on top, you dont have the time usually to do that. Also, with that many changes happening the value of such graphs are rendered useless after a few months. New GPU drivers, DCS' updates and new modules, newer Win10 Build, Win11 soon, etc etc.. I test it on my system with my settings and that counts for my experience. If it has too little fps I know I have to tune LOD down etc... Graphs from others are only a rough Guide what to expect, it only takes a little overseen setting to ruin it all. -
That's a must have Warbird imho. Preordered
-
Latest Benchmarks - Intel vs AMD
BitMaster replied to m335's topic in PC Hardware and Related Software
Benchmark: 3DMark via Steam https://store.steampowered.com/ Passmark https://www.passmark.com/ Stability Tests: Intel Burn Test IBT v2.6 https://www.majorgeeks.com/files/details/intelburntest.html prime95 https://www.mersenne.org/download/ Aida64 https://www.aida64.com/ if you know Linux, I recommend stressapptest from Google, available via most Software Management Tools Ultimative Test: 14+ days no reboot and no BSOD -
QUOTE: "...and a new GPU equals the end of my marriage" Buy some FLOWERS for your wife.
- 26 replies
-
- triple monitor
- rx6800xt
-
(and 1 more)
Tagged with:
-
Microsoft Windows 11? Errrr. Perhaps not.
BitMaster replied to Thinder's topic in PC Hardware and Related Software
There won't be 100% compatibility, that is the only thing that is 100% certain. With the right HW you can get pretty damn close to Fire&Forget when it comes to basic stuff, CPU, Chipset, Audio, LAN, WLAN, BT. GPU is usually not a big issue but sometimes you have to fiddle with the version of Nvidia driver you are using to get it working ( I am really happy to have an iGPU just for this reason ). For gaming devices, it all depends if the souces are open and if the Community can build a driver, or not.- 1 reply
-
- 1
-
-
The trend towards 64GB in DCS is slowly arriving with Syria and the newer modules. If I was you and if you can afford it, go 64GB right away. It may be very tricky or impossible to add another 32GB a year later. I would personally prefer an AMD these days but the above mentioned HW is fine as well. We have choices meanwhile.
-
Will the M1X drag players to Mac ?
BitMaster replied to BitMaster's topic in PC Hardware and Related Software
I have no idea what language he's talking in that video. -
Will the M1X drag players to Mac ?
BitMaster replied to BitMaster's topic in PC Hardware and Related Software
As far as I could read it up, that version of Windows never made it to release. I am honestly not a big fan of messing too much with a Windows install. The more functions and apps you have installed the more dependencies one has and it then takes only little things to break big things. I would consider that without question with one of my VM Win10's. Provide me a link and I will happily spend some time with it. It's actually not about OS stability foremost, that is only a nice extra, the big thing is graphics power combined with very high IPC and Multi-Core performance as well, all that with much less wattage/heat and very likely for a fraction of the cost. I mean, right now, you can get a very capable MBP 16" with some reall extras on top for the price of a 3090. With a sober mind and hard earned cash you will think twice if alternatives arise. I am confident DCS won't move to macOS-ARM before I die, that's not my topic. It is more or less the whole Home Computing market shifting towards ARM in general IF Apple can deliver as expected with M1X, M1 is already a big hitter even so it's a first. For 90% of the people I know ( and fix their little PC issues ) a MacBook Air M1 would be a blessing in every aspect. It roughly pulls equal with a R5 5600X at a fraction of the energy and cooling needed. Now imagine Apple manages to really succeed with the M1X across a variety of their Computers and mobile devices as well. It's the same damn die all over, they wont need 5 different dies and sockets etc... they need just ONE. Yield will be much higher, cost significantly lower and what the rumors say, tack a few of them together and you get a 40-core CPU. That will hunt in Threadripper territory, ala Mac Pro or high end iMac class of devices which usually have high core count WS CPU's. If Apple manages to use one and the same core across most of their devices it will be a winning strategy and others, if they like it or not, will have to do ""something"" about it. I think, the wrong answer would be to follow the x86 road for much longer. imho, the future is ARM, multiple cores with high performance vs. high efficiency ( 8+2; 16+4, 32+8 etc.. ) combined with a multicore iGPU, all tied to the same nearby DRAM. Things would need to change, there would be no more empty mainboard to buy, likely they would come as a Board+CPU/GPU+RAM is what I could imagine. As it already is nowadays with highly energy efficient devices, sockets and pins are a thing of the past in this regard. With the Green Idea behind, Net-Zero-America or the equivalent movement in Europe, tightened regulations, all that points to the same direction. In contrast, Intel's latest Alder-Lake-S insanity with ~230w TDP for an 8-core feels like the "Processic Park II, The Revival of the Dead", a real hot movie if you ask me. Whatever will come out of it, it will affect the CPU landscape significantly and thus, through the backdoor, force Software Companies to again take care for ARM compatibility. It's nothing new, it's already done for billions of ARM based mobile devices, likely outnumbering x86 devices already. -
Don't beat me now, LoL looking at the assumed numbers for the coming successor of the M1 chip, M1X, it looks like we are looking at a paradigm shift happening silently. When that new MBP with a 40coreCPU/32core GPU will play at 4k+, incl. RT, at a MUCH lower TDP and also MUCH MUCH lower price I could see that happening. Actually, I was planning for an AMD Ryzen and maybe a MacBook Air M1, hey, it looks like I will stick to my rig and wait what the new MBPro offers. When it comes to quality, I can only say my Mid2012 MBPretina still runs great, never had an issue in almost a decade, never ever it let me down or wouldnt wake from sleep, etc etc etc... I could list a ton of stuff that my Windows machines sometimes do and what the Mac never did. That alone makes me want Apple and pick Windows only if there is no other way. In that time I own the MBP I went through more than a hanfdull of gaming rigs, 2700k, 6700k, 7700k, 8700k, 980GTX, 1080ti, etc. ..and each one of those gave me more grey hair I am really looking forward for this. It can't harm to stirr the gaming hardware market up a bit. It will take time to adopt to ARM but I think the age of x86 is coming to an end in the next 5 years.
-
I cannot deny the truth above, LoL. So far it runs ok in a virtual machine with nothing to do. On a workhorse machine it may look totally different.
-
Depending on your CPU cooler you can get some better frametimes if you overclock the CPU towards 5GHz, which many of the 8700k are capable of. As a guideline, the upper Volt limit should be 1.35ish Volts for the CPU and when you do stresstests, have an eye on the watts ( HWinfo ), you can go well north of 200w and the heat will spike like mad. A good start could be MCE, MultiCoreEnhancement, which will likely put all cores to a static 4.7G at around 1.35ish Volts. You can use that ( activate in Bios ) to get a feeling how your CPU clocks. It's a very time intense thing if you are new to oc. Take your time, watch YT vids and dont hesitate to ask.
-
Find cause for crazy high CPU frametime?
BitMaster replied to Donglr's topic in PC Hardware and Related Software
I wouldnt buy new RAM either unless really forced to. Enjoy -
Exactly ! I will not pay 2-3k€ to get a great GPU. Actually, I just got a call today from one of my nephews asking if he could upgrade his GPU etc... Guess what I told him. Stick to your GPU and if it all goes south, get a new console. This situation ruins a lot, the damage just doesn't show up yet but it will, I am pretty sure there will be a measurable shift towards consoles, away from PC if the prices dont come down for everyday people with normal budgets and family responsibility. I just cant pay 3k€ and tell my kids its gonna be Noodles and Ketchup for the next 12 weeks.
-
If you have the KEY it should work but many PC's I services didnt have a key sticker, or didnt have it anymore. That will work and activate as long as you keep the same HW as MS ties that HW checksum to the KEY your Win10 uses, you dont need any MS account for that. Just when you change HW the hassle starts... Things might work differently if you live in different countries with differing laws etc. I can only speak about Germany and partly about US owned machines operated in Germany ( like a Nato soldiers private PC, which would actually fall under SOFA agreement iirc... now it gets complicated..and I am a PC guy and no lawyer
-
Find cause for crazy high CPU frametime?
BitMaster replied to Donglr's topic in PC Hardware and Related Software
Good move with the 5600X ! I dont think it will dissapoint you. It runs circles around my enthusiastically cooled 5GHz 8700k even w/o PBO2 engaged and with the stock AMD cooler. It should give you a good boost forward. For the RAM: Once you can run them stable at 3200/CL16 without any manual OC otherwise you can try to up the Volts to say like 1.385-1.40v and lower the CL to 14. It's a trial and error and may not be worth it but it's a good way to waste dozens of hours and maybe you can get them to 3200-14-14-14-34-1T @ 1.38-1.40 Volts That's what I would do if they run XMP/DOCP just fine. If they don't, you have to manually find out the correct settings or buy new RAM. I am also keeping my RAM, 3600-16-16-16-36-2T 32GB/4x8 for my planned 5900X. Only if they dont run properly I will buy new RAM...and if I have to buy I will go 64GB, likely 2x32GB if I can get those with the timings I want. RAM can be really tricky if it wont work out of the box as intended. -
AMD RYZEN 7 5800X vs AMD RYZEN 9 5900X
BitMaster replied to SirWoofer's topic in PC Hardware and Related Software
IIRC the 5800x runs a bit hotter than the 5900x despite fewer cores. If you want to exploit PBO2 you do need a serious cooler. I am actually aiming for the 5900X myself, just waiting for the "S" boards to arrive, I really want to avoid that Southbridge Fan. -
FYI: For a DCS Simmer or PC enthusiast in general it may pay back to not only activate Win10 online ( which you must do obviously ) but also to register that license to your MS account for a very simple reason: If you do NOT do that, you can easily reinstall Windows10 on this same rig and it will activate again ( again only online, which is actually not my topic here ) BUT if you change MB, CPU or many other components and your Win10 decides to deactivate itslef you are locked out of your license. Only if you have registered that lic to your account you can tell MS "I have changed my hardware..." with a button, log in and choose the previous PC-lic you want to use to activate again. Saves real money and headache. You dont have to use that online account if you dont like it afterwards, I dont use it either. Create a new admin account and wipe that online account if you wish but secure your license across hw changes. I did activate TPM2.0 on my 8700k through Asus Bios and it worked, I can also use that function inside VMware now and have 2 Win11 installed, Beta and Dev editions in VMware so far. Mind you, for VMware, I upgraded two Win10Pro installs, it did not need that TPM function. I enabled it a day after I installed Win11. At least for now and in VMware you dont need a TPM enabled. Might be needed if install from scratch or in later editions, I simply don't know. What I know is that the 8700k has a built-in TPM2.0 function via Firmware.
-
Standard benchmark/record on DCS ?
BitMaster replied to dureiken's topic in PC Hardware and Related Software
I can answer your last question: Many use MSI Afterburner, which you can download from Guru3d.com. It has a configurable OSD via the included Riva Tuner Statistics Server RTSS. YOu can, if needed, also route your HWinfo values into that OSD and literally show each and every aspect HWinfo is capable of on your OSD inside DCS. It has many features, the OSD is only 1 part of it. You can tune and overclock/undervolt your GPU with it too... it's worth spending some time with it to know how to work with it, not only for DCS. The first question is really hard to answer: The "standard" benchmark....ehhh...there is the problem. With so many updates in DCS, coupled with new GPU drivers and Windows Builds...it is almost impossible to create a benchmark that has a halftime longer than 14 days tbh. Some have done real nice graphs with lots of effort, but that was before 2.7, before 2.x etc.., so they have become mroe or less useless for present setups & versions. The best benchmark imho is your own judgement. Does it stutter ? NO = Good, does it have enough fps to satisfy me ? Yes = Good. If you are on the other side of those 2 answers then you may need to tinker with the LOD, OC etc... and if that all doesnt bring relief you may need to upgrade hardware -
Don't panic yet ! It's beta time and beta conditions too ( TPM2.0 only for now in beta, Release will work with 1.2 as well ) and most boards do have at least TPM 1.2 in Firmware, so no need to add a TPM module iirc. If they really would exclude R5-1600x, 7700k etc.. from Win11...well, then MS really screwed it up this time. Acceptance would be a lot lower if that is the case upon release. No, I cannot believe they will do it that rough.
-
SATA SSD or PCIe NVME SSD?
BitMaster replied to Hammerhead's topic in PC Hardware and Related Software
Raid-5... I never liked that specific type, it's not bad but imho if you can go Raid-6 then do that. Not seldomly multiple drives fail in an array in relative short time, rebuild speed etc.. Global Hot Spare(s) etc... Raid-6 for critical stuff, or Raid1or10 for OS ...and Raid-0 for pure speed Gaming etc. ala F... Data Security LoL W -
New drive, looking for advices
BitMaster replied to Gianky's topic in PC Hardware and Related Software
-
New drive, looking for advices
BitMaster replied to Gianky's topic in PC Hardware and Related Software
This is partially true. Smaller drives, usually the smallest or the 2 smallest drives out of a series have too few Dies to have one connected to each channel of the controller and thus much of the performance cannot be leveraged on those i.e. 4 out of 8 channels. The parts, controller and storage dies are as fast as on the big TB drives, just fewer of them and that hurts parallel I/O performance. But to be honest at this specific capacity and price, those are usually the most expensive per GB as well. When you look at the 980 Pro, the 256GB is very expensive compared to the 500GB and 1TB model.