MIghtymoo Posted June 7, 2023 Posted June 7, 2023 (edited) I have recently gotten a used RTX4090 (still expensive ) and have a Intel i5 11600K CPU. The CPU has a stable overclock on all cores to 4.9 GHz. I am running DCS capped to max 60 FPS in 4K with everything maxed out. I can get occasional drops in heavily scripted multiplayer servers, otherwise smooth on all maps in SP and MP, even Normandy 2.0. Is there much to gain going for a newer 13th generation CPU for DCS compared to my OC 11th generation? Edited June 7, 2023 by MIghtymoo Intel i9 13900K | RTX4090 | 64 Gb DDR4 3600 CL18 | 2Tb PCIe4.0 | Varjo Aero | Pico 4 on WIFI6e | Virtual Desktop running VDXR
Glide Posted June 7, 2023 Posted June 7, 2023 This is a good video explaining what impacts the CPU in game. Some multiplayer servers have a lot going on which impacts the ability of the CPU to keep up. I think you are good with your 11gen. It's not you, it's the server. 1
MIghtymoo Posted June 8, 2023 Author Posted June 8, 2023 Thanks for the feedback and video link. Based on this I will hold off a CPU upgrade for now. Intel i9 13900K | RTX4090 | 64 Gb DDR4 3600 CL18 | 2Tb PCIe4.0 | Varjo Aero | Pico 4 on WIFI6e | Virtual Desktop running VDXR
darkman222 Posted June 8, 2023 Posted June 8, 2023 (edited) Of course the way the mission is built makes a huge impact on CPU frame time. But what happens, same mission, same server, better CPU? Quite some time ago was hoping to get better frame times when I run my single player mission on a second PC. So I made a SP mission a MP mission running on a separate PC with my gaming PC being the client. Turned out, CPU frame times stayed almost the same. So every client has to do its own calculations, which sounds very very VERY unoptimized for a multiplayer game. Edit: I ordered a new CPU. I hope I can do some research on this exact topic when switching to the new CPU. Edited June 8, 2023 by darkman222 1
Recommended Posts