Jump to content

Recommended Posts

Posted (edited)

Despite I thought and felt otherwise at first glance and flights, it still makes a HUGE difference at what speed your CPU runs.

 

I was really satisfied with the MiG-15 performance, way above 100fps usually, seldom in the low 80 and very seldom I had to accept 60 fps for a moment or two.

 

Well, it changed when I got into the MiG-21. While taxiing to the runway my fps was kinda locked at around 60 fps which I find rather unsatisfying with the previous fps in the MiG-15.

I changed to windowed mode ( still in Gsync ), opened my Asus tool and clicked the CPU from 3.4 to 4.8 and the very same moment my fps went into mid 80's, stable. I could reproduce the fps jump, downclock-upclock etc... the fps value directly scales to your CPU power.

 

I then left the CPU at 4.8 and went on with the GPU, which was underclocked as low as it goes in MSI, all sliders LEFT, still this gives 121 fps in most airframes. I clicked it to Asus Gaming Setting = no change in fps, still those mid 80's, I went further to 1500MHz and 8000MHz VRAM = same fps if not 1-2 fps slower.

 

Baseline:

Overclocking your CPU will still deliver more fps if you GPU is hungry enough. To bottleneck even a 980GTX ( not Ti or any Titan nor SLI ) a 4.8GHz CPU is NOT ENOUGH, you better get

Liquid Nitrogen Cooling and 6.8GHz to please my 144Hz screen.

 

It's the CPU, still, and I still wonder why SMP is not in question. This DCS exe process clearloy floods any CPU around on one core, as hard as you wish or as hard as you dare to OC your rig.

 

Each and every tool that does massive processing these days from Handbrake to iTunes, WinZip etc.. they all went SMP as they have understood that there will be no 10GHz CPU in the forseeable future and if you want to process more you need more processors aka cores as the cores themselves will not get that much faster, but smarter and more energy efficient, they actually have a standstill in performance meanwhile while the wattage drops !!!

 

I dont want to risk another 10% warning for stressing things ED has more than said NO to, still, for an IT guy that is used to servers with more cores than one wants to count it makes plain no sense to turn it down category wise. I would never close that door as it is the one chance to more processing speed.

 

I am no coder but it is not hard to understand that this would mean deep changes in the code and maybe even general layout of the exe, I dont know what it will take but it will take far far more to convince Intel to give us faster cores ! theyx will rather give us 16 cores at 2.4GHz than 4 cores at 12GHz.

 

 

I would love to use 144Hz, but with those 4.8GHz only you dont even get close. Add a few exposions and ground units and 2 rotarys and you are back to 27fps ( happened lasdt night in the Mi-8 ), which is more than dissapointing.

 

 

....looking for a LN bottle and a How-To-PDF LoL I WANT 144Hz flooded !!!!!

Edited by BitMaster

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Sapphire  Nitro+ 7800XT - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus XG27ACG QHD 180Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Posted

Even though the engine now relies much more on the GPU, it's still a fact that a GPU cannot do it's job without having a CPU that feeds it with jobs to do.

 

Google "API overhead" and start reading and you will understand the relationship between CPU and GPU much better.

 

Also, there's no problem here. We are seeing great performance. You are bringing up a discussion on something that is a non-issue.

Posted (edited)

That is some bugreport BitMaster...

You have a severe "G-sync syndrome". I fear you just realized that your system is aging and as fast as your slowest PC-part.

 

Every sim-player always wants next generation hardware for today's games. I doubt ED can help you, although... Maybe this can be moderated to chit-chat.

 

 

 

 

 

 

:smilewink:

Edited by piXel496
bazinga
Posted

Oh boy, someone isn't happy with 60 FPS. Like 60 FPS is for the plebs. Quickly ED, write a completely new code so the prince can get 300 FPS

Posted
Oh boy, someone isn't happy with 60 FPS. Like 60 FPS is for the plebs. Quickly ED, write a completely new code so the prince can get 300 FPS

 

LMAO, maybe he has robot eyes, and let him mess up his PC's life span by extremely OC. Not to mention that the human eye cannot tell the difference passed 60 FPS. 4K UHD, & HD Videos are 60 FPS. Technically anything higher 60 FPS currently is useless to the human eye, but 120 FPS keeps the eyes from seeing dips in FPS within video creation.

Thanks,
Lt. Commander Jason "Punisher" M

Hardware:
i7 10700K 5 GHz Quad Core, Water-cooled , 32GBs 2400 DDR4 RAM, MSI Intel Z470A GAMING MB, MSI RTX 3080 GPU W/10GBs GDDR6X, 512GB NVME.2 SSD, 1TB NVME.2 SSD, 2TB External SSD, 2 512Gb SSD's & 1 350 Gb HARDDRIVE, WinWing Orion 2 Stick Base and Throttle Base, Quest 2, Windows 11 (64bit)

Posted
LMAO, maybe he has robot eyes, and let him mess up his PC's life span by extremely OC. Not to mention that the human eye cannot tell the difference passed 60 FPS. 4K UHD, & HD Videos are 60 FPS. Technically anything higher 60 FPS currently is useless to the human eye, but 120 FPS keeps the eyes from seeing dips in FPS within video creation.

 

Well that's a load of bullshit :smilewink:

 

Yes, over-clocking theoretically shortens the lifespan of components. In reality, even highly over-clocked components will last long enough to be completely obsolete by the time they stop working.

 

Oh, so you believe in the 60fps myth, uh? This is an old myth created long before high refresh rate monitors were widely available. Anyone with a 120+ hz monitor that has been doing even a little bit of gaming will tell you that you are wrong. Not only that, but the benefits of a high refresh rate monitor can easily be proven in blind tests.

 

In DCS the benefits of going beyond 60fps is not huge, but anyone who uses a TrackIR5 will easily feel the difference in latency when moving their head. Latency in head tracking is distracting, and going just from 60fps to 75fps makes for a much better experience.

 

Oculus decided to use 90hz displays in their Rift VR headset because they found that 60hz makes people dizzy when wearing a VR headset. That pretty much says it all.

Posted

Here's a blind test 60hz vs. 120hz, just if your curious, and just to prove I'm not making things up.

 

Personally I use a 144hz monitor and I know how easy it is to spot the difference. It doesn't even take me a second to know what refresh rate I'm running at. 60hz is sort of a golden standard, or minimum requirement to have a good time in most games, but something like Counter Strike feels like a slideshow at 60hz once you gotten used to 144hz.

Posted
Add a few exposions and ground units and 2 rotarys and you are back to 27fps ( happened lasdt night in the Mi-8 ), which is more than dissapointing.

 

This is the real issue. DCS is not a "cruise from A to B and enjoy scenery", it's a combat simulation. This shouldn't happen.

Posted (edited)

I am not complaining about my fps, those are higher than expected and the 27fps in the Mi...well.

 

The reason why I type this are the servers, those who will host 64+ missions on one core.

 

 

And yes, 144Hz is nice with TIR5. If the 4970k was faster I had it, built it a few times but for 500€ later having 5fps less is not a good thing. They also hate overclocking, heat spreader is of the lesser kind.

 

 

I am with Brisse, those who havent seen it may not believe it. My son wont play BF4 on his ( not slow machine ) anymore...cuz mine rocks it with 141fps all maxed out at 1440p. He says his locked 60Hz look like shit...he's 14 !

 

It's not the client, it's the server ( and maybe the Mi-8 with 27fps which I dunno what caused it )

I am not aware that there is a SINGLE ONE SERVER out there that can run 64+ without performance issues. Are you ?

Edited by BitMaster

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Sapphire  Nitro+ 7800XT - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus XG27ACG QHD 180Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Posted

The issue is still the same from DX9 to DX11.

 

DirectX 11 offers REDUCED CPU Overhead, not Eliminated CPU OVERHEAD.

 

Rendering 100,000+ Objects (Terrain, Aircraft, Trees, Traffic, Fences, etc etc etc).

 

The more CPU Power You have to process those draw calls, the cleaner and more efficiently those calls are fed to the GPU.

Windows 10 Pro, Ryzen 2700X @ 4.6Ghz, 32GB DDR4-3200 GSkill (F4-3200C16D-16GTZR x2),

ASRock X470 Taichi Ultimate, XFX RX6800XT Merc 310 (RX-68XTALFD9)

3x ASUS VS248HP + Oculus HMD, Thrustmaster Warthog HOTAS + MFDs

Posted (edited)
The issue is still the same from DX9 to DX11.

 

DirectX 11 offers REDUCED CPU Overhead, not Eliminated CPU OVERHEAD.

 

Rendering 100,000+ Objects (Terrain, Aircraft, Trees, Traffic, Fences, etc etc etc).

 

The more CPU Power You have to process those draw calls, the cleaner and more efficiently those calls are fed to the GPU.

 

 

 

Exactly, now take into consideration that 12.5% cpu-load could spread...among more cores, lets assume 8 cores ( just an example for the math ), then you will have done ALL processing in 1/8th of the time...having all those calculations a lot closer together.

 

4.8GHz still need 1 real time second to run around the clock. If I can spread this to more "runners", each one only has to run a fraction of the time.

 

It's not for more fps,actually not about fps at all, it is to avoid calculations that are torn out of timeline, causing jitter/stutter/whatever.

Edited by BitMaster

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Sapphire  Nitro+ 7800XT - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus XG27ACG QHD 180Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Posted

In my opinion multithreading is the way to go to make better use of todays (and tomorrows) hardware. But apart from the fact that it is complicated to implement: for ED to adapt this technology would mean a complete rewrite of the engine, which of course is financially not feasable.

Windows 10 64bit, Intel i9-9900@5Ghz, 32 Gig RAM, MSI RTX 3080 TI, 2 TB SSD, 43" 2160p@1440p monitor.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...