darkman222 Posted August 21, 2019 Posted August 21, 2019 (edited) I recently ran a test with my PC because CPU time seems to bottleneck my 2080. I measured the CPU time with low resolution and high resolution on a monitor. The CPU time was raising with higher resolution. If I connect my VR headset the CPU time doubles. What I understand is the obvious relation between higher resolution and the resulting higher CPU time. But what I dont understand is the reason for that? The CPU does the calculations of the game then sends it to the GPU for render. Why is the amount of additional needed calculations so much affected by higher resolution? It almost seems to be proportional. Edited August 21, 2019 by darkman222
METEOP Posted August 22, 2019 Posted August 22, 2019 hummmm...... interesting indeed. But I guess there are some texture manipulations/rendering functions that are not done purely on the gpu. I've always wondered though why it is that we are reaching what seems to be a diminishing return on overall performance these days, where increasing cpu, ram speed, and upgrading gpu does not make such a huge difference than it used to. Just a feeling. You are suggesting that it might be inefficient coding? METEOP i5-6600K OC@4.5Ghz, GTX 1070 OC, 32Gb RAM, M.2 NVMe SSD Warthog HOTAS, Saitek Rudder Pro, Trackhat Clip, 1080p projector, Custom touchscreen rig, Ikarus touchscreen panel, Voice Attack, ReShade, Simshaker Aviator
Recommended Posts