Jump to content

DX12 Shenanigans


Recommended Posts

So, while scrutinizing more closely the claims and counterclaims regarding Kaby Lake vs. Ryzen, I came across what was for me new and troubling information. Apparently, the driver framework that MS uses for DX12, Windows Display Driver Model 2.0, forces the GPU to use the Windows 10 graphics compositing engine, so that the display will support the various UI overlays that MS wants to integrate across all devices and platforms, such as a recording bar and on-screen keyboard.

 

From the article I found:

 

Down the road, it appears that Microsoft thinks that running all games through the compositing engine will allow for unique features and additions to PC games, including multi-plane overlays. Multi-plane overlays allow two different render screens, one with the 3D game and another with the UI, for example, to be rendered at different resolutions or even updated at different rates, merging together through the Windows engine. Pushing games through the MS Windows engine will also help to improve on power efficiency, a trait that is more important as PCs [move] into the realm of mobile devices. It is laudable that MS wants to improve the PC gaming experience and bring some unique features from the Xbox to the PC – we just have questions on how it will be done and if they will be sacrificing some of what makes the PC, "the PC" to get it done. https://www.pcper.com/reviews/General-Tech/PC-Gaming-Shakeup-Ashes-Singularity-DX12-and-Microsoft-Store

 

At this stage in DX12's development, utilizing the OS compositing engine has some peculiar effects on display output, like automatically capping the framerate at 60 FPS, no matter the refresh rate of the monitor or the in-game V-sync settings.

 

The benchmark testing conducted by the author on "Ashes of the Singularity" revealed that AMD GPU drivers were achieving this by simply dropping frames when the GPU output exceeded 60 FPS, similar to how some low end displays claim to be able to "overclock" their refresh rates. Thus, the in-game benchmark reported much higher framerates than were actually output to the screen. Nvidia avoided this result by using a workaround in DX12 to bypass OS compositing and directly control the display. The aforementioned article explains all this in full detail.

 

In my view, MS is using DX12 as part of a strategy to try and force all games built for Windows to behave like XBox console games, no matter what platform they're intended to be played on. This only increases my distaste for the Universal Windows Platform and my skepticism of its goals. I truly wish there was a realistic alternative.

 

In addition, all of this underscores for me the importance of deconstructing the hype surrounding metrics such as FPS, thread count, and core count and closely examining how any given architecture impacts the experience of the end-user in real-world terms.

 

Maybe in the near future, all of this will be moot, as we'll all be running around sporting all-in-one VR visors or some such, but moves like this make me feel like the era of PC building for anything other than boutique purposes is quickly coming to a close.

EVGA GeForce GTX 1070 Gaming | i5 7600K 3.8 GHz | ASRock Z270 Pro4 | Corsair Vengeance LPX DDR4 3200 16 GB | PNY CS2030 NVMe SSD 480 GB | WD Blue 7200 RPM 1TB HDD | Corsair Carbide 200R ATX Mid-Tower | Win 10 x64
Link to comment
Share on other sites

really, all those "numbers" come down to... how well do the games you play run with x cpu at x freqency on x motherboard with x ram at x freqency and x videocard at x frequincy...

 

anyone who just takes random benchmarks from some games nobody plays made by someone who's trying to sell you something, not help you, but sell their stuff to you, is an idiot...

 

I will believe AMD has a better product than intel when I see the new I9 benchmarks in the games I play give lower performance...

 

price has nothing to do with what's better, price has to do with what you get for what you pay... a cpu that costs 1000$ but is only 2% faster than a CPU that costs 500$ is still better than the 500$ cpu.

 

the big thing is the games you play, the programs you use on a daily basis... not just random benchmarks...

My youtube channel Remember: the fun is in the fight, not the kill, so say NO! to the AIM-120.

System specs:ROG Maximus XI Hero, Intel I9 9900K, 32GB 3200MHz ram, EVGA 1080ti FTW3, Samsung 970 EVO 1TB NVME, 27" Samsung SA350 1080p, 27" BenQ GW2765HT 1440p, ASUS ROG PG278Q 1440p G-SYNC

Controls: Saitekt rudder pedals,Virpil MongoosT50 throttle, warBRD base, CM2 stick, TrackIR 5+pro clip, WMR VR headset.

[sIGPIC][/sIGPIC]

Link to comment
Share on other sites

Lol, you'd be a fool to buy a 2% faster CPU for double the price !

 

Fast CPU's are not the problem anymore.

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Asus 1080ti EK-waterblock - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus PG278Q 27" QHD Gsync 144Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Link to comment
Share on other sites

Lol, you'd be a fool to buy a 2% faster CPU for double the price !

 

Fast CPU's are not the problem anymore.

 

it was just an example... an over exaggeration...

 

you could have something that costs twice as much as a competitor but be worse too... shoes are a good example of that... a 70$ pair of steel toed will last far longer than a 400$ pair of nikes...

 

just trying to say price and performance are not the same thing, and not linked in any way, shape, or form... that if you have the money, more performance is more performance, some people seem to forget that, now that AMD has something closer to intel IPC wise...

 

even if intel and AMD cpus had exactly equal IPC, or AMD even had slightly better than intel, the new intel CPUs still clock to 4.5+ ghz and support quad channel, much higher freqency memory... we shall see what more cores does for intel soon™


Edited by Hadwell

My youtube channel Remember: the fun is in the fight, not the kill, so say NO! to the AIM-120.

System specs:ROG Maximus XI Hero, Intel I9 9900K, 32GB 3200MHz ram, EVGA 1080ti FTW3, Samsung 970 EVO 1TB NVME, 27" Samsung SA350 1080p, 27" BenQ GW2765HT 1440p, ASUS ROG PG278Q 1440p G-SYNC

Controls: Saitekt rudder pedals,Virpil MongoosT50 throttle, warBRD base, CM2 stick, TrackIR 5+pro clip, WMR VR headset.

[sIGPIC][/sIGPIC]

Link to comment
Share on other sites

in the GPU Market, there are GFX Cards that Sell for Double the next level down and the only place you'll see any increase is benchmark numbers.

Windows 10 Pro, Ryzen 2700X @ 4.6Ghz, 32GB DDR4-3200 GSkill (F4-3200C16D-16GTZR x2),

ASRock X470 Taichi Ultimate, XFX RX6800XT Merc 310 (RX-68XTALFD9)

3x ASUS VS248HP + Oculus HMD, Thrustmaster Warthog HOTAS + MFDs

Link to comment
Share on other sites

If budget efficiency is your goal (and I think it has to be no matter the size of your budget), then I think you always want to optimize the price/performance ratio, i.e. get the best performance possible for the lowest price possible, within your budget, bearing in mind that "best performance" is relative to your personal goals. If bragging rights is your goal, then by all means spend away on a benchmark crushing beast. If task specific performance is your goal, find the system that's best suited to that task, for the lowest price.

EVGA GeForce GTX 1070 Gaming | i5 7600K 3.8 GHz | ASRock Z270 Pro4 | Corsair Vengeance LPX DDR4 3200 16 GB | PNY CS2030 NVMe SSD 480 GB | WD Blue 7200 RPM 1TB HDD | Corsair Carbide 200R ATX Mid-Tower | Win 10 x64
Link to comment
Share on other sites

GPU Prices are gonna spike, BitCoin Jumped to $3000.

Windows 10 Pro, Ryzen 2700X @ 4.6Ghz, 32GB DDR4-3200 GSkill (F4-3200C16D-16GTZR x2),

ASRock X470 Taichi Ultimate, XFX RX6800XT Merc 310 (RX-68XTALFD9)

3x ASUS VS248HP + Oculus HMD, Thrustmaster Warthog HOTAS + MFDs

Link to comment
Share on other sites

wow, damn !

 

If those miners werent so expensive ! Have a pal that runs a whole cellar full of them LoL

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Asus 1080ti EK-waterblock - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus PG278Q 27" QHD Gsync 144Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Link to comment
Share on other sites

wow, damn !

 

If those miners werent so expensive ! Have a pal that runs a whole cellar full of them LoL

 

My GPUs are worth an Estimated $120 now, (i paid $700 for one, and $300 for the 2nd and 3rd Lightning).

 

I can literally sell them for $650 each to miners right now.

Windows 10 Pro, Ryzen 2700X @ 4.6Ghz, 32GB DDR4-3200 GSkill (F4-3200C16D-16GTZR x2),

ASRock X470 Taichi Ultimate, XFX RX6800XT Merc 310 (RX-68XTALFD9)

3x ASUS VS248HP + Oculus HMD, Thrustmaster Warthog HOTAS + MFDs

Link to comment
Share on other sites

My GPUs are worth an Estimated $120 now, (i paid $700 for one, and $300 for the 2nd and 3rd Lightning).

 

I can literally sell them for $650 each to miners right now.

 

Hell yes,...and DO IT, I have read from guys who sold a Rx480 and got a 1070 instead plus cashback for a big pizza !!!! They are crazy for those Radeons now

 

I have been reading hardocp forum today and they say ebay is going nuts with Radeon cards, they sell for astronomic prices now that Bitcoin is so high, GPUs seem to make sense again with such a high exchange rate.

 

From what I know, those GPUs were useless 6 months ago as the ded. miners are so superior in bc/watt relation, but as the rate goes up GPUs may pay again.

 

 

If electricity was for free, I#d run them all HAHA !

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Asus 1080ti EK-waterblock - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus PG278Q 27" QHD Gsync 144Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Link to comment
Share on other sites

Afaik not that much. Mining has always been an AMD/ATI realm.

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Asus 1080ti EK-waterblock - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus PG278Q 27" QHD Gsync 144Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Link to comment
Share on other sites

AMD is going to release stripped down Vega Cards as "mining AIBs"

Windows 10 Pro, Ryzen 2700X @ 4.6Ghz, 32GB DDR4-3200 GSkill (F4-3200C16D-16GTZR x2),

ASRock X470 Taichi Ultimate, XFX RX6800XT Merc 310 (RX-68XTALFD9)

3x ASUS VS248HP + Oculus HMD, Thrustmaster Warthog HOTAS + MFDs

Link to comment
Share on other sites

From what I know, those GPUs were useless 6 months ago as the ded. miners are so superior in bc/watt relation, but as the rate goes up GPUs may pay again.

 

Doesn't seem logical. Sure, with BC prices higher, there's more chance to make profit after paying for the electricity but surely you'd make even more profit using a dedi miner which uses less power, so why not spend the £650 on those and avoid tieing up your PC?

 

What do I know though! I came in too late a few years ago to make any profit mining with my 6950 and then invested in the wrong currency and lost money. I made more profit betting on the World Cup

Main rig: i5-4670k @4.4Ghz, Asus Z97-A, Scythe Kotetsu HSF, 32GB Kingston Savage 2400Mhz DDR3, 1070ti, Win 10 x64, Samsung Evo 256GB SSD (OS & Data), OCZ 480GB SSD (Games), WD 2TB and WD 3TB HDDs, 1920x1200 Dell U2412M, 1920x1080 Dell P2314T touchscreen

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...