Jump to content

Recommended Posts

Posted

No.

 

The key difference, as explained in that article actually, is that AMD's solution relies on eDP - which common desktop monitors do not use. As the article also states:

 

"As it stands today, the only way to get variable refresh gaming technology on the PC is to use NVIDIA's G-Sync enabled monitors and GeForce graphics cards."

 

G-Sync will probably become obsolete at some point in the future, but that's different. Monitors with G-Sync give you the capability today, rather than having to wait for DP1,3-enabled cards and monitors.

[sIGPIC][/sIGPIC]

Daniel "EtherealN" Agorander | Даниэль "эфирныйн" Агорандер

Intel i7 2600K @ 4.4GHz, ASUS Sabertooth P67, 8GB Corsair Vengeance @ 1600MHz, ASUS GTX 560Ti DirectCU II 1GB, Samsung 830series 512GB SSD, Corsair AX850w, two BENQ screens and TM HOTAS Warthog

DCS: A-10C Warthog FAQ | DCS: P-51D FAQ | Remember to read the Forum Rules |

|
| Life of a Game Tester
Posted

both of them are still gimmicks to me... just like Dedicated PPUs

Windows 10 Pro, Ryzen 2700X @ 4.6Ghz, 32GB DDR4-3200 GSkill (F4-3200C16D-16GTZR x2),

ASRock X470 Taichi Ultimate, XFX RX6800XT Merc 310 (RX-68XTALFD9)

3x ASUS VS248HP + Oculus HMD, Thrustmaster Warthog HOTAS + MFDs

Posted
No.

 

The key difference, as explained in that article actually, is that AMD's solution relies on eDP - which common desktop monitors do not use. As the article also states:

 

"As it stands today, the only way to get variable refresh gaming technology on the PC is to use NVIDIA's G-Sync enabled monitors and GeForce graphics cards."

 

G-Sync will probably become obsolete at some point in the future, but that's different. Monitors with G-Sync give you the capability today, rather than having to wait for DP1,3-enabled cards and monitors.

I think you're missing one important point here. nVidia obviously didn't design, prototype and sign agreements with HW vendors overnight. It took time. The thing is instead of going for the technology that is going to be available soon (namely DP 1.3) they went for not only a proprietary extra hardware module but one that will efficiently allow display manufacturers to squeeze bucks out of the buyers. In other words, they could have put the money and time into pushing the DP 1.3 adoption.

 

Now it's only a matter of time before nVidia first start lobbing crap at FreeSync in the press and producing papers addressed to VESA to point out how 'unoptimal' the DP 1.3 standard is, and how we need time to 'improve it' and how 'rushing it is in no one's interest'.

Posted (edited)

nVidia is a member of the VESA governing body, and the slowness of DP1.3 development and adoption is one of the reasons they've done this. The fact that eDP has it - and has had it for a while - but no-one from VESA to manufacturers cared to bother about desktop implementation can serve as a hint there.

 

You could just as easily turn this argument against AMD: they didn't do anything to push this through VESA, but when nVidia takes it into their own hands AMD tries to take credit for already having the feature by obfuscating that what they're using is actually not something they did, and something they also failed to push through for desktop monitors.

 

It's simple really: developers of the chips used in desktop monitors were slow as heck and just didn't do it. nVidia had the feature ready in their cards, but no customers could use it because of this fact. So they made it happen with their own chip. How is this evil?

Edited by EtherealN

[sIGPIC][/sIGPIC]

Daniel "EtherealN" Agorander | Даниэль "эфирныйн" Агорандер

Intel i7 2600K @ 4.4GHz, ASUS Sabertooth P67, 8GB Corsair Vengeance @ 1600MHz, ASUS GTX 560Ti DirectCU II 1GB, Samsung 830series 512GB SSD, Corsair AX850w, two BENQ screens and TM HOTAS Warthog

DCS: A-10C Warthog FAQ | DCS: P-51D FAQ | Remember to read the Forum Rules |

|
| Life of a Game Tester
Posted

I think you mean duoploy since there are 2 gpu makers. For one I think they are doing a good job competing as they keep one upping the other. Maybe complain about the CPU douply as gpu performance has increased at least 10 fold in the same period as CPU performance.

Pacotito

 

I7-5820k@4.5 Z99 extreme4 16gb ddr4

520gb ssd. Gigabyte ssc GTX960 SSC 4gb

Posted
nVidia is a member of the VESA governing body, and the slowness of DP1.3 development and adoption is one of the reasons they've done this. The fact that eDP has it - and has had it for a while - but no-one from VESA to manufacturers cared to bother about desktop implementation can serve as a hint there.

 

You could just as easily turn this argument against AMD: they didn't do anything to push this through VESA, but when nVidia takes it into their own hands AMD tries to take credit for already having the feature by obfuscating that what they're using is actually not something they did, and something they also failed to push through for desktop monitors.

 

It's simple really: developers of the chips used in desktop monitors were slow as heck and just didn't do it. nVidia had the feature ready in their cards, but no customers could use it because of this fact. So they made it happen with their own chip. How is this evil?

I didn't start the topic solely to bash nVidia. What you're saying would be a valuable addition to what we can read in the press under one condition. Is there a proof that nVidia was trying to push DP 1.3 and failed for reasons not in their control? Being in a body of a standardization committee says nothing about one's agenda. There's plenty of tech hyenas in such bodies with the sole purpose of fighting standardization and actual progress.

Posted

Sure, but is there also any proof that nVidia has been blocking it?

[sIGPIC][/sIGPIC]

Daniel "EtherealN" Agorander | Даниэль "эфирныйн" Агорандер

Intel i7 2600K @ 4.4GHz, ASUS Sabertooth P67, 8GB Corsair Vengeance @ 1600MHz, ASUS GTX 560Ti DirectCU II 1GB, Samsung 830series 512GB SSD, Corsair AX850w, two BENQ screens and TM HOTAS Warthog

DCS: A-10C Warthog FAQ | DCS: P-51D FAQ | Remember to read the Forum Rules |

|
| Life of a Game Tester
Posted

I think we shall agree that we need a starting point here. And that would be a proof for nVidia seeking a way to solve the tearing issue using another route than 'We've got another sticker for your displays. Pay us.' What I'm saying is right now this is the basic notion against nVidia, while the only notion against AMD is that they are late to the party. Quite a different weight.

Posted

But I don't see how that is a notion AGAINST nVidia?

 

They've got it NOW. You can get that feature NOW. Rather than later. And they made that happen. Whereas AMD were like "oh, but we've got it too... err... on laptops... but we'll keep quiet about that little tidbit until nVidia people call us out on it". :P

[sIGPIC][/sIGPIC]

Daniel "EtherealN" Agorander | Даниэль "эфирныйн" Агорандер

Intel i7 2600K @ 4.4GHz, ASUS Sabertooth P67, 8GB Corsair Vengeance @ 1600MHz, ASUS GTX 560Ti DirectCU II 1GB, Samsung 830series 512GB SSD, Corsair AX850w, two BENQ screens and TM HOTAS Warthog

DCS: A-10C Warthog FAQ | DCS: P-51D FAQ | Remember to read the Forum Rules |

|
| Life of a Game Tester
Posted (edited)

It syncs the monitor refresh rate to the fps output of the video card. If your card is capable of say 49 fps on a 60htz monitor vsync the way it is now will drop to a number that is a ratio of 60 htz. In the case of 49/60 it would drop to 45 fps and throw out 4 frames which can cause microstutters and input lag due to lost frames. With vsync off you could get some tearing even at fps lower than you refresh rate due to the video card rendering multiple frames in the middle of a refresh. G-sync and freesync are better ways to "vsync" under the refresh rate of the monitor.

Edited by pacotito

Pacotito

 

I7-5820k@4.5 Z99 extreme4 16gb ddr4

520gb ssd. Gigabyte ssc GTX960 SSC 4gb

  • 3 months later...
Posted

people can criticize nVidia all they want for coming out with 'their own' standard and charging for it, but the bare face of the matter is that the industry has spent the better part of a decade completely ignoring the problem and, as a result, everyone except screen manufacturers has suffered for it.

 

nVidia has simply decided that it's time to force the hands of the display mfrs. I fail to see how there is anything wrong with them entering into a partnership with ASUS to be able to provide their users with a feature they already support, which users and video card mfrs alike have wanted for years. That's the great thing about a free market: If someone has an idea to make something work better, they can hash it out and put it on the market. If the rest of 'the industry' wants to be included, perhaps they should get off their asses and do something about it.

  • 8 months later...
Posted

AMD owner here.

Dual card rendering, Nvidia (via 3Dfx acquisition). Native 3D support, Nvidia. Micro stutter fix, Nvidia.

When one of the manufacturers can pump out 3 4K monitors in 3D with out tearing or stuttering at 60 FPS then I will buy that solution. Until then it's just a game a leap frog for me.

All of my posted work, ideas and contributions are licensed under the Creative Commons - Attribution-NonCommercial 4.0 International (CC BY-NC 4.0,) which precludes commercial use but encourages sharing and building on for non-commercial purposes, ©John Muldoon

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...