Jump to content

Recommended Posts

Posted

This article it a bit above my pay grade, but I found it interesting to my special sort of genius.:music_whistling: One thing I found enlightening was the idea that the new Nvidia GT300 series might have 6mb of on-board memory. I wonder if this much memory will be made use of by a new GPU architecture? Memory increases do not necessarily mean more FPS, but do offer some aid at very high resolutions.

Oh, but I digress. Here's the link. http://www.theinquirer.net/inquirer/opinion/1560330/cpu-gpu-convergence-goes

 

Maybe someone can break down it's meaning for the rest of use?

Flyby out

The U.S. Congress is the best governing body that BIG money can buy. :cry:

Posted (edited)

For us gamers, the problem in following the article is that its written from a completely different perspective, that of the people who use GPUs to perform pure number crunching, as in scientific applications (goelogical, wheather, folding@home, etc).

 

The article goes on as explaining how ATI and Nvidia struggle to provide a much tighter interoperability between the CPU and GPUs. The main news at least for me, is that the new GT 300 will be able to run some C/C++ code, which is remarkable. But before anyone believes that you´ll be able to run DCS A-10 in that card and magically get colllidable trees and intelligent AI (:P) the first thing you have to know is that it can run very very specific C programs specially coded and compiled for it, which, having no affiliation with ED, I can say with 100% certainty will absolutely not be the case of any DCS product in the foreseeable future.

 

Then it explains that the true brakthrough will come when nVidia finds a way to run standard CPU code on their GPUs, which requires a kind of special "hardware front-end" for the GPU. But that´s way farther down the road (i.e. not in the GT 300) and there is even the possibility that nVidia will need to have cooperate with Intel to do so. At the same time, Intel´s Larrabee should do the same, but, what does Intel know about high performance GPUs?

 

So for us gamers this implies that by around 2012 we´ll have to upgrade or entire systems again. This convergence should allow developers to more easily incorporate number-crunching intensive functions into games, such as physics, AI, advanced lightning, and the like. It´s more a benefit in the ease of coding, rather than in actual new hardware capabilities.

 

It is not stated in the article but it is my personal opinion that if nVidia gets any kind of success (money, of course) with moving general purpose CPU code to the GPU, Intel will look kindly into buying them.

Edited by sinelnic

Westinghouse W-600 refrigerator - Corona six-pack - Marlboro reds - Patience by Girlfriend

 

"Engineering is the art of modelling materials we do not wholly understand, into shapes we cannot precisely analyse so as to withstand forces we cannot properly assess, in such a way that the public has no reason to suspect the extent of our ignorance." (Dr. A. R. Dykes - British Institution of Structural Engineers, 1976)

Posted

give that man a cookie!

 

Sinelnic, old man, I'd say you broke it down pretty damn good! Thanks for the technical translation.:thumbup: Here in the U.S. I think this is grounds for a new cash-for-clunkers initiative. Trade in old PC tech for new PC tech.:D Hey, it could happen. thanks again,

Flyby out

The U.S. Congress is the best governing body that BIG money can buy. :cry:

Posted

Thanks for the kind words! :blush:

 

Come to think about it, there´s another thing related to the article worth mentioning: see how ATI came out with their 5970 card, DX11 and all, and nVidia is not even close to releasing their own? This is because nVidia believes that the PC market has little growth potential and is focusing on expanding their market towards the budget number crunchers (those who cannot afford Deep Blue). They see a big ka-ching potential there.

 

This is sad news (we simmers know what it means when "our" market is not seen as attractive) but on the other hand, something must be wrong when those who have everything to win in the convergence market (AMD/ATI!!), are not pushing hard for that, but even pushing further for DX11 and the PC graphics.

 

Now, DX10 was heavily capitalized by nVidia, but we did not see any noticeable enhancement in graphics, basically because DX10 was more about optimizing the API than new features. DX11, on the other hand, brings the absolutely revolutionary feature called "Hardware Tesselator". See for yourself:

 

The funny thing here is that ATI cards have been having hardware tesselators included for at least 2 generations, but never used, while nVidia never had one.

 

Just food for thought. But between DX11 and Eyefinity, for the first time in eons my next card will be an ATI.

Westinghouse W-600 refrigerator - Corona six-pack - Marlboro reds - Patience by Girlfriend

 

"Engineering is the art of modelling materials we do not wholly understand, into shapes we cannot precisely analyse so as to withstand forces we cannot properly assess, in such a way that the public has no reason to suspect the extent of our ignorance." (Dr. A. R. Dykes - British Institution of Structural Engineers, 1976)

Posted

S~ Sinelnic,

That's an interesting viewpoint on Nvidia. I'm not quite sure I agree totally. The Green Machine has been slow off the mark with it's GT300 series GPUs. But there may be other reasons for that. I don't think it is ready to totally capitulate the top-tier gaming GPU territory to AMD/ATi just yet. I am a bit fearful that Nvidia may release a super GPU at an unreachable price, as it did with it's 280/285 single GPU cards. AMD/ATi really beat them down in the price/performance/value race there.

 

I do agree with you about DX10. It was not the leap that DX11 appears to be (nice video in that link too ;)). Yet I'm not sure how prevalent DX11 will be in our beloved flight sims. TBH, I don't have any other sort of gaming in my collection besides (PC) flight sims, and FS2004 is the only non-combat flight sim I own. Looking at the most recent screen shots of SoW_BoB, I'm not sure I'm looking at a DX10 engine, but I know it's not DX11. I'm not aware of what plans, if any DCS has for implementing DX11 either. My point here is, when making a decision on which GPU to buy today, the price/performance/value equation may shift to Nvidia's current generation GPU (if Nv decides it must lower the price of the GTX285 in order to compete with AMD's 5870 for sales at the top end of the GPU market) in the narrow market of flight sim-ers like me. Of course if you expand the gaming genre then the 5870 is clearly the way to go, imo, as I expect most people will. That card is a screamer. I might have to get one for my new gaming system (which is in perpetual hold due to my economic malaise).

well, now i have a headache from trying to form an intelligent reply!:helpsmilie:

Flyby out

The U.S. Congress is the best governing body that BIG money can buy. :cry:

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...