Pilotasso Posted October 4, 2009 Posted October 4, 2009 (edited) Hey guys. Just delivering a few ideas for ED's consideration in a future DCS module. It apears computing is more and more off loaded to the GPU. A few months ago we watched a military SIM use 200 GPU's using them for object physiscs and management. A few days ago NVIDIA announced the FERMI architecture further suporting programing languages and capabilities on the GPU with 512 CUDA cores. http://www.hardwarecanucks.com/news/video/nvidia-officially-unveils-generation-cuda-gpu-architecture-codenamed-fermi/ ATI followed NVIDIA on the CUDA and Phys front, there are reasons to beleive they will also going to compete with FERMI in the future somehow. 512 cores worth of processing power is too much to ignore. It blows any CPU out of the watter. It would be nice for immersion to see sky scrapers on cities and traffic, thousands of units, infantry etc, aside along with AFM for all aircraft. All of it would be possible. Is this in ED's reach in the comming years? Edited October 4, 2009 by Pilotasso 1 .
Krippz Posted October 4, 2009 Posted October 4, 2009 Who knows. Looks like ED plans on using a modified DCS engine for a while. Maybe the guys back in Russia are feverishly working on a new engine? I understand why ED will not provide any concrete info now (if they tell us that their doing something and don't deliver they will disappoint the community). I guess we can only hope and play the guessing game for now... [sIGPIC][/sIGPIC] 64th "Scorpions" Aggressor Squadron Discord: 64th Aggressor Squadron TS: 195.201.110.22
Kuky Posted October 4, 2009 Posted October 4, 2009 I don't think this multi GPU is better, all we need is software to suport multi-cpu. Imagine having dual CPU Xeon mothermoard and 2x quad CPU's... that's 8x CPU's to run things on... if 1x CPU right now can give 20FPS min then 8 can run more with 60FPS min... not to mentin CPU architecture will improve and on die shrink more so you get even more processing power per CPU core... oh the dreams will come true one day :D PC specs: Windows 11 Home | Asus TUF Gaming B850-Plus WiFi | AMD Ryzen 7 9800X3D + LC 360 AIO | MSI RTX 5090 LC 360 AIO | 55" Samsung Odyssey Gen 2 | 64GB PC5-48000 DDR5 | 1TB M2 SSD for OS | 2TB M2 SSD for DCS | NZXT C1000 Gold ATX 3.1 1000W | TM Cougar Throttle, Floor Mounted MongoosT-50 Grip on TM Cougar board, MFG Crosswind, Track IR
Boulund Posted October 4, 2009 Posted October 4, 2009 (edited) Multiprocessing solutions can surely benefit a lot of things in scientific calculations, but I reckon a lot of games won't in fact gain the miracle boost everyone is dreaming of. This is just a link to something that will get you thinking about that it isn't all gold just because it's "multiprocessor" http://en.wikipedia.org/wiki/Amdahl's_law A whole lot of things are just not parallelizable like we want them to =D I've played around a bit with Jacket/CUDA in MATLAB to test things in a program I was involved in earlier and it would have achieved great speed-ups via parallel processing but unfortunately GPUs are not the omni-capable processing units that regular CPUs are. There where several operations performed in our program that were impossible to perform on the GPU, preventing full scale parallelization and thus great great performance increases. But hey who knows, maybe this fermi thing is just what everyone needs to get going? ;) Looking forward to the future Edited October 4, 2009 by Boulund Core i5-760 @ 3.6Ghz, 4GB DDR3, Geforce GTX470, Samsung SATA HDD, Dell UH2311H 1920x1080, Saitek X52 Pro., FreeTrack homemade cap w/ LifeCam VX-1000, Windows 7 Professional x64 SP1. FreeTrack in DCS A10C (64bit): samttheeagle's headtracker.dll
Doc. Caliban Posted October 4, 2009 Posted October 4, 2009 Does Blackshark even properly take advantage of SLI? SLI has been around since the late 90's. [ Asus Rampage II Extreme | Intel i7 920 @ 3.6GHz| 12GB DDR3 | 3-Way SLI (3x GTX280OC 1GB) | 300GB Raptor | SupremeFX X-Fi 7.1 audio | 1,200W Thermaltake Toughpower PSU | 30" LCD @ 2560x1600 | Thrustmaster HOTAS Cougar | TrackIR 5 | CoolerMaster HFC 332 case ]
Ightenhill Posted October 4, 2009 Posted October 4, 2009 Does SLI really make any sense anymore.. I could see the point of using it when I was using 8 or 9 series cards but now it seems more cost effective to move up to the next generation of card to get a better fps return.. On another note why are FS games so poor at using the gpu anyway... [sIGPIC][/sIGPIC]
Boulund Posted October 4, 2009 Posted October 4, 2009 Because they generally don't need to use the GPU because there are no eye-candy-stuff-calculations going on? Most calculations are for the simulation of flight, isn't much room left for graphics I think =) Core i5-760 @ 3.6Ghz, 4GB DDR3, Geforce GTX470, Samsung SATA HDD, Dell UH2311H 1920x1080, Saitek X52 Pro., FreeTrack homemade cap w/ LifeCam VX-1000, Windows 7 Professional x64 SP1. FreeTrack in DCS A10C (64bit): samttheeagle's headtracker.dll
Doc. Caliban Posted October 4, 2009 Posted October 4, 2009 SLI is simply the paraleling of multiple GPU's, no matter what kind they are. It's the simple principle of "If n is good, then n*2 is better." And it is. GPU's are used for much more than video processing now, as the OP is pointing out. Unfortunately, this sim, and it sounds like others as well, are not written to take advantage of that power. Another way to think of SLI is to say, "Now that I have a faster, newer CPU, why should I bother having more than 1 core?" Same idea. Faster, better, AND multiple cores. That's SLI / Crossfire. [ Asus Rampage II Extreme | Intel i7 920 @ 3.6GHz| 12GB DDR3 | 3-Way SLI (3x GTX280OC 1GB) | 300GB Raptor | SupremeFX X-Fi 7.1 audio | 1,200W Thermaltake Toughpower PSU | 30" LCD @ 2560x1600 | Thrustmaster HOTAS Cougar | TrackIR 5 | CoolerMaster HFC 332 case ]
joey45 Posted October 4, 2009 Posted October 4, 2009 Would be good if most of the graphics stuff be of loaded to the GPU.. other wise these next gen Graphics cards are no point. The only way to make sense out of change is to plunge into it, move with it, and join the dance. "Me, the 13th Duke of Wybourne, here on the ED forums at 3 'o' clock in the morning, with my reputation. Are they mad.." https://ko-fi.com/joey45
ZaltysZ Posted October 4, 2009 Posted October 4, 2009 "If n is good, then n*2 is better." And it is. If n is sufficiently large, n*2 becomes worse than n due to overhead of resource management. 2 Wir sehen uns in Walhalla.
Recommended Posts