jmod Posted March 3, 2013 Posted March 3, 2013 Since ED is making new Graphic Engine EDGE, I think it would be great to update the Physics Engine as well, by using new technology PhysX Lots of new modern Game Engines, like Real Virtuality 4, are going to use it. Features PhysX is a multi-threaded physics simulation SDK available for Microsoft Windows, Mac OS X, Linux, PlayStation 3, Xbox 360 and Wii. It supports rigid body dynamics, soft body dynamics, ragdolls and character controllers, vehicle dynamics, volumetric fluid simulation and cloth simulation including tearing and pressurized cloth. Let's fly together for the sake of peace :)
Witchking Posted March 3, 2013 Posted March 3, 2013 Problem is physx is nvidia only....however, I think it would be amazing and just insane if ED licensed HAVOK and Euphoria for EDGE. I mean imagine GTA 4 style physics for infantry, ground vehicles physics, all the freaking demolition and debris we want at smooth frame rates...and ED can layer that with their in house AFM etc. Now thats a true next generation Eagle Dynamics simulation. WHISPR | Intel I7 5930K | Nvidia GTX980 4GB GDDR5 | 16GB DDR4 | Intel 730 series 512GB SSD | Thrustmaster WARTHOG | CH Pro Pedals | TrackIR4 pro | |A-10C|BS2 |CA|P-51 MUSTANG|UH-1H HUEY|MI-8 MTV2 |FC3|F5E|M2000C|AJS-37|FW190|BF 109K|Mig21|A-10:SSC,EWC|L-39|NEVADA|
jmod Posted March 4, 2013 Author Posted March 4, 2013 Although PhysX is running (natively) on NVidia PPU, it could run on CPU as well! Video games supporting hardware acceleration by PhysX can be accelerated by either a PhysX PPU or a CUDA-enabled GeForce GPU, thus offloading physics calculations from the CPU, allowing it to perform other tasks instead. This typically results in a smoother gaming experience and additional visual effects. I'm in love with this kind of Physics Engine: Let's fly together for the sake of peace :)
Witchking Posted March 4, 2013 Posted March 4, 2013 If its Physx...no way it will be allowed to run on a non-geforce card. Running it on CPU is just too much. The old demos used to allow comparing CPU vs GPU and a CPU just can't handle everything. You need GPU acceleration. This is where the other third part non affiliated physics engines come in: Havok, DMM, Euphoria etc etc. WHISPR | Intel I7 5930K | Nvidia GTX980 4GB GDDR5 | 16GB DDR4 | Intel 730 series 512GB SSD | Thrustmaster WARTHOG | CH Pro Pedals | TrackIR4 pro | |A-10C|BS2 |CA|P-51 MUSTANG|UH-1H HUEY|MI-8 MTV2 |FC3|F5E|M2000C|AJS-37|FW190|BF 109K|Mig21|A-10:SSC,EWC|L-39|NEVADA|
Wolf Rider Posted March 4, 2013 Posted March 4, 2013 This is where the protagonists of "interoperability" should be making their noise (but not at the game/ sim developers though) City Hall is easier to fight, than a boys' club - an observation :P "Resort is had to ridicule only when reason is against us." - Jefferson "Give a group of potheads a bunch of weed and nothing to smoke out of, and they'll quickly turn into engineers... its simply amazing." EVGA X99 FTW, EVGA GTX980Ti FTW, i7 5930K, 16Gb Corsair Dominator 2666Hz, Windows 7 Ultimate 64Bit, Intel 520 SSD x 2, Samsung PX2370 monitor and all the other toys - "I am a leaf on the wind, watch how I soar"
ARM505 Posted March 4, 2013 Posted March 4, 2013 I must say, that's one thing that really impressed me about GTA4 - the physical behavior of the people. Bumps, explosions etc and they all fell about in a plausible manner. That being said, I doubt we have much hope of seeing that in DCS - far, far too many other priorities stand in it's way. I guess sims are doomed to have 'wooden' infantry for the forseeable future. I'm looking at you, Steel Beasts :) (who only just recently got actual 3D infantry - up until then they were sprites!) Yes, the best tank simulator in the world (commercially available to the average civilian I suppose, although military ones usually look worse) had sprite based infantry up until 2012. I guess the rules are, if you see laughable human modelling, it must be a sim?
mjeh Posted March 4, 2013 Posted March 4, 2013 I suppose one could dedicate a processor core for physics handling?
sobek Posted March 4, 2013 Posted March 4, 2013 ED are perfectly capable of implementing their own solid body physics, it's a matter of time and resources to apply it to all parts of the sim. I guess that ragdoll animation for infantry is still way out of scope, though. Good, fast, cheap. Choose any two. Come let's eat grandpa! Use punctuation, save lives!
ZaltysZ Posted March 4, 2013 Posted March 4, 2013 PhysX is a recurring topic for flight sims, however there are issues, because of which PhysX is still not used in them: 1) It is tied to NVIDIA. CPU implementation is poorly optimized and falls behind in performance too much. This means that devs have to limit its GPU assisted usage, or they will loose user base with ATI cards. 2) Mostly only effects (smoke, debris, explosions and etc.) can be run on GPU. Physics calculations like collision detection, ragdolls and so on always run on CPU. 3) PhysX is just physics engine middleware, which allows devs to quickly write code for trivial/general case physics interactions and offload some stuff to GPU. If you want general rigid doll dynamics, vehicle dynamics, particle effects, then it is ok as it saves time and gives good performance. However, if your game/sim requires very specialized calculations (like detailed FM, DM), then most of PhysX features becomes too crude to get wanted fidelity. PhysX is designed for games (whose players do not moan that zombie head bounces few centimeters too high) and not for sims or general computations. 4) Pluging PhysX into existing engine is not an easy task. Even little rag doll soldier with RPGs might require big architectural changes. Wir sehen uns in Walhalla.
SkateZilla Posted March 4, 2013 Posted March 4, 2013 ever since nVidia bought the technology it's gone downhill, they purposely cripple the CPU Code, and the list of games it supports is small. The "PhysX On/OFF" Videos mostly showcase residual dust that reacts to players, random sprites that react to player (cob webs, curtains whatever) and water flowing effects. None of which we need. PhysX is OverRated, everything that PhysX does in most of the games can easily be coded, it's a Gimmick, they code it to PhysX Code, so when it's OFF your presentation is stripped and watered down. For Example, Everything that is "PhysX Exclusive" in Batman A.A., I had in Rainbow Six 3 Black Arrow 10 years ago. (Smoke that reacts to wind and players, Vinyl Curtains that react to players bodies etc etc). Windows 10 Pro, Ryzen 2700X @ 4.6Ghz, 32GB DDR4-3200 GSkill (F4-3200C16D-16GTZR x2), ASRock X470 Taichi Ultimate, XFX RX6800XT Merc 310 (RX-68XTALFD9) 3x ASUS VS248HP + Oculus HMD, Thrustmaster Warthog HOTAS + MFDs
VincentLaw Posted March 4, 2013 Posted March 4, 2013 If its Physx...no way it will be allowed to run on a non-geforce card. Running it on CPU is just too much. The old demos used to allow comparing CPU vs GPU and a CPU just can't handle everything. You need GPU acceleration. This is where the other third part non affiliated physics engines come in: Havok, DMM, Euphoria etc etc. Since I have a quad core processor, when I play DCS, I basically have two cores doing nothing at all. I'm pretty sure it wouldn't be too much on the very bored half of my CPU that has nothing better to do anyway. In the mean time, my GPU is busy rendering all the fancy graphics, so why would I want to add physics calculations on top of its workload? It doesn't make sense. ever since nVidia bought the technology it's gone downhill, they purposely cripple the CPU Code, and the list of games it supports is small. Not to mention they completely dropped support of the original Ageia PPU, so if you have one of those in your computer, it is good for nothing but clogging your ventilation. [sIGPIC][/sIGPIC]
SkateZilla Posted March 4, 2013 Posted March 4, 2013 (edited) Since I have a quad core processor, when I play DCS, I basically have two cores doing nothing at all. I'm pretty sure it wouldn't be too much on the very bored half of my CPU that has nothing better to do anyway. In the mean time, my GPU is busy rendering all the fancy graphics, so why would I want to add physics calculations on top of its workload? Not to mention they completely dropped support of the original Ageia PPU, so if you have one of those in your computer, it is good for nothing but clogging your ventilation. There's 3rd party Modded Drivers for the Ageia PPU Cards, as well as 3rd Party Modded Drivers that will allow you to use a Cheap nVidia GPU as a PhysX Processor in a AMD/ATi GPU System. I ran that setup for 3 months last year, (AMD 7950 OC, w/ eVGA 8800GTS G80 as PhysX unit). other than some eyecandy, it didnt improve anything, other than removing the stutter caused by forcing PhysX to the CPU. PhysX On w/ 8800GTS, Fluid, w/ Dynamic and Interactive Smoke... (sarcastic Yay). PhysX On w/ CPU, Stuttering, Frame Rate drop everytime a PhysX Effect was present. PhysX Off, Fluid, no Interactive smoke, peices of glass or debris.. etc. Decided the Extra heat and Power consumption wasnt worth it for the 2 games that supported it. (8800 G80s idled at 55-60^C), So I uninstalled the 8800 and the software, Might be better with a Single Slot 650ti and a Ribbon/Riser Cable to mount it to a different slot out of the way. IMHO, PhysX is dead. HaVoc and as much as I hate to say it, FrostBite II, is way more evolved. Eagle Dynamics Current Flight Model Engine has better physics than PhysX. you just dont see it because you're in the air and not on the ground. Broken Glass and debris on the ground really isnt showcasing "Physics". Edited March 4, 2013 by SkateZilla Windows 10 Pro, Ryzen 2700X @ 4.6Ghz, 32GB DDR4-3200 GSkill (F4-3200C16D-16GTZR x2), ASRock X470 Taichi Ultimate, XFX RX6800XT Merc 310 (RX-68XTALFD9) 3x ASUS VS248HP + Oculus HMD, Thrustmaster Warthog HOTAS + MFDs
danilop Posted March 4, 2013 Posted March 4, 2013 (edited) What about huge computation increase (FFT and SGEMM/DGEMM) in GTX Titan? In double precision FFT, GTX Titan is more than 3x faster than 680!!! It would be great if DCS could tap in to this almost supercomputer like processing power. We are CPU limited ATM, so newest GPUs are the only "cheap" option to dramatically increase computation power of gaming systems intended for calculation intensive simulations. Edited March 4, 2013 by danilop
SkateZilla Posted March 4, 2013 Posted March 4, 2013 GTX Titan is a Limited Edition PCB. and has more Minuses than Pluses. No Sense coding a Engine to take advantage of something thats limited to 5000 Units and only about 50 of them are even being used to run anything close to flight sims. That's like asking ED to code the engine to run on the Asus Ares II,. Windows 10 Pro, Ryzen 2700X @ 4.6Ghz, 32GB DDR4-3200 GSkill (F4-3200C16D-16GTZR x2), ASRock X470 Taichi Ultimate, XFX RX6800XT Merc 310 (RX-68XTALFD9) 3x ASUS VS248HP + Oculus HMD, Thrustmaster Warthog HOTAS + MFDs
danilop Posted March 4, 2013 Posted March 4, 2013 (edited) Yeah, true but the new GPU's are around the corner. GK110 is here to stay. Not in Titan necessarily, but in the future versions. Cheaper (and faster) version of the Titan are certain. IMO, it's smart strategy to code with NVIDIA CUBLAS/CUFFT library now, especially when it's much more probable to see dual Titan (or whatever future GPU based on GK110 Nvidia may release) than full blown dual or quad Xeon setup in the gaming computer. In the long run, support for this aspect of the newest GPU's would pay off. Edited March 4, 2013 by danilop
SkateZilla Posted March 4, 2013 Posted March 4, 2013 DirectCompute is here to stay Windows 10 Pro, Ryzen 2700X @ 4.6Ghz, 32GB DDR4-3200 GSkill (F4-3200C16D-16GTZR x2), ASRock X470 Taichi Ultimate, XFX RX6800XT Merc 310 (RX-68XTALFD9) 3x ASUS VS248HP + Oculus HMD, Thrustmaster Warthog HOTAS + MFDs
danilop Posted March 4, 2013 Posted March 4, 2013 (edited) :thumbup: DirectCompute, OpenCL or Nvidia CUDA C/C++ toolkit or whatever ... Technology for accessing underlying GPU power is not that important - what's important is that developers finally exploit parallel computing available on modern GPU's. CPU intensive games like our beloved DCSW, would benefit the most. Edited March 4, 2013 by danilop
SkateZilla Posted March 4, 2013 Posted March 4, 2013 (edited) DCSW's Problem isnt Physics Processing, DCSW's problem is the Base coding is aging while features have outgrown the base coding and hardware profile it was originally programmed for; TFCSE 1.0 being 1/2 Core CPUs and Geometry Driven GPUs w/ extras shader cores. Now we have 4,6,8,12,16 core CPUs and ShaderCore/Compute Driven GPUs. TFCSE originally coded for Lockon/FC, and Expanded for the DCS Series (or so Wiki Says). <- I'd take that with a grain of salt, specially coming from Wiki. But lets just say it's true, The Engine has Evolved from Mid-Fidelity to High Fidelity in Many Areas, From Fixed Wing Simulation, to include Rotary Wing Simulation and Ground Units Simulation, eventually Marine/Navy Unit Simulation. Eventually cramming all this stuff into ONE process/thread/core is gonna bite you. Edited March 4, 2013 by SkateZilla Windows 10 Pro, Ryzen 2700X @ 4.6Ghz, 32GB DDR4-3200 GSkill (F4-3200C16D-16GTZR x2), ASRock X470 Taichi Ultimate, XFX RX6800XT Merc 310 (RX-68XTALFD9) 3x ASUS VS248HP + Oculus HMD, Thrustmaster Warthog HOTAS + MFDs
DayGlow Posted March 4, 2013 Posted March 4, 2013 Arma 2 uses PhysX 2.0 and Arma 3 will use PhysX 3.0. "It takes a big man to admit he is wrong...I'm not a big man" Chevy Chase, Fletch Lives 5800X3D - 64gb ram - RTX3080 - Windows 11
SkateZilla Posted March 4, 2013 Posted March 4, 2013 And I see no difference with it Off or on. Windows 10 Pro, Ryzen 2700X @ 4.6Ghz, 32GB DDR4-3200 GSkill (F4-3200C16D-16GTZR x2), ASRock X470 Taichi Ultimate, XFX RX6800XT Merc 310 (RX-68XTALFD9) 3x ASUS VS248HP + Oculus HMD, Thrustmaster Warthog HOTAS + MFDs
ZaltysZ Posted March 4, 2013 Posted March 4, 2013 :thumbup: DirectCompute, OpenCL or Nvidia CUDA C/C++ toolkit or whatever ... Technology for accessing underlying GPU power is not that important - what's important is that developers finally exploit parallel computing available on modern GPU's. CPU intensive games like our beloved DCSW, would benefit the most. Not necessary. There are pitfalls. If you want performance benefit from calculations on GPU, you will need highly paralleled/vectorized computations with infrequent need to transfer data between CPU/RAM and GPU, or else access latency will eat most of performance gains. Wir sehen uns in Walhalla.
danilop Posted March 4, 2013 Posted March 4, 2013 Well, you could load huge chunk of code on the graphics card itself and basically forget about it there. No need to bounce code around. Just "pick up" the results from time to time or when need arise. There are 6GB of memory on Titan for example and increasing memory on Graphics cards is ongoing trend ... However, complexity of parallel processing is real and effort needed to code it correctly is not trivial.
jmod Posted March 5, 2013 Author Posted March 5, 2013 The main parts of Game (Simulation) Engine are: 1) Graphic Engine 2) Physics Engine We could think of PhysX SDK as DirectX SDK in Graphic Engine! So: 1) DirectX 11 for EDGE (the Graphic Engine) 2) PhysX 3.0 for Physics Engine Let's fly together for the sake of peace :)
EtherealN Posted March 5, 2013 Posted March 5, 2013 Why does this make me remember the threads about how ED should use CryEngine? :P [sIGPIC][/sIGPIC] Daniel "EtherealN" Agorander | Даниэль "эфирныйн" Агорандер Intel i7 2600K @ 4.4GHz, ASUS Sabertooth P67, 8GB Corsair Vengeance @ 1600MHz, ASUS GTX 560Ti DirectCU II 1GB, Samsung 830series 512GB SSD, Corsair AX850w, two BENQ screens and TM HOTAS Warthog DCS: A-10C Warthog FAQ | DCS: P-51D FAQ | Remember to read the Forum Rules | | | Life of a Game Tester
Recommended Posts