Jump to content

Graphic engine


Demongornot

Recommended Posts

Compréhension du ''comment le SIMULATEUR'' fonctionne = :doh:

Pour faire une histoire courte, le moteur ACTUEL de DCS utilise DX9, qui n'est pas fait pour du multi-GPU et multi-core. Avec DCS A-10 ils ont transféré le son sur un autre ''thread'' du CPU pour facilité la tache. Ils modifient le tout progressivement. Comme tu te l'est fait dire, ils travaillent sur leur nouveau moteur EDGE, qui va être DX11. Il ne faut pas s'attendre a un changement extrême coté graphique mais c'est un début :thumbup:

Ça coute extrêmement cher et prend beaucoup de temps retransférer tout le code d'un SIM vers un nouveau moteur.

 

Et avoir tout les calculs d'IA et autre par le CPU + graphique a la Skyrim/Crysis est impossible, peut importe le moteur actuellement sur le marché, avec du hardware d'aujourd'hui

Do you think that getting 9 women pregnant will get you a baby in 1 month?

[sIGPIC][/sIGPIC]

Mobo: Asus P8P67 deluxe Monitor: Lg 22'' 1920*1080

CPU: i7 2600k@ 4.8Ghz +Zalman CNPS9900 max

Keyboard: Logitech G15

GPU:GTX 980 Strix Mouse: Sidewinder X8

PSU: Corsair TX750w Gaming Devices: Saytek X52, TrackIr5

RAM: Mushkin 2x4gb ddr3 9-9-9-24 @1600mhz

Case: 690 SSD: Intel X25m 80gb

 

Link to comment
Share on other sites

  • Replies 145
  • Created
  • Last Reply

Top Posters In This Topic

Hm, wow. My two years of french in high school is not completely gone. I didn't even have to use google translate. :D

[sIGPIC][/sIGPIC]

Daniel "EtherealN" Agorander | Даниэль "эфирныйн" Агорандер

Intel i7 2600K @ 4.4GHz, ASUS Sabertooth P67, 8GB Corsair Vengeance @ 1600MHz, ASUS GTX 560Ti DirectCU II 1GB, Samsung 830series 512GB SSD, Corsair AX850w, two BENQ screens and TM HOTAS Warthog

DCS: A-10C Warthog FAQ | DCS: P-51D FAQ | Remember to read the Forum Rules |

|
| Life of a Game Tester
Link to comment
Share on other sites

Hm, wow. My two years of french in high school is not completely gone. I didn't even have to use google translate. :D

yeah, sorry for the French text... (it's 2AM, and it's my 5th night, so I dont have the will to argue with a French in English :music_whistling:;))

Do you think that getting 9 women pregnant will get you a baby in 1 month?

[sIGPIC][/sIGPIC]

Mobo: Asus P8P67 deluxe Monitor: Lg 22'' 1920*1080

CPU: i7 2600k@ 4.8Ghz +Zalman CNPS9900 max

Keyboard: Logitech G15

GPU:GTX 980 Strix Mouse: Sidewinder X8

PSU: Corsair TX750w Gaming Devices: Saytek X52, TrackIr5

RAM: Mushkin 2x4gb ddr3 9-9-9-24 @1600mhz

Case: 690 SSD: Intel X25m 80gb

 

Link to comment
Share on other sites

No worries. I was just not really prepared for the fact that I'd be able to read french and actually understand it. :)

Avoiding language barriers can be useful, though of course we shouldn't make it a habit. :P

[sIGPIC][/sIGPIC]

Daniel "EtherealN" Agorander | Даниэль "эфирныйн" Агорандер

Intel i7 2600K @ 4.4GHz, ASUS Sabertooth P67, 8GB Corsair Vengeance @ 1600MHz, ASUS GTX 560Ti DirectCU II 1GB, Samsung 830series 512GB SSD, Corsair AX850w, two BENQ screens and TM HOTAS Warthog

DCS: A-10C Warthog FAQ | DCS: P-51D FAQ | Remember to read the Forum Rules |

|
| Life of a Game Tester
Link to comment
Share on other sites

No worries. I was just not really prepared for the fact that I'd be able to read french and actually understand it. :)

Avoiding language barriers can be useful, though of course we shouldn't make it a habit. :P

Glad for you, tried to learn Deutsch but I failed:cry:, so stuck with French and English.

 

On topic, was it said if the new EDGE engine will support multi-GPU and/or more multi-thread? Thx

Do you think that getting 9 women pregnant will get you a baby in 1 month?

[sIGPIC][/sIGPIC]

Mobo: Asus P8P67 deluxe Monitor: Lg 22'' 1920*1080

CPU: i7 2600k@ 4.8Ghz +Zalman CNPS9900 max

Keyboard: Logitech G15

GPU:GTX 980 Strix Mouse: Sidewinder X8

PSU: Corsair TX750w Gaming Devices: Saytek X52, TrackIr5

RAM: Mushkin 2x4gb ddr3 9-9-9-24 @1600mhz

Case: 690 SSD: Intel X25m 80gb

 

Link to comment
Share on other sites

As far as I know there has been no statements on that yet. My understanding is that this generally depends on co-operation from nVidia and AMD. (Without special profiles for the driver, AFAIK you're "stuck" in alternate render and similar techniques, which still help but not as much as a "proper" profile. I don't use multi-GPU myself though so I never went into deepest detail on it.)

[sIGPIC][/sIGPIC]

Daniel "EtherealN" Agorander | Даниэль "эфирныйн" Агорандер

Intel i7 2600K @ 4.4GHz, ASUS Sabertooth P67, 8GB Corsair Vengeance @ 1600MHz, ASUS GTX 560Ti DirectCU II 1GB, Samsung 830series 512GB SSD, Corsair AX850w, two BENQ screens and TM HOTAS Warthog

DCS: A-10C Warthog FAQ | DCS: P-51D FAQ | Remember to read the Forum Rules |

|
| Life of a Game Tester
Link to comment
Share on other sites

Ok, this is going to be confusing because i can't quote from the locked thread. Whatever, here goes:

 

@Aaron:

 

My point is not that the graphic engine is perfect, i know that the devs are playing catch up technology wise, but the approach that the OP proposed is extremely simplistic and therefore the results he expects are just not possible.

 

Yet the discussion continued around 'why didn't anybody try this? I know that it works and yet it is so simple...'

 

My response to that is: show me. A simple tuning of texture quality vs. draw distance can be done. If that is all that it takes, go for it.

 

To be honest, i find it a little strange for somebody to think that he can waltz in here and educate people who are (probably) master or even PhD degree level software design engineers with 10 years or more experience in 3d engine development about how they should do their job.

I'm not saying that keeping them on their toes isn't good and that we shouldn't demand solid work from them, but throwing around terms that have barely any meaning for the engine at hand like in bullshit bingo without actually understanding how the underlying technology works and what it's strengths and weaknesses are, that just doesn't cut it. If you want to talk the talk, be prepared to walk the walk. ;)

 

We've been down that path often enough. :)


Edited by sobek
  • Like 1

Good, fast, cheap. Choose any two.

Come let's eat grandpa!

Use punctuation, save lives!

Link to comment
Share on other sites

I just saw this on the russian part of the forums:

 

http://forums.eagle.ru/showthread.php?t=81200

Novice or Veteran looking for an alternative MP career?

Click me to commence your Journey of Pillage and Plunder!

[sIGPIC][/sIGPIC]

'....And when I get to Heaven, to St Peter I will tell....

One more Soldier reporting Sir, I've served my time in Hell......'

Link to comment
Share on other sites

I don't think DCS graphics engine is bad. In fact, I think it's rather pretty and immersive most of the time. Explosions could be better though, but that's a mere detail.

The real problem is that it's very difficult to get it running smoothly.

For example, currently SLI rendering does not work well in DCS. I have managed to get SLI working sometimes, and when it works, it dramatically increases FPS. However, almost always when I have set SLI on, everything turns half blue, transparent, laggy, flickering and awfully ugly. This could be problem only I am having, but then again, no other game has ever done this. If someone happens to have a workaround, I'd love to hear it.

 

 

Anyway, one more thing. When I play a fantasy game, I want it to look as beatiful as possible. However, when I play a combat simulator, I want the view to be as realistic as possible. That is, I want the simulation to give visual information which corresponds to what a real pilot sees. Eg. it's more important for a tank to be spotable from a realistic distance than it is for it to look pretty. Realism and good looks aren't mutually exclusive of course, but I'd dare to argue that it's a lot more difficult to create a game with both of them than just the other.


Edited by Randolf
Editing for Reason
Link to comment
Share on other sites

@EtherealN

Yeah its a bad optimization problem cause you are not the only one with a simple "old" computer who can using DCS with nice fps and i'm not the only one with powerful computer who have a bad fps, i have made a test and for the same scene i have already see (thats bad its only for a single scene during presentation of the A10C and not always) : Maximum graphic settings scene with more fps than the same scene with minimal graphic setting, that just bad...

 

Outerra its a graphic engine with little physic incomparable and i talk about graphic engine and not about full software, FSX its not made for combat and look the new VRS Superbug who can using weapons, Arma 2 its not made for radio simulation and i have a a single people make an addon for simulate where the radio can receive.

I just talk about the graphic engine of Outerra and not about any full software.

 

Can you talk me more about the buffer program plz ?

 

And yes but flexible screen don't exist since long time and projector its not free, that's need big room and this people will not be sad if they can have a full canopy field of view.

 

The only thing that i see its other software can do better (sometime including too the distance like Outerra where we can see more far away) without using top secret or alien technology, that's all...

 

And its exactly what i say and the first thing that i don't like about the graphic engine, if even less powerful graphic card can obtain good fps, why more powerful graphic card don't be better like any other software (and i don't talk about other game and EVERY software including 3D software) and the CPU don't have big impact about graphic engine.

You talk about graphic engine like if its the same thing than physic engine and other software configuration, i'm sorry but you can't told me that i don't understand and say that, graphic engine have ONLY on impact to graphic render its why we see the same big graphic engine using for several game with totally different physic, AI, script and more.

 

Yeah but i have never see it and a lot of time with camera hacking or simple bug i have see under the ground for a lot of various game and software and okey maybe some title do that but you do like me and compare DCS with other game and i have never see in for Crysis but this game its a perfect example of bad optimization, it eat more power than (for example) Battlefield 3 and BF3 are better for graphic rending, like DCS graphic engine and Outerra...

 

And with this "trick" or not DCS graphic engine eat so many power and its not optimized with graphic card, mainly for more powerful actual CG cause with completely minimal graphic setting i (and many other) don't have correct fps that still low and the problem don't come from my computer who can use any other 3D application with correct fps i don't lag...

 

I have the box of LOMAC in my eyes right now its write compatible with Dx 9.0b audio and in the CD i have DX9 suit and the same graphic engine can "update" for example Crysis 2 before the patch can only using DX9...And a lot of people (including people who know a lot of things in programming) have told me that its the same graphic engine and anyways problem salved : ED work for a new graphic engine...

 

And if anyone can told me more about the new graphic engine

And where is the screenshots ? its the screenshots of Nevada ? Any link plz =)

And yes sorry for the language barrier a need practice more the English, i canonly basicly talking for the moment :/

 

And anyways i don't ask to made graphic engine look like FPS or with extrem graphic render but something like we actually see in Grandsurf video will be perfect and better improvement with graphic card and maybe the new graphic engine taht ED make right now can do it.

 

And thanks you for quote my entire post =)

Anyways problem solved, we just must wait new graphic engine coming "soon" and maybe an SDK too for make more easily graphic and more addons and maybe can do for full map addon like Grandsurf and maybe too add new map.

 

Thanks Genbrien, and yes same question: the new EDGE engine will support multi-GPU and/or more multi-thread ?

CPU : I7 6700k, MB : MSI Z170A GAMING M3, GC : EVGA GTX 1080ti SC2 GAMING iCX, RAM : DDR4 HyperX Fury 4 x 8 Go 2666 MHz CAS 15, STORAGE : Windows 10 on SSD, games on HDDs.

Hardware used for DCS : Pro, Saitek pro flight rudder, Thrustmaster HOTAS Warthog, Oculus Rift.

Own : A-10C, Black Shark (BS1 to BS2), P-51D, FC3, UH-1H, Combined Arms, Mi-8MTV2, AV-8B, M-2000C, F/A-18C, Hawk T.1A

Want : F-14 Tomcat, Yak-52, AJS-37, Spitfire LF Mk. IX, F-5E, MiG-21Bis, F-86F, MAC, F-16C, F-15E.

Link to comment
Share on other sites

Thanks Genbrien, and yes same question: the new EDGE engine will support multi-GPU and/or more multi-thread ?

 

My understanding is that this generally depends on co-operation from nVidia and AMD. (Without special profiles for the driver, AFAIK you're "stuck" in alternate render and similar techniques, which still help but not as much as a "proper" profile.)

 

This comes to mind...

Good, fast, cheap. Choose any two.

Come let's eat grandpa!

Use punctuation, save lives!

Link to comment
Share on other sites

CPU : I7 6700k, MB : MSI Z170A GAMING M3, GC : EVGA GTX 1080ti SC2 GAMING iCX, RAM : DDR4 HyperX Fury 4 x 8 Go 2666 MHz CAS 15, STORAGE : Windows 10 on SSD, games on HDDs.

Hardware used for DCS : Pro, Saitek pro flight rudder, Thrustmaster HOTAS Warthog, Oculus Rift.

Own : A-10C, Black Shark (BS1 to BS2), P-51D, FC3, UH-1H, Combined Arms, Mi-8MTV2, AV-8B, M-2000C, F/A-18C, Hawk T.1A

Want : F-14 Tomcat, Yak-52, AJS-37, Spitfire LF Mk. IX, F-5E, MiG-21Bis, F-86F, MAC, F-16C, F-15E.

Link to comment
Share on other sites

Screen_111129_195024.thumb.jpg.bbc4f1ee117688e35139809c24269f79.jpg

Screen_111129_195209.thumb.jpg.8d858721a06272f98fbba9241ce25cbd.jpg

not quite sure if tree use Geometry instancing to prevent cpu bounding.

http://http.developer.nvidia.com/GPUGems3/gpugems3_ch02.html

it seems that the cockpit is the main problem for fps

syncing between cpu and gpu without cpu bounding for cockpit is unsolvable afterall?

 

Multithreaded rendering — to render to the same Direct3D device object from different threads for multi core CPUs

that should help for better density object

tesslation too there will be less vertex transfer.

 

but cockpit is a problem doesn't it?


Edited by Fifou265

VEAF 735th - www.veaf.org - Formateur Ka50

Escadrille Francophone évoluant sur DCS.

En savoir plus : http://www.veaf.org/fr/735-escadrille-virtuelle-dcs-fancaise

Nous rejoindre : http://www.veaf.org/fr/nous-rejoindre

Link to comment
Share on other sites

Multithreaded rendering — to render to the same Direct3D device object from different threads for multi core CPUs

that should help for better density object

tesslation too there will be less vertex transfer.

 

As was said, multithreading does only make sense if operations are paralleliseable, else the overhead from scheduling the threads will make performance even worse.

 

Amdahl's law is idealised and doesn't account for the overhead of multithreading.

Good, fast, cheap. Choose any two.

Come let's eat grandpa!

Use punctuation, save lives!

Link to comment
Share on other sites

Hi everyone,

 

I found an article about 3D engine and how a 3D scene is built.

The article is in french but there are many stats about GPU usage during the whole 3D construction process.

 

It's based on 3DMark11 engine so it does not reflect our reality but it's nice to see and that help to understand how our GPU works...

 

http://www.hardware.fr/articles/845-1/comprendre-rendu-3d-etape-par-etape-avec-3dmark11.html

DCS Wish: Turbulences affecting surrounding aircraft...

[sIGPIC] [/sIGPIC]

Gigabyte GA-Z170-HD3P - Intel Core i5 6600K - 16Gb RAM DDR4-2133 - Gigabyte GeForce GTX 1080 G1 Gaming - 8 Go - 2 x SSD Crucial MX300 - 750 Go RAID0 - Screens: HP OMEN 32'' 2560x1440 + Oculus Rift CV1 - Win 10 - 64bits - TM WARTHOG #889 - Saitek Pro Rudder.

Link to comment
Share on other sites

As was said, multithreading does only make sense if operations are paralleliseable, else the overhead from scheduling the threads will make performance even worse.

 

Amdahl's law is idealised and doesn't account for the overhead of multithreading.

i 'm not talking about multi tasking in game logic (physic ,ai) but Multithreaded rendering to render to the same Direct3D device object from different threads for multi core CPUs. not the same thing;)

VEAF 735th - www.veaf.org - Formateur Ka50

Escadrille Francophone évoluant sur DCS.

En savoir plus : http://www.veaf.org/fr/735-escadrille-virtuelle-dcs-fancaise

Nous rejoindre : http://www.veaf.org/fr/nous-rejoindre

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...