Jump to content

Recommended Posts

Posted

 

 

Things I never knew about x-Tracing.

 

 

The comment section has some high grade info too imo.

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Sapphire  Nitro+ 7800XT - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus XG27ACG QHD 180Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Posted

Wow...really informative! Thanks

hsb

HW Spec in Spoiler

---

 

i7-10700K Direct-To-Die/OC'ed to 5.1GHz, MSI Z490 MB, 32GB DDR4 3200MHz, EVGA 2080 Ti FTW3, NVMe+SSD, Win 10 x64 Pro, MFG, Warthog, TM MFDs, Komodo Huey set, Rverbe G1

 

Posted (edited)

To give people an idea of how computationally expensive ray-tracing is, Im going to link to a PCGamer article from last summer and the video that came with it as an example.

https://www.pcgamer.com/uk/check-out-quake-2-played-on-a-titan-xp-with-real-time-gpu-path-tracing/

 

 

The thing to note is that Quake 2 is now a 21 year old game and runs on pretty much anything (even the old Amiga!), and even using top of the line hardware the guy was struggling to get 30+ fps.

 

x19sIltR0qU

 

 

The video is also a great example of the noise problem too :)

Edited by Buzzles
Posted

What I got out of it is this: Ray- or Path-Tracing is indeed a step forward in mimicking reality even better than we did the last 20-30 years with raster method. The bad part is, no chip available will have enough power to do it, Nvidias RTX series is just a drop of water on a very hot stone...if I can take those 1-10-500-120000 ray examples as a fact. With 1 ray you get nothing done, with 50 it still lokks awefull but already tilts any GPU...

 

 

The noise...they need to find a miracle algo to fix that issue.

 

This will be a long road with lots of rocks for us.

 

 

The 2080ti is a joke in that respect, a door opener that will leave you with lots to desire.

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Sapphire  Nitro+ 7800XT - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus XG27ACG QHD 180Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Posted

At the moment the feature that has more immediate effect is dlss and that is the most interesting part over Ray Tracing (they will keep adding Ray samples and throw fps down the drane each generation ) . That boosts fps considerably and for games that will support it a 4k monitor is finally viable at high fps. That being said, by the time you me its widespread, the 3080ti will be probably be out.

.

Posted (edited)

DLSS is just the first feature of the NGX SDK, which is going to be responsible for tensor core operations.

 

https://news.developer.nvidia.com/dlss-what-does-it-mean-for-game-developers/

 

Tensor cores may be capable of handling physics, and other aspects of the rendering pipeline as well as creating smarter enemy AI, adaptive cheat detection and we have NO real idea what this technology is going to be capable of over the next 5-10 years. We haven't even scratched the surface.

 

I'm personally curious, if the 20 series is a hybrid of the best of both worlds, and that perhaps we'll be less reliant on CUDA in the future making more room for Tensor and RT cores on the die.

 

Deep Learning AI IS the same technology being used to teach vehicles to drive themselves in the real world. It may be the next generations of cards before we see gaming applications beyond DLSS from it, but, the future of gaming could be very interesting. RTX is just paving the way. Now if they'd release some stuff for us to experience it with..given that it's November >.<

 

I mean.. am I really the only one who thinks the possibility of a supercomputer training itself to fly AI aircraft with more human like reactions and decisions doesn't sound like amazing potential for the future of combat flight sims? It's just a pipedream right now but, this is stuff that was previously considered impossible, or at least required too much effort.

 

Anyway - all just speculation right now. Grain of salt and all. Who knows at what point a dev team would have to invest into their own $130K supercomputer to train the neural networks required for such tasks. I'm not hopeful, just intrigued. Have been hearing about Deep Learning for awhile now, and I'd been wondering if it was going to find a place in gaming.

 

Not going to lie.. I like the sound of Deep Learning AI for gaming more than I like the sound of it for self driving cars. I mean.. ever seen the movie Maximum Overdrive? That movie was never scary to me, but now that we're actually building Skynet and cars are being taught to drive without human input i mean...

Edited by Headwarp
Spoiler

Win 11 Pro, z790 i9 13900k, RTX 4090 , 64GB DDR 6400GB, OS and DCS are on separate pci-e 4.0 drives 

Sim hardware - VKB MCG Ultimate with 200mm extension, Virpil T-50CM3 Dual throttles.   Blackhog B-explorer (A), TM Cougar MFD's (two), MFG Crosswinds with dampener.   Obutto R3volution gaming pit.  

 

Posted

If Eagle Dynamics was really smart, they would "teach" AI by iteratively learning using machine learning.

 

 

Set up a dogfight between two aircraft, multiply by 1,000,000 iterations, then you have the best pilot AI in existence.

 

 

Put that AI up against human players by adjusting decision making speed. The slower the "thinking" by the AI, the more it will screw up.

 

 

You don't even need to have the AI processing as expensive computer processes once the AI has learned all that it needed to.

[sIGPIC][/sIGPIC]

Posted

I don't get were the surplus of wisdom should come from in AI.

 

 

No doubt, standard situations are best dealt with by machines and routines but what IF...some idiot out of nowhere does THIS what no AI ever predicted, that kind of thing.

 

 

In other words, if I had to kill an AI, fight against them, what could work is doing something totally odd, something forbidden, some thing so far off and contrary....

 

 

There will be a weak spot in that idea of AI, I am almost certain.

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Sapphire  Nitro+ 7800XT - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus XG27ACG QHD 180Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Posted
I

There will be a weak spot in that idea of AI, I am almost certain.

 

I'm counting on it :)

9700k @ stock , Aorus Pro Z390 wifi , 32gb 3200 mhz CL16 , 1tb EVO 970 , MSI RX 6800XT Gaming X TRIO , Seasonic Prime 850w Gold , Coolermaster H500m , Noctua NH-D15S , CH Pro throttle and T50CM2/WarBrD base on Foxxmounts , CH pedals , Reverb G2v2

Posted
I don't get were the surplus of wisdom should come from in AI.

 

 

No doubt, standard situations are best dealt with by machines and routines but what IF...some idiot out of nowhere does THIS what no AI ever predicted, that kind of thing.

 

 

In other words, if I had to kill an AI, fight against them, what could work is doing something totally odd, something forbidden, some thing so far off and contrary....

 

 

There will be a weak spot in that idea of AI, I am almost certain.

 

 

Wrong, you're thinking in the old way of programming "AI".

 

 

Look at Alphago and you'll see the future of gaming AI. But, you don't have to be Google to design a machine learning AI...

[sIGPIC][/sIGPIC]

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...