Jump to content

Nvidia RTX cards and DCS - ?


Recommended Posts

  • Replies 142
  • Created
  • Last Reply

Top Posters In This Topic

25-35% is actually a big gain.

 

25-35% (projected gain) RTX 2080 vs GTX 1080....

 

Assuming there is no RTX Titan Xp hiding behind the curtain, and based on pricing and tech specs (CUDA cores etc.)...

 

  • RTX 2080ti = Upgrade from Titan Xp
  • RTX 2080 = Upgrade from GTX 1080ti

 

The 1080ti is @20-25% faster than the 1080.

 

That could potentially cut the 2080 down to @10% faster than a 1080ti

Link to comment
Share on other sites

the 7nm cards will be the interesting ones. They will be able to squeeze alot more cores and crank up the clocks at the same time.

[sigpic]http://forums.eagle.ru/signaturepics/sigpic4448_29.gif[/sigpic]

My PC specs below:

Case: Corsair 400C

PSU: SEASONIC SS-760XP2 760W Platinum

CPU: AMD RYZEN 3900X (12C/24T)

RAM: 32 GB 4266Mhz (two 2x8 kits) of trident Z RGB @3600Mhz CL 14 CR=1T

MOBO: ASUS CROSSHAIR HERO VI AM4

GFX: GTX 1080Ti MSI Gaming X

Cooler: NXZT Kraken X62 280mm AIO

Storage: Samsung 960 EVO 1TB M.2+6GB WD 6Gb red

HOTAS: Thrustmaster Warthog + CH pro pedals

Monitor: Gigabyte AORUS AD27QD Freesync HDR400 1440P

 

Link to comment
Share on other sites

Is it even possible?I mean, radar waves are not light , I dont know if you can "generate" a ray "physics" and using it as a game logic.

 

 

You can use the GPU to train machine learning algorithms or in more general terms, perform highly parallelized matrix and vector algebra, so yes, definitely possible.


Edited by sobek

Good, fast, cheap. Choose any two.

Come let's eat grandpa!

Use punctuation, save lives!

Link to comment
Share on other sites

Remember, this comparison is vs 1080 and not 1080Ti. And not sure ED will support DLSS. I don't think it's just an Nvidia thing, right? The game has to support it?

hsb

HW Spec in Spoiler

---

 

i7-10700K Direct-To-Die/OC'ed to 5.1GHz, MSI Z490 MB, 32GB DDR4 3200MHz, EVGA 2080 Ti FTW3, NVMe+SSD, Win 10 x64 Pro, MFG, Warthog, TM MFDs, Komodo Huey set, Rverbe G1

 

Link to comment
Share on other sites

I truly hope that DLSS will be incorporated into DCS World. This could give a big performance boost, especially in VR. There was a released video that showed almost 100% fps gain between 2080Ti with DLSS vs 1080Ti with TAA in 4K mode while retaining the same visual fidelity.


Edited by Supmua

PC: 5800X3D/4090, 11700K/3090, 9900K/2080Ti.

Joystick bases: TMW, VPC WarBRD, MT50CM2, VKB GFII, FSSB R3L

Joystick grips: TM (Warthog, F/A-18C), Realsimulator (F-16SGRH, F-18CGRH), VKB (Kosmosima LH, MCG, MCG Pro), VPC MongoosT50-CM2

Throttles: TMW, Winwing Super Taurus, Logitech Throttle Quadrant, Realsimulator Throttle (soon)

VR: HTC Vive/Pro, Oculus Rift/Quest 2, Valve Index, Varjo Aero, https://forum.dcs.world/topic/300065-varjo-aero-general-guide-for-new-owners/

Link to comment
Share on other sites

Is it even possible?I mean, radar waves are not light , I dont know if you can "generate" a ray "physics" and using it as a game logic.

As much as I've understood about RT is that developers can now adding lights like IRL and expect realistic outcome in real time, nothing more than that..

 

Actually Radar waves are light. Visible light and radio waves are all electromagnetic radiation, just in different wavelengths. One of those just happens to be visible to humans. As Nvidia described their raytracing, they are doing it inversely from the eye/camera, to target a handful of rays that would be hitting the eye. The newer radar modeling is ray-casting which in theory could would physically the same way, just not to light the scene. The problem here is that its both explicit to Nvidia and DX12.

Link to comment
Share on other sites

Not for $1000-1400 it isn't.

 

Really is a lot of ill informed chatter going on here. The Nvidia performance chart is detailing the RTX2080, which is NOT $1000 to $1400. It is $799.

 

The RTX2080 is based on the TU104 chip (2944 cuda cores).

 

The RTX2080TI DUAL is based on the TU102 chip (4352 cuda cores) which is a hell of a lot bigger and more powerful. The OC version of that is the one at £1300 and change including the 20% VAT sales tax, I know because I ordered one. Whether it was a wise decision is debateable, I needed to replace my aging and struggling, mega overclocked 980TI, so I went for it.

 

Everything will come out in the wash. Extrapolating performance from figures from totally different cards is pure clickbait conjecture.

 

A more balanced view, but yet again pure speculation:

 

 

The one thing I am confident of is the fact that it will comfortably outperform my old 980TI and I have absolutely no regrets. I always replace my card every other generation anyway. Next update will be the 2280ti. If, as Jay Z says, the 2080TI dual is replacing the Titan series I will be over the moon. September 20th cannot come soon enough.


Edited by Tinkickef

System spec: i9 9900K, Gigabyte Aorus Z390 Ultra motherboard, 32Gb Corsair Vengeance DDR4 3200 RAM, Corsair M.2 NVMe 1Tb Boot SSD. Seagate 1Tb Hybrid mass storage SSD. ASUS RTX2080TI Dual OC, Thermaltake Flo Riing 360mm water pumper, EVGA 850G3 PSU. HP Reverb, TM Warthog, Crosswind pedals, Buttkicker Gamer 2.

Link to comment
Share on other sites

Really is a lot of ill informed chatter going on here. The Nvidia performance chart is detailing the RTX2080, which is NOT $1000 to $1400. It is $799.

 

The RTX2080 is based on the TU104 chip (2944 cuda cores).

 

The RTX2080TI DUAL is based on the TU102 chip (4352 cuda cores) which is a hell of a lot bigger and more powerful. The OC version of that is the one at £1300 and change including the 20% VAT sales tax, I know because I ordered one. Whether it was a wise decision is debateable, I needed to replace my aging and struggling, mega overclocked 980TI, so I went for it.

 

Everything will come out in the wash. Extrapolating performance from figures from totally different cards is pure clickbait conjecture.

 

A more balanced view, but yet again pure speculation:

 

 

The one thing I am confident of is the fact that it will comfortably outperform my old 980TI and I have absolutely no regrets. I always replace my card every other generation anyway. Next update will be the 2280ti. If, as Jay Z says, the 2080TI dual is replacing the Titan series I will be over the moon. September 20th cannot come soon enough.

 

I think the point he is making is that there is a huge price increase and the ti is likely to see a similar increase of the previous ti card.

1080=$550 > 2080=$700

1080ti=$700 > 2080ti=$1000 for a possible 25-30 increase

A 980ti > 1080ti was about $100 more for 50%+ increase.

 

So for those not so interested in the new tech I would be less enthusiastic about getting one if the guy in that video is correct.

Win 10 64//4.5g i7 Kaby Lake//gtx Titan x pascal//16gb 3200ram//Asus Maximux Hero IX//Oculus Rift//

Link to comment
Share on other sites

Actually Radar waves are light. Visible light and radio waves are all electromagnetic radiation, just in different wavelengths. One of those just happens to be visible to humans. As Nvidia described their raytracing, they are doing it inversely from the eye/camera, to target a handful of rays that would be hitting the eye. The newer radar modeling is ray-casting which in theory could would physically the same way, just not to light the scene. The problem here is that its both explicit to Nvidia and DX12.

 

 

That they are related in-RL has nothing to do with graphics engine coding, especially when said coding is of a highly proprietary nature.

 

But as you say, it's a DX12 thing anyway, which DCS is not going DX12, so the whats and wherefores of RT are utterly irrelevant since we won't be getting it :P

-edit Also, Remi, who seems obsessed with having "the latest and greatest", congratulations, you are the primary target audience for every marketer who ever drew a breath. They love people like you.

Де вороги, знайдуться козаки їх перемогти.

5800x3d * 3090 * 64gb * Reverb G2

Link to comment
Share on other sites

RTX is on 12 nm but this lithography it will be superseded by 7nm next year so theres alot of speculation that NVIDIA might do a refresh of turing architecture with 7nm in 2019. That might enable more cuda cores and clocks at the same time.

[sigpic]http://forums.eagle.ru/signaturepics/sigpic4448_29.gif[/sigpic]

My PC specs below:

Case: Corsair 400C

PSU: SEASONIC SS-760XP2 760W Platinum

CPU: AMD RYZEN 3900X (12C/24T)

RAM: 32 GB 4266Mhz (two 2x8 kits) of trident Z RGB @3600Mhz CL 14 CR=1T

MOBO: ASUS CROSSHAIR HERO VI AM4

GFX: GTX 1080Ti MSI Gaming X

Cooler: NXZT Kraken X62 280mm AIO

Storage: Samsung 960 EVO 1TB M.2+6GB WD 6Gb red

HOTAS: Thrustmaster Warthog + CH pro pedals

Monitor: Gigabyte AORUS AD27QD Freesync HDR400 1440P

 

Link to comment
Share on other sites

That they are related in-RL has nothing to do with graphics engine coding, especially when said coding is of a highly proprietary nature.

 

But as you say, it's a DX12 thing anyway, which DCS is not going DX12, so the whats and wherefores of RT are utterly irrelevant since we won't be getting it :P

-edit Also, Remi, who seems obsessed with having "the latest and greatest", congratulations, you are the primary target audience for every marketer who ever drew a breath. They love people like you.

 

My point being that the core coding would be fundamentally no different, other than the target viewport being a radar screen. The 2 being exactly the same IRL means the computational work is also exactly the same, probably less so for a radar simulation. Furthermore anything added to the DX API becomes a vendor agnostic spec, that any GPU maker can build hardware to address. At that point it'll be the tessellation shenanigans all over again. Along with Gameworks competition nerfing.

Link to comment
Share on other sites

Am I the only one that finds all the negative people posting irritating? Maybe that's why you can't afford a new graphics card? If spending $1000 on a new card is going to bankrupt you and your family, you shouldn't be wasting your time flying imaginary airplanes on a computer or commenting in a flight sim forum. Take the money and put it in a college fund for your children or use the money to enhance your salary and marketable skills. How much time did you spend gaming and siming in the last year? Unless you flip burgers for a living that would have more than payed for any computer upgrade you want.

 

For those subject to high taxes, import fees, and exchange rates, that has nothing to do with Nvidia. Change your government's economic policies.

 

There are serious entitlement issues in the PC gaming and Flight Sim communities. But I have solution. If you don't like Nvidia's prices, don't buy them. It really is that simple. 1000 series prices have dropped, buy one used. $500 dollars spent today gets you a whole lot more bang for your buck than 5 years ago. What level of price and performance would make you happy exactly? 50 teraflops for $400? Maybe in 10 years.

 

I just started the Witcher 2 because I liked 3 so much. Its a 2011 game that looks gorgeous in 4K with the settings maxed. Way ahead of its time. And I bought if for like $5 bucks on a Steam sale. Getting a 4K panel to display it on in 2011 would have been impossible let alone a graphics card that could push that many pixels. We've come a long way in a very short period of time gentleman. Important to keep that in perspective.

 

Nvidia has some of the smartest people in the world working with multibillion dollar budgets. Where do you think that money comes from? If you think you can do better. Get an EE degree and start your own GPU company and compete with Nvidia. Good luck with that.

 

My Titan Xp was one of the best computer purchases I ever made. The 1080 TI came out like a month after but I hit break even on my Titan months ago.

 

Right now, there isn't one single person on these boards who knows how the 2080 Ti will perform in DCS. It's uninformed speculation at best and you are cluttering up the forums with useless information. No amount of trash talking Nvidia online is going to make AMD cards perform better ,draw less watts, or cause prices to drop.

 

All the 2080 Ti preorders in the world are currently sold out. Those are the real numbers Nvidia is looking at. Not the uninformed whiney "expert" on a forum or Youtube. Stop embarrassing yourself. Computer gaming is a luxury purchase for your leisure time and hobby pursuits. I have yet to see one developer, AI researcher, or artist complain about the price because the card will make them more productive and efficient. Faster machine learning, faster rendering.

 

Sorry about the rant, but this is getting old quick. I'm sure there are tons of people who aren't going to upgrade their PC due to budget constraints. They don't whine about it publicly or blame Nvidia.

 

 

Link to comment
Share on other sites

Well I sure hope I am lucky enough to be able to snag an EVGA 2080 Ti FTW3 the day they are available to order.

Yeah I am crazy, but I fly in VR and just want it.

Don B

EVGA Z390 Dark MB | i9 9900k CPU @ 5.1 GHz | Gigabyte 4090 OC | 64 GB Corsair Vengeance 3200 MHz CL16 | Corsair H150i Pro Cooler |Virpil CM3 Stick w/ Alpha Prime Grip 200mm ext| Virpil CM3 Throttle | VPC Rotor TCS Base w/ Alpha-L Grip| Point Control V2|Varjo Aero|

Link to comment
Share on other sites

Rant...

 

The problem is the value. They have such a huge price increase compared to last gen for such a negligible performance increase (estimated) that they arguably shouldn't even exist as a consumer product.

 

And to your main point, imagine if everytime performance is doubled, price is also doubled. That basically means there is zero technological progress in the consumer market. If we had been using that model for even the last 10 years, the new 2080 would cost more than your house. Yet this appears to be what you're arguing for.:doh:

System specs: i5-10600k (4.9 GHz), RX 6950XT, 32GB DDR4 3200, NVMe SSD, Reverb G2, WinWing Super Libra/Taurus, CH Pro Pedals.

Link to comment
Share on other sites

The problem is the value. They have such a huge price increase compared to last gen for such a negligible performance increase (estimated) that they arguably shouldn't even exist as a consumer product.

 

And to your main point, imagine if everytime performance is doubled, price is also doubled. That basically means there is zero technological progress in the consumer market. If we had been using that model for even the last 10 years, the new 2080 would cost more than your house. Yet this appears to be what you're arguing for.:doh:

 

Thanks for making my point. I'm going go out on a limb and say not too many people would buy it if it weren't that expensive. Any more imaginary benchmarks you care to share with the group? Look up the cost of old Silicon Graphics workstations and rendering workstations and compare the performance. The new Ti model costs the same as the Titan it is replacing. The 2070's are $100 dollars US more. Oh the humanity. I hope your family survives.

 

Nvidia is sure going to be making a lot of money off a product that shouldn't exist. I really hope you are a non native English speaker if you came away that I'm arguing costs should double. I'm a big fan of Moore's Law. We wouldn't be here without it. My point was that people who waste time playing video games shouldn't be complaining about the cost of a card to play said video games. If the 2000 series is too much for you buy a used 1080Ti or wait for the 2060 or 2050 models. Since you play DCS I'll give you a below market price on a 1070 if you want it. You sound like spoilt children who didn't get what they want for their birthday.

 

I'm much more interested in longterm Vulcan API performance and driver support given that's where ED seems to be going.

 

To the people waiting for the FTW model or any particular model for that matter. If you can stand the wait, go with the EVGA Kingpin model, the 1080Ti version could beat my Titan on most games when properly overclocked. It'll be a custom PCB with upgraded voltage regulators and binned GPU's with lots of OC headroom. They are even coming out with a hybrid model. He's an electrical engineer that holds all the benchmark records, Im not even sure he actually games much if you can believe that. He would rather experiment or use liquid N2.

 

 

Link to comment
Share on other sites

I have also been surprised by the negative stuff on the forum for a card with no applicable benchmarks yet. Its the same way on Wall Street, with people trying to trash NVDA with their bear thesis.

 

Nvidia is a great company. I have been playing with flight sims since Falcon AT. Every card I have bought from them has been a material improvement over the last, and I expect and trust that the 2080ti will be the same.

 

I have also made alot with the stock. When Nvidia stock fell around the product release time, with Citron and others trashing the stock, I bought more even though it was already one of my largest positions due to its stock performance.

Link to comment
Share on other sites

Am I the only one that finds all the negative people posting irritating?......rant about people who have a different opinion than me....

 

The irony.

Win 10 64//4.5g i7 Kaby Lake//gtx Titan x pascal//16gb 3200ram//Asus Maximux Hero IX//Oculus Rift//

Link to comment
Share on other sites

The irony.

 

We should just shutdown all the forums on the net.:D

 

I will upgrade perhaps at some point, but I'm still very happy with my system now. I'm really waiting to see rift 2 and see what the 2080ti will do or if it can hold 90 fps with Rift 1? This might also make me decided to sell my 1080ti.

 

There is just so many decisions at the moment, which is very cool and "my" system is fine for now in DCS until.

 

Vulkan performance / Threadripper 16 core potential

2080ti / 2nd Gen VR

2080ti / 1st Gen VR @ 90 fps?

 

2080ti / 4K performance

i7-7700K OC @ 5Ghz | ASUS IX Hero MB | ASUS GTX 1080 Ti STRIX | 32GB Corsair 3000Mhz | Corsair H100i V2 Radiator | Samsung 960 EVO M.2 NVMe 500G SSD | Samsung 850 EVO 500G SSD | Corsair HX850i Platinum 850W | Oculus Rift | ASUS PG278Q 27-inch, 2560 x 1440, G-SYNC, 144Hz, 1ms | VKB Gunfighter Pro

Chuck's DCS Tutorial Library

Download PDF Tutorial guides to help get up to speed with aircraft quickly and also great for taking a good look at the aircraft available for DCS before purchasing. Link

Link to comment
Share on other sites

We should just shutdown all the forums on the net.:D

 

I will upgrade perhaps at some point, but I'm still very happy with my system now. I'm really waiting to see rift 2 and see what the 2080ti will do or if it can hold 90 fps with Rift 1? This might also make me decided to sell my 1080ti.

 

There is just so many decisions at the moment, which is very cool and "my" system is fine for now in DCS until.

 

Vulkan performance / Threadripper 16 core potential

2080ti / 2nd Gen VR

2080ti / 1st Gen VR @ 90 fps?

 

2080ti / 4K performance

 

This about where I'm at. I was in much more of a hurry when the 10 series came out as my 980ti struggled with some VR but now its just about getting better quality VR & i can take my time a bit see what else comes out.

I'll also be interested to see if those that did get 2080tis got much of an improvement in DCS VR.

Win 10 64//4.5g i7 Kaby Lake//gtx Titan x pascal//16gb 3200ram//Asus Maximux Hero IX//Oculus Rift//

Link to comment
Share on other sites

MSI listed the 2080ti in Asia for 1349,- € exluding tax.

 

 

In Germany that means another 19% VAT on top of that.

 

 

:(

 

 

Insane numbers

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Asus 1080ti EK-waterblock - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus PG278Q 27" QHD Gsync 144Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Link to comment
Share on other sites

Actual prices are too much for a graphic card, nevertheless I'm thinking about changing or not my 1070. I'm quite satisfied by its performance in VR, but for a consistent upgrade I could change. My goal is to raise my pixel density from 1 to 1.4 - 1.5 max, keeping my graphic settings. Will this be possible with a 2070? Will I Need a 2080? On monitor I'm still 1080p, but with a new card I could change for 1440p, maybe 4k? What card will I Need for 4k? Without real benchmarks, noone Can answer to these questions. So by now I'm better to wait and see. Besides that, I think prices Will drop in the next 6 months.

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...