Jump to content

Recommended Posts

Posted

I currently have a Reverb v2 G1, it drives the same number of pixels as the g2, on my current hardware and settings ... i get good "consistent 45 FPS"across most scenarios except in some very heavy missions. I use a PD of 1 and my SS is typically at 100%.. i can read the cockpit fine...

 

PD of 1.7 would have a massive impact on performance, but you simply don't need it with the reverb, increasing SS in steam makes much more sense if you want enhanced clarity as it provides a more granular level of control

SYSTEM SPECS: Hardware AMD 9800X3D, 64Gb RAM, 4090 FE, Virpil T50CM3 Throttle, WinWIng Orion 2 & F-16EX + MFG Crosswinds V2, Varjo Aero
SOFTWARE: Microsoft Windows 11, VoiceAttack & VAICOM PRO

YOUTUBE CHANNEL: @speed-of-heat

1569924735_WildcardsBadgerFAASig.jpg.dbb8c2a337e37c2bfb12855f86d70fd5.jpg

Posted
Hi, everybody:

 

 

I only use virtual reality on DCS, and I'm just about to pre-order a new Reverb G2 because it looks like it's going to be the best to use in simulation, and take advantage of the 100 Euro discount on the pre-purchase.

But first I would like to ask a question to those of you who currently have a Reverb:

 

My equipment is a Rayzen 3700X, with 32 GB at 3600MHz and an RTX 2080.

Do you think this is enough equipment to handle the new G2?

 

Reading forums around, I see that the graphics are fair, but people who have 1080 Ti and 2080 "just" say that it is enough - logically a 2080 Ti will go better, since it is known that for GPU VR you always want the best and it will still be little?

 

Currently I fly with a Rift S, always OffLine and in light missions, and I am happy with the performance as long as I don't go to PD higher than 1.2. This way I get 40 fps with active ASW on smooth NOE flights.

 

As people justify that the Reverb does not need to increase the PD they say that the performance even improves, but of course they usually come from flying with high PDs: 1.7 or even more in their previous HMDs, hence my doubts...

 

I look forward to your contributions, which will always be welcome...

 

Regards

 

 

45 FPS straight you can get with the HP reverb G2 with your system now!

 

 

I play 45 - 90 FPS with the reverb g1 now. See system in signature.

 

 

Because i will the best of the best i gonna upgrade at the end of the year. But will it Nvidia or AMD... time will tell.

 

 

90 FPS DCS on high settings is a dream. I'll hope it finally comes true. :D

Newest system: AMD 9800X3d, Kingsting 128 GBDDR5, MSI RTX 5090(ready for buying), Corsair 150 Pro, 3xSamsung 970 Pro, Logitech X-56 HOTAS, Pimax Crystal Light (Super is purchased) ASUS 1200 Watt.

New system:I9-9900KS, Kingston 128 GB DDR4 3200Mhz, MSI RTX 4090, Corsair H150 Pro RGB, 2xSamsung 970 EVO 2Tb, 2xsamsung 970 EVO 1 TB, Scandisk m2 500 MB, 2 x Crucial 1 Tb, T16000M HOTAS, HP Reverb Professional 2, Corsair 750 Watt.

Old system:I7-4770K(OC 4.5Ghz), Kingston 24 GB DDR3 1600 Mhz,MSI RTX 2080(OC 2070 Mhz), 2 * 500 GB SSD, 3,5 TB HDD, 55' Samsung 3d tv, Trackir 5, Logitech HD Cam, T16000M HOTAS. All DCS modules, maps and campaigns:pilotfly:

Posted (edited)
My MB does have a front panel USB C-Port, which is hooked up to front of my case.

 

My Quest with Link works just fine in it so hopefully my Reverb G2 will also.

Yea hopefully it does. I looked up the specs of my motherboard and my USB C port only supplies 3 Amps of power. So as long as yours puts out 6 watts you should be good.

 

Looks like I'll need to use an adapter and also have the headset plugged into a power outlet.

Edited by sze5003

Asus ROG Strix Z790-E | Core i9 13900K-NZXT Kraken X73 AIO | 32GB DDR5 G Skill Neo 6600mhz | 2TB Sk Hynix P41 Platinum nvme |1TB Evo 970 Plus nvme | OCZ Trion 150 960GB | 256GB Samsung 830 | 1TB Samsung 850 EVO | Gigabyte OC 4090  | Phanteks P600S | 1000W MSI  MPG A1000G | LG C2 42 Evo 3840x2160 @ 120hz

Posted
Yea hopefully it does. I looked up the specs of my motherboard and my USB C port only supplies 3 watts of power. So as long as yours puts out 6 watts you should be good.

 

Looks like I'll need to use an adapter and also have the headset plugged into a power outlet.

 

Just looked up the specs on my Z390 Dark MB, that USB -C front panel port I want to use is listed as 5V 3 Amp. None of the USB headers on this MB goes above 5V. The manual does say if more is needed to just use a powered USB Hub though.

Don B

EVGA Z390 Dark MB | i9 9900k CPU @ 5.1 GHz | Gigabyte 4090 OC | 64 GB Corsair Vengeance 3200 MHz CL16 | Corsair H150i Pro Cooler |Virpil CM3 Stick w/ Alpha Prime Grip 200mm ext| Virpil CM3 Throttle | VPC Rotor TCS Base w/ Alpha-L Grip| Point Control V2|Varjo Aero|

Posted
Just looked up the specs on my Z390 Dark MB, that USB -C front panel port I want to use is listed as 5V 3 Amp. None of the USB headers on this MB goes above 5V. The manual does say if more is needed to just use a powered USB Hub though.

 

Unless I'm forgetting my Ohm's Law [nerdiest thing I've said today, which is saying a lot], 5V at 3 amps is 15 watts. So maybe you're fine?

Posted
Unless I'm forgetting my Ohm's Law [nerdiest thing I've said today, which is saying a lot], 5V at 3 amps is 15 watts. So maybe you're fine?

 

Ok thanks, will see it probably will be fine.

Don B

EVGA Z390 Dark MB | i9 9900k CPU @ 5.1 GHz | Gigabyte 4090 OC | 64 GB Corsair Vengeance 3200 MHz CL16 | Corsair H150i Pro Cooler |Virpil CM3 Stick w/ Alpha Prime Grip 200mm ext| Virpil CM3 Throttle | VPC Rotor TCS Base w/ Alpha-L Grip| Point Control V2|Varjo Aero|

Posted

It's all pretty confusing, with USB-C being a connector and not a USB standard. Example: USB 3.1 -- which is what my mobo's Type C connector is -- is supposed to deliver up to 20 volts at 5A (= 100W).

 

Yet my motherboard's specs say its USB 3.1 port can deliver 3A of current, not 5A. So... is that 3A at 20 volts (60W)? Or is the power delivery spec for USB 3.1 just the maximum, and anything less than that is fair game to be called USB 3.1 as long as the data throughput requirement is met?

 

Not being smart enough to puzzle it out, my plan when my G2 arrives is to plug it into the USB-C port and just see what happens.

Posted
It's all pretty confusing, with USB-C being a connector and not a USB standard. Example: USB 3.1 -- which is what my mobo's Type C connector is -- is supposed to deliver up to 20 volts at 5A (= 100W).

 

Yet my motherboard's specs say its USB 3.1 port can deliver 3A of current, not 5A. So... is that 3A at 20 volts (60W)? Or is the power delivery spec for USB 3.1 just the maximum, and anything less than that is fair game to be called USB 3.1 as long as the data throughput requirement is met?

 

Not being smart enough to puzzle it out, my plan when my G2 arrives is to plug it into the USB-C port and just see what happens.

Yea my z370 E Asus board has some usb A 3.1 ports + type c and it says they support 3A power output. The usb C is on the same section as the other two USB A ports.

 

I may as well try it out and if not I'll just have to find a bigger extension cord to plug the damn headset in when I get one.

Asus ROG Strix Z790-E | Core i9 13900K-NZXT Kraken X73 AIO | 32GB DDR5 G Skill Neo 6600mhz | 2TB Sk Hynix P41 Platinum nvme |1TB Evo 970 Plus nvme | OCZ Trion 150 960GB | 256GB Samsung 830 | 1TB Samsung 850 EVO | Gigabyte OC 4090  | Phanteks P600S | 1000W MSI  MPG A1000G | LG C2 42 Evo 3840x2160 @ 120hz

Posted

The reverb is 2160 x 2160 x 2

SYSTEM SPECS: Hardware AMD 9800X3D, 64Gb RAM, 4090 FE, Virpil T50CM3 Throttle, WinWIng Orion 2 & F-16EX + MFG Crosswinds V2, Varjo Aero
SOFTWARE: Microsoft Windows 11, VoiceAttack & VAICOM PRO

YOUTUBE CHANNEL: @speed-of-heat

1569924735_WildcardsBadgerFAASig.jpg.dbb8c2a337e37c2bfb12855f86d70fd5.jpg

Posted
????

 

...hence more pixels to push to the Reverb than a 4K monitor. Hence more "graphical load" (assuming there is no efficiency gain in basically rendering the same scene twice, once for each eye). There are also complications in VR like super sampling and reprojection, but I am not nearly smart enough to know how those impact the GPU versus CPU.

Posted
????

 

 

It's easy to figure out. Multiply the total pixels first then by the refresh rate and you get the graphical load in terms of pixels per second that need to be rendered

 

4k: [3840 x 2160] x 60 = 497,664,000

 

G2: [2160 x 2160 x2] x 90 = 839,808,000

 

Conclusion: G2 at 90hz is roughly 1.7x the load of 4k at 60hz.

Posted
It's easy to figure out. Multiply the total pixels first then by the refresh rate and you get the graphical load in terms of pixels per second that need to be rendered

 

4k: [3840 x 2160] x 60 = 497,664,000

 

G2: [2160 x 2160 x2] x 90 = 839,808,000

 

Conclusion: G2 at 90hz is roughly 1.7x the load of 4k at 60hz.

 

Conclusion-> 1,7x .... get a 3080ti

Posted (edited)
Except that for VR it's twice as much CPU work because game actually renders 2 different frames

 

i think thats covered ...

G2: [2160 x 2160 x2] x 90 = 839,808,000

 

im almost never bound by cpu... except in the heaviest of missions

Edited by speed-of-heat

SYSTEM SPECS: Hardware AMD 9800X3D, 64Gb RAM, 4090 FE, Virpil T50CM3 Throttle, WinWIng Orion 2 & F-16EX + MFG Crosswinds V2, Varjo Aero
SOFTWARE: Microsoft Windows 11, VoiceAttack & VAICOM PRO

YOUTUBE CHANNEL: @speed-of-heat

1569924735_WildcardsBadgerFAASig.jpg.dbb8c2a337e37c2bfb12855f86d70fd5.jpg

Posted
im almost never bound by cpu... except in the heaviest of missions

 

What's your preferred method of monitoring GPU and CPU utilization? FPSVR seems to show GPU headroom even when I suspect one core is slammed, so I'm looking for something else to give me a better picture of what's really going on under the hood. Windows Performance Monitor? AIDA64? Thanks!

Posted

I'm aiming for 45 FPS. 90 FPS is not gonna happen with todays hardware. If a video card comes out that is capable of 90 FPS then my cpu may be the limiting factor.

Posted
What's your preferred method of monitoring GPU and CPU utilization? FPSVR seems to show GPU headroom even when I suspect one core is slammed, so I'm looking for something else to give me a better picture of what's really going on under the hood. Windows Performance Monitor? AIDA64? Thanks!

 

I use FPSVR as well and i'm more looking at the frame times for CPU than the state of the core . which usually shows me 2 or 3 ms of headroom, e.g. frame times between 6 and 9 ms for the CPU

 

I'm aiming for 45 FPS. 90 FPS is not gonna happen with todays hardware. If a video card comes out that is capable of 90 FPS then my cpu may be the limiting factor.

 

pretty much this

SYSTEM SPECS: Hardware AMD 9800X3D, 64Gb RAM, 4090 FE, Virpil T50CM3 Throttle, WinWIng Orion 2 & F-16EX + MFG Crosswinds V2, Varjo Aero
SOFTWARE: Microsoft Windows 11, VoiceAttack & VAICOM PRO

YOUTUBE CHANNEL: @speed-of-heat

1569924735_WildcardsBadgerFAASig.jpg.dbb8c2a337e37c2bfb12855f86d70fd5.jpg

Posted (edited)
It's easy to figure out. Multiply the total pixels first then by the refresh rate and you get the graphical load in terms of pixels per second that need to be rendered

 

4k: [3840 x 2160] x 60 = 497,664,000

 

G2: [2160 x 2160 x2] x 90 = 839,808,000

 

Conclusion: G2 at 90hz is roughly 1.7x the load of 4k at 60hz.

 

 

Well, according to this method on my Odyssey Plus I have roughly 414,720,000. I am near the limit for a 1080GTX as I get most times in MP around 45-55FPS..

 

So, I am guessing if I did buy the G2 I would most likely need a new video card as well since my video card is near maxxed out with the O+? I doubt this 1080 could push twice the pixels??

 

where would I find info on where to get the amount of pixels a 1080 can push per second. from what I see wouldn't this caluclation be bandwidth based for the cable capability?

Edited by The_Nephilim

Intel Ultra 265K 5.5GHZ   /  Gigabyte Z890 Aorus Elite  /  MSI 4070Ti Ventus 12GB   /  SoundBlaster Z SoundCard  /  Corsair Vengance 64GB Ram  /  HP Reverb G2  /  Samsung 980 Pro 2TB Games   /  Crucial 512GB M.2 Win 11 Pro 21H2 /  ButtKicker Gamer  /  CoolerMaster TD500 Mesh V2 PC Case

Posted

Remember guys, the solution to improve frame rates is not just to buy a more expensive graphics card. There are plenty of graphics settings where you can compromise on "quality". And oftentimes the quality sacrifices are barely noticeable.

 

Anti aliasing and shadows tend to be the two biggies where an intermediate sweet spot gives you visual quality without being too much of a GPU resource hog.

[sIGPIC][/sIGPIC]

Posted

I've said it a zillion times before, but IMO, you can tweak till the cows come home, but the only things that REALLY affect performance are MSAA, SteamVR SS/PD and Shadows. Everything else is fairly inconsequential.

Intel i7 12700K · MSI Gaming X Trio RTX 4090 · ASUS ROG STRIX Z690-A Wi-Fi · MSI 32" MPG321UR QD · Samsung 970 500Gb M.2 NVMe · 2 x Samsung 850 Evo 1Tb · 2Tb HDD · 32Gb Corsair Vengance 3000MHz DDR4 · Windows 11 · Thrustmaster TPR Pedals · Tobii Eye Tracker 5 · Thrustmaster F/A-18 Hornet Grip · Virpil MongoosT-50CM3 Base · Virpil Throttle MT-50 CM3 · Virpil Alpha Prime Grip · Virpil Control Panel 2 · Thrustmaster F-16 MFDs · HTC Vive Pro 2 · Total Controls Multifunction Button Box

Posted
Well, according to this method on my Odyssey Plus I have roughly 414,720,000. I am near the limit for a 1080GTX as I get most times in MP around 45-55FPS..

 

So, I am guessing if I did buy the G2 I would most likely need a new video card as well since my video card is near maxxed out with the O+? I doubt this 1080 could push twice the pixels??

 

where would I find info on where to get the amount of pixels a 1080 can push per second. from what I see wouldn't this caluclation be bandwidth based for the cable capability?

 

The G2 will have a 60Hz mode (as opposed to 90), as well as a mode to set it to half its native resolution, presumably to help folks with less-than-cutting-edge hardware.

 

Of course, I can see little point in buying a G2 just to run it in half resolution, but it might provide a bridging solution if you get the G2, find you can't get it to run smoothly, and then need some time to convince the wife that you need a 3080Ti more than the kids need new shoes.

Posted
The G2 will have a 60Hz mode (as opposed to 90), as well as a mode to set it to half its native resolution, presumably to help folks with less-than-cutting-edge hardware.

 

 

 

Of course, I can see little point in buying a G2 just to run it in half resolution, but it might provide a bridging solution if you get the G2, find you can't get it to run smoothly, and then need some time to convince the wife that you need a 3080Ti more than the kids need new shoes.

Yep that 3080ti will be pricey. Especially if you need to swap out other components too. I recently read some article where it said the 3080ti needs a more powerful psu connector, could be a 12pin instead of 8. So for some people that have modular psu's I guess you add another cable if you have the room and ability.

 

Otherwise you may need another psu. Not so expensive in reality but it's always a pain to swap out psu's.

Asus ROG Strix Z790-E | Core i9 13900K-NZXT Kraken X73 AIO | 32GB DDR5 G Skill Neo 6600mhz | 2TB Sk Hynix P41 Platinum nvme |1TB Evo 970 Plus nvme | OCZ Trion 150 960GB | 256GB Samsung 830 | 1TB Samsung 850 EVO | Gigabyte OC 4090  | Phanteks P600S | 1000W MSI  MPG A1000G | LG C2 42 Evo 3840x2160 @ 120hz

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...