Jump to content

Tensorial_Architect

Members
  • Posts

    105
  • Joined

  • Last visited

Everything posted by Tensorial_Architect

  1. I can, ... assuredly, ... say that using ultra-high rendering does not net a 50% performance gain with the RTX 5090 over the 4090. The best I have personally seen with everything maxed on the Crystal is 21% over the 4090. I should add though that I don't fly helos which seems to be where so many guys have massive problems. I have an ASUS TUF 5090. With regular voltage and high resolutions with the Crystal or Somnium, it gets up to 75 deg C for the GPU and 84 to 85 deg C for the memory. (On the Marianas map or Syria) I have not tried under or over volting yet, ... I have a wife and son Fooling with endless graphics card settings is about priority #183 in life at the moment.
  2. Mr Sukebe, when opportunity presents, would you mind showing us a few pics of your setup? I'd love to see your Moza, Jetseat, and general layout.
  3. It is not that Nvidia is not interested in selling 5090s, ... they are VERY happy to rape wealthy American consumers for their flagship; The problem is that TSMC cannot produce the GB202 in moderate to high yields. Sadly, Blackhawk, I would also describe the RTX 5090 as, ... middling. Go for an RTX 4090, ... still the best high-end sweet spot for DCS.
  4. One of the grad students in my lab brought in his MeganeX 8K about ten days ago for some of us to sample. It was, ... ehh, ... okay. I will be sticking with my Quest Pro for everyday uses and my Varjo Aero when I want that last little bit of resolution.
  5. On paper, the Pimax Crystal Light would seem like a clear winner for price range you indicate. But that is, ... on paper. In real life, Pimax headsets and their accompanying software (Pimax Play) suffer from a number of quality control issues. Pimax as a company in some ways almost resembles a Ponzi scheme, ... they release numerous headsets all promising wonderful performance but when you actually get your hands on them, most are in various stages of being half-baked. They are also not very good about following up and correcting lingering issues down the line. Having used all the headsets you mention and considering your $1000 budget, my recommendation is the Quest Pro. (New is about $850 while a lightly used model on Ebay will run you $500 or so.) You would not think it to look at the resolution specs but the in-headset view turns out to be extremely good with its pancake lenses (close to the Varjo Aero, Pimax Crystal, and Somnium Ultimate) and it is far easier to run at 72 Hz with the 5070 Ti you plan than those higher resolution headsets. The Quest Pro does feature true eye-tracking and you can use this feature with the software created by MBucchia (Quad Views) to further increase your frames per second. The Quest Pro also offers exceptional hand tracking (which you can also add custom Github software to further increase), ... so much so that you do not need the various finger pointing devices you will sometimes see mentioned here in the forums. The Quest Pro also has no need for base stations (like the Valve Index and Varjo) and its hand controllers are much better than those that come with the Quest 3. Either the Quest Pro or the Quest 3 will serve you well. Both have a large user base and both are friendlier for new guys like yourself to get up and running. Both also work well with the RTX 5000 series of GPUs (where as Varjo, Pimax, and Somnium experienced teething problems). In the end, despite how much I dislike the little weasel Zuckerberg, the Quest Pro or Quest 3 will just provide a better newbie experience for a guy such as yourself.
  6. Salute! Well done.
  7. Excellent find kacper! I used the Intel 13900k in my tests on an Aorus motherboard with DDR 6400 RAM. It is interesting to see how much better he does with the 9800X3D. AMD really is the sweet spot right now for latency. Overall, I am averaging 22% higher FPS with the 5090 whereas he is at about 27% higher. This is one of the best videos I have ever seen showing the exact side-by-side comparison of CPU limited vs GPU limited. I might just have to get myself an AMD chip.
  8. I have an RTX 5090 and the improvement overall with DCS in Vr is anywhere from 17% to 24% depending upon DLSS or DLAA and settings. Overall, I find the gains to be minimal. Aside from bumping up MSAA slightly, there is nothing you can do with the 5090 you cannot do with the 4090. You can read more here: I do not recommend the RTX 5090. It consumes 33% more electricity for less than a 33% gain in resolution. It is also slightly louder, slightly hotter, and far more expensive than the 4090. The 4090 release was arguably the best of the last decade or more. The 5090 release is arguably the worst in the past decade. Thanks to Trump tanking the entire economy and the massive recession that is coming, the RTX 5090 will be selling for over $4000 for some time to come. $4000 for marginal gains even with MBucchia's Quad Views is not worth it.
  9. Roger that. Thanks Archangel for that link. Someone at Trend Micro is up, ... got a detailed signed report saying it is not malware The more interesting question is, Why is this being detected as a trojan? Never had DCS give me a problem before.
  10. Tried to update DCS today through Steam to the latest Beta and I have multiple programs giving me warnings that the following trojan: Win32/Wacatac.B!ml is embedded in: D:\SteamLibrary\steamapps\common\DCSWorld\Mods\aircraft\F14\bin\F14-HeatblurCommon.dll I assume this is a false positive but thought I would pass it on to BigNewy and the other powers that be. I did submit the file to Microsoft and Trend Micro for analysis which I am awaiting.
  11. Also a few extra tidbits to help inform the larger DCS community; When you purchase a high-end GPU, Nvidia (and to a lesser extent AMD) is charging you a "gamer" premium. As an example, take a Rolex Submariner wristwatch. It typically sells in the $9000 to $10,000 range but is estimated to cost somewhere around $520 to $700 to actually manufacture. When marketing costs are added along with additional fees, Rolex is making roughly $8000 for each Submariner. No one but Nvidia knows exactly the research, development, and marketing costs to get the RTX 4090 into stores but many outlets put the actual cost to manufacture an RTX 4090 card at roughly $420 to $450 to build when it came out in late 2022 (for a street price of $1600 to $2000 for most variants). If the real cost was, say, $600 to bring the RTX 4090 to market, then you have a 3:1 profit ratio. In that same vein, I have been watching some of the TSMC feeds and those in Malaysia and Indonesia who actually work to assemble the RTX 5090 (for companies like Zotac). They are guessing that is costs about $600 to actually make an RTX 5090. If the real cost to Nvidia to get the RTX 5090 on to the streets is $700, then again, you aren't far from a 3:1 ratio.
  12. Hoggorm, you asked, "For my part I would extend it to say that I have really no idea what OpenXR is... If you have a simple explanation I'd be happy to hear it." OpenXR is an API, ... an application programming interface, ... computer code, .... that various hardware (Meta (Oculus), Varjo, HTC, Pimax, etc ... ) or software (Steam, DCS, specific games) vendors can implement to send and receive a set of commands to a VR headset. It acts as a bridge between the application (DCS, Steam, etc ...) and the VR headset to translate the application's requests into commands (rendering visuals, tracking movement, processing input from hand controllers, etc ...) that the headset can use. OpenXR is also an "open" standard developed by the Kronos Group (software developers) which means developers and software engineers (like Eagle Dynamics (DCS), Steam, Meta, Varjo, Pimax, MBucchia, etc .... ) don't need to pay fees to anyone to use it. The main benefit of OpenXR is that it allows a software engineer to write code only one time and deploy it across multiple platforms such as Meta Quest, SteamVR, Windows Mixed Reality, etc ... This means that a guy like MBucchia who is trying to help DCS simmers get better framerates for a given graphics card's processing power by rendering only the area where a human eye's fovea is pointed (the area outside where the fovea is pointed is rendered but with less detail (resolution)) can write code only once, instead of needing to write completely different code for every separate VR headset model. Hoggorm (that sounds for some reason like the name of a Klingon or a dwarf from Middle Earth), you should be using MBucchia's Quad Views/Fixed Foveated software to increase the framerate for your HTC headset. You can find an overview and the links here: https://github.com/mbucchia/Quad-Views-Foveated/wiki
  13. For some reason, my lizard-like hippocampus recalled the HTC Vive Cosmos (non-Elite) that we once had in our lab and which worked fine without SteamVR as representing most of the HTC headsets. Sadly, have not had an HTC headset in our labs in the last few years. Thanks for the memory recall and clarification!
  14. I don't believe that is required any longer. A DCS simmer used to have to enter specific commands to get the multi-threading or OpenXR versions to run. For many months now, that is no longer necessary. When I fire up DCS, I have three options to launch the sim under, ... the middle one is to run DCS using its (now) native OpenXR runtime.
  15. I've dealt with VKB for years now and find them to be a very reputable manufacturer. I am sure they will help you get things sorted. My Gunfighter IV base has zero issues.
  16. I don't have a Vive headset to specifically test but I believe that Vive Hub does support OpenXR. This means you do not need SteamVR. When you fire up DCS, you can select to run it in OpenXR mode (which DCS supports natively) without SteamVR.
  17. No, I reported an initial increase of 11.3% to 17.3% for the 5090 over the 4090. Also, Aussie Dude above indicates nothing about eye tracking or the use of MBucchia's software, which is what I used. That is why my numbers are different and why as of Feb 24th, they still do not scale up to his results. (My numbers more closely match his with MSFS 2020 and 2024 because I am not adding any additional APIs). In addition, there has been one Nvidia beta and two hotfix driver releases since Aussie Dude put up his vid. In DCS specifically with the very latest drivers (572.47) and MBucchia's QuadViews OpenXR API, I am measuring a 24.1% increase for the 5090 over the 4090 with every setting in DCS at "high" or "ultra" while using DLSS across six headsets (Reverb G2, Quest 3, Quest Pro, Varjo Aero, Pimax Crystal, Somnium Ultimate). With DLAA, I am measuring an increase of 19.5%. Why the increase? ... It's simple, ... Nvidia is ironing out the problems which happens every time a new GPU series is released. That said, 1 - The 5090 is doing 33% more work than the 4090 (electrically speaking, wattage is work) for less than a 33% gain in framerate. In comparison, the 4090 beat the 3090 and 3090 Ti by a far larger percentage. This means that after two years, the optimization of the GPU is fairly poor. In electrical engineering speak, ... the 5090 is a s$#! release. 2 - When the 4090 was released, its retail was $1500 but its street price was $1599 to $1899 for most the variants (Nvidia, ASUS, Gigabyte, Zotac, etc ...) in that first year (late 2022 and all of 2023). In comparison, the 5090 carries a retail price of $2000 but its street price is $4500 to $6800, ... pure scalper prices. The very small amount of cards that Nvidia can actually ship to reputable retailers (Amazon, Newegg, Best Buy, MicroCenter, etc ...) are gone in seconds. This makes the 5090 all but unavailable. TSMC cannot make many chips (GB202) and Nvidia is going to rape every wealthy American gamer they can for its flagship. Because of the current White House tariffs and the upcoming recession, these prices are not likely to go down in the next two years. Millions straddling the fence voted for lower grocery prices and are going to see not only higher grocery prices but increased inflation (inflation has risen to 3.1% since Jan 21st). That is what tariffs and trade wars do. In essence, the 5090 is TWICE as expensive as the 4090 for gains in the 20 to 30% range. Again, ... that makes it a s%#! release. 3 - The electricity costs of this card will really show up in your monthly bill. If you live in rural Louisiana where you can purchase electricity in the 11.3 to 12 cents per kilowatt hour (kWh) range or Bumpkee, North Dakota , the card will show up on your monthly bill if you do any significant gaming with DCS. If you live in the far more populous urban areas of America (about ~ 80% of Americans live in urban areas), the 5090 is really going to show up. Running this card several hours a day is going to run into your ability to buy food in a state like California or Hawaii or Florida very fast. There is also the issue that many people are going to need to upgrade to a 1200 to 1600 Watt power supply unit to run the 5090 reliably. I continue to stand by original sentiments, ... the 5090 is a poor release.
  18. This is just an update for Varjo Aero users; Varjo has been in contact with Nvidia and last night, Nvidia released new drivers (572.47) that fix some of the issues. I can now confirm that the Aero is working stably with the RTX 5090 with MBucchia's QuadViews in Fixed Foveated mode (but with eye tracking disabled in Base). Hope this helps out a few guys. (Still working to get QuadViews with eye tracking working) (I am also not having any issues getting the Pimax Crystal to work other than the regular issues that occur every week with trying to get the Pimax Crystal to work. )
  19. Yeah, it has been awhile since I thought on the 1080 Ti. I would agree with you.
  20. His results are almost exactly what I can recreate. Four notable points: 1 - Currently, with the very latest drivers for the RTX 5090 (572.42 released three days ago), the 5090 across the board in DCS specifically is averaging a 23% gain over the 4090 when using DLSS at the very highest settings combined with Mbbuchia's QuadViews software. (This is a gain over what I measured just two weeks ago.) When using DLAA, the gain is about ~ 18% at the very highest settings (this is averaging the six headsets we do have in the lab). With regards to what I think of as the "high-end" headsets, ... the Somnium Ultimate, the Crystal and the Varjo Aero, ... the gain of the 5090 over the 4090 in DCS specifically is averaging 18.9% as of driver version 572.42 (with settings absolutely maxed). In IL2, the gains are slightly larger, ... averaging 27% across six headsets. There are bigger gains to be had in MSFS 2020 and 2024 than with DCS (there, ... the 5090 is beating the 4090 by about 33% averaged over the six headsets). I have not had a chance to test with XPlane yet. 2 - Just like I did, he got the 5090 and the Pimax Crystal to work together fine. I believe what other posters have written online about their Crystals not working with the 5090 are true and valid, ... but for myself personally, I was able to get it working. (I also am not using an Nvidia card, ... I am using an ASUS TUF 5090.) 3 - If or when DCS implements DLSS 4 (as Cyberpunk 2077 has), then yes, the price and electricity expenditure of the 5090 may be worth the resultant visual increase. Currently, ... I guess you will have to decide if 23% visual increase is worth the price over the 4090. I suspect it is going to be a mute point anyway for some many months unless you want to pay vastly inflated scalper prices for the 5090. The upcoming trade war and recession will make these prices even higher. 4 - Beginning to build some experience with the 5090 card now and I can say that overall this impresses me as a "poor" Nvidia release. The 2080 Ti was a "good" release, whereas in retrospect, the 3090 Ti is often seen as a "poor" release (hot, loud, with high electricity costs). Following that cycle, the 4090 was an "awesome" release, ... arguably the best in the last decade if not longer. In comparison, the 5090 is a "poor" release.
  21. Thank you Diego, ... somewhere distantly in the back of my memory I knew that the Quest models could play games entirely untethered. I appreciate becoming better educated!
  22. Can you clarify a bit Aapje on the, .... "which makes it rather poor for standalone use," aspect. I own two Quest Pro's and my son has used the second in wireless mode with AirLink or VD to play games like Skyrim, Fallout 4, and The Room VR: A Dark Matter to outstanding effect with excellent resolution. With the Meta catalog, the QP offers wireless headset VR gaming, ... something that the Varjo Aero and Crystal cannot do (the Crystal can actually do it but it rarely works well, ... another of its numerous problems). I can even play DCS fairly well with no data link cable. It's great. Combine that with the Github hand tracking project and it is my high-end sweetspot over the last year.
  23. If a headset has eye tracking, it can efficiently use Mbbuchia's QuadViews software package to increase your frame rate. (Basically, the resolution is rendered very highly where your fovea are looking but with a lessening resolution outside that zone.) So, it's a good thing!
  24. Apologies, I forgot the Focus Vision has eye tracking. It is a headset I have never owned or had a chance to play around with.
×
×
  • Create New...