

HoBGoBLiNzx3
Members-
Posts
256 -
Joined
-
Last visited
Content Type
Profiles
Forums
Events
Everything posted by HoBGoBLiNzx3
-
not planned Unreal Engine 5 viable for DCS?
HoBGoBLiNzx3 replied to Bog9y's topic in DCS Core Wish List
That was the whole point of the thread. Was it possible, yes. Is it feasible now? No. The arguments made were that it is not possible for UE5 to run advanced physics based large scale simulations such as DCS. That seems to be false according to UE5 themselves. Hence my post, and it's links. People can go argue with them about what information they put out on their engines capabilities. -
not planned Unreal Engine 5 viable for DCS?
HoBGoBLiNzx3 replied to Bog9y's topic in DCS Core Wish List
They are improving things that are still 10+ year old features... The amount of time they can spend developing content greatly increases if they have to worry less about the engine itself. Plus we aren't talking about DCS' current build. We are talking about 5 years from now. Where does the company go? What are they going to do to with the next iteration of DCS? The amount of technology that is coming out at a rapid pace is hard to keep up with now. Pretty soon MIXED reality will be a part of gaming as well. That will be huge for a lot of people with cockpits that have been built, etc, etc. How about the Plugin where you can load up any part of the world on google maps in high detailed real time 3D spaces? That is already available with cesium.. This is the CURRENT level of development. What do you think it will look like in 10 years? Do you think ED will be able to develop the an engine to keep up with everything while also making high quality content? This is to be seen. I'm not even saying the ED can't create something similar that will use plugins similar to these but with the amount of developers they currently have, and the timelines which are currently have issues with. I think that is something ED will probably look into. However the claims that UE5 can't handle flight sims is ridiculous. Is it cost effective, or worth changing to? Probably not right now.. but the future is full of what if's isn't it? -
not planned Unreal Engine 5 viable for DCS?
HoBGoBLiNzx3 replied to Bog9y's topic in DCS Core Wish List
So you think that this engine will just be "updated" going into what? 2040? Did anyone say that it would be an easy task? It's hard enough for them to implement things that are 10 years out of date into their engine and have to rewrite code anyways. At least you'll have a modern and improving platform that will be updated into the future outside of ED. No one claimed the current version should be redone. So don't get your panties in a bunch. As for DCS being heavily influenced by physics over visual performance seems pretty preposterous. Considering the level of physics they can achieve running real time simulation with millions of physics based objects, even to the smallest triangles, all without any major loss of visual fidelity vs a single or should I say a hand full of objects that rely on that scripting... Every single game in today's world has physics built in to their engines, coding it how you want it is exactly the point of having a out of the box engine to work with. Hence your Space Sims.. I see absolutely no reason to believe that DCS physics are more complex than any modern game with destruction, tessellation, and other objects working in real time. Yes I understand that the aircraft themselves have a lot of complex physics such as drag, air density, wind, etc. but assuming that UE5 can't handle that work load I just don't see. -
not planned Unreal Engine 5 viable for DCS?
HoBGoBLiNzx3 replied to Bog9y's topic in DCS Core Wish List
It's so funny reading these responses. UE5 is miles better than most other engines period. And yes they CAN do large scale rendering. If ED wanted to move into future development with DCS2.0 or whatever they want to call it.. without having an entire team try and rewrite new engines to keep up with technology they probably don't even have time to learn then they should probably have a small team start working on the next iteration... It's really not hard to find all of this information.. lmao.. https://www.unrealengine.com/en-US/blog/unreal-engine-5-offers-significant-new-potential-for-the-simulation-industry User made clip... -
yeah I figured as much, I didn't know how much ED and HB work directly together when it comes to stuff like that. As they do add the content updates of all modules together etc.
-
I know this would be one of the last things on a priority list but can you add the BLANK function as a programmable key for HOTAS? Currently RECCE works for turning it ON/OFF Slider or knob works for ON/OFF - Dimming An actual bindable key to decouple the HMD would be awesome and extremely helpful. Unless there are plans to implement the datalink/HMD display though HUD (which I know is an option IRL not sure what year it was implemented though) Obviously the latter would be preferred.. but that would be a welcoming work around if you have no plans to update the HUD.
-
Appreciate it. I didn't mean a days work literally, I meant it's a quick solution that would keep a lot of people happy. Thank you regardless Bignewy.
-
Make sure you let the INS align properly too.. having INS set to IFA and GPS is important. NAV will deteriorate over time. and if you're not Hot Starting or Auto Starting.. You must align your HMD. Also it's hard for the HMD to keep moving targets perfectly tracked so it won't always be right on jet but a ground target should be fine as long as you align both HMD and INS. There are other options that I personally use like blanking it out so it overlays the HUD and using the RECCE button to turn it on or off. I also have a slider set for the brightness knob. Set it 80-100km under mids setup. I do wish there was a blank/unblank button to lock it in and out of the hud. but that's for a seperate topic.
-
I'm not frustrated, it's constructive criticism on a topic that seems to reappear constantly. It just seems like an extremely excessive amount of time to be flying some of your main airframes without a pilots body when one is currently available. That is like a day's work, and if there is something there.. they won't get pressured every week to put one in. lol. Also I didn't post it in the F-18 subsection because it's more for the F-16 and F-14.
-
Why is the F-18 pilot body not available in the F-16, F-14, etc if anything as a placeholder..? This should be the easiest implementation in the game and is big for both 2D and VR users in terms of authenticity when flying. It doesn't even have to be permanent but it's better than not having one since one is modeled already..
-
OpenXR Toolkit Tuning Guide (updated 21/02/23)
HoBGoBLiNzx3 replied to edmuss's topic in Virtual Reality
I stated that in my response FFS is kind of a hack imo but it works to save some fps.. You ideally don't want to have to use it. DLSS is a much better option than NIS but needs to be implemented on the development side as well and you need an RTX card. But let's be honest, who isn't buying RTX cards now... If you're buying AMD, they could also implement that option as well. IF you guys haven't given DSR a shot on top, give it a go. I wouldn't use the legacy resolutions but they will work. The other 2 are much better.- 688 replies
-
- oxr
- openxr dev toolkit
-
(and 1 more)
Tagged with:
-
OpenXR Toolkit Tuning Guide (updated 21/02/23)
HoBGoBLiNzx3 replied to edmuss's topic in Virtual Reality
These are something things that helped with OC - OXRTK. (I run mine at 75%NIS/Override RES ON-default/FFR- 1/2@85 & 1/16, Smoothing at -20) Adding DSR at the driver level also helps you gain some resolution back at a very minimal cost. You can find it in the Global Setting in your NCP. I also add antialiasing in NCP @ 2x (Enhance application setting) So I can enable MFAA. Note: You must have at least msaa 2x on in game to active it. - DSR: With the displays in a VR headset resting close to the user's eyes, higher resolution can improve the VR experience. Dynamic Super Resolution (DSR) – which we're introducing with Maxwell – helps us take the resolution from 1 megapixel per eye to 4 megapixels per eye. DLSS would be a much welcome feature if we ever actually see an update for MCS/Vulkan.- 688 replies
-
- oxr
- openxr dev toolkit
-
(and 1 more)
Tagged with:
-
lol that is so wrong it's scary.
-
Do you work for ED? If not, don't speak for them and stop giving lectures to people and telling them how to feel. I understand your point of view but you're doing a lot of assuming and a lot of pandering. Sounds like you're dating half the DEV's for fk sake.
-
Yes I'm aware of the of the BLNK option, that was what I was saying in regards to having to blank it out in order to carry that information across the hud. From my understanding the same set of MIDS options under the HMD subpage is available in the HUD with HMD turned on. Meaning it will display data link contacts, friendlies, (whatever options you want to share via link the same way you do with the JHCMS and it uncages the LHCQ/LACQ/9X in the HUD as well) so you don't have everything displayed over each other. I would assume that the BLNK function would disable the link between the two as it does, but will not display over the HUD still (I'm not sure of this it would just make sense) IRL it seems that the RECCE button not only can be set to record, but disables/enables HMD. Regardless, The useability of the HMD blnked renders the latter useless except for a cleaner, less invasive experience when looking around the controls/HUD but loses it's functionality. I can't see the US NAVY having it any other way... but like I said, I'm just civilian. I'm going off the answers given to me by an instructor. Hopefully there are a few hornet pilots that could clear that up without committing treason hahah
-
Ahh so it's all for network configuration between the Link systems? I think I read it also is where you reset aircraft system faults/reboots. VOCA and VOCB work thought SRS as radio 3 and 4 but I don't know beyond that, any plans of implementing it further, they did say there were still working to make their in game VOIP work like SRS as a vanilla feature. The thing that I was hoping to find, and I know it probably won't be there but an Hornet instructor for the Navy told me that there is a way to configure the datalink to show symbology in the HUD, instead of having to blank out the JHMCS, it also allows the LAHQ and boresight to move freely within the HUD instead of caging to the nose when passing though the display. He didn't tell tell me if it was possible to use in DCS, but he said it is definitely possible in the hornet. Maybe Wags has some info on it's development or if they will be adding the feature at all. Currently, blanking out the JHMCS is a much more streamlined process than working through the actual hud.
-
Any information on the MIDS subpage?, this seems to be where all the datalink information is and it's options. In the manual it says will be added later but it looks like it works. There are quite a few different options that I was curious about. This is the MIDS page next to BIT testing. Not under the HMD subpage.
-
When using Briefing Room to create missions the rain mod doesn't work but when I create a mission via editor it does. Any way of getting around this? also, is it possible to get them working in the mirrors? I know it's the 2D image and that likey falls under the monitor version but it would be epic have it added.
-
If it's broke don't fix it I guess haha. I've noticed massive performance drops with ASW and GPU scaling. DCS's VR section even brings this up as a caution. but I'd be willing to bet if you kept a constant high bit rate (dynamic does some funky stuff to make it energy efficient, turned off GPU scaling so you're not changing resolutions to keep frames, kept ASW to produce the maximum amount of frames so it doesn't throttle you down and start producing frames) You'd see a big jump in performance. a matching encode rate to pixel density (I think the default is actually set way to high for avg cards, OTT addresses this). Can all be done in OculusDebug You're espentially opening the valve to your hose with no restrictions or special features that mess with DCS because it's not an officially supported VR game quite yet. https://www.digitalcombatsimulator.com/en/support/faq/VR/ As I said though, if it's working smoothly and looks good. No need to mess around with it but I find that with updates, things generally keep getting better, and using those features seems to give you pretty big amounts of tailored performance. ASW can feel weird at higher paced games with ghosting and jumping resolutions/ reprojections always being a bigger problem than framerate.
-
I have before, I like having at least OTT because it saves my link/asw/gpu scaling settings. That and I can set windows to run ultimate performance mode. Debug tools really need at least some options to save profiles.
-
I might be able to get rid of opencomposite in general now because openxr toolkit works directly with oculus. Prior to it's release you couldn't use any of the openxr tools because they were only available through openxr tools and it would only work on windows headsets like the reverb g2. Seeing as I can't get the toolkit to load anyway, I might just delete it anyway and run it natively. To be completely honest, I'm not entirely sure how all the API's work within DCS as it's not supported by the application. I kind of just followed all the guides for it.
-
It isn't hard to do if you cut corners on the FOV like you stated. If you can narrow the overall amount of pixels and supersample them in a narrower space the performance shouldn't be nearly as much of an issue. You're completely right at the full FOV though, I don't think I could supersample, especially with msaa without narrowing it. However clarity is a much better suiter than FOV being that the distortion on the far edges renders them pretty useless anyway, and I haven't noticed that much if at all in viewable real-estate. The way I look at it, you gain performance from getting rid of blurry edges to supersample what you will actually viewing.. and with that lowered amount of pixels rendered, supersampling it becomes much less taxing = because of the math.. As I said, about Openxr. DCS uses Openvr so that is the API that is called from DCS. To get the Oculus to use it's native openxr api, you must call for it though steamVR(opencomposite).. When you check the API layer in Openxr tools, you can see it links the Oculus Openxr layer too it. It's apparently the only way to call it until Vulkan is implemented. Then it will be called directly from the application itself.
-
I've always been a stickler for FOV and with the Q2's narrow FOV I found that it really didn't reduce any possible peripheral view. This post was for people with Q2's and mid to lower end cards that are having problems even achieving a decent frame rate without MSAA or anything more than 4x AF. I saw a pretty big difference in quality in supersampling on objects that are otherwise boxy because of the lack of performance in anti aliasing at higher quality textures. That can be said for literally any screen that is supersampled. That is what supersampling is.. Supersampling is a method of antialiasing that attempts to reduce jagged, pixelated edges (aliasing) in images. It works by sampling a higher resolution version of the image to get the average color of a pixel before reducing it to the intended size. The issue with having a lower end card is that you are fighting either a lower resolution and higher quality texture, which doesn't really help all that much when it comes to the perceived quality of the texture. Or a higher resolution and lower quality texture. Either way, anti aliasing will be less noticeable at higher resolutions. If you can SS, along with MFAA the downscaling will result in a much more clean edge. No one is maxing out the graphics settings with this hardware, I don't think anyone ever even claimed it was possible. It's a solution to the problem of terribly jaggies, any sort of MSAA smoothing it out without tanking performance. So if you want a higher cleaner lower quality texture at higher resolutions, this is essentially the only way to do it. That I've found at least. Also, OpenXR is part of the native oculus software. The problem is getting it to run as DCS calls for Openvr. From my understanding opencomposite calls the Openxr api from steamVR, as the game is launched technically thought steamVR.. any steamVR api launches openxr with opencompiste.
-
I was just able to get Quest2 on a 2060 and i7-9700k 32gb to run at 60fps stable in MP (30 players on SoCal Caucus server) with very very good settings. Let me know if this works for any of you who are struggling with it as I had for about a year until today. No mods except sound mods and OpenXR/Opencomposite.. Have Openxr tools installed as well as tool kit but I didn't touch any of the settings as I can't get the menu to work anyways.. Oculus-- Full resolution at 72 hz OTT-- 2352 = 1.1 PD <--- important bit rate =500 <---important dynamic bit rate OFF / dynamic bitrate max 500 (incase i turn it back on for whatever reason)<--- Important GPU scaling and AWS OFF <--- Important FOV .70/.75 homeless realtime mipmap on NVCP--- DSR - select all options + 30% smoothing (global) <--- important imo Anti alias enhance application (rest on app) Low Latency - ultra MFAA - on Max performance LOD- clamp Textures High Quality pre render 2 DCS--- Basic VR settings with Textures low/med shadows normal secondary shadows on global shadows default global illumination on clouds high cockpit MDF 1054 every frame AF - 16x MSAA - 2x (with MFAA from nvcp enhanced) no civs sliders at 10k and rest around 70% rest of the options off (besides rain drops etc) VR tab - PD 1.0 (leave everything default) I changed the PDI to 56 because It scales properly. mirrors in cockpit are butter smooth, not much ghosting at all, no noticeable hiccups except for a few spots and it was fairly quick. Clarity is almost perfect. Almost no shimmering and jaggies. Even the cables looked nice and clean. Shadows didn't lag out the cockpit, textures worked well with all the shading as well. Also for windows make sure you run everything as administrator and DCS.exe with fullscreen optimizations off. I'm not exactly sure which of them is the most important but I can tell you with 100% certainty that the OTT settings are important. Hope this helps anyone looking for some help.
-
So I was reading through the OTT user guide and there were some things I definitely missed. I have a 2060 I7-9700k 32gb Q2. The encode resolution needs to match your PD.. I always ran it at the max. 3648 & 300 bit rate. After actually reading it, it clearly states not to over use the encode resolution or it will decrease performance, which it does. 2352 = 1.1 PD (moved my bit rate to 500 and have had zero issues with it) I had my headset resolution maxed out but pulled it back to 4864 x 2488 at 80hz with ASW off and GPU scaling off. GPU scaling does not play nice with DCS, esp with ASW. I'm thinking I may drop it to 72hz and move it back to full resolution... Then increase PD to 1.2 and 2912 encode resolution. I'll have to check the values to see what SS rate corresponds with what encode resolution. It also states that super sampling without increasing the encode resolution will decrease performance. So it goes both ways.. as per OTT. I also in NCP enabled DLR upscaling and it looks pretty nice, without any noticeable performance loss. If anyone here is savvy on VR development, it would be nice to get some details into how this works as it is one of the most confusing parts about VR.. From Oculus Development Community. In ODT, find the "Oculus Link" section with "Encode Resolution Width" and "Distortion Curvature". Also note the "Pixels Per Display Pixel Override" (short "Pixel Density") value up top. For most fields in ODT, 0 or "Default" means "do not override the value". Changes to the "Oculus Link" values require an Oculus Server restart to take effect, which can be done directly inside the ODT menu option, or from the "Beta" tab in the Oculus desktop application. Depending on the VR app, changes to the "Pixels Density" might require the VR app to be restarted, but won't require the Oculus Server to be restarted. "Oculus Link" fields will persist after the Oculus Server is restarted. "Pixel Density" will not persist after the Oculus Server is restarted. Initial tuning recommendations - start with these and adjust as needed: NVidia RTX 2070+ or comparable GPUs - Curvature "Low", Encode Resolution "2912", Pixel Density "1.2" NVidia GTX 1070+ or comparable GPUs - Curvature "High", Encode Resolution "2352", Pixel Density "1.1" NVidia GTX 970+ or comparable GPUs - Curvature "Default", Encode Resolution "2016", Pixel Density "1.0" When tuning, keep in mind: "Pixel Density" is per-dimension. e.g. a setting of 1.2x means 1.44x in 2D, which means 44% more rendered pixels. 2.0x means 300% more rendered pixels. Higher "Pixel Density" can cause dropped VR app frames and will vary based on the performance characteristics of the VR app. Higher "Encoder Resolution" can lead to dropped compositor frames as well as visible tearing. This is mostly tied to the encoding capabilities of the PC GPU and Quest. Higher resolutions in general can also lead to higher latency. Unnecessarily high resolutions (especially "Encode Resolution") can lead to aliasing artifacts (i.e. pixel crawling) on high frequency details. To revert changes, set them back to 0 or "Default" where applicable.