Jump to content

Panzerlang

Members
  • Posts

    479
  • Joined

  • Last visited

Everything posted by Panzerlang

  1. It all worked fine after I put the missing line back into the LUA file.
  2. I did a performance comparison, fwiw. My personal takeaway is that ED need to look at why DCS is whaling on a single core, if that is actually what's going on. It certainly appears so and would explain the stutter..
  3. CPU. Even with a 13900k and 4090, I'm CPU bound.
  4. And yeah, from the end of the video, theoretically at least, it looks like Reflex would work wonders in VR. It brings frame-time well under the threshold of visual acuity.
  5. I would really like to see this kind of analysis done exclusively for a flight sim, focusing on the CPU/GPU render. They make the entirely valid point about system latency vs human latency, where ten or twenty milliseconds of system lag is inconsequential compared to the average human's physical response time of 200 milliseconds. This is relevant to an FPS but not to a flight sim, where the latency that we're seeing is visual. Human visual acuity is in the range of 20 milliseconds and for us to see stutter and jerking means the render queue in DCS must be getting seriously choked. 13900k has 15 physical cores, yet we see "CPU Bound, Main Thread" in red. My layman's guess is DCS isn't spreading its load across a sufficient number of cores OR, if it is, there's some other chip/s that can't keep up (memory controller maybe). Next year, Intel saves itself by offering flight-sim only CPUs. Lol.
  6. Brand new install of the OS and DCS etc and I think DCS is running as well as it can, all the tweaks have accumulatively been very successful.
  7. Yeah, I have to admit the latest throttle was underwhelming. Seems all they did was re-jig the previous one into something with more metal and pretty colours.
  8. It's not "fealty", it's a considered estimation that there's a point at which a company *should* realise that if it doesn't markedly change it will go under. Plus other calculations... a) First units are more likely to be of a better quality, if the company is playing that game (and it's been suggested Pimax did just that, as OG Crystals had none of the major issues the CLs did). b) Not waiting and thus avoiding short supply that appears later. c) Unit can be returned within two weeks for a full refund. An additional calculation now is will there be a 'run on the bank' that'll push Pimax under? The 'insanity' of announcing the Dream isn't so insane if Pimax is desperately trying to raise money on pre-orders. In that case it makes absolute sense but it would indicate just how close Pimax is to the edge. Unfortunately there's no ATM in view at which to check the queue. Lol.
  9. Biggest takeaway for me was them saying the image quality/clarity isn't much noticeably better than the OG or light. WTF?! I thought the various iterations of the Super were supposed to be 'retina level'. Was the test model being run with big down sampling? They don't say. Lots of people in the comments saying they have or will cancel their pre-order. I'm considering the same, with my mind also on getting that done before Pimax folds. It's kind of becoming clear that Pimax is running on pre-order money at this point and, if true, it ain't good. Not to forget to mention...this is just three weeks before final-prod units start shipping to customers and they're showcasing a flakey pre-prod at CES?!
  10. Ah, the scum crawl out from under their rocks again...it's GPU scalping time!
  11. No arguments there, they're the only ones making stuff that wouldn't feel or look out of place in a real machine.
  12. HAGS on gives me a tiny bit better performance (though it's so hard to tell the difference but it's definitely not worse). And no, not getting on my nerves, I count myself fortunate that I have to look for the stutter to really see it.
  13. My 13900k is significantly weaker than my 4090, according to DCS. Yeah, see my thread 'Hunting for the stutter-free VR experience'.
  14. I'm wondering if there'll be significantly extra goodies for HAGS, though my estimation of its value is seen through a fog of dumbwittery. It seems to me the CPU has become a major bottleneck in DCS and more money spent on GPUs is pretty much wall-decoration spaffage.
  15. Yeah. I'm guessing/assuming all those extra fake frames still have to go through the CPU in some way. Absolutely no point at all in getting a 5090 for VR DCS as far as I can see. For the Super? No, same condition applies. I'm getting it for the 135-degree FOV (and will then return it if/when that turns out to be yet another lie). I'll downscale it to suit my GPU and CPU.
  16. Do we have any explanation why DCS isn't DX12? Is potentially better performance being left on the table? Ok. "Digital Combat Simulator (DCS) currently utilizes DirectX 11 as its graphics API. Instead of transitioning to DirectX 12, Eagle Dynamics, the developer of DCS, has chosen to implement the Vulkan API. Reasons for Choosing Vulkan over DirectX 12: Cross-Platform Compatibility: Vulkan is a cross-platform API, enabling potential support beyond Windows, whereas DirectX 12 is exclusive to Windows. Performance Enhancements: Vulkan offers low-level access to the GPU, similar to DirectX 12, allowing for improved performance and better utilization of modern hardware. Development Flexibility: Implementing Vulkan can provide developers with more control over rendering processes, facilitating optimizations tailored to DCS's specific needs. Eagle Dynamics has been actively working on integrating Vulkan into DCS. In their "2023 & Beyond" newsletter, they mentioned that the introduction of their render graph would improve DCS's efficiency and deliver optimal performance with modern graphics APIs such as Vulkan." Puh. CPU on its knees, sucking up the micro-stutter. Lol. However, something tells me DX12 would be no better. But I guess we won't be getting multi-frame generation then. So, 5090 with maybe an extra 20% grunt over the 4090, for what, $3000 after all the rip-offs go into effect? I think I might pass.
  17. So glad I cancelled my order for this.
  18. Frame Generation is available for 40-series GPUs? Ok, DLSS 4.0. So DCS will have to be updated I guess. NVIDIA DLSS 4 Introduces Multi Frame Generation & Enhancements For All DLSS Technologies | GeForce News | NVIDIA
  19. I'll be ok down sampling the Super, it's the greater FOV I'm after...assuming that's not going to turn out to be another Pimax lie.
  20. The race is now officially on...which reaches the line first, the Pimax Super or 5090?
  21. That's been noted elsewhere, guys who thought their eyes were ok discovered they weren't and upon wearing specs the issue cleared up.
  22. I had similar grief with the eye-tracking calibration, in my case it turned out to be the USB connections being inserted directly into the mobo. Once I redid everything using the powered Pimax hub it all worked correctly. I've read conflicting reports however...some people have the opposite experience. I have zero idea why that would be, other than mixing up blue and red slots on different brands and versions of mobos with different USB specs and even incorrectly coloured USB slots or simple user error. So, try the Pimax hub, investigate your mobo's slots and what priority is given to each and etc.
×
×
  • Create New...