Jump to content

Thinder

Members
  • Posts

    1413
  • Joined

  • Last visited

Everything posted by Thinder

  1. You're not going to gain in performance at high resolutions doubling your RAM, for the simple reason that DDR4 CPU controllers are limited to 4 ranks that they can manage, and there are no 16GB sticks with one rank, only 8GB sticks can have one rank, and not all of them does. As a result, under load, your CPU will throttle down, even with B.Die and 3600MHzkits, in my case, tested back to back, the loss was <> 31%, same RAM only 16GB instead of 8GB sticks. You're better off investing in a B.die RAM kit, corsair or GSkill, my choice would be G.SKILL TridentZ RGB Series 32GB (4 x 8GB) 288-Pin PC RAM DDR4 3200 (PC4 25600) Desktop Memory Model F4-3200C14Q-32GTZR They are designed to be O.Ced so you easily can get 3600MHz from them, but do not purchase two different 16GB kits, you would risk incompatibility between them even in the same batch, they need to be tested together, 4 X 1 stick is safer. Intel: Overclocking RAM. It is often not really the case, the performance depends on the CPU controller, not the RAM kit itself, if you have 8GB or 16GB, of course you're short in capacity but 32GB of low latency RAM works faster with every DDR4 CPU, there is a point where more is less and it shows under load, we're talking 4K or even MT.
  2. Thanks for the infos. As I expected we are gonna have to wait for Vulkan to finish fine tuning MT in this game, I hope it will help the GPU to keep up with the CPU, I never thought possible to see a Ryzen 7 5800X 3D bottlenecking a GPU running at 3000MHz. What can we expect from Vulkan in term of performances compared to the actual MT? I remember 3DFX, I had a Voodoo GPU once...
  3. Thanks for your reply, it's good info for everyone. It still isn't PCI_E Gen 5, and yes, there still are ways to gain FPS with PCI_E Gen 4, something I need to explore, I know there is a bottleneck in the Rendering API, I just haven't got the full picture yet to try to tweak it, if you have a fix, you're more than welcome to post it... In my screenshot, there is no reason why the GPU only produce 66FPS, especially because the frame time is relatively low, I think there still is more to come from this combo. Note that my goal here was to demonstrate the lack of maturity of the DDR5 systems, I had a clue looking at the available RAM specs, no equivalent of B.Die there yet, and the RYZENs are designed for lower latency, even the 7000 series, so investing in a DDR 5 system to play DCS today is perhaps not the best option.
  4. I checked on my own GPU specs, that of NVIDIA 4080 and 4090, I can't see any of them supporting PCI_E Gen 5, yet, the DDR 5 systems relies on the 64 GB/s X 16 to get the performance increase manufacturers claim for their new tech motherboard, CPU and RAM. So what gives? Is there any GPU supporting the 64 GB/s X 16 of PCI_E Gen 5 today or does it depend on the GPU bandwidth limits only? In any case in the event that there is no GPU support for PCI_E Gen 5, once the player start using Multi-Threading in DCS she/he will quickly hit a GPU bottleneck, that's what my Sapphire Radeon RX 7900 XTX clocking as high as 3000MHz end up doing from a <> 15% CPU bottleneck. Multi-threading is already a huge improvement in terms of performances in DCS, but the drawback is that the GPUs are now the weak links of the systems we put together, with faster CPUs than my 7 5800X 3D, the difference will be even greater. There's no way I'm gonna push mine further, I'm not planning to watercool my GPU, only the CPU which Max Operating Temperature (Tjmax) is only 90°C, yet higher than the 7800X3D, 7900X3D, 7950X3D (89°C), so AMD players better watch this cooling... So I have to live with this new situation, even before the final release of DCS MT engine and see where gains can be made; frame times come to mind, but also better case and CPU cooling. Your opinions...
  5. STFU. I was responding to THIS below, IF YOU'RE NOT HAPPY COOK YOURSELF AN EGG or apply for moderator position. So what was the relevance of my post? ASUS motherboard cooking the Ryzen 7000 CPU? And of course there is no immaturity in new technologies (!?!). You guys wanted better than a DDR4 system, but my detailed specs for performances shows one thing very clearly, with MT, you'll need more than a 4080 to avoid a bottleneck because the performances of your CPU is way beyond what a 4080 or RX 7900 XTX can produce, stock. So at the end of the day, if it's to be limited by the GPU, there is no advantage going for a DDR5 system right now.
  6. I used the stock BIOS and updated it after I realized that the stock settings were limiting the CPU to 3600MHz, once the BIOS updated it runs to its optimal frequency without issue. The cooking BIOS issue doesn't concern my TUF GAMING, once again, one must be careful not to generalize when one type of device has issues, there have been a number of cases from both AMD, Intel and NVIDIA, it didn't make all of their devices plagued with the same problem. I frankly don't regret my choice to stick to DDR4, I was reluctant to make the jump to DDR5 because the technology is not yet mature and after some initial problems caused mainly by anything else than the CPU or GPU, once the system is stable it flies in VR/MT like it really was meant to do. Things to look at which are or can be source of problems: Microsoft Automatic Updates; replacing AMD drivers with their own and embedding MPO settings limiting the performances of the GPU. MTO Fix AMD Ryzen 7 5800X 3D Max Operating Temperature (Tjmax) of 90°C, it is not very high, if your case cooling is average it might cause the CPU to throttle down due to thermal limits. Progresses due to use of MT: From <> 15% CPU bottleneck, even with the GPU running at 3000MHz (with the previous driver), the CPU now works all cores in a way that get the GPU to work a lot harder to process all the data, the bottleneck is now the GPU and I bet it would be much the same with an 4080 only looking at its boost stats, my Sapphire ran 19.50% faster, you'll need a watercooled O.Ced GPU to achieve parity with the humble 7 5800X 3D performances. Where it can be improved: Timeframe, use of memory, cooling (case, CPU), B.Die RAM (if you haven't get one of those kits yet).
  7. Don't want Windows to install tons of garbage while logging in? Here you go, I guaranty a 500Mbs (tested with Ookla) speed, turn off before logging off, turn on after logging in and cleaning your system. Vegamax Power Button Kill Switch Shut Off for Ethernet Internet Network Cable Cord
  8. In any case, the better the cooling the better the performances. Recommended Cooler:Liquid cooler recommended for optimal performance. Max. Operating Temperature (Tjmax) 90°C Perhaps watercooling the CPU would be an overkill but my case cooling was set for an 1080Ti and 5600X, it needs more airflow to keep a good headroom, so the 2 low pressure Noctua NF-A12x25 f12-industrialppc and one NF-A12x15 are gonna go and be replaced by high airflow NF-A12x25 that's 8.5% and 2 X 20.82% more airflow respectively. I use three 120 X 15 fans because 120 X 20mm wouldn't fit behind the GPU or at the bottom of the case, but other than that I'm not limited to thin fans and can upgrade 3 of them, now that my motherboard works perfectly, I don't need a thin fan aligned with PCI_E2, the GPU can stay where it is and get more airflow from this fan. Here is my cooling configuration, there are no fans at the top, and the vertical 120mm on the right, the second, third from the top and case bottom fans are thin 120 X 15, the two other fans are low pressure and feed/exhaust the CPU cooler airflow, the bottom fan is aligned to the opening in the case, closer to the vertical fan.
  9. I tried Right control + pause break, it doesn't work, I'll try again next time I do another test flight.
  10. I just restarted. I had to ask Support for setting up the full screen mirror so I can record with the overlay. I'm trying your procedure now. I can't get it to work or it doesn't show on screen, is there a folder if it is a screenshot where I can see the result?
  11. I'll do it when I restart, right now I'm in a lunch break, I'll PM it to you but as far as I know, my frametimes are low to moderate, nowhere near the value you posted in your screenshot.
  12. Thanks mate. My system isn't getting anywhere near this temperature but I am trying to gain anyway, I'll swap one or more of my case fan for Noctuas with 102,1 m³/h from the actual 94,2 m³/h, every little bit helps... All in all I am trying to keep the whole system cooler, the solution I used up to now with the 5600X and 1080Ti (mix of high airflow and low pressure) if it worked is now limited and even if my GPU is cooling rather well, I'd prefer to have more headroom especially since I really O.Ced my GPU now...
  13. I had a G2 but it packed up, so I used the money from the refund to buy my first G.Skill Cl14 kit. Not sure what you mean by frame time deltas, if it is peak in frametime, then no, in my case it is fairly stable and stays low as you can see from my screenshots, in the worst condition possible, the issues I have with this setup are not directly related to the headset but rather what it needs to run it on a PC, namely Steam VR and Virtual Desktop. Yes I think setting up under-voltage in BIOS might be a good idea. Thanks for the suggestion. About the Pico4 before you jump to conlusion, here are a couple of things you might want to take into consideration: The G2 sweetspot is slightly sharper but way smaller, in terms of visuals, the Pico relies on settings and image quality changes accordingly but you have a much larger sweetspot and don't need to turn your head to see things clearly around you where with the G2 it is the only way, it helps in the cockpit. I think Steam VR and Virtual Desktop needs to be kept as discreet as possible, I experienced a slight fogging in the middle of the screen during very low/fast phases, and it doesn't show in replays, so I turned all Virtual Desktop off and am experimenting with Steam VR so as to make sure there is nothing between the headset and the GPU... Its battery has limit in durability but if you take breaks to spare your eyes and recharge it when you do you'll be fine, it's just a matter of taking the habit. I also play Elite Dangerous with it and graphics are stunning, but as usual it depends on settings, if I use my O.C profile, I can notice the improvement, mostly anti-aliasing and image clarity.
  14. I am. It sill demands a lot of firepower, my GPU is O.Ced and I play in MT/VR, the GPU is running at 100% nearly all the time while the 8 Cores are more often in idle waiting for it to process the frames, especially because I use quite high DCS settings and test low/fast in rich environment (between hills in a forest)... This is where I really can see what my system can cope with, and lately I have been tweaking both the CPU with PBO2 Tuner, lowering the Voltage by -30, and the GPU with an increase in both VRAM frequency and GPU frequency + Under Voltage. The system runs smooth and cool, I reduced the Pixel density from 1.6 to 1.2 and changed a few things, it looks as if Steam VR is having somewhat unwanted effects on the image quality too, so I'm still experimenting with setting so as to find the best compromises between FPS and image quality. Power limit setting that you cannot see bottom right is at 15. Mirage 2000C Map Caucasus Free Flight Test starts. I break right and dive at tree top from there, following the valley curves. In the valley where the system have to work the hardest. I haven't managed to gain much at this point but the play is smooth with no or very little flickering in the side, I'm trying to figure how I can improve my frametime without further GPU tweaking, I think I should consider myself lucky the GPU is taking it without a glitch.
  15. Yep, that's what makes the difference between lows and high, I added Frametime to OSD, we will see what we got. But seriously who would have thought that a GPU running at 2764MHz would become the bottleneck of an 5800X 3D? So better frametimes are the answer to better performances, although my settings are still insanely high compared to the majority of GPUs usad in DCS and also it has the disadvantage to have to feed a Pico4, probably one of the most demanding headsets for a GPU... Researching Frametimes reduction. Meanwhile, I'm playing around with the GPU VRAM frequency which was only boosted conservatively and already see some improvements at 4K 2 X MSAA.
  16. It's not what I am saying, I know the CPU controller is managing channels bandwidth, what I wanted to demonstrate is that the Core Clocks are or are not dependent of the GPU clock and if they are to what degree, this was the idea, and the CPU can only feed as much data to the GPU as itself can process, which is why we see some difference. All in all it shows that there is room for improvement when it comes to the Cores frequencies and usage, the lowest 27 FPS, 4225MHz and 9% usage is where progress can be made, my Cores have been recorded on screen at 4450MHz and above, so there is an unnecessary loss of performance there that can be changed with further MT tweaking.
  17. Gradually decreasing Maximum GPU Frequency from my Personal Game settings: -15%. 2,614MHz setting. >>> 20%. 2,460MHz. >>> Cockpit and HUD views at 2,460MHz. >>> Something that doesn't show in those screenshots is the slight amount of fogging I get in the middle of the screen in the valley at very low altitude, making it a bit harder to navigate safely at this altitude and speed, game is still playing real smooth with very little flickering. As I was expecting the difference made by lowering the GPU frequencies doesn't make much of a change in the CPU frequency (6% from Maximum GPU Frequency I use normally, while I lowered the Maximum GPU frequency to 15% and 20%), but FPS and quality of the visual quality (fogging) are lower. Conclusion: I'm not too sure what I am looking at, the CPU Cores doesn't seem to be much affected by the GPU performances, from 67% to 73% under the same conditions and settings on the other hand, there is room for improvement for developers in order to allow the Cores to work a bit harder...
  18. Mentioning FPS is not enough, what matters most is where you get your FPS, from mid to high altitude there isn't the same load on your PC due to a much lower level of redraw than at very low level in a valley surrounded by trees, which is where I do my flight testing... My DCS settings are maxed out but even at lower FPS, the game plays smooth, GPU frequency have been lowered by 10% for the purpose of this test... From id to high altitude you can nearly double your FPS, so there isn't much point testing over 500Ft, 100Ft and even lower is better for testing a rig. >>> Flight Test Saab J29 Tunnan.
  19. Well it's your assessment, right mine is that DDR5 systems are relying on an immature technology which doesn't have the equivalent of the DDR4 B.die and since AMD designed their X3D serie for lower latency, I'll wait until someone comes up with better RAM. Right now I am finishing to set up my system and I believe that there is still more performance to get from it, I don't feel the need to swap to DDR5 even if I had the budget, and I'm not sure that bang for bucks I would be able to get the same level of performances from a DDR5 system... Whatever suits your need, I don't advocate one system or another for other players, I share my experience for those interested and considering the good experience I had with my EVGA GTX 1080Ti, I'm not shy about getting an NVIDIA GPU if there is something that good in the range i'll be looking for when I decide to upgrade this one. The problem was for me to get a system running DCS in VR at good pace with good visuals and I got it now, with a GPU that was £450 cheaper than the cheapest 4090 and it gives me what I wanted, so all in all this justify my choice to stick to DDR4 and go all AMD...
  20. From my PoV it depends on your own need, not a standard at all, your case might be better suited to cooling, or your Motherboard and a lot of other things can affect the results, So what I am doing now is just fine tuning my rig to squeeze the very last bit of performance I can out of it and so far I get excellent results at high settings in DCS-MT in VR. I'm just not too happy seeing my CPU temperature creeping in the red zone even if it's just blinks. I'm testing my rig with different Maximum GPU Frequencies so as to see how my CPU Cores reacts to those changes, I post my results in this topic.
  21. From now on I will reduce the Maximum frequency of the GPU step by step and by increments so as to see what the CPU Cores are doing.
  22. There is something else there which causes this to occur, a setting (can't remember which) and it changes the resolution where you look or in the center of the screen, solution, turn it off. As for performances, here you go. What you need is to optimize your systems, including RAM-to-CPU bounding because DCS is so demanding, with Pico's high requirements on top, you end up with mediocre VR performances, reason why I don't rely on my rig's RAM, CPU or GPU alone, I keep fine tuning them and get them to work for me. In my last flight test (Caucasus map) I flew at tree-top with very smooth visual but have this occasional fogging in the center of the screen, when I remember what causes it I'll sort it out. The Headset itself is not at fault, it just pumps out more resources than a G2 but overall, when it is sorted it is better.
×
×
  • Create New...