Jump to content

Thinder

Members
  • Posts

    1413
  • Joined

  • Last visited

Everything posted by Thinder

  1. Getting there. Little by little, settings and performances are improving, I ust done a flight test on Caucasus map as usual, tree-top flying, high settings, very smooth, and I managed to get my CPU Cores to boost as well which they couldn't do before BIOS update being stocked at 3600Mhz. Here, same DCS settings, reduction of the Maximum GPU Frequency by 10%. Approach to the valley, I dive breaking right in full PC right after this moment, from then on I fly at tree top above 500Kt most of the time, the environment on the side tells me if settings are too high, so far so good. In the valley as low and fast as possible, this is where redraw is very hard on the system. Zooming up to check on environment visuals. We can see than the GPU frequency went down from 2764MHz to the low 2600MHz, GPU usage is still 100%. When the FPS went at its lowest, visuals were still very smooth, in fact I didn't notice the lower FPS, only the visual quality degraded slightly with some blur in the HUD area making it more difficult to navigate at this altitude and speed... In the most demanding part of the test flight, not all the CPU Cores were where I expected them to be, in the top screenshot, before lowering the GPU frequency, all cores clocked at 4450MHz, not in the middle of the valley where two of them went down to 3560MHz. Perhaps on test is not enough to draw a conclusion, but apart for losing FPS, I see no advantage in reducing the GPU frequency, and it appears that visual quality also decreases with lower GPU performances.
  2. Occasionally, Steam VR is also playing on me the very same way, it won't recognize controllers and even the headset when it have been working just fine for month. Since STEAM support keep replying "we don't support Pico" I told them what I thought of them and it wasn't pretty, it's not the hardware, it's their god damned package which is unstable and they wash their hand of it, it became obvious when the you both headset and controller been given the OK in STEAM settings but for some reasons, STEAM VR just screw you, by that I mean I can't play two games out of 3. I'm not taking it, As early as there is a solution to get rid of STEAM all together I'll take it. Anyway, the issue have been identified (not thanks to STEAM support), and it comes from AMD Driver strings throwing STEAM VR into orbit, when I say instability, here you have a clear example, solution, FRESH AMD Driver install..
  3. I have been working (hard) at the optimization of my system in order to squeeze the last bit of performances out of it. Had my share of problems, bugs, stuff that didn't work as intended, made mistakes etc but the last hurdle is one I didn't fully anticipate. My actual Cooling solution is reasonably good, an Artic 7X for the CPU and X 5 120mm good quality case coolers, I thought the axial airflow going from the from the front through the CPU cooler then out of the case, basically aligned to each other would help keeping the CPU cool and they do. But just not enough, and to figure this one out I had to start playing around with PBO21 Tuning Tool and CPUID HWMonitor, while running tests with Cinebench. After updating my BIOS to get rid of the locked 3600MHz frequency, the CPU is now working as it should with the standard Base Clock of 3.4GHz then boosting toward its designed Max. Boost Clock of "Up to" 4.5GHz. Note AMD disclaimer on this topic: First, don't forget that one cannot OC the AMD Ryzen 7 5800X 3D, just Under-Volt it, So I don't really expect a huge gaun in performances from this upgrade, the intention is just to lower the temperature. Is it worth it? Well, if you are playing between professional, social and family time and want to get your partner and kids in W-E somewhere, you don't need a much better solution than an Artic 7X, it does a good job of keeping your CPU alive, so keep your dosh for relaxing times. On the other hand, if you became the local anorak and spend most of your time gaming, then reducing the temperature of your system further makes sense. In my last test (1mn ago), I got this result: 4449MHz captured on screen but a temperature Value reading that all went into red at 90.4°C and 91.0°C, in fact they spend way too much time blinking red for my liking. And this is without the GPU kicking-in which add to the overall case temperature by a fair margin, in most scenario, the GPU Core will stay around 81°C but I have seen it go higher than that occasionally, especially recently when the room temperature was high or the case filters were dirty (check them out regularly and clean them). So in my case, it makes sense to try to keep the CPU cooler, there isn't much I can do for the GPU for lack of space (2 X Water Cooler?) but here is my solution: Artic Liquid Freezer II 120 It can be mounted at the back of the case, with two 120mm fans, this would go some way to reduce both CPU and case temperature. Your thoughts?
  4. I don't know about DCS in 2 D anymore since I only play it in VR since I have an headset and now Multithreading. What I noticed since the last MT update is a real progress in terms of performances, I have a new motherboard and didn't update the BIOS at first, so my CPU frequency was limited to 3600MHz, and its usage was well below that of the GPU which was 99% but I was already scoring the highest with 3DMark Pro at 4K and DCS was real smooth. One could think that the GPU is the bottleneck but no, the previous BIOS simply set the CPU frequency as average, 3600GHz instead of the Base Clock 3.4GHz, and it stayed there, didn't boost regardless of which game or App was on, be it with 3DMark Pro, Furmark at 4K or Cinebench. Once updated the BIOS Base Clock went down to 3.4GHz and it boosts, I tested it with Cinebench and HWMonitor and even undervolted with PBO2 Tuner by -30 without any problem. So from where I'm standing, considering the performances I can squeeze out of this rig when set properly, everything is interdependent: BIOS settings. RAM latency, number of stick, number of ranks per stick. Overall RAM-to-CPU bound. Microsoft Updates (it replaces AMD drivers without your knowledge). SSD speed and SSD socket (bandwidth). OS settings including background Apps etc, and I'm sure they all make a difference because I tested them. It's only when one have taken care of every single one of those issues that the CPU or GPU can run at their true potentials; you run a RAM kit with too many ranks (64GB 4 X 1 stick 3600 GHz) you lose 31% at 4K, you run a RAM kit with a latency higher than optimum and it's a bottleneck and 30% bandwidth at 4K, and if your SSD sits on the wrong socket, that's 50% or more w/r speed gone up to 5% speed at 4K, you have mediocre case or/and CPU cooling, that's your thermal limit down the drain and yet a few more % final performance gone. Since at the end of the day it's the CPU controller which have to manage all channels taking these parameters into account, before saying my CPU is slow, I look at what limits its capacities to run optimum, it takes time efforts and more than a few apps for testing/boosting its performances (7 5800X3D doesn't O.C but can be under-volted). That's what I've been after for month and I'm not finished yet, the joy of computing...
  5. Just done that and ran both Cinebench and 3DMark Pro Firestrike at 4K 2 X MSAA. I pushed it to -25 then reverted to -20 because I couldn't see any significant progress, and my GPU Temperature seems to be the limiting factor anyway. I also lowered my GPU Maximum further voltage by 5mV, I don't know if it has much of an impact but I try to lower the overall system/case temperature in anyway I can. I have very good air cooling with 5 X case fans and 1 GPU fan, one of the case fan blows upward from the bottom of the case and 3 of them are Noctua NF-A12Xx15PWM which can mount there and in front of the GPU where the 20mm wouldn't. What I have noticed after setting the voltage to -20 is a slight improvement in performances in 3DMark Pro Firestrike at 4K 2 X MSAA, but not so much a lower temperature. Graphics score 19 596 - 19 550 Graphics test 1 97.23 FPS - 96.97 FPS Graphics test 2 75.82 FPS - 75.66 FPS Physics score 28 975 - 27 993 Physics test 91.99 FPS - 88.87 FPS Combined score 11 094 - 10 999 Combined test 51.60 FPS - 51.16 FPS 3.12 FPS in Physics Test is not munch but I'll take it, it might show when playing DCS-MT in VR.
  6. First test flight since last PC upgrade (SSD) on Mirage 2000C Caucasus map at tree top/560Kt, very smooth, the combo didn't blink, no flickering from the side, just a little fogging in central HUD area but I think it is a graphic setting that can be turned down/off, I have to research what it is again that causes this to occur. I still can't get MSI Afterburner to show in replays but my CPU was firing on all 8 Cores well above 3600MHz at times, a lot of progresses achieved there, I'll try to get the graphs to show in replays, overall, this runs a lot smoother than my first videos, stunning visuals, good frame rate... Screen shots are from the 2D version where I watch the replays, I have just reset the "Enable Virtual Reality Headset" option.
  7. Very interesting. Please port your complete system stats including RAM latency and frequency, this way we really can see what can be going on, note that I experienced similar gains with the use of 'cl14 RAM and that I figured that OCing my GPU to a Maximum Frequency of 3075 actually translated quite well in DCS despite a BIOS setting and limiting the PCU to an average 3600MHz before BIOS Update. The stock BIOS apparently limited the CPU Boost frequency to 3600MHz and prevented the 8 Cores to boost. In 3DMark I also figured that my actual 32GB 4 X 1 stick of 3600 Cl14 RAM was 31% faster at 4K than the 4 X 1 16GB it replaces, that's the CPU throttling down under load because it cannot manage 8 ranks, tests were conducted in 3DMark Pro Firestrike at 4K 2 X MSAA. My GPU ran at 2662MHz and stayed cool, Max Junction Temperature was 81°, frequency is <> 6.5% above stock Boost speed.
  8. Yeah, I had one G2, it packed up and even if in the sweetspot it was marginally sharper than the Pico, overall, the Pico is a better headset, I don't need to target what I want to see to see it, in the cockpit it makes a lot of difference. On top of which I can watch movies, download them or play games, go online without even having to link it to my PC, bang for the money it beats the G2 hands down, once set up, the only drawback is a somewhat limited battery autonomy, but I don't really care since every time I take a break I recharge it. When it came out its library was limited, its apps too, but now it has got a lot better, from my PoV it will replace the G2 without problem, especially because it is out of production and will run out of support before I feel the need for a new headset. The thing is with a higher specs PC, you can play DCS with settings you don't get anywhere close to with a low or mid range, in terms of visual quality, it's obvious it is an advantage and I'm not playing only DCS but also Elite Dangerous, and graphics are stunning in both games, only DCS is a lot more demanding at higher settings and harder to set up properly. I say that but you wouldn't believe what I had to go through only to get a 5600X/1080Ti system running, I know what it is to be skinned, that's why I haven't been seriously looking at the 4090s, with the difference of money I was able to purchase a good Cl14 kit and with this CPU it makes a lot of difference, plus my system is foolproof for a few years from now. So consider this, you can buy second hand like I did the 1080Ti, half the price of a new one, two years warranty (CEX) and I sold it back to them too, because even after another two years of service with me, it still passed their tests, you can pretty much put a high end PC together for 2/3rd or even half the price of new if you can find the warranted gear. The 4090s are a hell of a GPU serie but you don't need one.
  9. Well, I'm sure you'd run it very well with an RX 7900 XT, those GPUs have a lot more potential than what is said of them. But it looks like you're looking at playing DCS in VR, a game which is notoriously demanding on specs but don't have the budget or the will to spend it, and it's not all about GPU, I wouldn't swap mine for a 4090 and it was about £450 cheaper than the cheapest of them when I purchased it. My EVGA 1080Ti was a very good GPU, one of the best I used so far if not the best but it was even better with a kit of Cl14 RAM and a 5600X CPU, it's all a question of balance.
  10. I just rebooted from BIOS and my CPU speed shows 3600MHz which is also what CPUZ test results shows, if there is a BIOS setting to get it to run at its normal boost speed I haven't find it yet, I'm in touch with ASUS support and that's the next thing I'll be looking at. I am not sure that it is a DCS/AMD thing, more likely a BIOS setting or my GPU is not working properly, what I can do next is to install Ryzen Master and see if I can unlock it. >>> Update. After a BIOS update, the CPU runs at the base clock recommended by AMD (3.4GHz) and during heavy load it should boost up to a higher frequency... Right... Sounds like the automatic ejection seat of the Yak-38, I'll see how it goes.
  11. You're not the only one asking yourself some questions about this CPU score, you're right, the Maximum Boost Clock should be 4.5GHz, that is something I should be looking at, it is not my RAM, I am still not familiar with my new motherboard BIOS and I ran the RAM at 3200MHz for a while thinking I've set it up to its designed Cl14 3600MHz frequency... So it is worth investigating, but as I was saying there still is a lot more to come from this rig, including the GPU. You can't O.C the 7 5800X 3D...
  12. I will, I bookmark the topic, but be patient, I still have a lot of work to do before I can post viable comparative with a stable rig apps and games.
  13. The Pico4 demands a lot of firepower from the GPU, but it also will always work better with a proper RAM-to-CPU bounding and it looks like your specs are more to blame than the headset itself. Of course it puts more load on a PC than a G2 but oce you get your rig set up, visuals are great, I wouldn't swap it for a G2, having said that, My G2 was working just fine with a 5600X and a 1080Ti, the Pico wouldn't, you need a stronger GPU and best RAM possible..
  14. You suggestion is a very good solution for testing and the alternative would be to set my GPU Maximum Frequencies lower instead of lowering load, I always start by testing with 3DMark Pro at 4K 2 X MSAA anyway. At the moment they both clock slightly higher than the standard Maximum Boost, I can lower those to the standard ones and see what the gain with CPU might be since I already have a reference, but I prefer to retain maximum load on my system for testing, which implies high DCS settings and harsh environment when testing it in MT-VR. I'm not sure that at 2662MHz the GPU qualifies as bottleneck, it's more a case of the CPU being limited and I was wondering if by tweaking the M-T algorithms, DCS developers wouldn't be able to "free" some more performances from the CPU independently of the GPU load, I mean what limits it? For the time being I prep my PC for a further upgrade to gain some bandwidth with an ultra-fast M.2_2 SSD running on the 4.0 x 4 slot, I had to move Windows and all the apps to the SATA by Cloning it with Hasleo Back Up Suite Free edition, it works perfectly, the new SSD should run at 7,450 MB/s and 6,900 MB/s R/W so it will contribute to better gaming performances and the SSD I had on the fast slot is free for whatever I want to use it for on the slower M2 slot. I haven't thought about testing performances of the CPU by lowering GPU load but I will, it will be interesting and certainly give some interesting feedback to DCS developers.
  15. It's a solution, however, I am after quality of visuals and frame rate, the CPU and GPU loads are not what I am looking for but the results of my settings. I always conduct my tests in this map or Nellis at very low level because it takes a lot of redraw and it is where the GPU limits shows first, especially with high settings between hills covered with trees, those are tests to determine settings in DCS or as it turned out more than once, RAM kits and even cooling. I generally look sideways to see how the environment zoom past the aircraft, and if I see too much flickering, then the settings are too high, in this particular test I had a lower pick of just above 40FPS but little flickering and image quality was very good as you can guess from the image, on the Nellis map it's pretty much the same except that the distance at which I can see the electric line is more important than on the Caucasus map.
  16. Once I got it to work, in VR the improvements are striking, curiously, my CPU is still way behind my GPU in terms of load percentage, I think it could be improved and performances will go up as well.
  17. I think thee is a misconception about what the RX 7900 XT/RX 7900 XTX are really, it first started with all the AMD bashing we've seen in Youtube done by guys who make their money with selling bad news to get moneytized, as it turned out, it was all mishandled GPU which were the subject of this topic. Then we had people complaining about AMD drivers, including myself when I didn't know better, I was provided with a link to a pool of 4 different AMD drivers for those GPUs and tested them, there were absolutely no difference between them, but once I figured the MPO issue then I realized that there were more than Drivers involved in this. If you look at my signature you'll see than I inform players of a message I received from AMD through their Adrenalin interface, they were warning us that Microsoft were replacing their native drivers with their own, instead of ignoring it, I decided to investigate and figured it out thanks to some NVIDIA users, because the fix was not designed for AMD GPU first, nobody in the AMD forum actually paid attention to it. Now it seems that Microsoft have stepped up their little war with AMD and PC users, because their triggers are embedded all over Windows 11 (in my case) and no matter what, if you have Windows Automatic Update and Automatic Update Medical Service disabled, after a given number of cycles (booting I guess), one of them is going to open the door for some data you don't want in your system. This is mainly why the RX 7900 XT/RX 7900 XTX have been under-performing just like the NVIDIA cards which suffered the effects of MPO setting did before them. So if you're still sceptical, here is the latest: Despite Update supposedly disabled, if I boot with the internet connection ON my PC will still collect tons of cr@p from Microsift and it does affect the AMD driver, only this time, Microsoft overdone it, their replacement driver doesn't work (it did previously and you'd not notice apart for a loss of performances) and I end up with a black screen (a reboot with internet connection off solve the issue). AMD Software Failed to Launch Because Windows Update Has Replaced the AMD Graphics Driver >>> Now if we look at the actual performance of my GPU after this little glitch was sorted, I compare to my 3DMark scores to that of 4080 users, I realize that those above me in ranking are ALL Watercooled with a huge level of O.C, other than that I walk all over them, especially at 4K. Just a reminder, my RX 7900 XTX is air-cooled and can run at 2823MHz cool-no-problem, since there are still more than a few little issues to sort out, I guess this GPU can only get better, but here is your reality strike, the. RX 7900 XT/RX 7900 XTX are not the slouch we read about all the time... And oh, I nearly forgot, until I figure where those retards at Microsoft hid their triggers, I'll use an internet cable kill switch, boot with the connection off until the AMD driver kicks-in and clean the directory. STUFF Bill Gates.
  18. Preparing the installation of my new SAMSUNG 990 PRO PCIe 4.0 NVMe M.2 SSD (as usual DPD courrier delivery pretended to have "missed me" when I was at home all day and set the option for him/her to leave the package in a safe place. Delivery expected tomorrow (25th). Moved Windows 11 Pro to SATA SSD, fast enough for the OS (Cloned with Hasleo Back Up Suite Free edition, works perfectly). Moved SAMSUNG 970 EVO Plus SSD to socket M2_1 socket 3 supporting x4 bandwidth, it is PCIe 3.0 and is slower than socket M2_2 which support 4.0 x 4 mode, wiped it out and repartioned it (NTFS) in SAFE MODE. The SAMSUNG 990 PRO SSD will seat on the M.2_2 and be able to run at full speed: Sequential Read Up to 7,450 MB/s Sequential Write Up to 6,900 MB/s, this is the SSD where I will install my games, currently DCS, Elite Dangerous and World of Warships. All my apps are installed on the SATA SSD (C Windows 11 Pro), so I have the choice to use both M2 SSDs for games and/or apps such 3DMark Pro on the M2_2 for test accuracy since it will run at the same speed than DCS. As soon as everything is set, I will conduct some tests with 3DMark Pro (Fire Strike 4 K 2 X MSAA), before installing DVS-MT then do further testings.
  19. I don't know, I will try this. Thanks for the suggestion.
  20. Events: . Start my system normally but have a black screen, which is weird because the system was working optimal when I logged out. . Restart in SAFE MODE. Use DDU. Reboot: Same result; black screen, unable to reinstall driver but this time I have a clue. I was unable to start paint to paste the screenshot so I had to use my cellphone camera. . Restart in SAFE MODE, clean registry using Glary Utility Pro. DDU again. Reboot with internet disconnected. . Reinstall driver. Restart. >>> Now the system is working normally, the driver is signed by Advanced Micro Devices. Inc, all good but what the heck happened? Both Windows Automatic Update and Windows Automatic Update Medical Service are disabled. I am thinking about setting my internet connection to NO GO in the Start up Options to prevent any trigger to download more Microsoft cr@p. Your thoughts?
  21. This GPU has a huge potential but many things holding it down, I'm still trying to figure a few things to improve performances further, but it is as I thought it was when I first tested it, a strong GPU.
×
×
  • Create New...