Jump to content

Thinder

Members
  • Posts

    1413
  • Joined

  • Last visited

Everything posted by Thinder

  1. I swapped my MSI X570S ACE MAX for an ASUS TUF Gaming X570-Plus [Wi-Fi] after the failure of the MSI's CPIE-1 slot and frankly I wonder why I splashed the money on the MSI in the first place, perhaps you should consider the best mid-range and I think it is the ASUS TUF Gaming. About the rx 7900 xtx, the GPU is good but there are issues with the drivers, not AMD fault apparently, Windows update replaced the AMD drivers by their own and there is also settings preventing the GPU to boost, then when you want to do it manually you experience crashes after crashes. The solution is Disabling Multiplane Overlay (MPO) now I can OC my card, it boosts and it runs smooth and cool.
  2. Pico 4 running on a PC requires a connection to the PC, either Wi-Fi or USB, use the fastest USB hub you have, 3.2 (blue) is recommenced.
  3. You can use Steam VR to play DCS with the Pico4 and its app, Virtual Desktop, in fact it is quite simple to set up. Go to your Open Beta folder, bin-mt folder (for multi-threading), create a shortcut cut it and paste in on your desktop and add the line --force_enable_VR --force_OpenXR after (space) "D:\OtherGames\DCS World OpenBeta\bin-mt\DCS.exe". It should look something like that: "D:\OtherGames\DCS World OpenBeta\bin-mt\DCS.exe" --force_enable_VR --force_OpenXR Then start 1) Streaming Assistant, start 2) Steam VR to detect and link your headset to your PC, then 3) Virtual Desktop. In the Games page of Virtual Desktop, add the exe you just made a shortcut of, it should start automatically when prompted, you can chose to use Steam VR for processing frames or not, you can leave every option to default or play around with it, personally I leave it alone. I use Steam VR Beta, and after finally setting everything up to play DCS, I done my first tests in multithreading/VR, it looks rather good, a ton better than with the first "DCS VR" update, I experience no lag, flickering or any loss of image quality but I had to boost/OC my GPU (good case cooling helps a lot) to get similar performances to where I was before the first DCS update. NOTE that apparently Microsoft is messing around AMD drivers, one Windows 11 update came with some settings which prevented the GPU to boost and if I wanted to do it manually, I had crashes after crashes with the mention "driver Time Out". The solution to that is to 1) disable Windows update before you play, 2) use an app to disable Multiplane Overlay (MPO), a solution designed for NVIDIA GPUs which works with AMD as well, and something else, I received a message from AMD asking me to check on my drivers as Windows update replaces them with their own. Something of a cold war is going on... Disabling Multiplane Overlay (MPO) Since I've done that I was able to O.C my card quiet well and experienced no adverse effects, my GPU runs within thermal limits, the maximum recorded under heavy load was 81°C, but then again I have an excellent case cooling (4 fans in, one fan out + CPU cooler fan) and took care to adjust the Fan speed Curve.
  4. You might want to try this... mpo_disable.reg I lost about 10-% performances from a stock (brand new out-of-the-bos-stock-settings) after testing my GPU in DCS (VR), since there have been a DCS "VR Update" at the same time I was sure it came from it, which in part was true, but there are other things which prevent GPUs to perform to their true potential in Windows 11 (Pro in my case). You won't believe that I have to boost my card now to get within a few % from the performances I had with a stock card, and when I tried to OC it it crashed with mention "Driver Time Out", so I investigated and find this solution used by an AMD user, it seems to work at many levels but I am sure it is not the only thing in Windows 11 holding GPU performances. Here are my actual settings: The card runs smooth, no crashes, no flickering, image quality and clarity are good, performance is <> a few % of what I had before losing to whatever it was, but this app seems to be working, note that I have a very good case cooling (5 case fans + 1 CPU fan) and the temperature of the GPU stays within reasonable limits (max 81°C Junction under heavy load). I can't guaranty that this solution will work for you but it's worth a try.
  5. To get to this point we need GPUs let free to perform by the OS: I've lost <> 10% raw performance in benchmarks without explanation, one of the culprits is Microsoft Multi-Plane Overlay which prevent the GPU to boost, I've tried several different drivers (thanks to a link to a pool of AMD drivers provided by Overclockers) with no improvements since I noticed the loss. For those interested, there is a file disabling Multi-Plane Overlay in the Registry Editor, at least I can O.C the card up to boost values and it goes there, my card was tested by Overclockers who didn't find any hardware fault with it, same conclusion than mine before I shipped it to them for testing. The file was originally designed for NVIDIA but it works for AMD as it prevents Driver Time outs and random crashes if you try to boost the card, after testing I'm only a few % (1.855) from where I was before I noticed this drop in performances. Having said that, the card perform as it should (as advertised) without OC/Boost, it's only that I have to boost it to 3050MHz and 2650MHz RAM to get similar test results in 3DMark Pro than what I had out of the box, for info, even at those frequencies it runs cool and stable. The video I posted was recorded before those 10% loss on stock settings, since I had a fresh Windows 11 Pro install and complete update, there are more than a few things that messes up AMD GPU drivers in Windows... MPO-Dislable Now I can try to play DCS MT in VR, I'll see how it goes. I've done the exact same thing for the exact same reason, I thought Pimax was taking the mickey, chances are, cost of top VR headsets will go down anyway and I don't regret my choice...
  6. Frankly, It is as good when you play than a G2 (Pico4), it's not like USB 3.2 Gen 2 with 10Gbps bandwidth is slow enough for you to notice. Up to now I have been rather happy with my Pico4 despite the little "DCS VR Support" update that resulted with so many issues, nobody did come with a viable solution until the next one. I have done all my GPU testing with this combo, mainly tree-top flying at high speed and noticed no lag, you can play at Ultra Resolution, 150Mbps bitrate, 90Hz Refresh Rate, it will reach the limit of the GPU before you can complain about USB link performance. The only problem I have with it is that the USB isn't recharging the battery when in use so you need to take regular breaks, other than that it's OK, with a resolution of 4,320 X 2 (2,160 X 2,160 per eye), it would take a Pimax Crystal to need a different connection, but that's a $1599 starter vs $599 for the 256GB version. While my GPU was tested by Overclockers, I used the Pico internet connection, watched a lot of movies, downloaded more than a few, played Pico-native games etc, for the price, I wouldn't swap for a G2, the Crystal can wait...
  7. The minimum frequency setting can be left alone to default 500, it matters little, same for Zero RPM. Whatever you're trying to do, under-volting or O.Cing your GPU, the most important will be to make sure 1) Smart Access Memory is turned ON, 2) Your cooling is adequate. Then run tests between each adjustments to verify stability, you can use Valley Free, consider this; every chip is different, your might run faster or slower than another strictly identical model (Silicon Lottery), so use people's settings as a basis for finding which suits your GPU. What I have find is that those GPUs are OK being O.Ced running benchmarks but not so stable with the same settings in gaming, Drivers have played a large role on performances of the 7900 series, I got my best performances with a driver from early January and the difference can be as high as 10% in testing at 4K.
  8. True, but only because the PCI_E1 slot never provided with the volume of bandwidth that the CPU would struggle with, in fact, X 16 CPI_E1 on this board only kicks in when the limit of X 8 bandwidth is reached, since it was faulty and running slower than PCI_E2 (X 8 only), it never happened. For the CPU controller limitation I have long had the information from G.Skill and now confirmation from ASUS, the Ryzen 3/3D Controllers will throttle back if they have to run more than 4 ranks, but this only occurs under load, if you play at lower resolution than 4K you might not even be aware of this the same way that a CPI_E1 slot running at X 8 will fool you and your CPU controller, at least, this is what I understood from G.Skill and ASUS emails. The only G.Skill sticks with one rank are the 8GB 3200 or 3600 MHz, the 16GB are two ranks, so I had this issue without being aware (or rather understanding it) from the time I installed this kit. No. The MSI B450 and 5600X always ran with Cl14 3200 MHz from the time I replaced the Crucial "Gaming" 3200MHz, same capacity (32GB), but the G.Skill was a 4 X 1 rank kit which allows for interleaving on those CPUs. I purchased the 32GB kit on 1/22/2021, the 3600 32GB Cl14 on 11/16/2022 then the 64GB kit on 1/31/2023, with retrospect, the last purchase was a mistake because I oversaw the rank limitation issues and I already knew only 8GB sticks are 1 rank, I just didn't put the dots together and focused on my idea to start using 3D SMax again for spaceship design. So right now, I only lost interleaving which accounts for <> 2/3% in performance but perhaps makes the job of the CPU a lot easier because it can access and write data on more than 1 stick, which with the 7 5800X 3D might be useful. Also, both recent emails from Overclockers and ASUS came to the same conclusion than what G.Skill was telling me a longtime ago, under load, the number of ranks per stick causes the CPU to throttle down. Using only two sticks doesn't bother me more than that, as long as it works well enough and it is way faster than using the four of them. Here is what MSI support were telling me on the subject when I inquired about different RAM kits.
  9. First thing first: Go to your BIOS and turn on AMD SmartAccess Memory. Check your case and CPU cooling, those cards have a thermal limit and will throttle down when they reach them, use the manual fans control curve to make sure your CPU and GPU doesn't overheat, you can set a more aggressive curve, it's easy to do and will avoid some unnecessary troubles, the cooler your case airflow, the more you can push the GPU if you boost it. Here is my Fan Curve to give you an idea, I don't use Zero RPM.
  10. If you want people to give you relevant comment/s answers, you need to put your complete system stats in your signature. As BitMaster pointed out, we need to know: PSU wattage. Motherboard model and BIOS. Type of RAM (Frequency and CAS latency), this way we will be able to figure if your system is capable of supporting a Ryzen 3 or not, if your RAM is properly bounded to your CPU and if you have enough power output to feed the whole thing. If you decide to stick with AMD and go for a Ryzen 3, the best upgrade available is a Cl14 B.Die RAM kit, the gain in performance is noticeable enough to equal a GPU water cooling Over Clocking, it will keep your CPU safer than OI.Cing it, keep your warranty, and it will remove the RAM-CPU bottleneck to keep it running at full throttle under load (4K). If you're not planning to play at 4K, you don't need a B.Die kit, a good Cl16 will do. The whole point is bandwidth, if your system doesn't push your CPU controller beyond a certain level, it will be able to manage the bandwidth without throttling down, and a Cl16 kit will be good enough, there is one catch tough, make sure you do not fit more than 4 ranks in your system, either 4 X 1 rank or 2 X 2 sticks, the Ryzen 3 controller won't manage more than that. That's on you to do your home work and figure which RAM you want and how many ranks your RAM of choice has. If your budget it limited, I can advise you on a very good mobo, the MSI B450 GAMING PLUS supports the 5600X and Cl14/3200 MHz RAM, the G.skill kit can also be O.Ced to 3600MHz, it was designed for the purpose so there is little risks doing it, such a combo will serve you well and the 5600X with good RAM is a good match for a 3080, only don't waste this combo with no a B.Die Cl16 RAM kit...
  11. Thanks! Now I run an ASUS TUF Gaming X570-PLUS [WI-FI], installation was pretty straight forward, BIOS easy to update and I immediately noticed a clear increase in image quality and clarity with standard GPU settings playing Elite Dangerous Odyssey, it has slightly less features than the MSI but quality is high, bat is accessible without having to dismount anything and I still can run 5 + 1 fans and an M2 SSD... There is a little problem tough, with the MSI, the PCI_E1 slot never game me X 16 bandwidth, it was running slower than PCI_E2, so the CPU could tolerate my G.Skill Cl14 kit at 3600MHz, the controller would manage since the bandwidth was never reaching its load limits. With this one, it seems that the CPU controller won't take 3600MHz but the next lower frequency because the CPU would become unstable, 4 X 2 ranks and 3600MHz doesn't work with this Controller, which negate my assumption that they were different from the other Ryzen 3, they apparently are not, so I might have to swap RAM again and go back to a 4 X 1 rank kit. I still have to run performance tests with 64GB and 32GB to see if it makes a difference (it should in theory). then if I'm lucky, and CEX haven't sold my old kit I'll buy it back from them, otherwise it has to be new and the cheaper 3200 MHz are out of stock, if I want a 4 X 1 rank it has to be a 3600MHz. Once I'm done with this, I'll start with multithreading in DCS VR. Here we go. Back to back, standard GPU settings, 4 sticks vs 2 sticks. 18 349 CPU score 9 095 23 227 CPU score 9 912 As I expected there is a clear difference between the two, using a 4 X 2 ranks ram stick resulted in a loss of 21% global performances in 3D Mark Pro Time Spie (-8.98% CPU Score), that's the result of the Ryzen Controller throttling down under load. Frequency was 3200MHz, now it runs at 3600MHz without problems. Conclusion, 4 X 1 rank is the maximum the Ryzen 3/3D will cope with, I was right before, changing my mind on a false assumption was a costly mistake.
  12. AS far as I know, the G2 has one of the best color displayed in VR headsets, better than my Pico for example, your solution lies in your GPU settings.
  13. To Motherboard hell and back... So when I got myself a brand new MSI MEG X570S ACE MAX as part of my 2023 upgrade, it all went downhill from there. First I had no signal with a brand new GPU, the PCI_E1 slot wouldn't work, I had to install it to the PCI_E2 slot, losing performance ( X 8 vs X 16) and a USB connection in the process (the GPU is too thick to allow for the connector to be fitted when in PCI_E2. Other little details... The Flash BIOS Button never worked, the battery is quasi-inaccessible because located half-way under one of the mobo sinks, BIOS is unstable, it did display boot options instead of the USB, I managed to go around this issue by dropping the BIOS file in an empty listed partition, since it became visible I was able to update my BIOS... As much I would recommend anyone wanting to build a starter system the MSI B450 GAMING PLUS motherboard, I'm saying stay away from the MEG X570S ACE MAX, if you're as unlucky as I've been with it you will regret it. I just ordered an Asus TUF Gaming X570-Plus WiFi (AMD AM4) DDR4 X570 Chipset ATX Motherboard, It should contribute to solving some of my issues, then I'll be able to work on the AMD drivers issues (if the GPU keep under-performing), re-install DCS and get it to work in multithread, at the moment, I do not have the necessary testing equipment to tell which part of my PC is causing other problems than the motherboard and its BIOS. Wish me luck.
  14. Agreed, your analysis is spot-on. But if I remember well, B.die wasn't necessarily designed for the industry, I used to work as aa CG Techie, using 3D Studio Max, CATIA, (a Dassault-Systems packages for Industrial CAD design) and Fluent for fluid simulation, I never heared of B.Die when all the gamers were using it to O.C their RAM to get the frequencies the Ryzen and Intels RAM didn't support. If anything we wouldn't have even consider using it simply because we would go for the most stable and safe system as possible rather than O.Cing anything in our system (we built our PC in house btw) and in the case of 3DS Max, lowering the cost was a major goal. Imagine, from a £35..000/seat Silicon Graphics machine to a network of PII with 256Mb of RAM each, allowing techies to work at different aspects of a project then network-render for a fraction of the price, it was a revolution... Gaming has another advantage looking at it this way, it pushes the boundaries of technologies and support the top end of R&D for this particular field, it was for gamers that Silicon Graphics decided to go from the specialized Aviation/Military niche to mainstream because they figured they would largely contribute to R&D finances, that gave you all the advances in GPUs and RAM and ended up with simulation at Squadron levels on PC COTS platforms as used by the French Air Force (RAZBAM M-2000). If it haven't been for us, buying unproven new, more and more competitive GPUs from the local shops, their R&D costs would have stayed as high as they were and it was a very expensive business at the time. So we can take some credit for the advances in technology we've seen since the mid-80s, while keeping in mind that manufacturers tend to abuse this and get us to pay for premiums just for the stake of lowering their R&D costs with equipment that aren't always worth it, I doubt very much that a company like Dassault-Aviation or Lockheed Martin are going to call Corsair or G.skill for an upgrade of their systems, even their recommended systems for their students aren't coming anywhere close to gaming standards.
  15. True but bandwidth is not everything, as SkateZilla pointed out, the number if time your RAM is accessed by your GPU matters and high latency remains a bottleneck, even more so for a Ryzen CPU. Ryzen 3/4 cache weren't originally designed for lower latencies and it matter little if AMD increased bandwidth with their new CPU design, they still will miss out on this capability as long as RAM manufacturers will not produce high frequency/low latency sticks as was the case for Zen 3s and Cl14. In fact, the situation is like not having B.Die RAM today, the Zen 3/3D wouldn't be able to produce the performances they can with Cl14, and under load (4K) it matters a lot, so I felt like I was already taking a lot of risks with a new GPU without having to pay the premium for a new socket/CPU and RAM, they can do their R&D without my money... Having said that I took the risk of going for a mediocre motherboard design with unstable BIOS, nobody is perfect.
  16. Thanks guys for your help. At the moment I'm dealing with a failing mobo so I can't tell which part is due to errors coming from this failure, for the time being I'm sticking to Elite Dangerous, with the GPU on PCI_E2 which performs better than PCI_E1, this shouldn't be the case (x 8 vs X 16) and I will replace this motherboard shortly. Fly safe.
  17. Thank you guys for your insights.. Sorry for my late reply, I've been struggling with a failing motherboard, will get a new Asus TUF X570 within a couple of weeks, the MSI mobo have been a source of problems for a long time now, the PCI_E2 slots is faster than PCI_E1 at 4K and it shouldn't be the case, as much as I was happy with the B450 Gaming, this one is garbage. Anyway, the rest of the PC performs very well, something I learned is that the 7 5800X 3D doesn't seems to have the same limitations with the number of ranks than the 5600X it replace, it might be to the marginal gain obtained with interleaving ( 4 X 1 rank), since that with double the capacity, the amount of data available is high enough to compensate or the controller is also different and deal with more than 4 ranks under load. Obviously with this sort of RAM, the problem is the cost, the premium is significant compared to a non-B.die kit but the quality and performances are there, so I can't really complain especially because it works wonder with the 7 5800X 3D cache. Now if anyone wonder if I regret my decision of not going for DDR5, the answer is a definitive NO, from my PoV the technology involved in the DDR5 die, its stability and the latency needed to squeeze the max performances out of a Ryzen 3D are simply not there yet, it might on the other hand, suit Intel CPUs much better (familiar pattern). So this system is gonna do for the time being until hopefully some manufacturer comes up with a DDR5-6000 (as example) with half the latency they have today or lower (we can dream and add those on our X-Mass list)...
  18. There is a good reason why I chose to remain with the DDR4 standard, and it's the Zen3/3D Infinity Fabric design and the fact that they were conceived for lower latency, even more so the 3D series. I don't know how much weight you'll give to this video but it seems to me that in some ways it validate my choice, I keept saying it; DDR5 is not ready for the AMD processors, they need lower Cl to take full advantage of their cache... From my PoV we will need to wait until some manufacturer launch a DDR5 RAM kit without the stability issue coming with lower latency, higher frequencies doesn't always cut it and before we see such RAM kit in the market, my DDR4 system will have served me well for some years.
  19. Yes, I had another email from Mike at support. Thanks! I still haven't solved the problem but we're getting there.
  20. Thanks BIGNEWY. I realized that the "" around DCS.exe are missing on the line I write, this have been highlighted by Mike from support who gave me a link. Thanks for your help.
  21. The string is rewritten whatever I do, there is nothing I can do by editing it, it shows normally on the Icon, the .exe is added in the error message.
  22. Good suggestion if it is the case, the setting has changed by itself... I'll check. Yep, was enabled. Repair (all files plus additional steps Search for extra files) did find a file with this name (.exe), I moved it to back up and the normal start Icon is still there, so I guess it was a different file taking over and triggering the error message, testing now, we never know I might be lucky. >>> It didn't work, still the same error message, on the other hand, DCS starts although it is taking time, and might crash...
  23. I know but I don't write the .exe, I copy/paste the line from support which is correct, something writes those letters and they don't show on the thumbnail, then I tried DCS iun 2D and get the same error message, all of that after a fresh install, DCS is on its own M2-SSD, I ran a full system anti-virus scan before reinstalling. Anyway, ticket is gone to support I'm sure they'll figure it out.
  24. This is what worked for me when I last was able to launch the game.. OK. Wipe out and fresh install in order. Something is adding to the line and it's not me, new ticket sent to support, I'm happy they are there to be honest..
  25. OK done. Thanks again. No, it doesn't work. I didn't touch the line before it first failed to start. Now I got two .exe on this line.
×
×
  • Create New...