Jump to content

LucShep

Members
  • Posts

    1687
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by LucShep

  1. That higher resolution (7680x1440) will push the VRAM of 16GB GPUs near the limit with some DCS modules and maps combined (the "heavy" ones), more so if in Multiplayer. You can, of course, reduce the textures quality to "medium" (or terrain on "low" only) whenever necessary, which alleviates that problem. Raw performance wise, I'd say any GPU that is close to 250% faster (or better) than your GTX1080Ti, will be a huge upgrade, good for triple 1440P. Not sure it can help but, for sim-racing there are a few comparisons of AMD RX 7900XTX with Nvidia RTX 4080Super on triple screens. That can, more or less, roughly translate to the AMD RX 9070XT and Nvidia RTX5070Ti, with comparable performance respectively. The RTX 5080 is about 10% faster than these (at 4K and higher resolutions) but it's significantly more expensive right now (too much for what it is, IMO). DCS is a performance hog, with a large hit on triple screens (and VR). There are sim-racing titles that get somewhat close, like Assetto Corsa Competizione (unlike the original Assetto Corsa, which is much lighter on resources) or even Automobilista2 and iRacing, which can be also demanding on triple 1440P. So, GPU comparisons/benchmarks with such sim titles on triple screens may be worth looking into. time stamp 7:21 (and on) in the video: time stamp 6:17 (and on) in the video: Triple 1440P (7680x1440= 11.059.200 total pixels rendered) is a lot more demanding than Triple 1080P (5760x1080= 6.220.800 total pixels rendered), so take that into account when noticing framerates for that resolution in such videos out there for these GPUs.
  2. Also, Park Control is a nice complementary app to Process Lasso (it's from the same developer, Bitsum), to unpark the Cores. https://bitsum.com/parkcontrol/ After that is installed, run it. Make sure it's set to start at Windows login (settings are in the app's icon, at bottom right of desktop). Some will change the "Active Power Profile" manually before running the game, to "Bitsum Highest Performance". More than just a high power profile, this also unparks any and all Cores. But if you have Process Lasso also installed, then I personally find it's better to do it this way, to automatize things: Open Process Lasso. Got to "Options / Power / Performance Mode" and include the game's executable (browse to game's directory and add the game's .EXE to list). This automatically enables a higher power plan (Bitsum Highest Performance) with noticeable performance benefits whenever you run such program (game, etc) added to that list, reverting to whatever that was when you close it. But now with Park Control also installed, it will then also ensure that no Cores are parked when using that highest power plan, for whatever apps / games included in that list. Run the game... Alt+Tab... back to Process Lasso. And in the game's executable (listed among all the others), right click and change or tick these options: - CPU Affinity / "Always" / "Select CPU Affinity" / change according to preference (to "P-Cores" only, or to all cores, as you wish) and click OK. - Induce Performance Mode (checked) - Exclude from ProBalance (checked) *side note: there's the odd game that likes to run in 'High Priority' -- for that, if necessary, also do "CPU Priority" / "Always" / change to "High" With both Park Control and Process Lasso installed, that should be it.
  3. Not sure ProcessLasso will help in your case, considering that DCS is always changing and evolving. But to assign P-Cores only to DCS, and all background apps (the non Windows OS ones) to the E-Cores, it may help. It works for some and doesn't for others, some prefer to have all cores (P-Cores and E-Cores) assigned to DCS, so YMMV. Nothing to lose, other than your time that is...
  4. The GPU vs CPU usage discussed from early 2010s is non-relatable at all to the present DCS versions. AFAIK, dual GPU tech like SLI and Crossfire stopped working with the release of DCS 2.7 and posterior versions (to this day). DCS version 2.5.6 (version in my signature, back from early 2021) seems to be the last version with which SLI and Crossfire still worked. Also, notice that in early 2010s we were on the old legacy engine (forward render, non-deferred shading) and single-thread. DCS in later years (since version 2.5.0) makes much, much bigger use of the GPU, and now also of the CPU with multi-thread adoption (since version 2.8, with MT).
  5. Yes, it's a PITA, you'll have to pay the shipment of the board to them, trust the hardware gods that your motherboard will be correctly handled during all the time (can take a while), and in the meanwhile you'll be deprived of the system (or then need to mount your previous system components if you need a gaming PC to work). But in the end it may be better. Worst case scenario usually is that they tell you the motherboard has no problem whatsoever, that the problem lies elsewhere (RAM, CPU, PSU, etc), and they send it back. Best case scenario (and the intended result, I guess) a defect is found and you're either given a replacement (same product or direct equivalent) or a refund. https://softhandtech.com/how-do-you-rma-a-motherboard/ (and before anything, contact the shop/seller where you got that motherboard and enquire for their direct assistance in the process) The thing I'm still not understanding when you mention that disabling the XMP profile presentes no issue is... 1) ...is this with the basic (safest but slower) auto speed/timings? 2) ...or is this if you manually insert "XMP-alike" values for speed, primary timings, correct gear mode, and the right voltages (etc), does it then work fine with no bootup issue?
  6. As someone who decided to go into same adventure two years back, I can tell you about an important thing that you should not ignore.... VR headsets render scenes twice (for each eye), for the stereoscopic view that each screen needs to be provide (POV that is slightly different from each other). This translates into much, much higher hardware requirements than even for a 4K screen. More over, the sensitivity to frametimes and framerate are amplified ten fold. Meaning, you'll need a considerably beefy system to make it run smooth and pleasant all the time (not easy, even with reprojection), with every DCS module and map. Also, be prepared to fiddle with settings on and on, read and learn stuff, etc, until you reach a balance of usage (performance vs quality), for which settings end up sacrificed. You need to be patient and willing for that; if you aren't then I don't think it's for you. I do agree that DCS in VR is a whole new experience of immersion (amazing!) but it is a different animal and the current iteration of DCS is (still) way too demanding IMHO. And why I ended up using a much older version to far better effect (the one in my signature).
  7. @scommander2 You mentioned you would like to use AA batteries instead. Grassmonkeysimulations (U.S. based) has wireless IR controllers for head-tracking that work with any 1.5v AA battery, including rechargeable. Seem far more robust than the TrackIR Clip Pro as well. They're not cheap (49.95$) but maybe worth considering. They exhist in two formats, Puck and Odissey, and are TrackIR compatible: https://grassmonkeysimulations.com/product/the-puck-ir-w-o-camera-trackir-version/ https://grassmonkeysimulations.com/product/odyssey-ir-w-o-camera-trackir-compatible-version/
  8. IIRC, Gigabyte motherboards usually have a section for DRAM VOLTAGE CONTROL with that setting (DRAM Voltage), it's located under the TWEAKER tab. I haven't looked yet, but it helps looking at the BIOS manual of your motherboard: https://download.gigabyte.com/FileList/Manual/mb_manual_intel800-bios_e.pdf?v=3a1f079a1e6e76af6f34412100dfb61a The example below is from a 13th/14th gen Z790 Aorus DDR4 motherboard:
  9. You're crazy that CPU is going to get fried soon with SA Voltage at 1.40v! CPU System Agent Voltage is the power for the memory controller, and should be avoided to be messed with unless in specific OC'ing cases. Yes, it usually helps with general OC'ing and also for RAM, but you need to know first what is the safe voltage limit for it on your specific CPU model. With the Intel 12th, 13th and 14th gen CPUs you'd risk degradation with that set at over 1.30v, and I don't think Arrow Lake accepts using that much (stock for it is what, 1.10v?). The setting you need to look at in your BIOS is DRAM Voltage (which every manufacturer usually has in BIOS) - this is where you adjust the voltage for your RAM sticks.
  10. Some months ago I made a sort of tutorial for same subject, I hope it helps somehow:
  11. I presume from previous replies that you already checked socket is ok (no bent pins or dirt) and reseated the CPU again, so at this point... yeah, could be any of those things. The problem with this kind of issue is not having other components to test (other compatible CPU, other memory kit), there'll always be a hint of doubt (is it? could be?). But, considering such limitations are what is normal for any regular user, and since you have a valid warranty, you're entitled to RMA that board if unhappy with its functioning.
  12. It's unfortunate (and most common) that you don't have another CPU for that socket, so to test if CPU and/or socket pins are not to be blamed. But yeah, might be getting somewhere, because A2 and B2 should be able to each run single-channel mode (1x stick only) just fine. And are both the ones to use for 2x sticks of RAM (dual-channel mode). If A1 and B1 do not present the problem (no boot issue and no normal working issues, other than the wrong slot message from DRAM LED) then you may have found a culprit. Or a partial (temporary) solution, if that pair of slots in use result in less of an annoyance and you don't want to RMA the board yet. From your motherboard's manual:
  13. You can try to repeat same process again with one of the two sticks only, load BIOS optimized defaults and save/restart. Then enter BIOS, and either load XMP or do your own RAM manual setting adjustments (from the default optimized ones, without XMP loaded), as you've done already. Try repeating the process with one stick on different RAM slots of your motherboard (of the four available), to isolate possible issues happening in a specific slot - then it'd be the motherboard's fault. Supposing that only one of the two mem sticks has that bootup issue, in whatever RAM slot of the motherboard, then you have found a possibly faulty mem stick. It could be that it is an untested memory kit that isn't listed in the QVL (see here if listed) and/or possibly unstable on that board. Also remember - XMP isn't always guaranteed to work with full stability on whatever motherboard. Sometimes it doesn't. Check if it's in the correct gear mode for the mem speed in place. If it already is, then you may have to relax the primary timings (try one step down) and/or slightly increase the DRAM voltage (say, 0.05v). But even that isn't guaranteed to work. Again, I have no experience with Arrow Lake, it works completely different to previous Intel CPUs, so can only speculate. Maybe look into SkatterBencher's Arrow Lake MemSS OC guide (here), it could lead to some helpful pointers.
  14. Well, then it seems the problem is not on the motherboard or its default (optimized) settings. Then that calls for troubleshoot, if you have the patience. It seems fault is within some OC setting. Wether its CPU related, or RAM, is your investigation mission now, again if you have the patience. XMP is a different thing (could be mem fault then), you need to test one without the other and, once found which one is, to see which related setting is causing it. All that fun stuff!
  15. Wether or not you want to RMA the motherboard is up to you. Personally, I too don't think it's a big problem you got there. If all is good apart that little detail, maybe see how it goes and reavaliate such need closer to end of warranty. And, yes, I'd strongly suggest to always load optimized defaults (and on any motherboard), then reboot, and do your personal preference adjustments in BIOS afterwards (OC or not), save a BIOS profile of your new settings, and you're off to the races. Which it seems you already did and, if it does no longer present such problem, then I'd just do some stress testing to monitor temperatures, voltages and wattage, to be sure something isn't "odd" (if it is, adjust accordingly). Some manufacturers pump "funny" higher voltages for better benchmark scores with optimized defaults BIOS settings, so this is just to be sure all is good. If all good, consider it solved for now.
  16. Last time I saw that "C5" code on a Gigabyte motherboard was with a Z170X UD5. But then, that was many generations ago (i7 6700K then) and lots has changed since. I remember updating the bios, then resetting it afterwards, loading optimized defaults, reboot, and only then start to change and test things as necessary. In this case, that C5 code, which is usually listed in the manual as "reserved" (thanks Gigabyte!!) was memory related. Though that code may pop as well with a badly seated CPU (or dirt) on the motherboard socket. It might be a memory learning related thing, which could explain why it only makes it on cold boot, not on Windows restart(?). I got no experience with the current "Arrow Lake" generation of Intel CPUs, or their motherboards, so I'll refrain from further speculation. But, admittedly, I personally avoid Gigabyte motherboards like the plague (Asus or MSI beefy mid-range still my go to, everytime) because every single one I messed with has been finicky, very sensitive to different BIOS versions with varying behaviours (too much) when it comes to memory OC. In the meanwhile, I'd suggest having a look into similar subject in Reddit, and the official Gigabyte forums, as brainstorming occurs and might lead to something: https://www.reddit.com/search/?q=gigabyte+z890+c5+error&cId=50b4f566-4b3d-405b-9c07-64dbf85fde16&iId=e306d4d6-b4db-431e-8c59-205ed1992d64 https://forum.giga-byte.co.uk/index.php
  17. Agreed, I think we may be witnessing a slow and gradual shift in the gaming GPU arena, perhaps for next years to come. Similar to what has happened with CPUs right before the 2020 time period, when AMD Ryzen then became a true alternative to Intel "K" processors at very competitive prices. The globally inflated prices and low availability, and also constant drivers development, are factors that can favor AMD GPUs right now - even if prices are bad, at the moment.
  18. It depends on the module + map combination, and if you're using it in SP or MP. And, of course, the higher the quality settings and the resolution (4K or VR, for example) the higher the VRAM guzzling. Try the F-14A/B, or F-4E Phantom II, or CH-47F Chinook, or AH-64D Apache, or KA-50 Blackshark modules in Afghanistan (any variant), or Iraq, or Kola, or Sinai, or South Atlantic, or Marianas (maybe some other new-ish maps and modules too). Then, and more so if in MP, you'll see much bigger VRAM usage. This is problematic for perception because, f.ex., if it's a newcomer just using the default Caucasus map with the Su-25T or FC3 (now FC4) aircraft in SP, then the VRAM usage isn't too much of a problem (though it certainly could be better). The VRAM concerns will probably not even be realized until later, when acquiring "heavier" modules and maps. PS: for VRAM monitoring, I'd suggest either LibreHardwareMonitor or HWInfo, as these separate measurement of VRAM (what's actually in use, and allocated).
  19. It depends what are your user needs/preferences, more than just budget (though of course value for money is important). The Nvidia RTX 5080 16GB is just too costly for what it is, IMO it makes no sense at current "real" prices on any of its versions. The AMD RX 7900XTX 24GB is indeed an interesting proposition because of the much bigger VRAM buffer (24GB). But then the performance is somewhat equivalent to the new RX 9070XT while being considerably more expensive than it, it also runs hotter (higher power consumption), plus it's limited to FSR3 (which is far lower quality), and when the newer RX9000 series are getting all the focus from AMD for driver updates now, it then makes it less appealing. Instead, I'd suggest getting an Nvidia RTX 5070Ti 16GB or an RTX 4080 Super 16GB, if you can get one priced up to $100 more than the AMD RX 9070XT 16GB. If the price difference gets bigger than that, then I'd entertain the idea of an RX 9070XT 16GB acquisition. Overall, for a 4K resolution screen, any of these three mentioned GPUs will satisfy. If you intend to use DLSS upscaler (and maybe also its Frame Generation) either Nvidia model will, of course, be the imediate choice. But, with the AMD RX 9070XT you can force FSR4 with similar result, through the drivers, on games that haven't adopted FSR4 - that is, if the game supports older version of FSR. And if it doesn't, and the game of your choice only supports DLSS, then there's also also Optiscaler to go around that if with AMD (see video demo here). If considering VR use later, then either Nvidia model will surely be a better choice. But that doesn't mean the new AMD RX 9070XT won't work "good enough" in VR (see here).
  20. Really? ...So updates of Windows 10 are no longer breaking WMR ??
  21. Nice one. Another happy Sennheiser user. PS: yep that's it. The audio crackles and gets wonky if the connector is not all way in and correctly put/secure (as per instructions, if confused). Once that's done, the skies open and the dark clouds are gone. PS2: if you ever need cables or ear-pads replacements, they're sold separately as well in Amazon.
  22. Yeah, that is a good remark. The advantage of a CPU like that could be that you can alocate every "non-gaming" application to the "non-X3D" CCD, freeing the CCD with the 3D V-cache just for DCS. A bit like you can do with Intel CPUs with E-Cores, freeing the P-Cores for DCS.
  23. 300.00EUR is a huge difference between those Ryzen 9 chips It shouldn't be more than 150.00EUR between 9950X (~660€) and the new 9950X3D (~810€). You sure to be searching for best prices? But maybe it's the "novelty factor" of the 9950X3D inflating prices(?), and the 9950X appearing with discounts....
×
×
  • Create New...