Jump to content

LucShep

Members
  • Posts

    1693
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by LucShep

  1. Hey, nice one! Strange days we live in today, when people get to be congratulated for getting a GPU at MSRP or whereabouts! Agreed, it does make more sense for such tech to work the other way around, than currently, for triples. That said, maybe makes sense to use DLSS in DCS. Even if negligible as it may sound in this case (working on 1/3 displayed pixels only), theoretically it should reduce VRAM usage and make it less computing intensive to render from the game.
  2. I always use "Quality", even if "Balanced" now is digestible with presets J and K (I'm quite alergic to aliasing *sigh*). I do notice that in dllstweaks.log, and on the app, it mentions DLAA not being forced onto presets. And only now I notice that I left that option by mistake earlier, guess was when testing the urban myth(?) that it applies an extra DLAA effect over DLSS presets regardless. ...may be a bug? ...or perhaps old eyes going for placebo Will have to disable "Force DLAA", yep. You're right about the Global Forced Preset. The thing is, some games prefer a different preset (Cyberpunk2077 "C", WRCG "C", GoT "K", ACC I prefer "C", and DCS I prefer "K") so that isn't always a universal aproach. And yes, I do rename the dxgi.dll regardless (to XInput9_1_0.dll, doesn't interfere w/ anything AFAIK) as I use Reshade, even in VR (VRToolkitReshadeUniversal, via SteamVR).
  3. Yep, that's a very demanding display (triple 1440P) for DCS. It's a darn shame that both DLSS and FG only work on the main display in a multi monitor setup, or at least it was my experience helping someone a couple of years ago with triple screens. I don't have experience with AMD on triple screens, but I suppose it'll be the same thing for their equivalent FSR tech. Perhaps FG (x2) works on triples with Lossless Scaling (see here) ? I think you're in that position with a 1080Ti, where you'll be "damned if you do, damned if you don't", the upgrade will have to happen at some point, sooner rather than later. I'd have a look to AMD, the RX 7900XT 20GB and 7900XTX 24GB are still found, some at discount in the usual places if you're in the USA (MicroCenter, Newegg, etc), probably a few of the new RX 9070XT 16GB as well. Any of these three will be a tremendous upgrade over the good old GTX1080Ti and, worst case scenario, you can later sell them on 2nd hand market (or here in the forum?) if better oportunity later appears for an Nvidia higher end GPU. But as it is, better forget Nvidia for the next few months...
  4. The newer presets J and K both have that soft image with less aliasing than any other preset (very nice there, though sometimes too soft), but the ghosting still present for me. The older presets C and E have least ghosting for me.
  5. YMMV but, I'm getting better performance and image quality results just with DLSS Tweaks alone. No messing with DLSS settings in Nvidia Inspector (or equivalent NVPI Revamped), whatsoever - all those DLSS settings best left as driver default for best results. PS: if you're the impatient short attention span type, go to the end of this post * I get worse results if with DLSS settings messed on Nvidia Inspector. Slightly bigger hit in GPU usage and higher wattage consumption (so, worse impact) on my RTX3090 and, although subtle, image quality does not look as good. DLSS Tweaks with adjusted settings seem to be devoided of issues, and brings the desired results for me. I'm also using the latest Nvidia DLSS .dll files alone, which also means then that I don't have to download latest Nvidia garbage drivers either (can use whichever older ones I like). What I do is: Download the latest DLSS .dll files from TechPowerUp and unzip (or copy) the files in the folder where the game's executable is. Download the most recent DLSS Tweaks application and unzip (or copy) the files into an empty folder. Run the "EnableNvidiaSigOverride.reg" file (double-click -> Run -> Yes to all prompts) Double click on the DLSSTweaksConfig.exe file and change any settings to suit your preferences. And then press "Save" after changing settings. Personally, I changed mine to exactly as they are in this image : (click image to enlarge it) copy and place all the files into the folder where the game's executable is. side note and optional step - in case you use Reshade or any other app that needs to have its own exhisting dxgi.dll rename the dxgi.dll that comes with DLSS Tweaks to either XInput1_3.dll, or XInput1_4.dll, or XInput9_1_0.dll. Run the game and see how it goes. ____ ____ _____ * Now, let's imagine that you're the impatient type, and want the least fuss as possible.... Then you'd download THIS (includes all mentioned files, with tweaked settings, and compatible with Reshade), then unzip or copy all files into folder where game executable is. Now you'd only need to double click the EnableNvidiaSigOverride.reg (run and yes, etc), adjust settings in the app if you wish (though not needed)... and it's ready to go. Run the game, see how it goes (again, YMMV - may or not work better for you). BTW, personal opinion but I still think presets "C" and "E" are the best alternatives to preset "K" or "J".
  6. That higher resolution (7680x1440) will push the VRAM of 16GB GPUs near the limit with some DCS modules and maps combined (the "heavy" ones), more so if in Multiplayer. You can, of course, reduce the textures quality to "medium" (or terrain on "low" only) whenever necessary, which alleviates that problem. Raw performance wise, I'd say any GPU that is close to 250% faster (or better) than your GTX1080Ti, will be a huge upgrade, good for triple 1440P. Not sure it can help but, for sim-racing there are a few comparisons of AMD RX 7900XTX with Nvidia RTX 4080Super on triple screens. That can, more or less, roughly translate to the AMD RX 9070XT and Nvidia RTX5070Ti, with comparable performance respectively. The RTX 5080 is about 10% faster than these (at 4K and higher resolutions) but it's significantly more expensive right now (too much for what it is, IMO). DCS is a performance hog, with a large hit on triple screens (and VR). There are sim-racing titles that get somewhat close, like Assetto Corsa Competizione (unlike the original Assetto Corsa, which is much lighter on resources) or even Automobilista2 and iRacing, which can be also demanding on triple 1440P. So, GPU comparisons/benchmarks with such sim titles on triple screens may be worth looking into. time stamp 7:21 (and on) in the video: time stamp 6:17 (and on) in the video: Triple 1440P (7680x1440= 11.059.200 total pixels rendered) is a lot more demanding than Triple 1080P (5760x1080= 6.220.800 total pixels rendered), so take that into account when noticing framerates for that resolution in such videos out there for these GPUs.
  7. Also, Park Control is a nice complementary app to Process Lasso (it's from the same developer, Bitsum), to unpark the Cores. https://bitsum.com/parkcontrol/ After that is installed, run it. Make sure it's set to start at Windows login (settings are in the app's icon, at bottom right of desktop). Some will change the "Active Power Profile" manually before running the game, to "Bitsum Highest Performance". More than just a high power profile, this also unparks any and all Cores. But if you have Process Lasso also installed, then I personally find it's better to do it this way, to automatize things: Open Process Lasso. Got to "Options / Power / Performance Mode" and include the game's executable (browse to game's directory and add the game's .EXE to list). This automatically enables a higher power plan (Bitsum Highest Performance) with noticeable performance benefits whenever you run such program (game, etc) added to that list, reverting to whatever that was when you close it. But now with Park Control also installed, it will then also ensure that no Cores are parked when using that highest power plan, for whatever apps / games included in that list. Run the game... Alt+Tab... back to Process Lasso. And in the game's executable (listed among all the others), right click and change or tick these options: - CPU Affinity / "Always" / "Select CPU Affinity" / change according to preference (to "P-Cores" only, or to all cores, as you wish) and click OK. - Induce Performance Mode (checked) - Exclude from ProBalance (checked) *side note: there's the odd game that likes to run in 'High Priority' -- for that, if necessary, also do "CPU Priority" / "Always" / change to "High" With both Park Control and Process Lasso installed, that should be it.
  8. Not sure ProcessLasso will help in your case, considering that DCS is always changing and evolving. But to assign P-Cores only to DCS, and all background apps (the non Windows OS ones) to the E-Cores, it may help. It works for some and doesn't for others, some prefer to have all cores (P-Cores and E-Cores) assigned to DCS, so YMMV. Nothing to lose, other than your time that is...
  9. The GPU vs CPU usage discussed from early 2010s is non-relatable at all to the present DCS versions. AFAIK, dual GPU tech like SLI and Crossfire stopped working with the release of DCS 2.7 and posterior versions (to this day). DCS version 2.5.6 (version in my signature, back from early 2021) seems to be the last version with which SLI and Crossfire still worked. Also, notice that in early 2010s we were on the old legacy engine (forward render, non-deferred shading) and single-thread. DCS in later years (since version 2.5.0) makes much, much bigger use of the GPU, and now also of the CPU with multi-thread adoption (since version 2.8, with MT).
  10. Yes, it's a PITA, you'll have to pay the shipment of the board to them, trust the hardware gods that your motherboard will be correctly handled during all the time (can take a while), and in the meanwhile you'll be deprived of the system (or then need to mount your previous system components if you need a gaming PC to work). But in the end it may be better. Worst case scenario usually is that they tell you the motherboard has no problem whatsoever, that the problem lies elsewhere (RAM, CPU, PSU, etc), and they send it back. Best case scenario (and the intended result, I guess) a defect is found and you're either given a replacement (same product or direct equivalent) or a refund. https://softhandtech.com/how-do-you-rma-a-motherboard/ (and before anything, contact the shop/seller where you got that motherboard and enquire for their direct assistance in the process) The thing I'm still not understanding when you mention that disabling the XMP profile presentes no issue is... 1) ...is this with the basic (safest but slower) auto speed/timings? 2) ...or is this if you manually insert "XMP-alike" values for speed, primary timings, correct gear mode, and the right voltages (etc), does it then work fine with no bootup issue?
  11. As someone who decided to go into same adventure two years back, I can tell you about an important thing that you should not ignore.... VR headsets render scenes twice (for each eye), for the stereoscopic view that each screen needs to be provide (POV that is slightly different from each other). This translates into much, much higher hardware requirements than even for a 4K screen. More over, the sensitivity to frametimes and framerate are amplified ten fold. Meaning, you'll need a considerably beefy system to make it run smooth and pleasant all the time (not easy, even with reprojection), with every DCS module and map. Also, be prepared to fiddle with settings on and on, read and learn stuff, etc, until you reach a balance of usage (performance vs quality), for which settings end up sacrificed. You need to be patient and willing for that; if you aren't then I don't think it's for you. I do agree that DCS in VR is a whole new experience of immersion (amazing!) but it is a different animal and the current iteration of DCS is (still) way too demanding IMHO. And why I ended up using a much older version to far better effect (the one in my signature).
  12. @scommander2 You mentioned you would like to use AA batteries instead. Grassmonkeysimulations (U.S. based) has wireless IR controllers for head-tracking that work with any 1.5v AA battery, including rechargeable. Seem far more robust than the TrackIR Clip Pro as well. They're not cheap (49.95$) but maybe worth considering. They exhist in two formats, Puck and Odissey, and are TrackIR compatible: https://grassmonkeysimulations.com/product/the-puck-ir-w-o-camera-trackir-version/ https://grassmonkeysimulations.com/product/odyssey-ir-w-o-camera-trackir-compatible-version/
  13. IIRC, Gigabyte motherboards usually have a section for DRAM VOLTAGE CONTROL with that setting (DRAM Voltage), it's located under the TWEAKER tab. I haven't looked yet, but it helps looking at the BIOS manual of your motherboard: https://download.gigabyte.com/FileList/Manual/mb_manual_intel800-bios_e.pdf?v=3a1f079a1e6e76af6f34412100dfb61a The example below is from a 13th/14th gen Z790 Aorus DDR4 motherboard:
  14. You're crazy that CPU is going to get fried soon with SA Voltage at 1.40v! CPU System Agent Voltage is the power for the memory controller, and should be avoided to be messed with unless in specific OC'ing cases. Yes, it usually helps with general OC'ing and also for RAM, but you need to know first what is the safe voltage limit for it on your specific CPU model. With the Intel 12th, 13th and 14th gen CPUs you'd risk degradation with that set at over 1.30v, and I don't think Arrow Lake accepts using that much (stock for it is what, 1.10v?). The setting you need to look at in your BIOS is DRAM Voltage (which every manufacturer usually has in BIOS) - this is where you adjust the voltage for your RAM sticks.
  15. Some months ago I made a sort of tutorial for same subject, I hope it helps somehow:
  16. I presume from previous replies that you already checked socket is ok (no bent pins or dirt) and reseated the CPU again, so at this point... yeah, could be any of those things. The problem with this kind of issue is not having other components to test (other compatible CPU, other memory kit), there'll always be a hint of doubt (is it? could be?). But, considering such limitations are what is normal for any regular user, and since you have a valid warranty, you're entitled to RMA that board if unhappy with its functioning.
  17. It's unfortunate (and most common) that you don't have another CPU for that socket, so to test if CPU and/or socket pins are not to be blamed. But yeah, might be getting somewhere, because A2 and B2 should be able to each run single-channel mode (1x stick only) just fine. And are both the ones to use for 2x sticks of RAM (dual-channel mode). If A1 and B1 do not present the problem (no boot issue and no normal working issues, other than the wrong slot message from DRAM LED) then you may have found a culprit. Or a partial (temporary) solution, if that pair of slots in use result in less of an annoyance and you don't want to RMA the board yet. From your motherboard's manual:
  18. You can try to repeat same process again with one of the two sticks only, load BIOS optimized defaults and save/restart. Then enter BIOS, and either load XMP or do your own RAM manual setting adjustments (from the default optimized ones, without XMP loaded), as you've done already. Try repeating the process with one stick on different RAM slots of your motherboard (of the four available), to isolate possible issues happening in a specific slot - then it'd be the motherboard's fault. Supposing that only one of the two mem sticks has that bootup issue, in whatever RAM slot of the motherboard, then you have found a possibly faulty mem stick. It could be that it is an untested memory kit that isn't listed in the QVL (see here if listed) and/or possibly unstable on that board. Also remember - XMP isn't always guaranteed to work with full stability on whatever motherboard. Sometimes it doesn't. Check if it's in the correct gear mode for the mem speed in place. If it already is, then you may have to relax the primary timings (try one step down) and/or slightly increase the DRAM voltage (say, 0.05v). But even that isn't guaranteed to work. Again, I have no experience with Arrow Lake, it works completely different to previous Intel CPUs, so can only speculate. Maybe look into SkatterBencher's Arrow Lake MemSS OC guide (here), it could lead to some helpful pointers.
  19. Well, then it seems the problem is not on the motherboard or its default (optimized) settings. Then that calls for troubleshoot, if you have the patience. It seems fault is within some OC setting. Wether its CPU related, or RAM, is your investigation mission now, again if you have the patience. XMP is a different thing (could be mem fault then), you need to test one without the other and, once found which one is, to see which related setting is causing it. All that fun stuff!
  20. Wether or not you want to RMA the motherboard is up to you. Personally, I too don't think it's a big problem you got there. If all is good apart that little detail, maybe see how it goes and reavaliate such need closer to end of warranty. And, yes, I'd strongly suggest to always load optimized defaults (and on any motherboard), then reboot, and do your personal preference adjustments in BIOS afterwards (OC or not), save a BIOS profile of your new settings, and you're off to the races. Which it seems you already did and, if it does no longer present such problem, then I'd just do some stress testing to monitor temperatures, voltages and wattage, to be sure something isn't "odd" (if it is, adjust accordingly). Some manufacturers pump "funny" higher voltages for better benchmark scores with optimized defaults BIOS settings, so this is just to be sure all is good. If all good, consider it solved for now.
  21. Last time I saw that "C5" code on a Gigabyte motherboard was with a Z170X UD5. But then, that was many generations ago (i7 6700K then) and lots has changed since. I remember updating the bios, then resetting it afterwards, loading optimized defaults, reboot, and only then start to change and test things as necessary. In this case, that C5 code, which is usually listed in the manual as "reserved" (thanks Gigabyte!!) was memory related. Though that code may pop as well with a badly seated CPU (or dirt) on the motherboard socket. It might be a memory learning related thing, which could explain why it only makes it on cold boot, not on Windows restart(?). I got no experience with the current "Arrow Lake" generation of Intel CPUs, or their motherboards, so I'll refrain from further speculation. But, admittedly, I personally avoid Gigabyte motherboards like the plague (Asus or MSI beefy mid-range still my go to, everytime) because every single one I messed with has been finicky, very sensitive to different BIOS versions with varying behaviours (too much) when it comes to memory OC. In the meanwhile, I'd suggest having a look into similar subject in Reddit, and the official Gigabyte forums, as brainstorming occurs and might lead to something: https://www.reddit.com/search/?q=gigabyte+z890+c5+error&cId=50b4f566-4b3d-405b-9c07-64dbf85fde16&iId=e306d4d6-b4db-431e-8c59-205ed1992d64 https://forum.giga-byte.co.uk/index.php
  22. Agreed, I think we may be witnessing a slow and gradual shift in the gaming GPU arena, perhaps for next years to come. Similar to what has happened with CPUs right before the 2020 time period, when AMD Ryzen then became a true alternative to Intel "K" processors at very competitive prices. The globally inflated prices and low availability, and also constant drivers development, are factors that can favor AMD GPUs right now - even if prices are bad, at the moment.
  23. It depends on the module + map combination, and if you're using it in SP or MP. And, of course, the higher the quality settings and the resolution (4K or VR, for example) the higher the VRAM guzzling. Try the F-14A/B, or F-4E Phantom II, or CH-47F Chinook, or AH-64D Apache, or KA-50 Blackshark modules in Afghanistan (any variant), or Iraq, or Kola, or Sinai, or South Atlantic, or Marianas (maybe some other new-ish maps and modules too). Then, and more so if in MP, you'll see much bigger VRAM usage. This is problematic for perception because, f.ex., if it's a newcomer just using the default Caucasus map with the Su-25T or FC3 (now FC4) aircraft in SP, then the VRAM usage isn't too much of a problem (though it certainly could be better). The VRAM concerns will probably not even be realized until later, when acquiring "heavier" modules and maps. PS: for VRAM monitoring, I'd suggest either LibreHardwareMonitor or HWInfo, as these separate measurement of VRAM (what's actually in use, and allocated).
×
×
  • Create New...