Jump to content

LucShep

Members
  • Posts

    1693
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by LucShep

  1. Posted a year ago on these forums..... And no, since then and so far, it doesn't seem to make a difference for DCS, or at least not in a meaningful way like it does to other games.
  2. @PeeJott17 and @Stratos doesn't seem related to CGTC. If the subject is Marianas modding, then perhaps better open a separate thread for it?
  3. Oh sorry, didn't notice you already have a 32'' 1440P gaming monitor. Disregard my previous suggestions then. Personally, I don't think the MSI MAG 321CUP is worth the expense as an upgrade from yours (also for reasons previously stated), but you might find that different... Honestly, I'd only upgrade what you already have to a bigger monitor. Which then, yes, would have to be 4K. The problem there is budget..... the bigger good stuff is also much more expensive. If it's for desk use, one that imediately pops in my mind is the 38'' ASUS ROG Swift PG38UQ (38'' 4K 144Hz IPS): https://rog.asus.com/monitors/above-34-inches/rog-swift-pg38uq/ This is a different and interesting option that ASUS puts out, because it's a nice big monitor but not "too big" for desktop use, if that's the main use intention. And then there are the various 42'' OLED monitors in the market, including LG OLED TVs (also work great as monitors). Probably will feel a bit too big on a desk (if coming from smaller monitors) but are perfectly suitable for that role. ASUS PG42UQ - review: https://www.rtings.com/monitor/reviews/asus/rog-swift-oled-pg42uq Philips 42M2N8900 - review: https://www.tomshardware.com/reviews/philips-evnia-42m2n89-review KTC G42P5 - review: https://www.guru3d.com/review/ktc-g42p5-oled-monitor-review/ LG OLED Flex LX3 (TV, HDMI 2.1 only, no DisplayPort; screen is bendable - flat to 900R curve) - review: https://www.rtings.com/tv/reviews/lg/oled-flex LG OLED42C3 (TV, HDMI 2.1 only, no DisplayPort) - review: https://www.rtings.com/tv/reviews/lg/c3-oled LG OLED42C4 (TV, HDMI 2.1 only, no DisplayPort) - review: https://www.rtings.com/monitor/reviews/lg/42-c4-oled LG OLED42C5 (TV, HDMI 2.1 only, no DisplayPort) - review: https://www.rtings.com/tv/reviews/lg/c5-oled
  4. What monitor are you currently using? (size, resolution and refresh, panel type?) The 7800XT is certainly 4K gaming capable, but not really meant for it. While DCS at 4K on a 7800XT is doable, you'll be forced to cut down some settings. I'd also suspect that, for the heavier maps, to use FSR may become necessary in 4K resolution - and here is where things get less positive... The 7800XT uses FSR3 and, as you may know, the image is nowhere as good as Nvidia's equivalent DLSS (it gets a bit blurry and grainy, comparatively). I honestly think 4K makes more sense (to me anyway) in bigger panels, from 38'' and upwards in size, and where this sort of "sacrifice/compromise" would make more sense. Matter of personal opinion but, to me, I think you'd be better served with a good monitor that is 32'' 1440P VRR 144Hz+ with IPS panel. And there are some very good examples out there very reasonably priced now (I'll leave this for last). Reasons for me in this case are: The 1440P resolution still provides very good image at a 32 inches screen size, and will be much better handled there. This way, you could maintain the native resolution (2560x1440) with no upscaling, using TAA (maybe with some sharpening added) as the anti-aliasing method in the game options. Which results in both good image quality and performance, with your 7800XT. Regarding screen high refresh-rate, you'll notice 144Hz, 165Hz, 180Hz and 240Hz will be the most common these days for gaming monitors. Regardless, noone sees the benefits of maxxing such high refresh-rates with flight sims, as it's unnecessary for the genre of game and becomes increasingly harder to run. The point here -and in DCS for this matter- is that, if it's a high-refresh gaming monitor, it means it'll handle motion (i.e, movement) a lot better, regardless of frame-rate (so long as it's above 60FPS, that is), then resulting in a more "natural picture" when things are moving, with far less blur and ghosting, which is very appreciated. And here, and with any gaming monitor these days, what you should do is enable VRR (FreeSync in your case, with the 7800XT) and lock the max framerate in game to something like 120FPS (or 100FPS, or 90FPS, or 80FPS, etc, whatever your prefer), then let the monitor panel do the rest. The last thing is the panel type on a monitor. The different types are TN, VA, IPS and OLED (also subdivided into WOLED and QDOLED). TN is more focused on fast image on very high refresh panels, fast response times and motion handling, not so much image quality, so it's not a good choice for flight-sims. (not recommended) VA is more focused on high contrast and vibrant colors, but at cost of motion handling (blur, ghosting) and poor view angles, which you may notice and dislike with flight-sims. (not recommended) IPS is more focused on color accuracy, motion handling, response times and better view angles, at a cost of lower contrast, but generally a great choice for flight-sims. (recommended) OLED are the best for image quality (true blacks, no backlight bleeding, no glowing or blooming) and instantaneous response times, but very expensive and risk of burn-in. (recommended, with that caveat) You mention budget is a concern. There are plenty 32'' 1440P IPS gaming monitors that are very good and not too expensive (from $200 to $400 USD). There are so many monitors always coming out, at any time, that it's hard to trace what is best or not but, having tried these myself, I'd definitely recommend them: Asus TUF Gaming VG32AQL1A --- 32'' 1440P (2560x1440) 165Hz / OC 170Hz - IPS https://www.asus.com/displays-desktops/monitors/tuf-gaming/tuf-gaming-vg32aql1a/ Asus ROG Swift PG329Q --- 32'' 1440P (2560x1440) 165Hz / OC 175Hz - IPS https://rog.asus.com/monitors/32-to-34-inches/rog-swift-pg329q-model/ LG 32GS75Q-B --- 32'' 1440P (2560x1440) 180Hz - IPS https://www.lg.com/us/monitors/lg-32gs75q-b-gaming-monitor LG 32GP850-B --- 32'' 1440P (2560x1440) 165Hz / OC 180Hz - IPS https://www.lg.com/us/monitors/lg-32gp850-b-gaming-monitor LG 32GP750-B --- 32'' 1440P (2560x1440) 165Hz - IPS https://www.lg.com/us/monitors/lg-32gp750-b-gaming-monitor Gigabyte GS32Q --- 32'' 1440P (2560x1440) 165Hz / OC 170Hz - IPS https://www.gigabyte.com/Monitor/GS32Q#kf Gigabyte M32Q --- 32'' 1440P (2560x1440) 165Hz / OC 170Hz - IPS https://www.gigabyte.com/Monitor/M32Q#kf I'd say you can't really go wrong with any of these (all are good) and they're not too expensive.
  5. Hi there. I don't run the Marianas map (pretty bad performance and of very little interest to me, ended up deleting it), so unfortunately I can't help you there. But my guess is that anyone with basic modding and 2D texturing skills will be able to easily do it. It's a matter of finding the respective textures inside the texture packages, then adapt and repackage the new ones.
  6. Agreed, 96GB seems like the sweet spot for RAM capacity now, for those building a new system for DCS.
  7. Well, regardless, that's good news. If you cleaned the previous Nvidia drivers with DDU and installed the Nvidia 537.58 drivers, it could be that. That could be enough to make things imediately improved. However, undervolting is a very strong recommendation on any RTX 3090 (otherwise it's always a struggling furnace, and wattage guzzler!). So, make sure MSI Afterburner is always working (i.e, starting with Windows) with the undervolting settings applied, and that GPU of yours will last a long, long time trouble free. I just noticed a prior post of yours here, asking about a possible swap to the AMD RX 9070 XT. As good as that is, I don't think it's that much of an improvement over the RTX 3090, especially considering the current prices. Keep yours, as it's still a really good GPU in 2025 (plus, it's an EVGA FTW - among the very best ones). Personally, I would only upgrade an RTX 3090 for an RTX 5080 or RTX 4090.
  8. What motherboard is this? I suppose it's an AM5 B850 or X870 model, and on these I believe only two NVMe slots can be used without impacting the GPU (which ones to use depends on motherboard). The ideal setup for NVMe drives in your case would be two, to maintain PCIe 5.0 x16 for the GPU. As you describe, one main NVMe drive for OS and etc (in M2A), the other one for games and etc (you mention it's in M2D ?). As you say, if you use the other two slots it will slow your GPU slot down to PCIe 5.0 x8. However, testing with the "almighty" RTX 5090 has shown that it only loses 1%~3% performance by losing half of its bandwidth, basically unnoticeable (see it HERE and HERE). That said, even if the difference is insignificant, and since you're building this system from the ground up, for the long term and wanting the large storage, I'd certainly consider going for two good 4TB NVMe Gen4 drives.
  9. I have no idea TBH, I don't have an AMD 9000 series GPU where I can test this on. Just telling what comments on forums suggest. ...I guess trying it on Win10 isn't a bad idea?
  10. That file contains over 2GB of display driver files in a 500MB compressed .7Z file package.... that's why it takes so long to uncompress it. Yes, MSI Afterburner needs to always start with Windows, and be running in the background at all times, for the applied undervolt settings to be in effect. You can change this in the general settings (click on the * "sprocket symbol").
  11. To me, that starts to look like power delivery issues on your RTX3090, either on the GPU itself (hardware or software related) or on the PSU side. The RTX3090 comes severely overvolted from factory and, also because there's double the mem modules compared to RTX3080/Ti, makes it really hot and power hungry. Also, RTX3000 series are far more sensitive to different Nvidia driver versions than following RTX4000 series, it can aggravate how the GPU works (stuttering, instability, even erratic power consumption). All this can be somewhat alleviated though. Having an RTX3090 (EVGA FTW Ultra) myself, there are two things that I found to be "must do" for it, and I would urgently recommend you to: Install Nvidia 537.58 drivers (the best drivers for RTX3000 series, I think I've tried them all), even better if it's a clean driver version (no bloatware from Nvidia). You can get these from here: https://mega.nz/folder/dQRX3AQI#9RmtXT0cTw45RsWxTQgYiw/file/JJ5kQAAC Uncompress (unzip) the package, then go to the folder with the extracted files, and run setup.exe to start the drivers installation. Before that, I'd strongly suggest to first uninstall your current Nvidia drivers with DDU: https://www.wagnardsoft.com/forums/viewtopic.php?t=5324 This will ensure that your system is clean of any display driver left overs, before (re)installing Nvidia drivers. (SIDE NOTE: I prefer to use the portable version of DDU, it'll extract files to a directory of your choice - where you then go to run DDU, by 2x clicking in the executable) If you're not accustomated to this procedure with DDU, then you have a tutorial here: https://www.tomshardware.com/how-to/uninstall-nvidia-amd-intel-gpu-drivers Undervolt your RTX3090 - I'd suggest it to about 1830 Mhz core frequency @ 850mv. This is absolutely crucial, makes a world of difference (how it should always have been). For this, you'll need MSI Afterburner, irregardless of RTX3090 brand/model that you may have. You can download it (version 4.6.5 Final) from here: https://www.msi.com/Landing/afterburner/graphics-cards If you use EVGA Precision-X or Asus GPU-Tweak, then those must be pretered (i.e, no longer used) in favor of MSI Afterburner. MSI Afterburner will need then to always start with Windows, for the applied undervolt settings to be in effect. I left some pointers about it in a previous post HERE. There's also a quick video tutorial on how to undervolt the RTX3090 in HERE. I hope it can help somehow. Sorry if it sounds like "too much info" or complicated. After the process you'll see it isn't.
  12. Some comments suggest that having Windows 11 and an AMD RX9000 series GPU are requirements for FSR4 to work with Optiscaler. Which could explain why some have problems (Windows 10 and/or older AMD GPUs or Nvidia GPU users) and others don't. There are various FSR4/Optiscaller tutorials on YTB, but for other games (see videos below). Method should be similar for DCS.
  13. Noone recommends DDR5 6400 memory for AM5 CPUs, like the 9800X3D, mostly because the performance difference is insignificant (to the point of "absolute zero perceived gains") at a cost of instability, higher SOC voltage, and higher acquisition cost. Not all AM5 CPUs will be able to do 6400 (3200 UCLK), especially dual-rank configuration. Some can but it is never guaranteed, it's a complete lottery. Even if it does work, it'll almost surely require 1.3v(+) on the SOC to avoid crashes... which then risks the CPU mid/long term reliability. That's why DDR5 6000 CL30 AMD EXPO kits (with two sticks, like 2x32 for 64GB, and 2x48 for 96GB) are still the most recommend for AM5 CPUs, including the 9800X3D. These mem kits are AM5 "Plug & Play", no fuss or issues - all you need is to load EXPO1 on the bios (just like you do XMP for Intel CPUs) and that's it. And if you really want that, it's very possible that a DDR5 6000 CL30 kit will do 6200 with little effort (SOC at 1.2v~1.25v). But that, again, may require some tuning.
  14. Yep. The distant clouds jittering is always present. It happens regardless of upscaling techniques being used (or not, it doesn't matter) or AA techniques (no matter which). The pixelation also happens regardless of clouds quality setting. It becomes painfully glaring (very blurry and jaggy clouds) once you decrease the clouds quality setting to anything below "High", which of course takes its toll on any system that isn't considerably fast for DCS. At this point, and after four years with this cloud system, you have two options: you swallow the pill, learn to live with it. Or use older DCS versions prior to 2.7 if it annoys you (and boy, it does annoy me!). But that, of course, also comes with a plethora of other problems/limitations...
  15. Note that all those B850 motherboards that I've listed are PCIe 5.0 on both the graphics slot and the main NVMe drive slot. I've noticed that you've selected (good) NVMe Gen4 drives, both for primary and secondary - so that makes no difference whatsoever for your components. Also, those B850 motherboards that I've listed are among the most robust, able to power even a 9950X or 9950X3D "24/7" with no issues whatsoever. So, in that aspect, there isn't any need for a more expensive X870 motherboard - the power delivery is more than covered for the 9800X3D (with PBO inclusively). These are not "locked" motherboards like Intel does with "Z" versus "B" boards. What X870 motherboards (the good ones, at least) notoriously have over the B850 ones is PCIe 5.0 also for the remaining PCIe and NVMe slots, with unshared lanes. Something that content creators really want (to whom time is money) to be fully loaded with the fastest (and very hot) expensive NVMe Gen5 drives. That's not meant for gaming - good NVMe Gen4 drives (like the SN850X and 990Pro) vastly surpass any performance needs and also run cooler (and will so for years). And that's why it's widely considered as wasted features and money even on enthusiast level gaming PCs. Some may argue that the good X870 motherboards have even more USB slots on the back, but then it's hard to justify the price difference with that, when a good B850 motherboard already have plenty (and even more so when any decent USB HUB makes that redundant for gaming/simming). Regarding the cooler, I think the highest power consumption you'll see from a (stock) 9800X3D is 160W(?), with gaming mostly at 45W~95W, similar also to the 7800X3D. Even if you go "OC mode" and load it with PBO settings, it won't really turn into a furnace (like Intel 13th/14th gen i9 and i7 do). A good dual-tower air cooler (like the mentioned PS120) will more than suffice. But some people will prefer a liquid cooler regardless of temps, and that's absolutely fine.
  16. The way I see it, that system could be a little more optimized for value (some things there are excessive) and should have at least one aspect corrected (DRAM). Going in parts that I'd likely change: CPU - - - - - - - - - - - - - In this case the 9800X3D is a perfect choice. However, if budget or availability becomes a concern -and only if so- the 7800X3D is still an alternative to it for less money. (performance difference is there but it's small - for example see 7800X3D vs 9800X3d For VR DCS, FS2024 & CP2077). CPU Cooler - - - - - - - - - - - - - For the 9800X3D CPU you don't need liquid cooling for it. I'd go for the Thermalright Phantom Spirit 120 and keep the money difference or spend it elsewhere. Note that any version of the Phantom Spirit 120 is good, among the very best air coolers out there and it's just $45 (even matches 240 AIOs at over double its price!). Motherboard - - - - - - - - - - - - - X870 motherboards are great, but are a tad expensive (most at well over $300) and you really don't need it for a gaming intended system. You've chosen the "Eagle" from Gigabyte at far lower price, but then I'm not so sure about the low budget orientation of that board (cheap components)... A good B850 motherboard (usually around $200, considered "mid-range") is all you'll need then, but you need to be careful when picking one (not all are good). I'm very partial to the MSI Tomahawk motherboard in this segment (what I'd go for) but any of these will be good: - Asrock B850 Steel Legend WiFi - Asrock B850 Pro RS WiFi - MSI MAG B850 Tomahawk Max WiFi - MSI PRO B850-P WiFi - Gigabyte B850 Aorus Elite WIFI - Gigabyte B850 Gaming X WiFi Regardless of AMD X870 or B580 that you choose, it'll require the latest BIOS right away (for voltages and performance to be safe and sound) - do not skimp on this! MEMORY (RAM) - - - - - - - - - - - - - As mentioned already, avoid DDR5 6400 memory for AMD Ryzen 9000 and 7000 series, unless you're into lengthy/fiddly tuning and adjustments with Memory settings. You don't need higher speed than 6000MT/s or lower latency than CL30 (30-36-36-96) because the 3D V-cache on the X3D chips makes it redundant. Go for a DDR5 6000 CL30 AMD EXPO kit of 96GB (2x48GB). Basically "plug&play", just make sure EXPO1 is loaded in the BIOS (it's same as XMP on Intel). For example, I'd look at one of these kits from GSKILL: - F5-6000J3036F48GX2-FX5 - F5-6000J3036F48GX2-TZ5NR - F5-6000J3036F48GX2-RM5NRK Rest looks OK to me? Maybe see what next replies indicate.
  17. Yes, that's true. It's VA but that's a really good premium panel there, and Samsung does handle VA better than most, especially at these very large sizes. Having had the older 55'' NU8500 with less pronounced curvature, which was also VA (and did have an awesome panel for the time), I'd say that 2nd gen Odyssey Ark 55 is certainly worth a look. I've often entertained the idea of a 48'' or 55'' OLED in the front of my own desk. But, as good as OLED does look (and I've tried plenty), I've decided to refrain on the idea. Simply because, after using a really big 16:9 curved monitor, the big flat monitor always make you feel that something is missing (I'm using a Philips 50'' 4K TV, so that's first hand experience). The 55'' panel was a bit too big for my own personal (desk) use and the only single reason I got rid of it - years later I still often regret it. But if it was for a sim rig and with no budget limit, I'd definitely pick that Odyssey Ark 55. There's nothing else like it in the market. I've only saw one working once in a expo (and was the older 1st gen) and I think it's fantastic for this type of hobby, IMHO. Whatever you end up deciding to get, and if you're not restricted by budget, don't miss the oportunity to first check on it.
  18. hmmm... tough one. For me that would be a huge screen with a nice pronounced curvature (much nicer immersion for simming - if you already tried it, you know it's better). I'd say, it would have to be 16:9 screen format, 4K resolution and a good quality panel at that, with gaming focused features (high refresh, VRR, good motion handling, etc). And, actually, there's one model like that already... Honestly, I'd take a good look at the 2nd gen of the Samsung Odyssey Ark 55'': https://www.samsung.com/us/computing/monitors/gaming/55-odyssey-ark-2nd-gen-4k-uhd-165hz-1ms-quantum-mini-led-curved-gaming-screen-ls55cg970nnxgo/
  19. If you're using the most recent (beta) release of OpenTrack then it could be it, people have considered it buggy (even the author alerts for it) https://github.com/opentrack/opentrack/releases I'd suggest to uninstall it, then install version 2023.3.0, or 2023.2.0, or 2023.1.0, or 2022.3.2, or even 2022.3.1 (further down in that link), see if it helps.
  20. I concurr with @speed-of-heat. That said, if speculating on specs alone (TPU's GPU database), performance should be slightly better than the current RX 7700XT 12GB, and the new RTX 5060Ti 16GB. Which means it will not be a recomendable GPU for DCS VR - something more powerful (and preferentially Nvidia) is recommended to make it satisfying. But, if using a monitor, should be a tremendous 1080P and decent 1440P performer (also thanks to FSR4) at a good price, if the base $349 MSRP is practiced.
  21. If Noctua really has managed to achieve AIO cooling performance without a pump then it means it's a more reliable cooling solution, which is important for those of us still viewing watercooling pumps with suspicion (I still prefer dual-tower air coolers because they're utterly reliable, they never fail). The problem there for me is the unknow pricing. Noctua is not known for making low-priced products (compared to its competitors) and I'd imagine the R&D costs on this are significant. If it does end up working good and priced competitively, then Noctua might have a winner here. I'd surely get one then.
  22. I think you misunderstood that. That quote you got there was a reply to SharpeXB (post below your previous one), not to you. If you noticed, my reply to yours had no sarcasm. And had no other motivation than to make a case to say that Linux OS (through distros), as little or far-fetched as it is right now for gaming, isn't so insignificant anymore.
×
×
  • Create New...