Jump to content

LucShep

Members
  • Posts

    1687
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by LucShep

  1. AMD introduces RDNA4 - Radeon RX 9070 XT 16GB at $599 and RX 9070 16GB at $549 (MSRP) https://videocardz.com/newz/amd-introduces-599-radeon-rx-9070-xt-and-549-rx-9070-rdna4-gpus AMD and some AIBs already list models that will be available soon (March 6th). There are some models using the 12V-2x6 power connector: AMD RX 9070 XT and RX 9070 ASUS RX 9070 SERIES (TUF and Prime models) SAPPHIRE RX 9070 SERIES (Nitro, Pure and Pulse models) POWERCOLOR RX 9070 SERIES (Red Devil, Hell Hound and Reaper models) ASROCK RX9070 XT and RX9070 (Taichi, Steel Legend and Steel Legend Dark models) XFX RX 9000 SERIES (Mercury, Quicksilver and Swift models) Acer, Gigabyte and Yeston should also announce and list their models sometime very soon.
  2. This is really bad news, both intriguing and worrisome.. It does seem like the latest Win10 update breaks WMR. Which then makes VR headsets that depend on WMR not function anymore. Just brainstorming here.... Any of you afflicted by this, have you tried to make a system restore, back to a restore point with date prior to this update? (see video below) Supposing that it does fix the problem (no guarantees, but fingers crossed!) do NOT let Windows update work in the background, imediately block it either with WUB or InControl.
  3. No problem whatsoever here with the Windows Update Blocker app. No interference with WMR for the two years I've ran it, still today. All is good here (see image below). In any case, you have another 3rd party app as alternative, which disables Windows non-security updates. InControl: https://www.grc.com/incontrol.htm
  4. To be constantly updating to the very latest NV drivers is not the best option, it will backfire at some point. They're also historically prone to errors when a new architecture is launched - as has been the case this month. And it's even worse with "Beta" non-WHQL drivers (never, ever, install these). None NV driver is perfect, they are always in constant corrections, and new bugs appearing at times (when previously those weren't present). Latest GPU hardware and popular game titles are the priority for improvements, while others are deprecated and can become worse in the process (and eventually they do). Simply put, stick to the good old "if it ain't broken, don't fix it". If you have an RTX 40 series GPU, stick with driver 566.36 (link here) as that was the last "trouble free" driver for those models. There's a better performance alternative in driver 566.03 (here) but it's not as refined bugs-fix wise. If you have an RTX 30 series GPU (or older), stick with 537.58 (link here) because this is still the very best driver (great overall framepacing) for those models. I also strongly suggest to debloat the NV drivers before installing them - they always have a heap of nefarious bloatware taking precious HW resources for no good reason. You can do this by using NVCleanstall (link here). If you never done this, there's a tutorial here (there may be others around) to see how it works.
  5. I don't have any issues with mine (so far), but I remember reading some mods people done when their cable went kaput (stuff I researched before commiting to a used unit). Look into one of these three threads (there may be more around), perhaps it may lead to a solution?
  6. I have no idea atm but, seeing that it's a cumulative non-security update released just two days ago (25th February 2025), it could be interfering with WMR somehow, yes. https://www.tenforums.com/windows-10-news/218354-kb5052077-windows-10-cumulative-update-preview-build-19045-5555-22h2.html Personally, I'm not going to update Win10 until it's proven WMR is not affected. Others unwilling to give up on WMR (essential for VR HMDs like the HP Reverb G1 and G2) shouldn't either, as reverting a cumulative update is not guaranteed to fix such process.
  7. You can do that with a 3rd party app, with just one click on a button. Windows Update Blocker: https://www.sordum.org/9470/windows-update-blocker-v1-8/
  8. i7 9700K to i9 9900K is too small of a jump for gaming today, though the price on a used one is tempting. The AMD Ryzen 7700X is a good balanced CPU, but it's not really all that great for gaming. Better choice would probably be something like this popular combination: CPU: AMD Ryzen 7 7800X3D (yes, others will recommend the 9800X3D but it's expensive and only a tiny bit better) CPU Cooler: Thermalright Phantom Spirit 120 (any version of the Phantom Spirit 120 is good) MOTHERBOARD: an inexpensive(ish) mid-range B650 motherboard with robust VRM. For example, any of these is great for the AMD 7800X3D: - Asrock B650E Steel Legend WiFi - Asrock B650E PG Riptide WiFi - Asrock B650 Steel Legend WiFi - Asrock B650 Pro RS - MSI MAG B650 Tomahawk Wifi - MSI PRO B650-P WIFI - Gigabyte B650 GamingX AX v2 MEMORY (RAM): any kit of DDR5 64GB (2x 32GB) 6000 CL30 "AMD EXPO" For example, Gskill X-Flare F5-6000J3040G32GX2-FX5 or Gskill Trident Z5 Neo F5-6000J3040G32GX2-TZ5N See if your budget allows for something like that - it'd be a kickass upgrade over your current system.
  9. hmmmmm.... Have you tried booting the PC with just one stick at a time, with XMP profile loaded in BIOS? It could be that one of the sticks is faulty. I've heard inumerous times that 4 sticks of RAM of such high-density (4x 32GB) can be complicated sometimes, depending on specific module and vendor (SK Hynix, Micron or Samsung) and motherboard / BIOS version. But if the memory kit is advertised at such speed/timings (3200 CL16-18-18-38), and seeing that your motherboard (MSI MPG Z690 EDGE WIFI DDR4 ?) is listed in its QVL by GSKILL themselves, then I think it's worth contacting them directly for assistance, via email. GSKILL support: https://www.gskill.com/techsupport Your RAM QVL in the model's page: https://www.gskill.com/qvl/165/166/1582537062/F4-3200C16Q-128GTZR-QVL Tell them what mem kit you have, what motherboard you have, what you already done to try make them work (and still no luck) and ask for their help. They're not fast in answer (i.e, it can take time to get a reply from them) but they're usually very helpful, and I don't recal a single time GSKILL didn't stand behind their products once contacted. PS: not related but FWIW, I have 4x 16GB of Micron 3200 16-18-18-36 1.35v (it's two kits of Crucial BL2K16G32C16U4B, for 64GB total) and it's been working overclocked at 3700 17-19-19-38 1.40v (done over XMP base profile) for nearly four years now with two different motherboards - first on MSI Z490-A Pro, then ASUS Z690 TUF Gaming Plus DDR4. MemTest, Karhu, Y-cruncher, etc, it doesn't matter, always passes with flying colors, rock solid reliable.
  10. I suppose you mean the 16-cores/32-threads AMD Ryzen 9950X (there is no 9950XT model in their lineup). It's not worth going to that from an Intel i9 14900K, because they're direct equivalents, i.e, it'd be a side grade at best, and an expensive one at that. There's the upcoming AMD 9950X3D 16c/32t later in March (next month), same chip as the 9950X but with one of the two CCDs making use of 3D V-cache. Maybe wait for in-depth reviews of that, and see if it's worth the big money and all the hassle. I have my doubts that it's worth leaving behind what you already have for it, but I'll reserve judgement until that's out and thoroughly tested. Notice also that you have a 128GB DDR4 based system. Meaning, you'd need to at least buy the CPU + Motherboard + RAM and, being a completely different architecture, it's basically a whole new system being built from the ground up, all over again. Personally, and looking at your signature, I would keep that system (still very, very good) as is until 2026 or so, when the following new generations of Intel (16th gen) and AMD (Ryzen 10000 series?) chips should be announced, then later released. But then I don't suffer from FOMO (a very common affliction in these forums). Meaning... that's up to you. You may be interested in changing platforms right away if only for the novel experience, and there's nothing wrong in that if you can spend on it. I see that you have a Pimax Crystal for VR. So, all that said, and if that much, it wouldn't be a bad idea to look for an RTX 4090 24GB (last units in stock, or used in mint condition) if at a good price. While your RTX4080S is a great GPU for that task, you'd notice a bigger jump in performance with that swap alone, than with the new CPU + Mobo + DDR5 combined investment. And yes, you'll notice the new RTX 5090 32GB but it's outrageously expensive for what it is (totally absurd, and a fire hazard waiting to happen), it can not be recommended.
  11. We're getting way off topic, but anyway... I personally don't have experience with 128GB kits (4x sticks of 32GB), so can only speculate about it. Looking at your sig, that's DDR4. And if it's Gskill it's most likely 3200 CL16-18-18-38 1.35V (is it?). My guess is that with such high mem density, you may have to adjust DRAM voltage (slight increase) and/or frequency and/or timings (to relax them a bit). That's not uncommon with 4 sticks of RAM, even with Intel 13th/14th gen CPUs (which are a bit more memory agnostic if compared to AMD Ryzen). For example, and just supposing that memory is DDR4-3200 CL16-18-18-38 1.35V, one quick method could be atempted. Once in the BIOS, I'd try loading XMP first (or set that to Auto if all else fails, then repeat following procedure), then manually adjust things as below, one section at a time: DRAM voltage ---- 1.35v (increase to 1.40v, should be safe to go upto 1.45v but not more) Get into the DRAM timing control settings (for manual timing adjustments) and drop down a step or two in the main ones, something like this CL --------- 16 (drop it to 17, if it fails then try 18) tRCD ------- 18 (drop it to 19, if it fails then try 20) tRP -------- 18 (drop it to 19, if it fails then try 20) tRAS ------ 38 (drop it to 39, if it fails then try 40) CR -------- 2T (because 1T is pushing it with 4 sticks of such high density, IMO) Last resort (IMO) would be dropping the mem speed a notch. DRAM Frequency ---- 3200 (decrease to 3000)
  12. I still refuse to "upgrade" my Win10 Pro x64 installation for now. Because 1) it still works very close to flawless while being a bit less resource hungry and 2) I've seen enough shenanigans on W11 machines (that I've built or helped build) out of nowhere after an update - a PITA to solve issues that shouldn't have happened at all. It's time people start realizing this - the days when everybody absolutely needs to use Windows for PC, especially for gaming, are ending. The general discontent with Win11 (perceived as much higher than it already was with Win10) is a reality and, exactly because of that, alternatives start to exhist (with varying degrees of success). Some may soon become direct alternatives. As we all know, Steam is pretty much the #1 gaming platform in PC gaming. You have the store from where you can buy pretty much every PC game there is (save a few exceptions from some publishers) and this platform also has a community market for in-game items of different games. It also has a working social platform for games there, inclusively modding support for plenty games (Steam Workshop). And then, currently, there's a profusion of handheld devices for PC games (boosted no less by the SteamDeck) which require light and uncomplicated OS, for performance and ease of use motives. You may think "okaay... but what does that have to do with my gaming desktop PC ?" Well, this has been igniting interest and fast developments in the OS area (especially Linux distros), ones that can "game out of the box" (right after OS installation) with hardware, peripherals and gaming controller drivers already implemented in the OS. This directly benefits PC desktop systems used for gaming (so, what most of us here use). There are people already converting their main PC to such OS alternatives. Now, with all this said, I'm not saying people should jump head-first on this. That is not what I mean. What I mean is, be curious, start paying attention to developments that are happening. Because, at a certain point (probably sooner than most expect), and for any "normal" PC user, gaming or not, you'll start to see 100% valid solutions, very optimized and far less fussy to use/maintain (than what we have today with Win11). For now, the transition to such new OS alternatives is not yet 100% guaranteed to anybody/everybody (problems with some drivers, incompatible software and games, limitations, certain game's anticheat systems, etc) but we are certainly getting there soon. BAZZITE is one good example today. Among others Linux Distros, this one is quickly getting popularity because it's a simplified hassle-free Linux + Steam based OS (cloud based for updates), one that is considered very user friendly, for both old-time Windows users and PC newcomers (many coming from consoles now).
  13. LOL as if melting connectors and price gouging wasn't enough.... nice job Nvidia and AIBs!
  14. I'd take that video with a grain of salt.... I look at it as somewhat of an anomaly (related to specific mission or game version). What you should notice, because it's important, is that AMD inverted the CCD+L3D stack for the 9800X3D (and upcoming 9900X3D and 9950X3D), which resulted in corrections to the IPC issues seen in past X3D chips (so, it has higher clocks now), which is where most of the performance improvements come from. In the past two generations of X3D desktop processors, namely the 7800X3D and the 5800X3D, the 3D V-Cache die was stacked on top of the CPU complex die (CCD), which then required lower CPU clocks than their non-X3D CPU brethren, due to thermal constraints. With the 9800X3D, AMD has inverted the CCD+L3D stack. The CCD is now on top, and the L3D is below it. What this does is make the CCD's thermals behave like they do on the regular Ryzen 9000 series processors without 3D V-Cache (much improved now), which is how AMD was able to increase the base frequency significantly. And it's considered to have the same overclocking capabilities as the regular 9000-series processors (not the case before).
  15. I'd be carefull if chosing a CPU+motherboard platform just by longevity arguments... AMD's roadmap indicates that Zen 6 is going directly to 2nm and is expected in (late) 2026. It is also been noticed that AMD is retiring AGESA for OpenSIL. This means that the firmware, which tells the motherboard how to communicate with the CPU, is being replaced right about the time that the new CPUs are expected to launch. It could be that the socket will not be replaced, and that every motherboard currently in service will require not only a new BIOS but a new BIOS type, in order to function. But it also could be that AM5 will be a two-generation socket (Ryzen 7000 and 9000 series). Meaning, it's unclear if the next AMD Ryzen (10000 series?) will be supported on current AM5 motherboards, and perhaps it's better to expect it not to be. The Intel i9 14900K is a really good CPU that has proven to perform on whatever single or multi threaded task (8c/16t P-Cores and 16 E-Cores), great for everything. But its outrageous power-consumption and hot temperatures at full load, and especially the infamous degradation issues (which may or may not have been mitigated with latest bios/microcodes updates), makes it somewhat of an insecure investment now, even if its lowered prices (because of all this) are tempting. The AMD Ryzen 9800X3D is currently the best CPU proposition if gaming is the sole priority, thanks to a specific feature it has - the 3D V-Cache. Which for now (as in, "great today but unknown in the future") imediately gives a performance advantage with many games - DCS included. It does result (unfortunately) in a higher price, very much so considering its limitations with multi-threaded demanding tasks - it has "only" 8 cores / 16 threads. The AMD Ryzen 9950X is rather interesting, for its performance and given the price (lower today than at launch). While not as fast for gaming as the 9800X3D (because it lacks the 3D V-Cache) it's not that far behind, it's still really good. And having double the cores/threads (16 cores and 32 threads) makes it much better for multi-threaded tasks for years to come (if important) and is at an identical price. It is AMD's direct competitor to Intel's i9 14900K. If willing to upgrade, I'd wait for the upcoming AMD 9950X3D, see reviews and decide. Promised as the best of both worlds (gaming and productivity) but expect a salty price. And, honestly? ...I'd not disregard your i9 12900K yet, it's still a very good CPU, better than what recent benchmarks may suggest. from https://www.techpowerup.com/review/amd-ryzen-7-9800x3d/
  16. Weekends at evening or night is all I have for flying (DCS or other), due to work and other time constraints.
  17. Meanwhile... - Gigabyte RX 9070 XT unboxed and pictured - Looks like scalpers or retailers got their hands on RDNA4 cards too early. - Radeon RX 9070 XT confirmed to have 64 Compute Units; Powercolor Red Devil (overclocked model) with 900W+ PSU requirement
  18. Here we go. Here's the difference between a propper techtuber and others who are there merely to entertain and profit from the views/clicks. He's interested in going into the rabbit hole, and prove why this design is hot garbage - which it is. It could be the case that the RTX5080 is not free of these issues. It has been suggested that the safe limit is 375W per single conector (12VHPWR or 12V-2x6) and, if true, at ~360W the 5080 is then borderline. Remember, we have to also keep in mind spikes that may surpass max announced TDP when at full load, and possible manufacturing tolerances. source: reddit.com/r/pcmasterrace/comments/1io4a67/an_electrical_engineers_take_on_12vhpwr_and/
  19. It hasn't been confirmed that RBAR works for DCS (not implemented by ED, is my guess) and I honestly never noticed a difference with it either on or off. I do use it for most other games and it does make a really noticeable difference with some.
  20. Yep, the memes are deserved. LOL Here's a fun fact - the CEOs (or spivs, depending on POV) of two of the biggest colossi in this industry - and certainly the leading ones in the GPU market - are actually relatives. https://www.tomshardware.com/news/jensen-huang-and-lisa-su-family-tree-shows-how-closely-they-are-related
  21. Was just now watching Paul's Hardware, today's "Probing Paul". Question/Answer #1 and #2 are Nvidia related (attached vid below is time ready) When he goes into that small trip down memory lane, and remembering that I just.... eeeeeeeh Every electrical expert: This stuff is bad like really. Tech companies: It's fine.
  22. MUHAHAHA congrats to the author ... "you win the internet!" absolutely friggin brilliant!
  23. LOL As nice as it looks, I think that's worth a try with repairing. Contact these guys: Northwestrepair and KrisFix-Germany.
  24. We're long past the days of 250W GPUs where pretty much any decent PSU could (and can) easily and safely handle it with the good old trusty double 8-pin connector, from which some users will jump from. Any of these new 5090s will have spikes reaching (surpassing?) the 600W limit, and beyond the 575W TDP, when at full load. That's on a single 12VHPWR (or 12v-2x6) connector and cable. Check the Igor'sLAB review of the 5090FE: https://www.igorslab.de/en/nvidia-geforce-rtx-5090-founders-edition-review-the-600-watt-powerhouse-in-gaming-and-lab-tests/14/ So, sure, certain AIB models may have the power monitoring issue resolved (as all should have had, IMO). But they'll still be far too power hungry for their single connector. And still lack the active load balancing. The concerns and possible issues in the matter will still remain.
  25. Unfortunately I don't have the money but, even if I had, I wouldn't get a 5090, probably not even a 4090, regardless of "best version" and "bestest PSU and cable" for it. Some units will last far longer than others, or present less simptoms, but all have this garbage power aspect in the boards, that much we now see (ignore it at your own peril). We will see reknowned people in the area showing that it "works just fine", somewhat in contradiction to what Der8auer, IgorsLAB and Buildzoid have shown and/or explained (also the electrical engineer fella in the reddit post). The problem is, proving that a certain combination of "specific RTX5090 model + specific PSU + specific Cable" is fine, won't acomplish anything of what must be in this subject, which is that every model of these GPUs work fully reliable and safely with all available PSU and cable hardware that is compatible, which as it seems they don't. Hey, at least there's one positive side in this honest working guys like Northwestrepair and KrisFix-Germany will have more work and keep doing what they do best, which is repairing GPUs for people who (perhaps) should have known better.
×
×
  • Create New...