Jump to content

LucShep

Members
  • Posts

    1687
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by LucShep

  1. igor'sLAB has released an article now too. Groundhog Day: The 12V2X6, melting contacts and unbalanced loads – what we know and what we don’t know https://www.igorslab.de/en/groundhog-day-the-12v2x6-melting-contacts-and-unbalanced-loads-what-we-know-and-what-we-dont-know/
  2. Thanks for sharing! That's mandatory reading for anyone interested in this issue, for RTX 5090, 5080 and 4090 models, and an excelent take on the subject. I just hope the guy does not suffer any "retaliation" for speaking so openly. And before it gets deleted on reddit, quoting his entire post: The simple conclusions are (and quoting the author also from one of his following replies in the discussion): "Any card drawing more than the base 375W per 12VHPWR connector should be avoided. Every single-cable 4090 and 5090 is in that mix, and the 5080 is borderline at 360W. Messing up the 4090 is a one-off mistake. It happens. It's not good, but one bad product does not set a pattern. The 5090 is BAD. They knew about the problem from the 4090 already. They had what at least appears to be a working solution from the 3090. They chose not to re-implement that solution after seeing the lack of it cause failures. So there is no way to safely use a 5090? As far as my opinion goes, no. Unless you cripple it down to 5080 power levels, it is simply too power hungry for this connector. It either needs active load balancing (and it better be good at ~600W) or multiple connectors to brute-force a big safety factor. The Galax HOF 4090 is actually a good example. 2x 12-pins, so in theory 1320W of power capacity on a 450W card, and if I use the derated connector spec of 375W, that's still 750W. If you find a 5090 like that, only then would I be comfortable running at full TDP."
  3. HEH true And agreed, wait for the reviews. I'm actually more concerned about AMD's Radeon division abilities to unwillingly self-sabotage..... it's been recurrent in previous releases. First time in many years that the wind blows in their favor, it's all on them now to lose this.
  4. With the messy release of NVIDIA RTX 5000 series, the attention is now quickly going to its competition - AMD's upcoming GPU releases, with RDNA4 RX 9000 series. After NVIDIA's RTX 5090 and 5080 supply issues, it's rumoured that RTX 5070 and 5060 series release has been delayed, as chip supply constraints also affect them. It won't be wrong to say that AMD has a great chance to dominate the mainstream GPU segment, since the general sentiment is definitely against NVIDIA for now. The RX 9070 series (with the 9070 XT and 9070 models) will be the first and, perhaps, most important release of RDNA4. It is (supposedly) the rival to the NVIDIA RTX5070/Ti series, but at lower prices to be more competitive. It launches in March 6th and available worldwide imediately, or within same month in worst scenario. Performance of the RX 9070 XT has been rumoured to be on par with (or better than) the Nvidia RTX 4070Ti Super and the previous AMD RX 7900 XT. Some suggest performance being actually closer to the RTX 4080 in rasterization. The 9070 series will feature a 16GB GDDR6 (20 Gbps) 256-bit memory configuration. There are also rumours of a 32GB XTX version to be launched at a later date. The RX 9070 XT reference models are ~300W TBP and a 750W+ PSU requirement, while the RX 9070 reference models are ~220W TBP and a 650W+ PSU requirement. Of course, expect the TBP to be slightly higher on the OC models from AIBs. It will feature a new dedicated FSR 4 (AI algorithm, similar to Nvidia's DLSS), which may be added also to previous FSR 3 and FSR 3.1 game titles. The RX 9060 series has been confirmed for 2nd quarter 2025, to be competing with Nvidia's current RTX 4060 series and upcoming 5060 series. AMD is confirming that the Radeon RX 9070 series will launch on March 6, exclusively. Officially, the RX 9070 XT costs $599 while RX 9070 will retail at $549. That said, we all know that prices on new GPUs have been well above MSRP, and some level of scalping could ruin (once again) a very antecipated GPU launch. Stock availability at launch is unpredictable, judging by latest GPU releases - maybe it's good, maybe it isn't. Rumour has it that AMD has been shipping RX 9070 series since late December, a large stock is waiting at retailers, and already in hands of reviewers (for which you should wait).
  5. Nice! Following this for sure, keep it going.
  6. There's still no security monitoring. The current is still not evenly distributed across the pins. There's still too little margin of error with one single connector. The problem remains, i.e. the difference with the new cable exhists but issues are only better disguised.
  7. I'm still perfectly happy with my RTX3090 and won't be upgrading anytime soon. That said, and even though there's the much larger VRAM buffer at 24GB surely helping with DCS (being the VRAM guzzler it is) and especially in VR, I'm not so sure it's that big of a jump from an RTX3080 (~15% performance difference at 4K).
  8. In the EU, the CE-marking is merely a legal construct. Putting it on the product is considered to be a promise by the manufacturer that they ensured that their product complies with the EU standards. So if they put the CE-marking on the product without actually doing that, the courts can conclude that it was not a mere oversight, but a willful act of not following the law. But there is no testing required by an independent/government body. Testing it would be hard anyway, since the Low Voltage Directive just uses generic language stating that the product should be safe to use and connect, and that the manufacturer should recall or fix the device if this turns out not to be the case (despite a solid effort to make it safe). Presumably, the courts would create jurisprudence, or it already exists, on what is considered to be safe enough, based on expert testimony or the assessment by national agencies, and when a recall is warranted. In the US, the laws seem more centered around empowered agencies to make and enforce rulings, but GPUs are probably not on the radar of any safety agency right now. So in practice, we probably either need a sufficiently big scandal with people dying for this to get on the radar of the agencies, or people need to sue themselves. Woooaaa ...I had no idea it was that lenient! (and thanks for the explanation) I suppose that, yes, it'll have to get far worse before it gets any better. The next coming months will be revealing, I guess. +1. Ditto It'd be a shame to be without future GPUs from the leading hardware/software manufacturer in the area. But then, if this is really their modus operandi, releasing products on the verge of price gouging while being potentially faulty (and dangerous), then I honestly think we'd be better without them (good riddance). There'd still be AMD and Intel (and possibily others who'd venture, like in past) picking where it'd be left at.
  9. Honestly, the more I read about this, the more I find this (far) worse than what happened with Intel Raptor Lake CPUs. Exactly. No security monitoring and too little margin of error. The power port on the 5090 overheats and burns because the current isn’t evenly distributed across the pins. In addition, it should have had at least a second connector (not just one). As the Der8auer tests show, one of the 12-pin connector wires draw 23 amps (over double of what is suppposed to run?!?) then causing temperatures to spike to 150ºC. That's crazy. A connector heating up to 70ºC, while the PSU gets extremely hot at 150ºC, all in just a few minutes of benchmark testing, is perhaps all that needs to be seen. And now think about this - if having the mentioned headroom of 5% (Nvidia 600W rating) or 15% (Molex 660W rating) is already too little on itself, then one has to wonder about the possible transient spikes, possibly going over 750W(?) in that puny cable and connector on each end, all prone to manufacturing tolerances.... What I wonder now is, seeing how this can get dangerous real quick, how could this have passed safety tests for consumer market etc? Always good videos from Buildzoid.
  10. Yep. And he already did a video on the RTX4090 before, explaining issues.* If the 5090 is now pushing upto 25% more power than the previous 4090, then of course it's now a problem of "when", not "if". To undervolt the 5090 is an absolute "must do", even more than before. To put it simply, this connector should have never been used. And if so insisting on it, it should have been two of them in the 5090 and 4090. * If you're short on time or patience, skip to 13:27 time of video:
  11. Yeah, I agree with @HansPeter1981, it does look like a description of a GPU with exhausted VRAM. @goot66 and @psychotik2k3 can you please try again after changing the game options "Textures" at LOW and "Terrain Textures" at LOW? Just for testing purposes to see how it impacts, and if it improves the issues you mention - if it does, it's definitely VRAM related as suspected. I'd also suggest to consider doing the following: If using less than 64GB of RAM (48GB, 32GB, or less), setting the pagefile fixed to at least 32GB (32768 in both initial and max size) and set on the fastest drive of your system (be it NVME or SSD). It helps tremendously with games that use lots of VRAM (like DCS) sometimes exceeding its limits, then also possibly exhausting the RAM - when that happens, it's then assisted by pagefile (aka virtual memory). Disabling HAGS. For most systems it's best left OFF (disabled) because it can introduce issues such as stuttering and hitching, and will increase VRAM consumption. Should only be enabled in rare cases where the CPU is vastly weaker than the GPU (not true on most gaming systems), as enabling it makes the GPU's scheduling processor and memory (VRAM) take over work that should be done by the CPU and, of course, it does penalize GPU resources. Using a higher performance power plan in Windows such as the Ultimate Performance Power Plan (enabling it before lauching the game, reverting to "Balanced" after exiting it). While peak performance won't be all that improved, it can benefit things once in game, to reduce stuttering and hitching issues. Disabling Core Parking, as it's a feature meant for laptops and office PCs. Disabling it can reduce micro-stutters when playing games or using resource-heavy apps. There's also a newer (free) application called ParkControl which handles this better on modern CPUs, along with Windows power plans. Disabling HPET and Dynamic Ticks, as it allows unrestricted I/O to occur, and helps to decrease micro-stuttering and hitching that may occur during gameplay. Meant for portable and battery systems, can be a problem for desktops, it's known to cause issues especially when gaming. Disabling VBS / HVCI if the system is meant for gaming - this setting is useless in such case, and disabling it can benefit performance when gaming. Enabling rBAR (if using NVIDIA GPU 30, 40 and 50 series). While benefits for DCS are debatable (or non felt), it benefits too many games for it not to be used.
  12. @AvgWhiteGuy Only now noticed your system specs... Intel i5 4460 (3.4Ghz 4 cores/threads) and GTX1070 8GB, with DCS 2.9x at 3440x1440.... That's not going to work well. Of course it runs worse than 1920x1080, because you're now asking that old system to do over double the work for number of pixels on screen (before 2073600, now 4953600). Honestly, for DCS 2.9x at 3440x1440, you'll really need a new complete system, not even worth upgrading parts of what you have there. Until you have a new system, for the meanwhile and depending on what modules and maps you have, I'd suggest trying an older version such as DCS version 2.5.6. Depending on exchange rate possibility, a budget of ¥18.000 (about 2.390€ur, or 2.470$USD) might be a bit tight for that new DCS PC system that you're planning. But, in any case, if this is a system to last, and for just a bit over that budget, I'd consider more or less something similar to this: CPU: AMD Ryzen 7 7800X3D (yes, others will recommend the 9800X3D but it's expensive and only a tiny bit better) CPU Cooler: Thermalright Phantom Spirit 120 (any version of the Phantom Spirit 120 is good) GPU: Nvidia RTX 4070Ti Super 16GB, any mid-range model version For example, ASUS TUF, MSI Gaming Trio, Palit Gaming Pro, PNY XLR8 Verto, Zotac Extreme Airo, Gainward Phoenix, Galax/KFA2 ST Plus .........or wait for the upcoming RTX 5070Ti 16GB ? MOTHERBOARD: a good but not expensive mid-range B650 motherboard. For example, any of these is great for the AMD 7800X3D (IMO) - Asrock B650E Steel Legend WiFi - Asrock B650E PG Riptide WiFi - Asrock B650 Steel Legend WiFi - Asrock B650 Pro RS - Asrock B650 LiveMixer - MSI MAG B650 Tomahawk Wifi - MSI PRO B650-P WIFI - Gigabyte B650 GamingX AX (rev 1.3) MEMORY (RAM): any kit of DDR5 64GB (2x 32GB) 6000 CL30 "AMD EXPO" For example, Gskill X-Flare F5-6000J3040G32GX2-FX5 or Gskill Trident Z5 Neo F5-6000J3040G32GX2-TZ5N STORAGE: I'd suggest getting two separate NVME Gen4 drives, one for system and general files (1TB or more), and the other (2TB or more) for DCS and other games. For example, WD Black SN850X or Samsung 990Pro are fast and reliable, among others. Of course, you can also add and reuse SSD drives from your older system. PC (ATX) case: a good ventilated mid-tower ATX case with fans in front (cold air in) and back (hot air out), at or below 110$ (no need for expensive cases), for example: - Montech AIR 903 MAX - LianLi Lancool 216 - LianLi Lancool 215 - Phanteks XP Pro Ultra PSU: a good quality "mid-range" 850W 80+Gold and PCIE5/ATX3 certified. For example, any of these will do super fine for this system: - Thermaltake Toughpower GF3 850W - Seasonic Focus GX ATX3 850W - Super Flower Leadex III ATX3.1 850W - BeQuiet! PurePower12M 850W - MSI MAG A850GL PCIE5 850W A system like this will run DCS at 3440x1440 really, really good. It's not "overkill" and, I think, it's worth saving / paying for a good system base that can keep you satisfied, since day one and for a longer time, after investing in it.
  13. 100% agreed. Pick your maximum possible budget for the new system first. With that in mind, we can then assist you here if necessary, for a possible selection/list of components.
  14. Yeah, that's basically the problem in many of the clueless arguments that you'll read in this thread. These guys saying"how much better it is now" trying to defend the latest GPUs utterly ridiculous prices, not even comparing "apples to apples" when bringing up the equivalent of back in the day, just getting segments and models completely wrong. The real GPU equivalents were not only much cheaper even after currency era conversion but, actually, may also have been used to better effect with the respective era demanding games back then. That's when the "price" problem does rear its ugly head. That and "modern features" defense, like if gunpowder was discovered. Take the old CRT monitors, for instances (still the best motion clarity and response times, ever). You could choose from a list of widely different resolutions and refresh, which could be selected without any blurring - there was no "native res" because that was whichever one the user found best for "quality-vs-performance", like intended, which was easier to achieve irrespective of GPU - for many years, long ago. We had what was like "DLSS before its time", in the friggin monitor. FrameGen what? Your average CRT (and Plasma TVs) at 60Hz always felt like 120Hz+ on modern LED panels. Different subject but, really, that and so many other great things which were lost along the way and, unfortunately, never picked up again (and I still crave for a Pioneer Kuro or Panasonic VT/ST 1080P Plasma...).
  15. As I said before, it doesn't have to do with them complaining or "has to have". Wrong expectations? Of course they are! As I said, it has to with a "parent and son" aspect of PC gaming, in many cases a repeat of similar experiences that a parent had (or wish they had) back in the day. Many of these dads remember what components exhisted then and what were their prices then. To see them realizing in shock how expensive these are now, reveals a lot (IMO). Of course the kid gets a below average PC in the end, because no parent is willing to spend that much, and usually ends up going the used components route (still best solution). Of course the kid has no saying and "it'll have to do". Even the RTX 4060 8GB that you mention, which is not much more than an entry level GPU (only the lowly 4050 sits below), still sells for 300€+. And the 4060Ti 8GB is 450€+. That's the modern equivalent of a late 2006 GF7600GS, which sold for 80€ or so in here back in its day. GPU prices are crazy today, yes, even the low entry level stuff.
  16. Yep, your replies clearly show the "TROLL" label that people constantly put on you through the years. And shame on me, for taking the bait and thinking otherwise before (waste of time, I now realize).
  17. I don't know, probably the "parent and son" interaction aspect of PC gaming goes completely over your head? Many of the complete systems I build or "list" (as components) are meant for that, as something that many parents here use to connect with their kids. And many parents go for it when their teenager kids do great in school grades and such. Who in their turn have inherit their parent's (very) old PC before, and went "more advanced" in their gaming habits. Many of these kids are (unfortunately) heavily influenced by tech-tubers and streamers, and are aware of what is good and isn't. What's next? Maybe they should try fishing? Or planting some cereals in someone's backyard to make bread and eat? ....oooh the poor dirty peasants with dreams of PC gaming, right?
  18. Trolling much today, are we? Kids also don't need an iphone, but some get it anyway. What do you mean? Presenting such prices to the parents doesn't mean that they go for the purchase (actually much to the contrary, for the vast majority of them). It just means presenting that reality of GPU prices for them, a very different one to 10 or 20 years ago that those parents remember from back in their PC gaming days.
  19. I guess we just live in different realities then. You live on a very wealthy country, I guess? Try to tell that to a parent that, in early 2025, has to pay 530€+ for an RX7800XT, or 640€+ for an RTX4070S, to put it into his kid's new PC. I have to do it all the time. That's about half of the average (liquid) salary today in my country, which is 1150€ (and inflation goes.... heh!) And no, before that argument comes up, the RX7600XT and RTX4060Ti are not "mid-range" at all. That is what the RX7800XT and RTX4070S are, "mid-range", same segment of the old GTX970, which completely changes your narrative there. In 2005 we used to put an ATI X700 256MB for 150€ (that's 220€ in today's money) or an Nvidia GF6600GT 256MB for 200€ (today's 300€), which were mid-range segment. Average (liquid) salary here was 750€ in 2005. In 2015 we used to put an AMD R9 280 3GB for 200€ (that's 250€ in today's money) or an Nvidia GTX970 4GB for 300€ (that's 380€ today), which were mid-range segment. Average (liquid) salary here was 950€ in 2015. Both you and @SharpeXB mentioned the ATI HD4850, a (fantastic) true mid-range, which people bought for 200€ and less in the 2nd half of 2008, when it came out. We used to build gaming systems by the dozens with those. Even the mighty HD4870 and HD4890 1GB models were sold for 300€ or less each, in their day. It makes you think for a while, knowing that for 300€ and change you could have one of the fastest GPUs in the market (in late 2008). ...how much for an RX7900XTX now?? LOL I deal with the subject of both gaming and hardware for decades (done it professionally in both areas, now merely a sidejob or hobby) and, to say that, today, for GPUs of mid-range we have it the same as before (or that we have it better now, as you've said previously) is, IMO, to be completely clueless on what goes into his own society. As I said... "it used to be better", and it really was for GPUs. Paint it as fancy as you like but, this particular subject is close to me and, further than this, I think, it's one of those things that I'll politely "agree to disagree".
  20. I suppose this will be for gaming (DCS and etc) and not for office type work. If so, I'd recommend a 4K 120Hz VRR OLED PC monitor or TV, either 42'' or 48'' size (pick your poison), because of the impeccable image, motion handling and response times. Any of these will be awesome: 42” 4K 120Hz (138Hz w/OC) ASUS PG42UQ Philips 42M2N8900 KTC G42P5 42” 4K 120Hz LG OLED42C2 (TV) LG OLED42C3 (TV) LG OLED42C4 (TV) 42” 4K 120Hz Bendable (adjustable from totally flat to 900R curve) LG OLED Flex LX3 48” 4K 120Hz (138Hz w/OC) OLED AOC AG485UD2 Acer CG48 LG 48GQ900-B Innocn 48Q1V ASUS PG48UQ 48” 4K 120Hz OLED Gigabyte AORUS FO48U Skyworth G90 BenQ EX480UZ AOC AG485UD LG OLED48C2 (TV) LG OLED48C3 (TV) LG OLED48B4 (TV) LG OLED48C4 (TV) That said, there is a small number of 50'' and 55'' 4K TVs (not OLED) that are 120Hz VRR and HDMI 2.1, which are more affordable than all those that I've just listed. There are some good examples in the American market, and for example.... At 50'' size you have the Vizio MQX (mentioned above by @rmm) and the Sony X85K. At 55'' size you have the Hisense U7N, U7K and U7H, and the TCL Q7 and QM7. Most people consider these too big for PC gaming, but it's a matter of taste and preference. And, while not as good as OLED, these 4K TVs are great also for gaming.
  21. LOL! That one almost sounded "woke"! ...you have to be really desperate to hold on that argument by bringing friggin laptops (considered downright luxury back in the day) to the conversation... Really, search same "general positivity" articles for PC hardware, such as CPUs, Motherboards, PSUs, and especially GPUs. You'll likely find none. 2024 is widely considered as one of the worst years for PC hardware in a long, long time. Same for gaming releases, with exception of a couple of AAA game titles. Weak releases in 2024 and beggining of 2025, of highly antecipated (and now even more expensive) products, which shown to be disapointining (Intel Arrow "Error" Lake + Z890 motherboards, AMD Ryzen 9000 "NothingBurger" + X870 motherboards, Nvidia 5000 series "for your 10.000$ gaming-center", etc) doesn't paint 2025 any better, so far.
  22. Not true, and this has been a subject of many a discussion at other places in recent years. It'll depend on the time period and genre but, in general, it's really the contrary. Compare today with a decade ago (2015), and with two decades ago (2005). Your average "mid-range" GPU, their cost, especially the relative performance versus cost for their time period, more so if put against in % difference to the top-level models of the time, was far more representative of better value for the average person. One may argue that there are more games today than there were then, so there's now more to chose from the current pool, and so wider/different performance requirements. But modern AAA games are more demanding than ever, requiring hardware at "Recommended Settings" that, comparatively to a decade and two back, is far more costlier against average wages in western countries. One may also argue that AI solutions for upscaling and frame-generation may alleviate things - that's the intention - but then there's nothing as "free lunch".... Those come at a cost of picture and motion quality, and are now a way to exploit the very self game development system, by lazy gamedevs who are already counting on them as base model, for the game's "Default" performance and, therefore, skimp on game optimizations that would (and should) be done in the first place. Meaning... you're back to the beginning, and their purpose - being twisted - may well become lost. So... paint it as you like, but "it used to be better", really.
  23. Actually, and AFAIK, Nvidia SLI and AMD Crossfire worked in DCS with versions upto 2.5.6 (stopped working with 2.7).
  24. HEH... pitchforks... and the "standard"? The "standard" is to have full sized diffuse maps and 1/2 size of that on every other map (speculars and glows). Which is what everybody has been asking for, if you noticed. But ED and 3rd parties actually make them all full size (diffuse, speculars and glows), and most times at 4K size. That's even in situations when even Diffuse at 2K and speculars and glows (Spec/Norm/NRM/Roughmet) at 1K would be more than sufficient (with armament/weapons, for instances). It's crazy! Also, it's a mistake to think that the textures need to be at 4K (4096 pixels) or 2K (2048 pixels) or 1K (1024 pixels). You can use in-between sizes and get more gains with it as well - they only need to be multiple of four. So many textures could very well be at 3K (3072 pixels) and be pratically indistinguishable from 4K ones, with huge gains once you collect all the textures. Or be at 1.5K (1536 pixels) and nearly indistinguishable from 2K ones, again noticeable gains once you collect all the textures. And then comes the format... Who in his/her right mind would risk to use 32-bit textures in a game like this, with so many textures at such high resolutions? 8-bit is plenty enough, noone will notice a difference whatsoever and it's at least 2x smaller (see here)...! And transparencies (DXT5) used when there is none in use (so could have been DXT1) - yet more gains ignored, textures that could have been, yet again, 2x smaller...! The part that irritates me, in all these years that we've been begging for ED and 3rd parties to do it, is the sheer lack of will to even experiment. So that they can see for themselves what we've been talking about. Because it's that evident, once in game, after you correct sizes and formats of textures in everything. ED and 3rd parties need to do better, there are significant gains to be had for everybody and it's all neglected. It's no wonder DCS is such a VRAM guzzler.
  25. Sure, there are plenty Indie games that are cool, and a few one-man projects that are really interesting (I'm waiting for Mass Conflict:Ignition and Over Jump Rally). But, let's be honest... the Indie arena today has been completely polluted with a heap of crappy isometric and side scrollers. I still miss that period of late 1990s to late 2000s, when a lot of new awesome games (now part of gaming history) were popping left and right. People were doing really innovative complex things (ED is no stranger to it), we never got to see it again, not at such volume. There's a sense of "fast food" in the gaming market. First were the FPS a-la CoD and BF, then the "BattleRoyale" genre, and now ultra flashy (and HW intensive) "3rd Person Fantasy RPGs" clones flooding the market. I mean..... At least make them without silly requirements, so that everyone can play them also with more "economic" GPUs? For sure, considering how the remaining 4090 in stock are keeping prices, as are the used ones (actually with value increased), that looks like a good decision! It may well be the case that not even the future 6080 will beat a 4090, much less beat a 5090...
×
×
  • Create New...