Jump to content

LucShep

Members
  • Posts

    1705
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by LucShep

  1. @pauldy Before quitting and go to such drastic methods on it... Please try following this tutorial (exactly, step by step) with a method for a Win10 reinstallation without loosing any programs, settings or drivers. The part that you want to follow starts at 6:43 and ends at 10:54 in this video ("The 'Last Resort' Repair Upgrade" section) Nothing else, before or after that, is of interest to you: It's like I'm saying, it reverts your Windows installation to default (so, like a fresh install) but without loosing anything at all. Everything will stay the same, apps, drivers, documents and whatnot. It takes about 20 mins to do the whole thing and, who knows, it may help you there. Meaning, at worse, you lose 20 mins of your time, and at least you'll know if it's Windows corrupted/borked files related or not. After that, if still does it, then yes... it's fair to suspect something else went wrong and some program, or even malware or virus(?) is running on the background. If so, try running SuperAntiSpyware Free, it scans for nasty stuff like malware, rootkits, spyware, trojans and etc, and that part too hopefully gets cleared.
  2. If you wish to decline Win11 updates, it can be done with WUB (windows update blocker): https://www.sordum.org/9470/windows-update-blocker-v1-8/ Very easy to use (one button affair, enable/disable), and works with all Win OS, all the way back from XP to the current 11. PS: please note that blocking Win OS updates with WUB also blocks apps downloads/updates from the Windows store.
  3. That's really odd, and I'm divided. On one side, for some reason it looks like games are running on Windowed Borderless, not "true" Fullscreen. For example, in your video, check how quickly you've Alt+Tab'ed from the game to desktop (and vice versa).... there should've been a bigger delay, like a pause, if on Fullscreen. What I mean is, it doesn't look to me like your game is running "true" Fullscreen. Makes me wonder what happens if you press "Alt + Enter" once in the game (any game)... On a different side, it looks like another app running on background is provoking that "sudden flash". Now, what is I don't know but, I suspect, it may be some kind of automatized process being triggered in the background, like a cleaner, or optimizer, or maybe antivirus? if you use CCLEANER, make sure in its options that "Smart Cleaning" is deactivated. if you use a specific AntiVirus, make sure that every game is added to its exceptions list.
  4. What relatively cheap FFB stick devices were sold under the patents? AFAIK, only Microsoft (Sidewinder FFB 1 and 2) and Logitech (G940 and Wingman Strike Force 3D FFB) sold "cheap" FFB sticks, both under patent licenses. Of course it couldn't be the only factor. But, of course, it is a considerable contributing factor.
  5. That's odd. Windows 10 or Windows 11? Just for a test, try disabling both Fullscreen optimizations and GameBar. Here's some step by step on how to do that, both in Win10 and Win11.... Disabling Fullscreen Optimizations: Right-click on the game’s desktop shortcut (or the .EXE file of the game or app) and click on Properties Select the Compatibility tab from the options available at the top Tick-mark the box that says “Disable fullscreen optimizations” and save this change by clicking on Apply > Ok - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - How to disable Game Bar on Windows 10 1. Open Windows Settings. To do so, open the Start menu and click the small "gear" icon, or press Windows+i on your keyboard. In Settings, click "Gaming." 2. Under "Game Bar" settings, click the switch beneath "Enable Game Bar" until it is turned off. Also, untick-mark the box that says “Allow your controller to open Game Bar". After that, close the Settings app. All is done, Game Bar is now disabled. If you try pressing Windows+G, nothing will pop up. Even if you press the Xbox button on an Xbox controller, nothing will happen. Game Bar has been fully disabled. PS: if you'd like to enable the Game Bar again later on, then reverse the process (switch on the toggle and box again). - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - How to disable Game Bar on Windows 11: 1. Press Windows + I keys together to open Settings. 2. Click on Gaming in the left pane. 3. Then, click on the Xbox Game Bar in the right pane, as shown. 4. Switch Off the toggle for Open Xbox Game Bar using this button on a controller option to disable Xbox Game Bar. 5. Next, click on Apps in the left pane and select the Apps & features option in the right pane. 6. Use the App list search bar to search for Xbox. 7. Click on the three-dotted icon for Xbox Game Bar. 8. Then, click on Advanced options, as depicted. 9. Click on the drop-down menu for Background apps permissions and select Never from this list. Hereon, Xbox Game Bar wouldn’t run in the background and consume system resources. 10. Scroll down and click on the Terminate button to Immediately terminate this app and its related processes.
  6. I certainly won't disagree with the last paragraph (flight sim genre nose dived after 2002, until Covid). But those are certainly not just rumours. And what actual facts contradicting? Immersion was until recently the patent holder for nearly all things FFB, haptic and touch feedback (see here, and here as well). They're know to be quite strict and have sued quite the number of companies, if you look into that part of their history, including Sony in 2002, Apple in 2016 and Valve in 2023. When Immersion sued Microsoft in 2002, Microsoft settled by buying 10% of Immersion Corporation (which, I think, is why some of the patents has Microsoft listed as the patent holder). Of course the license fees exhist and deals can be struck - that's part of their business - but it doesn't mean such deals are frugal. The opposite is actually considered to be the truth, because even one of the big companies in the console market decided to not license - actually ditched the tech - for one of their most antecipated products (Sony did not feature touch-feedback technology in its pack-in controller for the PlayStation 3). Logitech had to license their G940 FFB stick when it released in 2009 ("I-FORCE™ Force Feedback Technology Licensed from Immersion Corporation"). But that's a huge multi-million company that could afford to risk it, and went for it (at the wrong time of flight-simming, should be said, and why it flopped). Thrustmasters, and Saitek (now property of Logitech), didn't go for it, and they certainly had the means and expertise. Perhaps conjuncture - too costly to develop and produce for a small market, and then the Immersion licensing fee on top made those a no go. How did that went for Brunner, VPForce and FFBeast I don't know, nor do you. But I can't believe (do you?) that a licensing deal with Immersion is so affordable that it wouldn't impact those smaller companies resources, or that it wouldn't impact on the final cost of already expensive boutique products, to sell on a niche market. Three of the important patents were listed as expired around 2021 (last time I went around this subject, that link was working), so I understand that both the acronym and the “haptic” simulation of software to an engine can be done without paying this company. Adding to that, the flight sim market has reemerged (exploded is more correct) around that time, and it's still growing. So, I don't think it's wrong or "rumour" to say that it's one less (considerable?) barrier to produce force feedback stick options, that are finally reappearing and probably will multiply in the coming years.
  7. IIRC, there were two companies holding patents on force feedback joysticks, Microsoft and Immersion Systems, and this last one had exorbitant licensing fees, which discouraged manufacturers to pursue the (re)adventure in this specific market. Thankfully these patents have expired, and we're now seeing the initial steps to what may be a small flood of force feedback options in the coming years. I'd be all in, not for these but for a more affordable model, a-la "VKB Gladiator FFB" (.....come on VKB!!! @AeroGator).
  8. It may be the case that the Artic Freezer III works better than the Freezer II with Intel LGA1200 (10th/11th gen) and also AMD AM4/AM5, as these sockets are "square-ish". These sockets do not have the problems that affect LGA1700 (12th/13th/14th gen) stock ILM, such as bending and etc (which affects temperatures), as it has a much more "rectangular" shape, also more sensitive to the contact patch with the cold plate. See image attached below, maybe it'll make sense. For LGA1700, it's when you use an improved contact frame specifically for it, made by 3rd parties, such as the one from Thermalright (or equivalent), that it works as desired. The new Freezer III uses (unfortunately) a proprietary contact frame, so you're forced to use that. And it doesn't provide an optimal contact patch with LGA1700. It's not like it's "bad" (it's still good!) but you can do better for same budget (as I posted above). The previous Freezer II with a specific and improved LGA1700 contact frame, gives slightly better results than the newer Freezer III (tested myself recently with 14700K system). Of course, such contact frame from third parties also benefits other coolers (air or liquid) that do not use proprietary contact frames.
  9. With that system, for 1080P single screen, I'd suggest getting one of these, used in good condition: RTX 2080 8GB (Nvidia) RTX 2080 Super 8GB (Nvidia) RTX 3060Ti 8GB (Nvidia) RTX 3070 8GB (Nvidia) RX 6700XT 12GB (AMD) All of these perform fairly similar. Models with dual fans are "ok", but models with triple fans (better cooling) are prefered. You should find plenty of these "pre-owned" at £200-ish on Ebay. Look for those listed by good reputation sellers. Enquire seller about what use the GPU has had, if there's any problems at all. Ask for more pictures if ones in advert are not enough to judge it on aspect alone. If any mention of "mining" is included in the response, then run away and look elsewhere. Otherwise, if it's fullly working and good condition, any of these mentioned GPUs should be reliable and do what you want. If it really needs to be brand new, I'd say to look for the Nvidia RTX 4060Ti 16GB (any variant), or the AMD RX 7700XT 12GB (any variant). But, being newer models and all brand new, be prepared to pay considerably more. PS: I know people frown upon used GPUs but, personally, I think they're absolutely worth considering, given how overpriced the market is for brand new GPU products.
  10. What's your current system (to fit the new GPU) ? And what's your budget for this GPU upgrade ? We need more info before recommending anything.
  11. Absolutely. ArmoryCrate is a POS software and ASUS should be embarassed for it. The only thing it's good for is to get/install motherboard drivers, when you can't be arsed to pick them up yourself from the manufacturer's website.
  12. Ah yes.... AVX2 and AVX512 ratio offset is a must. Try to set those to per-core "user specify" and ratio offset at "3". Go for it again and see it how it goes. For AVX, those are pretty decent temps for air-cooling anyway.
  13. 1.5v... "mother of god meme". You need LN2. Not a 360AIO!! Thats what I meant before with the previous wall of text... you hit a wall with 12th gen (and why Intel went full retard with 13th/14th gen Raptor Lake, for higher clocks). At some point you need outrageous voltages for a mere 100Mhz bump on clocks! LLC 4 to 5 helps a lot with vdroop for a more stable overclock (but temps/wattage go up too).
  14. That's right, that's really the point of OC'ing (to me anyway). It's a "K" CPU (and a 12th gen free of degradation issues) and you got a motherboard able to do it. If stuff was meant to be pushed (so long as it doesn't break), it's low hanging fruit right there.
  15. All those four 360AIOs I've listed will do fine as any other good 360AIO out there should. FWIW, I've used all of those on 12900K, 13700K, 13900K and 14700K systems without any issues, far hotter than a (OC'ed) 12700K. If those become short for what you require, even with later added nice static pressure quality fans, then it means you're going extreme and require custom watercooling. The thing is - and I went already through it with mine - I sense that you may be searching unicorns where there aren't any... Perhaps refrain expectations and avoid aiming straight at the moon (12700K 5.4Ghz OC?? wooa?!?), because a wall is hit at around 5.2 and overvoltage becomes a reality. I'm pretty sure you'll need to at least sacrifice Hyper-Threading and/or E-Cores. Meaning, you may get extra performance from the P-Cores, but at high cost of temperature and voltages (can't really see it happening unless at 1.45v+?). Which could also mean risk of mid/long term CPU degradation. Not to mention some possible (very) annoying instability occuring at random times. Get that 360AIO if you intend to overclock, yes for sure. But try an overclock that is usefull and not suicidal/unrealistic. For the 12700K, if this is for gaming, personally I'd aim no higher than this (saw no practical benefit over it): 5.3 for 2 P-Cores (decrease to 5.2 if unstable) 5.2 for 4 P-Cores (decrease to 5.1 if unstable) 5.1 for 8 P-Cores (decrease to 5.0 or even 4.9, if unstable) <--- this one is usually the most problematic 4.0 for all the 4 E-Cores (decrease to 3.9 or even 3.8, if unstable) Ring Ratio to 4.0 (easy and safer with 1.25~ SA voltage*) ....or 4.1 and up to 4.2 (harder/harsher on CPU, requires more SA voltage - do not go over 1.35v on it!) * supposing that I read right and that RAM (in your signature) is DDR4 3400 V-Core (Cpu Core Voltage), if it's manually fixed, ideally below 1.39v (with 1.44v being the absolute max limit IMO), and with LLC ideally set at 4 (up to 5, avoid more). Anything more than this, I think it gets too much into that hot zone of more extreme overclocking and benchmarking score contests, not gaming and daily usage.
  16. I'd recommend any of these 360AIOs, and all are relatively affordable. In no particular order of preference: Arctic Liquid Freezer II 360: https://www.arctic.de/us/Liquid-Freezer-II-360/ACFRE00068B Yes, I prefer the Arctic Freezer II over the new Freezer III as well, because it does not have a proprietary contact frame. The one they force you to use in the new Freezer III really sucks (awful contact with the cold plate). The Freezer II is older but performance is still good. Thermalright Frozen Edge 360 (Black): https://www.thermalright.com/product/frozen-edge-360-black/ This is one of the most affordable 360AIOs in the market (check Amazon) but be not mistaken - it performs really good. It's like the 360AIO version of the Peerless Assassin air-cooler - the best bang for the buck around, in my relative experience with AIOs. Thermalright Frozen Note 360 (Black ARGB): https://www.thermalright.com/product/frozen-notte-360-black-argb-v2/ This is slightly better than the Frozen Edge, also affordable (check Amazon). But one advantage it has is also a downside... the pump runs at 5300rpm if at max (3300rpm on Frozen Edge). Meaning, more performance but it tends to make more noise, and may be prone to shorter lifespan if intended to run always at (or near) 100% speed. DeepCool LT720: https://www.deepcool.com/products/Cooling/cpuliquidcoolers/LT720-360mm-Liquid-CPU-Cooler-1700-AM5/2022/16286.shtml This is the best performing cooler (and with best fans) on this short list, though not by far. But is also the most expensive - still reasonably priced. There have been some "scandals" involving DeepCool (search the web) so you may find it hard to get (the brand is banned on some markets). Regardless of choice of AIO for your i7 12700K, you'll also need an aftermarket LGA1700 corrected "contact frame" (aka ILM), because the stock one is utter garbage. This little thing alone will pick another 7ºC or so of cooling temperature, and it also eliminates any bending problems (like the stock one has!). I find it to be absolutely recommended for 12th/13th/14th gen "K" CPUs, be it with air or liquid coolers: https://www.thermalright.com/product/lga1700-bcf-black-v2/ It's also available on Amazon. There are alternative ILM solutions to this on Amazon as well (pretty much copies), which can also work fairly well and are cheaper (9.00 Euros +/-). If you decide to go that route, be careful to chose ones with good feedback from users, and make sure that it's a propper one made of aluminium - some out there are plastic!! As side note, I won't recommend anything other than the AIOs I already listed because I find 360 AIOs of other "reknowned" brands (and much higher price tag) don't perform all that different (actually, most perform slightly worse while costing 2x). It becomes a search for useless vanity reasons. Absolutely not worth paying more, IMHO. If you want to extract a little more performance from those 360AIOs, you may later add better static pressure fans if not satisfied (I like the Arctic P12 MAX). And if you really require more cooling performance than that of a 360AIO, then go for full blown custom water cooling (yep.... far more expensive and complicated!).
  17. So it's better to exclude the obvious part of the ownership experience, and get stuck on numbers which can not translate to what one actually perceives IRL? Do you even know that the current mid to high range 4K TVs, including (so, not only) those 42'' and 48'' OLED TVs, are fully capable monitors in disguise? Have you even tried both solutions directly? I don't think you did. Because, if you did, you'd acknowledge how absurd your "math is always right" argument is, for a PPI difference that is meaningless IRL. Can you give math numbers for how beautiful one feels an image is or isn't ? Can you present numbers to translate the intensity of a higher satisfaction and pleasure one imediately feels, with one screen solution versus the other ? Or how one screen format versus the other, makes you form an imediate opinion when using it for a given sim/game title ? You can't, can you? Because it's impossible. That's what I've been saying - that it is the biggest part of the equation, of the whole first hand experiencing thing. If that still needs to be explained to you, then I don't think you'd ever comprehend anyway. By all means... "Compete on a level playing field" on your own then, I'm done. PS: Unrelated but... I honestly feel sorry for you, when reading that you don't buy things like cars based on emotion. You've missed probably the most rewarding (human) aspect of what makes the whole thing about cars trully interesting. Or at least it really is, for those who care enough about them (that and motorcycles, as is my case).
  18. No, no. That's not how you debate. Do not cherry pick one point I present while excluding all my other points, and then call it a "straw man". *tut tut tut* Straw man is actually what you're doing there with the insistence on the "math numbers are always right, therefore I win" argument. When the fact is that math (PPI perception) will mean jack sh!t once you see both screen formats for similar/equivalent screen area, at normal working distance, regardless of panel tech. Empiricism means everything when you're staring at an image, static or in motion. Not rationalism. What you feel, sense and perceive, is what will count in the end to make your own opinion on a given screen (monitor or tv) that you experience or experiment with. Always, every single time. Not the data numbers that you won't be able to perceive. You presented the Samsung G9 as your example, did you not? I presented the LG C3 as its price equivalent, representative of the 16:9 market it seems to actually compete with. To make it simple, I presented images for everyone to have an idea what they face for a choice, for both formats (16:9 vs 32:9), when using DCS or even general usage. Again... just by looking at this image, think for yourself - which one would you really prefer to use DCS on? I know which one I'd choose (and actually did!), even if it was also a VA panel on the one at the left, as it is on the one at the right.
  19. The problem with that argument is that it becomes impossible not to bring the comparison of OLED to the table. Because every single 32:9 is still waaay overpriced. For their price, you can get infinitely better 16:9 OLED panels that beat them at every single point, be it quality or price, with only a small sect of sim-racing users holding on to that 32:9 ultrawide format. The lower pixel density of 4K vs DQHD (aka"5K") becomes a matter of numbers, like 500hp vs 510hp on a combustion engine (meaningless), because yours and everyone's eyes will not complain about that even on 16:9 48'' screen positioned at one meter (plus) distance from you. Eespecially not if it's an OLED. The math is just math. It's not human, doesn't interpret your feelings and perceptions. Again, what you feel is the truth, it's what creates your reality and your opinion. Try both, then you'll see and create yours too. I did, and for me the winner is obvious (and it's not even close).
  20. The problem is, every 49'' 32:9 screen like that Samsung G9 uses a VA panel that can't even hold a candle to any and all 48'' OLED panel, be it TV or Monitor, such as the LG C3 48 OLED.... which actually sell at a lower price! Overall image quality (static or in motion) is not even comparable, really. Again, for the same price or lower! If you've seen the two working, the C3's OLED panel is s-t-u-n-n-i-n-g. The G9's VA panel is just "meh" after it. We must add another fact as well, the math you implied there can not take into account how human vision works - noone will be using a 48'' 16:9 screen at less than one meter of distance, and the PPI perception difference becomes completely irrelevant and forgotten at such distance, especially with a 48'' OLED (like that relatively affordable LG C3), as crisp and sharp as anyone wishes for. So much so that all major manufacturers simply have quit the idea of an 8K screen on anything less than 65''. Because 4K in 16:9 format on 42'' and 48'' OLED is *muuaa* (Chef's kiss!) That's why all these 42'' and 48'' OLEDs have been selling like hot cakes and will keep doing so, ond and on... The 49'' G9 is, along with so many other 32:9 monitors, and in my experience, one of the biggest scams in the monitor's market - it's not worth anything close to its price. Especially with OLEDs overshadowing it instantly, and that's even before we take into account the screen format. And on that, then the size and aspect. Not even going to repeat myself from previous posts, but consider what I just wrote and check the following images, hopefully it can ilustrate my point... Look at it, and think which one would you really prefer to use DCS on? I mean... really? So, same price (actually lower on the C3), but far more immersive (bigger estate) and far superior quality panel (on every single aspect)? If you can, try to work and sim/game (DCS?) on both. You'll easily see what I mean when I say it's a really, reeaally easy choice...
  21. Share what? Common sense? What I'm saying is that there is no "standard" that can be applied for all cases. What you imply can not be applied to each and every case. Practicality and compromise is what defines what you do with your own space, for the purpose(s). Insist with it as you may, I'll simply ignore it - been there, done that, every solution was tested for myself. I believe you mention having a Samsung G9 49'', right? That's basically the same as two 27'' 1440P monitors side by side, in a 1000R curvature. With the kind of work I do (and so many others, surely), with that monitor, I ended up having more neck strain than with a 55'' 16:9, from being constantly forced to look left to right. Actually much worse because, in adition, I'd always end up getting closer and hunched over towards it, missing the vertical space of a big 16:9 screen. That doesn't happen with a 42'' to 50'' 16:9 screen (not for me anyway). And that's for both work AND gaming/simming. With the agravation that 32:9 ultrawide looks wrong in a flight sim - which looks perfect with 16:9, more so if it's a bigger screen for the desired 1:1 scale for objects and cockpit. How do I know? ...I had one for three months, before going back to a TV (a 4K 55'' curved NU8500). It had been recomended by a "so called friend" with the same argument of "neck strain and posture" benefits, same ones that you too seem to believe is a "standard" that everyone should follow. A huge waste of time and money in my case (was an overpriced POS, in my experience). The problem with your case is - everyone is different, and everyone is not doing the same exact thing, or at the same distance from the screen, or looking for the same end-use. I have no problem in "looking up" (for years now) and have no posture and neck issues, spending 10+ hours per working day multitasking in front of a (biiiiig) screen.
  22. Those "standards" are stuck in the mid 1990s, when the big brick CRTs on top of the horizontal PC case, with a tiny 13'' or 15'' screen, was the norm in offices, also at home. These days multi-monitor setups (three and four screens) are completely normal at office work, for necessary multi-tasking. Alt+Tab'ing windows is no longer an option. I have to use at least three 24'' screens when I'm at the office, there's simply no other way to get the (huge ammounts of) work done. The only way to have that same comodity and practicality at home (working from distance), while at same time getting the benefit of immersion for gaming/simming hobbies, is with a big screen. In this scenario, it becomes a real case of "have your cake and eat it too". While this isn't yet fully across everyone's personal use, it is becoming the norm. Simply put, the old "standards" no longer apply. Example 1. check this video at the 9.09 time point. The guy is using a 55'' and comparing it to a quad 27'' monitor setup. Example 2. check how this guy splits the screen into various windows (he's using a 48''). See what I mean? ...how the heck can one do that in a (tiny) 27'' single screen without being literally face planted on it, to read things?
  23. @SharpeXB and @kksnowbear you two need a room... it always gets weird everytime you guys go with the back-and-forth comments. Personally, I dont give a hoot about the experts and the "standards". I know exactly what I like and what works for me. There's no way to convince me to go back to a simple desk monitor of "regular size". It'd be the same as telling you to swap your "regular size" monitor for a tablet screen! For over fitfteen years that I've been using both monitors and TVs, back to back, smaller screens (below 27'') and bigger screens (32'' and way above that), for PC use, both for gaming and office work. I always ended favoring the latter. I personally feel that anything less than 40'' is just too small, 42'' and 43'' being very acceptable, and 48'' being the ideal size. (couldn't afford an OLED, the alternative was a 50'' 4K TV) Have had 55'' (curved) and also tried 65'' (flat), that I agree becomes too big (scale in game versus FOV gets messed up for me, but YMMV). Benefits go beyond the obvious immersion benefit for gaming/simming, provided by the much more realistic scale (close to 1:1) of objects and the cockpit, be it car or aircraft. If it's a big size quality panel, at the propper distance from you and with head-tracking, then IMO only VR can beat it... (but that's a world of pain, at various levels LOL) How you can use that real estate for your regular "non gaming use", of daily multitasking, is one other major benefit with a bigger screen. The multitasking advantages are real, and the benefits for production/work are imediate. I split my big screen in various windows (usually 4, sometimes 8, it depends) which, pretty much, translates to various office screens. (I need at least three monitors if I'm at the office!) If someone tells me he/she is viewing this forum thread on his/her browser at FULL SCREEN with a 42'' or bigger screen panel, then I'd have to say it... you're not getting the point of a big screen!
  24. That's the 850,00 USD/EUR/GBP (+/-) question, really.... There are benefits but comes at a considerable cost. Too much or not, depends on opinion and budget.
  25. I would probably not go back to LED as well, because the image from OLED is that good (needs to be seen to understand how good it really is). And yes, I too use a big screen (50'') and can't go back to regular (smaller) monitor sizes - it's something you get used to and then feel impossible to "downgrade". But, the thing with OLED is that degradation will always happen, sooner or later, depending if it's on intensive or light use, and how (and if) mitigations are used (short compensation cycle, etc) to go around the unskippable burn-in. It's something that adopters of the tech just learn to accept. You really need to baby an OLED monitor in ways that you won't ever need for LED monitors, and always have a shorter life regardless. That's on top of a much heftier price tag. RTINGS has some pretty revealing articles about burn-in and degradation of TVs (OLED, and LED as well), and are on a ongoing two year test. https://www.rtings.com/tv/learn/longevity-results-after-10-months So far this is what they say about monitors, as recently as this month and with results coming soon:
×
×
  • Create New...