Jump to content

LucShep

Members
  • Posts

    1692
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by LucShep

  1. Now that's a more relevant point. And one that can divide opinions (each to his/her own). I personally prefer a glossy panel to a matt panel, because the "matt treatment" (term?) usually dilutes a bit the color accuracy and almost gives a "milky" tone to things (some may be less sensitive to it but I really dislike it). At least in my experience with 5+ year old models, not sure how it is with latest "higher end" ones. Buuuut... of course, the downside of a very glossy screen is that any sort of light behind you will glare/reflect on the screen imediately. The upsides are substancial though (IMO ) - dark to light (and vice-versa) colors transitions get to be better, more so if in a darker room. And that makes sense on an OLED. That monitor you've chosen is not "mirror glass reflection" like old ones used to be, these high-end ones now have reflection and glare reduction (it's almost "mid term"). At least you aren't hooked on the bigger screens like I am... I can almost ear myself doing all sorts of funny whining sounds when looking at these latest 48'' OLED monitors... and their price tag. *sigh* LOL Indeed, OLED is especially expensive but.... aaaaawww ... that crisp image, the zero latency, beautiful colors perception, and dark tones that are really dark (as should be). With a monitor with such specs in addition to OLED tech, you'll be blown away, guaranteed. If that's the size and specs you want, and you can afford it, I'd say "hey, you only live once..."
  2. If text is really so important, then none will really be a good choice (that would be a 5K Samsung ViewFinity S9 or Apple Studio Display, or equivalents). If you want an OLED 1440P gaming monitor, and your only real negative point towards it is the Subpixel Layout, then I'm afraid you'll encounter a similar issue in all of them. AFAIK, they're all either RWBG, or WRGB, or triangular RGB. So, none is "wow, great" on text. Personally, I don't think it's such a big deal. It's nothing like BGR (aka "inverted RGB", much worse - I know because I got one here) and you get used to it very quickly. And if you don't, you can also circumvent the problem more or less. With apps such as these, for example: Better ClearType Tuner: https://github.com/bp2008/BetterClearTypeTuner Mactype: https://www.mactype.net/ If viewing/testing before buying isn't an option, getting it from somewhere where return policy is somewhat "easier" (Amazon?) could be a solution.
  3. It was pushed back, but it'll be released in less than 5 months (supposedly before CES 2025, which is in early January). It is also rumoured that, this time around, the RTX 5080 will come out first (maybe with the RTX 5070 as well), only later is the RTX 5090 expected to be released. I'd wait for the RTX 5080... but I'm not you. If you're not using it for VR, I think the RTX 3060Ti (with limitations, I know I had one) will at least hold on somewhat fine until then (update DLSS and use it in the game!). But if it's for VR, yeah it's more complicated.
  4. Yep, agreed. As much as I like my big screen with headtracking, once you taste DCS in VR (if in good conditions) nothing else feels as good. The meme "once you taste VR you never go back" was real in my case. DCS is one of the sims that really makes all the sense to use in VR (you're there, in the cockpit). Buuuuut.... it is very demanding on hardware - too much, IMO. When things get too heavy or complicated, if you fly mostly SinglePlayer (not MultiPlayer) and don't use recently released modules and maps, then I'd suggest trying DCS 2.5.6 (link in my signature). It's a three year (plus) older version of DCS with simpler shaders and without the new clouds system. Much lighter on resources (about 30% less GPU usage, about the same also for less RAM and VRAM usage). Made all the difference for me (butter smooth, which never was since 2.7, still isn't with latest 2.9) and, as I fit that "offline player, non-recent modules" profile, never looked back. Sure, it sucks to miss newer and upcoming modules that do interest me, but at least in VR it all finally works smooth (can even crank up details and resolution) and don't need to upgrade the PC, nor worry about another game update breaking this or affecting that - I finally just enjoy it. And if you have friends using it too, it also runs great online.
  5. You can never know for sure, it may or may not happen. That said, I notice a lot of people overstressing and getting ansiety for this, it's out of proportions, IMO. You have a device (whatever hardware part), use it and enjoy it - that's what its purpose is after all. It has a warranty, use it if needed - it breaks and isn't your fault, RMA must be served for a replacement. Basically what I said in my previous reply, and quoting:
  6. Nice numbers there! "Lets goooooo, get to da chopaaa"
  7. That really looks like a superb monitor, but yeah.... Subpixel Layout is RWBG (some color fringing around text). But, even so, may be such a small issue that makes no difference, if that's the monitor that you're really looking for. I like the aproach and format these guys use on their reviews (best in the biz, IMO) - and the text-clarity section is always important to look at: https://www.rtings.com/monitor/reviews/asus/rog-strix-oled-xg27aqdmg Hope it can it help in any way.
  8. I understand, but if this business ALSO depends on hobbyists (and increasingly so), then this connector is a resounding failure. Because it needs to be 100% safe AND idiot proof. And it is neither. ....not funny dealing with this after paying $1000 plus. Perhaps I should leave that sort of opinion for someone who is (I believe to be) one of the few yutuberzz to often be 99,9% correct in whatever PC matters... If you're short on time or patience, skip to 13:27 time of video: ^^ ....I rest my case.
  9. Thread derailed somewhat (again!) but hey Not sure how accustomated you are with electronics on motorcycles, but it's very(!) often the case that they "budget" the voltage regulators and wire gage. They crap out at about 20.000 kms or so, and that's in a very large number of motorcycles, higher end models inclusively. And has always been so, even if you can buy aftermarket equivalent parts that are far better (and long lasting reliable), which should have been there right from the start. If multimillion budget capable and reknowned manufacturers do them for vehicles and still get away with it (and knowing it's like that, with decades of bad experiences), so do too PC hardware manufacturers - believe it. It's the triumph of the bean counters....
  10. The "conspiracy theory" is how EVGA built their prototype is also how they adviced it to be, period, but it was more expensive to manufacturer. They decided on a much bigger cooler to circumvent the 450W+ resultant heat, and that would get in the way of construction costs, when it's already a very expensive GPU. So, silly monumental size won and "no no, no can do on connector in that place... eff off". The fact that it's a much bigger problem in the 4090s than it is in the 4080s (not even 10% versus of the RMA for that issue, according to my sources) also exhacerbates how badly it was planned and done. IMO, it should never been more than 300W in that 12VHPWR connector (total power includes PCIe slot socket) yet they went ahead anyway with a single one even on models getting close to 600W - (IMO) should have been two connectors, not just one, but I guess even that would get in the way of lucrative returns... lol
  11. If the connector was pointing up/down (or had a "L" convertor like those currently sold by 3rd parties) that wouldn't been a problem.... Being far of the "hot zone" (and usually where the intake fans of most ATX cases are) would have prevented a pretty big part of the "heat" issues....
  12. Not entirely disagreeing, but my point is - there was absolutely nothing wrong with PCIe 6+2 connectors. How many issues you've seen with them? (I never seen any in 20+ years that connector has been used) Also, the location of the 12VHPWR in the GPU itself, for ALL of the RTX4090s in the market, is wrong. It should never have been within/below/above the PCB, but at the end (i.e, at the side) of the PCB. Because the connector, as it is currently, is then being heated by the fans exhausts in addition to all the massive electric current already going in that single connector. If only EVGA never pulled the "we quit" action, and went ahead with their own idea of how a RTX4090 should be (my guess is they knew all along), I think half of the issues would've never ocurred.... That's their RTX4090 FTW3 prototype, which never went through production. (they've decided to quit right before the RTX4000 series launch) Picture taken from the video: youtu.be/tYzJf71WUcM
  13. If black screens occur with undervolt then, yeah, matter of optimizing curve (a little less clock perhaps). But if 90% is doing ok, then whatever works best for you. And yes, Cultist's PSU Tier List and especially anything HWBusters informs is reliable (as good as the good old JohnnyGuru website, IMO). So, any recommendations there can be trusted. BTW, just my opinion but, if you ever consider replacing the MSI 1000G (already a great PSU), then might as well go the extra mile and pick a good 1250W+ PSU. Pretty expensive and somewhat overkill, yes, but with a high-end system like yours, I personally think it pays up in the longer term (may even be reused again on next system). https://hwbusters.com/best_picks/best-atxv3-pcie5-ready-psus-picks-hardware-busters/7/ All that said, I do agree with @kksnowbear above. Also, it looks to me as well that the PSU Cable seems to have a bit of a tight bend there, right before the GPU connector. Could be it(?). As I read that you've ordered the thermal grizzly wireview (nice one!), it will perhaps also help aliviate that common issue. As side note, I know it's not everybody having the problem but... if it's enough of an issue for so many, then it shows to be an issue with the concept (not of user handling). Not so sure I'm alone when I say that the 12VHPWR connector is among the worst things Nvidia did in many years...
  14. My experience with AMD in VR is only with older models, the RX5700XT and RX6900XT, but I have to say that it was not a good one ("tear" artifacts, bad frametimes, so many darn issues, which back then imediately went away with an RTX3060Ti). Nvidia has been and still is ahead in the VR camp (IMO), noticeably so in my experience (zero issues). But if there's no VR in the plans, and it's for a single 1440P monitor (not really meant for 4K res., but can also do), then I'd say the 7900GRE and 4070Super are both very good. In that, and again like I said in my previous post, you have to ask yourself "that" question........ PS: $70 becomes pocket change, if it's something you value to last more than a year in use.
  15. Personally, I don't see the point in paying so much (550,00€ in my area) for the RX 7800 XT, which in practice gives exact same performance as the older RX 6800 XT. So, for me, it's automatically excluded. Between the RX 7900 GRE 16GB and the RTX 4070 Super 12GB, now that's not as easy to decide. If it's not for VR, and if it's a 1440P res. monitor, both will work really well, even with DCS. Personally, even after a heavy bias for ATI / AMD, for many years*, my preference is Nvidia. (*and boy, how I kept with the RX5700XT through that very dark first year until its drivers were sorted!) But these days even FSR3+ is close enough to DLSS, and Adrenalin drivers are okay. I think you should ask yourself this question... Do you want a very familiar experience to what you have had with the RTX2070, usage wise, but better and with a LOT more performance? If so, get the RTX 4070 Super 12GB. Do you feel like changing to something different, with no problems to quickly adapt to unfamiliar things? If so, the RX 7900 GRE 16GB may be right for you.
  16. Huh what?? Are you crazy? Would I recommend an Intel i7 12700K in 2024 for 200$, brand new? ...which works with Z690 and Z790 motherboards, both available for DDR5 and DDR4 RAM ? (i.e, you can reuse your older RAM!) ...which is faster most of the time than the newer AMD AM5 Ryzen 7700X, 7800X and 9700X, for considerably less money ? ...a 170W Intel "K" 12-core (8/16 P + 4 E) that also overclocks like a champ, even with a simple $40.00 dual-tower air cooler ? (TR Peerless Assassin, Phantom Spirit, etc) ...and hasn't any of this recent degradation BS ? Frak yeah, of course I do!!! a million times! Best processor for the money, by far and large (it's not even close!). This is coming from someone who has repeatedly done top builds with the ultra hyped 13900K, 13700K, 14700K, 5800X3D and 7800X3D, among others - the i7 12700K is an absolute gem, and the most overlooked and underrated CPU in this "yutuberzz-influencerzz" biased market. Makes the 5800X3D absolutely atrocious in price/performance. And similar can be said for the i9 12900K (another gem, though this one is still noticeably dearer than the i7 12700K). So much so that I put my money where my mouth is, and brought one last year for myself. (170$ from used market, and a Z690 TUF D4 for 140$) So good in fact that I don't really see any point worth in upgrading yet.
  17. Looks good, but............ there's three items I'd personally recommend changing. One because it's better to avoid (and worth spending just a little more), and then two other items which I just think there's as good or better alternatives for similar or lower price. 1. GPU (graphics card). For the GPU you've chosen the RTX 4080 Super (very good choice). But they're not all the same, the quality of internal components and cooling vary quite a bit between models. The MSI Ventus 3X OC is one of the models to avoid (yes the price looked good, I know). While it's not "ooh it's really bad", there's much better for just a little more money. For example, three specific models that I've used and found worth recommending are the Gigabyte Aorus Master, Aero OC, and Gaming OC: Gigabyte GAMING OC GeForce RTX 4080 SUPER 16 GB (black color only, $1050.00): https://pcpartpicker.com/product/mXNYcf/gigabyte-gaming-oc-geforce-rtx-4080-super-16-gb-video-card-gv-n408sgaming-oc-16gd Gigabyte AERO OC GeForce RTX 4080 SUPER 16 GB (white/silver color only, $1100.00): https://pcpartpicker.com/product/94hv6h/gigabyte-aero-oc-geforce-rtx-4080-super-16-gb-video-card-gv-n408saero-oc-16gd Gigabyte AORUS MASTER GeForce RTX 4080 SUPER 16 GB (black/silver color only, $1200.00): https://pcpartpicker.com/product/8ppQzy/gigabyte-aorus-master-geforce-rtx-4080-super-16-gb-video-card-gv-n408saorus-m-16gd 2. Storage (NVME). You picked the Samsung 980Pro 2TB as main drive, and the WD SN770 2TB as complementary storage drive. Nothing wrong with that (I think I even recommended it?). But, since my last reply, I've built a system with a newer NVME gen4 (with DRAM) from Solidigm, the P44 PRO, and it impressed me a lot. The performance is absolutely top notch and temperatures are lower (better!) than what I've seen from top competitors, a huge plus. Prices vary a lot from place to place, but there are really good promos on Amazon. And that's why I'd recommend you to pick two units of this instead: SOLIDIGM P44 PRO 2TB (from $139.00): https://pcpartpicker.com/product/X8nypg/solidigm-p44-pro-2-tb-m2-2280-pcie-40-x4-nvme-solid-state-drive-ssdpfkkw020x7x1 3. PC (ATX) Case. The Fractal Torrent is a nice case, but a tad overrated and (IMO) a bit too expensive. I also think it's too big for that type of system. There are great alternatives at far more affordable prices, available in Black or White, some even with ARGB fans+controller (also available w/o ARGB fans+controller, if prefered). For example, check these two below, with linked youtube reviews to get an idea of the details on them, see if it interests you. LIAN LI LANCOOL 216 (ARGB fan controller included) -> M.U. REVIEW - Black ($120.00) : https://pcpartpicker.com/product/PG88TW/lian-li-lancool-216-rgb-wcontroller-atx-mid-tower-case-lancool-216rc-x - White ($125.00) : https://pcpartpicker.com/product/JG88TW/lian-li-lancool-216-rgb-wcontroller-atx-mid-tower-case-lancool-216rc-w MONTECH AIR 903 MAX (ARGB fan controller included) -> M.U. REVIEW - Black ($70.00): https://pcpartpicker.com/product/2MwmP6/montech-air-903-max-atx-mid-tower-case-air-903-max-b - White ($80.00): https://pcpartpicker.com/product/bQGhP6/montech-air-903-max-atx-mid-tower-case-air-903-max-w
  18. @nephilimborn Looks like power delivery issue, either by PSU or GPU cable connector, or the cable itself (?). Can't pronounce about CableMod 12VHPWR connectors as I don't have experience with those, though I've repeatedly see that latest revisioned models are much better quality. I see people mentioning the Corsair GPU Power Bridge and the Thermal Grizzly WireView GPU as well, may be good alternatives. Again, I have no experience with those. The problems still reported with burning connectors I think are increasingly more related to the design of the connector in the RTX4090 itself. (IMO, should have been two connectors, not one!) Meanwhile, try undervolting the RTX4090, you get at least a 20% reduction in power consumption, and only a ~2% reduction in performance (very good trade-off). Various tutorials in youtube. Among plenty others, these two for example:
  19. Buildzoid! He may sound like a nerd rambling but his videos often show very interesting facts with his experiments (f.ex, latest oscilloscope videos on Intel K chips). Once you sort those issues, and if not done already, consider stopping the single/dual core boost from happening, because of its 1.5v+ voltage spikes (one of the main culprits for the current 13th/14th Gen degradation issues). Easiest way to do this is by sync'ing (locking) your P-Cores all at same max possible clock (close to what the "All P-Cores max clocks" is out-of-the-box). Even better if with the Cpu Core Voltage (Vcore) limited to lower values, at around 1.35v (or below). You can set a limit of voltage, in the BIOS setting "IA VR Voltage Limit" with a value between 1350 and 1400 mv. Or you can manually adjust the Cpu Core Voltage (Vcore), either making it by "fixed" or by "offset" voltage adjustment (whichever way you prefer). One way to look at this is like the undervolt that so many also do on high-end GPUs. It prolongs its life, by lowering the voltage and temps. In this particular case with Intel 13th/14th gen, it's (IMO) a very good procedure to drastically mitigate the possible degradation, and doesn't really affect general performance.
  20. The CPU degradation can appear in somewhat different ways from one machine to another. It can manifest by Windows closing applications in the background by itself, or BSOD (blue screen crashes), or system lock up (freezes), black screens, general instability, etc. "Black Screen, Fans 100 percent, hard-reset or power-cycle is the only option to restart" is one of the reported symptoms. Though it could be so many unrelated things (corrupted Windows and/or applications files or drivers, faulty PSU, or motherboard, or GPU, or Drive, etc). That's why troubleshooting with stress-testing is important, to try determining what's causing it. I maintain my previous suggestion.
  21. I was reading the OT and thinking the same, that it could be early signs of the now well known 13th/14th CPU degradation.... @nephilimborn have you updated your motherboard BIOS to latest version, for the new Intel microcode? (if you haven't, you should) Not sure if you're willing to try something with CPU settings in your motherboard BIOS, just for a test.... If you are, then try to sync (i.e, lock) all your P-Cores, and use a manual clock value that is same for all P-Cores (5.3 GHz is the stock "all P-Core" maximum clock for i7 13700K) Repeat testing, if it still does the same thing, go back to BIOS and reduce 100 MHz in that "all P-Core" clock (so, now to 5.2 GHz) and try again. ...repeat and so on.... If at some point the problem stops happening (by lowering the P-Cores clock), then you may have a CPU that has started to degrade, not being able to reach the stock ultra high "boost" clocks with stock voltages. (note: do not increase the CPU Core Voltage to reach the stock boost clocks, as it just makes things worse!) Also, you mentioned not having yet disabled C-States, which is good because you should never disable those. What you may do, if intended, is reduce the limit of C-States (for example, to "C3" instead of "Auto" - which can go to "C10" deepest savings limit) - Intel C-States explained.
  22. Performance is not a problem with the new microcode. Nor temperatures. It's the voltages. The frigging 1.5v+ voltage spikes b!tch slapping the poor processor so hard, to the point of slow unpredictable degradation. They're still there. That's the price to pay for overambitious boosts on 13th/14th gen => stupic stock high voltages and spikes that slowly eat away your processor's life. Even the reliable 12th gen i9 and i7, which aren't affected at all by these degradation issues, will also slowly degrade if you also force them stupid high voltages. People with 13th and 14th Gen 65W+ CPUs, even after installing the new microcode, have two choices: Keep things stock on clocks, boosts and voltages, and enjoy it as it is. The new microcode at least ensures a slower degradation than before. But it will very likely still occur, sooner or later. (in six months? five years? ...who knows?) If it indeed goes "kaput", well... let's hope it's within the RMA time period and a new replacement is accepted. or Lock the P-Cores (i.e, limiting them all to same max possible clock), so that there is no single/dual-core boost. Reduce the CPU core voltage to about 1.35v (or lower). Worst scenario, you might have to lower the "all P-Core clock" a hundred MHz less than stock. Which won't make any difference to whatever use (unless it's competitive benchmarking?) and the degradation is drastically mitigated. ....as should've been done imediately when this problem appeared (IMO).
  23. The issues with single/dual core voltage spikes and ~1.55v Vcore can't be addressed, unless it's done by the user. They can't bring down the voltages to "normal" levels, because that wouldn't allow boosts to go high as marketed (as in spec sheets). Just like they can't disable the single/dual core boost (which would imediately resolve most of the problem!). Because that would represent changing a product "after the fact", when it was already presented, marketed, and sold as that. The single/dual core boost and high "guaranteed" clocks are features, that they marketed and spec'ed for the product. It would be admiting a grave mistake, almost like "it's a scam product", and taking defeat (law suits and indemnifications would imediately go through the roof! ). Unfortunately, the possible solutions/mitigations are something that you'll have to do on your own, for your own. "Dew it!!"
  24. That's a tough one to know for sure. It'll depend on how the motherboard sensors are being read by each different monitoring software. IMO, HWINFO is usually more accurate than HWMonitor but, again, it'll also depend on the motherboard sensors, not just the software. For some reason that I haven't understood, one that usually shows a fairly accurate reading of the Vcore is CPU-Z (even though it's not really a monitoring software). You might have noticed (as said before in this thread, videos showing it and all) that the single/dual core voltage spikes (always going 1.5v+) have not been resolved by the new Intel microcode. The main problem still exhists. So, if it's fully back to stock BIOS values, then it'll be reaching ~1.55v when it boosts, regardless of what you see in any monitoring software. And that will still continue provoking degradation, albeit at a slower pace than with previous microcode.
  25. Nice one. Yeah, so long as you don't mind the 60Hz limitation, the LG UR and UT 7000/8000 are among the affordable and decent 4K TVs that can be used for such purpose (flight sims). I'm pretty sure you imediately understood why the picture size (both vertical and horizontal) provided by a 16:9 big screen makes all the sense, for flight-simming purposes. It's just the way everything becomes much more "real" scale wise. There's no way one goes back to a smaller screen after the experience (the very old 22'' I had around really looked like a tablet in comparison!). At the right (somewhat close) distance and with some headtracking, it's a great alternative to VR (if that's not an option) and far, far easier to run.
×
×
  • Create New...