-
Posts
1699 -
Joined
-
Last visited
-
Days Won
2
Content Type
Profiles
Forums
Events
Everything posted by LucShep
-
@RealDCSpilot thanks for the impressions and details. Following your feedback, as well as the current initial reviews on youtube, this does seem like a valid VR headset for PCVR, maybe also for newcomers. Actually, it seems to be the new alternative to the well known (but discontinued) HP Reverb G2. From what I gather for general first impressions across the web: - Similar resolution to HP Reverb G2, but with better FOV and better colors (OLED!) - DisplayPort connection (so no image compression and performance impact from wireless encoding/decoding, and no batteries) - Comfortable enough "out of the box" for most people (doesn't require aftermarket straps, mods or hacks, HALO type design) - Somewhat acceptable price (60€ for PC adapter + 600€ for VR headset --- used ones can be found at half the price) - Wide offer/availability in big marketplaces
-
NOOOOOOOOO You should have told her to get her own moto. LOL Congratulations for at least having it for a while, that's a friggin unicorn right there. I still dream of owning a 1990s 2-stroke 250cc GP replica like that (RGV, NSR, TZR, KR1S, RS, etc). Unfortunately, near impossible these days, as they're as rare as hen's teeth and worth a lot of money now, if in pristine condition. That and the old 500cc MX'ers, even if obviously impractical, they're the ones doing my mind every time I watch some nostalgia video of 80s and 90s GP and MX heroes. Speaking of which.... here's Eddie "Steady" Lawson in the beautiful Cagiva C591 back in its day:
-
GTX 1080Ti or RTX 2080Ti. Which to choose?
LucShep replied to Holbeach's topic in PC Hardware and Related Software
Oh but there is.... I'm sorry to say that you're misinterpreting those figures. Those are unusually bad temperatures for an RTX2080Ti Strix - it's most likely suffocating in there. IIRC, that RTX2080Ti Strix was/is one of the coolest (literally) models of such a "hot" Nvidia GPU, usually high 60s for max core temps with fans just a bit ramped up. Yours is not even close to that. If you're already hitting 85ºC (?!?) on the core with unlocked framerate (quote,"GPU 85ºC at 99% continuous"), it then means that your memory junction and hotspot may be going over 95ºC (you always want to keep those under 90ºC). It's cooking in there, even with CPU+RAM nowhere near matched to it... You want to keep that temperature reading you're seeing on that GPU always below the mid/low 70s, never above that. 60~70 FPS limit, that seems a better strategy for now, yes. And a more agressive fan curve on the GPU as well. Keep the old Antec case somewhere for nostalgia sake if desired, but please think about the new modern case with big airflow - the lowest hanging fruit for you at this point. You don't want to risk degrading that nice GPU now if you've just invested on it. PS: look at the top of that RTX2080Ti Strix, there is a button switch for two BIOS profiles that it can either be on, "Quiet" (Q MODE) or "Performance" (P MODE). Perhaps it's set on the "Quiet" (Q MODE) profile. Please make sure that is set to the "Performance" (P MODE) profile. (note, turn off the computer before changing that switch) The only change is the fans curve, which goes higher (much cooler temps, but louder) when using that GPU with its "Performance" BIOS enabled. -
If it's only for DCS, then 4K 60Hz will be ok. But if it's "multi use", then I'd recommend spending for the higher refresh rate, it makes more difference in racing-sims and regular desktop use than you think (not so much with flight sims). I decided to spend the least possible (went 4K 60Hz again) and I now kind of regret not spending more for the SONY X85J/K 50''. Would probably go 55'' if no other choice, and just sit a little further away from it (one gets used to it anyway). Don't forget, higher refresh rate doesn't mean it always requires high FPS usage in games. You can always lock framerate with software (Riva Tuner, etc) and use VRR (Freesync or Gsync, depending on model), or change refresh-rate if desired. Depending on model and size, ~9Kg for 43'', ~12Kg for 50'', ~16Kg for 55''. I now got a deeper desk, close to the wall with the monitor on top. But before that I was using this stand for the 46'', 50'' and 55''. Does the job really well, solid and plenty vertical adjustment. Recommend something like that if your idea requires it. https://www.amazon.es/dp/B087F8S5C2?psc=1&ref=ppx_yo2ov_dt_b_product_details
-
I've used TVs for some fifteen years now as monitors for my PC gaming (flight, racing and mil sims, RTS, and all sorts of RPG and Action games). Then 1080P (four), and 4K in the last four or five years (went through four of them as well). I've used Sony, LG, Samsung, Toshiba, Philips. While generally you'll be fine, you need to watch out for some things. For example, that there are many 4K TVs now with BGR subpixel layout, and this will affect text on screen (blurry and/or aliased text). While unimportant if it's just for gaming and movies, this may be annoying if it's to be used like a regular monitor with lots of text as common usage. Try to get one that is RGB subpixel layout. Next, the default color and brightness settings, which will usually be all over the place, with over-brightness and over-saturation. Any and all will need for you to calibrate them, for your own needs. One trick that will work with most, for a starting point, is to use the "movie mode" settings for all things image color/contrast settings related (usually the better image mode), but manually applied on the "game mode", which is the one you'll want to use for PC gaming (for best latency). Check for good reviews of a TV, you'll probably find they list specific calibrated settings (using professional tools) as recommendations, for you to apply (a good option). Then there's the size, which can be personal preference. I've had 32, 37, 43, 46, 50 and 55 inch panels, flat and curved. For me, 50 inch is about the perfect size (or 48 if it's an OLED), and what I currently use. More seems too big, and less feels somewhat small. FWIW, my eyes are about one meter (39.4 inches), give or take, away from the screen. The problem, like you describe, is that the "affordable" ones with 120hz+ VRR panels are hard to find in 50'' size or below. I've only found two options. And both are from SONY, with 50'' size, 120Hz panel. SONY X85J 50'' 120Hz (2021 model) SONY X85K 50'' 120Hz (2022 model) I've only tested the "K" but both are practically the same thing. Pretty darn good, almost went for it. But at last minute decided it's too expensive (yeah, tag price seems to include the posh "SONY tax"....). Other than those, I found only 55'' inch (so, bigger) for affordable(ish) prices, such as: Hisense U7 line in 55''size with high refresh panel: U7H 120Hz (2022 model) U7K 144Hz (2023 model) E7K 144Hz (2023 model) E7N 144Hz (2024 model) TCL C7 and C8 line in 55'' size with high refresh panel: C835K 120hz, (2022 model) C741K 144Hz, (2023 model) There may be other 4K TVs with 120/144Hz VRR panels and affordable prices in Europe that I don't know about, but those seem good options. Of course, there's the 48'' OLEDs but those are considerably more expensive.... Lastly, I'd say a deep desk is recommended, or placing the TV on a support (either on wall or with stand type) in front of your desk if it's a short one. This is, if you're not using a cockpit sim-rig already prepared for it (most these days can be prepared to fit big screens).
-
Even at idle, there are dozens of very light background tasks being ran. There always are. You'll find that it's impossible to see the processor continuously at 0% usage and all the power locked at minimum. It's always being triggered a little, every second or so. Even the friggin mouse. Try, for instances, just moving around your mouse very frantically, and watch the CPU usage and power increase just a tiny bit in reaction.... Many things happening on background, and that's why that oscilloscope video from Buildzoid is so interesting and important, because it shows the kind of voltages oscillations and spykes that occur, in different situations, that otherwise you may never even see on your regular monitoring software. And also why sync'ing (limiting) all the P-cores to same clock, to stop this 1.5V+ single/dual core boost BS, is very important. All that said, no need to overstress yourself with the issue. Because, at some point, there is nothing more one can do. And it gets down to pure luck that your, his and her processor is more or less degraded (if at all), thanks to Intel. Everybody (those aware of the issue, that is!) with 13th and 14th gen CPUs that are over 65W is waiting for the Intel microcode to be released. All will need to install it, no buts or ifs. Supposedly the degradation will be stopped with it. But, if there are signs of damage already, then no microcode will save it, and it's RMA time for the CPU. And, sadly, that's about it.
-
GTX 1080Ti or RTX 2080Ti. Which to choose?
LucShep replied to Holbeach's topic in PC Hardware and Related Software
+1, Ditto. -
Very good, he gets it too! Baron is also another one of the recent afflicted (LOL) and his is the special "SPS" (1996-2000 homologation series, few units produced), the pinnacle of the Desmoquattro, and the most addictive of the 916 series. If you're saving to get one (...and its respective maintenance...) then that SPS is the "unicorn" you should aim at. That's as good as it gets:
-
Z790 Aorus Master manual https://download.gigabyte.com/FileList/Manual/mb_manual_intel700series-bios_e.pdf?v=aceb9fb3f69cc73ea6b2fddd6a6f34ed I'm mostly used to ASUS and MSI (what I'm used to for OC settings)... I see it's very different with Gigabyte in this aspect here. I'm also not seeing the direct option to sync cores(?) and neither where to place a single value for all at once. LOL (maybe someone else can chime in?) I do see this in your board's manual: So interpreting as I read it......... If you enter BIOS and go to: "Advanced mode" (TWEAKER) ... "Advanced CPU Settings" (scroll down) Turbo Per Core Limit Control ---- AUTO >>>> MANUAL ... Turbo P-Core 0 Ratio Limit ---- Auto >>>>> value of clock per core // for example "56" for 5.6Ghz, as for i9 13900KS all P-core max clock, as said previously Turbo P-Core 1 Ratio Limit ---- ´´ ´´ ´´ Turbo P-Core 2 Ratio Limit ---- ´´ ´´ ´´ Turbo P-Core 3 Ratio Limit ---- ´´ ´´ ´´ Turbo P-Core 4 Ratio Limit ---- ´´ ´´ ´´ Turbo P-Core 5 Ratio Limit ---- ´´ ´´ ´´ Turbo P-Core 6 Ratio Limit ---- ´´ ´´ ´´ Turbo P-Core 7 Ratio Limit ---- ´´ ´´ ´´ Again, this is how I interpret it as I'm diagonally reading the BIOS manual. But I could be wrong(!), it could be other settings... I wish to not make you do something wrong! Again, anyone knowing how the particular motherboard settings are, please chime in. PS: my dislike for Gigabyte boards increased even more! LOL (Go ASUS and MSI !!)
-
If you haven't disabled the CPU power savings (Speedstep, Speedshift, and all the C-States), then you're just saying to the motherboard that "I want those P-cores to all be limited to that X ammount of Ghz, when at maximum". It'll still downclock like always, power savings the same, everything the same, when you're not pushing the processor. So no. It won't be forcing to always be there. You're just limiting, placing a ceiling if you will. And that's where the benefit comes from. Because (if you run it stock) the voltages are "tabled" according to clocks (the higher the clocks, the higher the voltage, lower clocks translate to lower voltages and etc). Taking an i9 13900KS as example... It boosts to 6.0Ghz in single-core (the "Max Turbo Frequency"). To get there, it requires more voltage and, even if it's just for one core, all of that power is offered, at once. If instead you sync all the P-cores, and set to 5.6 Ghz (which is its stock "out of the box" all P-cores clock), it won't boost there anymore and will not reach such silly high voltage (plus, you also get lower temperatures). It's how it should have been (IMO), like correcting it (and restricting that). For which you actually don't get lower performance in gaming or most things really. Not saying that this will save your CPU forever, or that it won't degrade ever again (given the latest news, it seems deeper than that). But, no doubt, you're already cutting the worst and biggest offender, and easing things a whole lot. Check Buildzoid's video, and how that single-core boost goes over 1.5v... (insane how it happens, and that's supposedly "normal" for Intel 13th/14th gen!! ).... Picking a part at 12:13 time of video, just keep watching:
-
Didn't even knew this thread exhisted. It seems there are others afflicted with the moto-nerd virus here!! My companions in "the path": Honda CRM 50 1993 Lots of memorable rides and adventures in this gutless POS (but a great school for anyone starting to ride). Suzuki GSXR 600 SRAD 1997 A friggin banshee, screaming out of the corners. A product aimed for the early twenties fellas, I too took the bait (was 23 then). Definitely not the most pleasant ride in slow traffic (needs moooaar revsss maaan). Nice toy that ended not seeing a race-track like I once planned. Ducati Monster 900 "Dark" 1999 Huge booming soundtrack, torquey "rounded" engine, huge fun. It didn't last more than two months, destroyed in a collision with a wreckless driver doing a U-turn in the worst moment possible. Such a lovely bike. And such a shame. Yamaha TRX 850 1998 This was kind of an oddball motorcycle, rare too, that I imediately wanted to have as soon as I could. Japanese sportbike with Ducati styling and a trellis frame? A parallel twin engine with 270º crank? (so, works and sounds like a V-Twin) Interesting bike that, unfortunately, too many times felt like a mongrel of parts (and technically it is). Ducati 750 SuperSport 1993 Old, traditional, agricultural, so many of the "ugly headlight looks like a brick" comments (yeah, yeah.. whatever). Huge character, rounded engine with nice soundtrack, and probably the sweetest ride in the whole list. Sometimes less is more, and simple is everything. If you ever owned one, you're probably nodding and smiling (yep, you get it). Triumph Daytona 955i (T595i) 1999 Great triple cylinder engine like no other, but an odd chassis geometry that I ended not getting along with. Always felt curious about it, got it in a deal that seemed too good to pass. A rare and great machine that simply took too long for me to gel with. Ducati 748E 2000 A dream come true, and the best motorcycle I've ever had, by far (and away) - and there's a very trick 916 engine under those fairings (the original 748 went kaput). Warming it up was like a ritual, and the most beautiful ride, fully in control. Feeling every little thing, and that deep sound and engine vibes, it's very visceral. Whatever good things you read about these, I assure you they're all that, and more. It's "the whole package", everything in the right dose. Shame about the maintenance (done frequently, sometimes laborious) that gets expensive, but I'd imediately own another if I could (yes I dislike modern literbikes). One of the very few pics I have of it (oddly, only a handful through those years), with my ugly head contemplating this marvel of design and engineering. Suzuki GSXR 1100 WP 1993 Nostalgia hit me really bad, in a time when I could no longer afford maintaining the yellow Duc, I decided to trade it for something else (my biggest regret, ever). Went for this piece of youth icon (of mine at least). And I confirm that it's as mad as everybody's stories that you may have heard of these. Imaculate for its age (they don't build them like this anymore). With modern tyres and a well setup suspension, despite the age and old tech, it is still a really good motorcycle. But..... just not for me. Aprilia RSV 1000 Tuono Fighter 2004 Went looking again for a sporty V-Twin, specifically the Aprilia RSV Mille (Gen 1, early 2000s). The problem is my age, the body aches, no disposition for a race-rep machine. The RSV Tuono is the same bike as the RSV Mille, minus the fairings, with a handlebar instead of clip-ons. It was an easy decision after a run around the block. I adore this motorcycle, the ride and feel, the strong engine, chassis and build quality, the sound, the details and the look (even though I wouldn't call it "pretty"). Currently at 77000 KMs, still going strong as ever. The guy who said "italian motorcycles can't do the miles" does not have any clue. The year for each mentioned one is of the model, not necessarily the year of my acquisition. They're placed by chronological order, during 30-odd years of riding to current day. Never had a "brand new zero milleage" motorcycle, and I don't think I will at this point. I value motorcycles very much for the emotions, the memories they create and, for that, they don't need to be brand spanking new, or the latest and greatest. They just need "to be right".
-
Problem on Intel 13th & 14th Gen CPUs.
LucShep replied to Devrim's topic in PC Hardware and Related Software
Quoting myself from another thread (pardon that), but.... Just by activating "Synch All Cores" and placing a value (for all the cores) that is same or as close to the "out of the box" (Stock) all-core clock for the specific processor, you imediately fix the worst part of the problem. What degrades these CPUs faster is when a single core, even for a background task, asks to boost to whatever outrageous ammount of "single/dual core boost". Even idling in desktop you can see this destructive behaviour, doing its own unknowningly suicidal thing. It happens very, very frequently during whatever time of use. When it does that, it is given 1.50v+ depending on the individual CPU and motherboard+bios (some people even mention 1.60v at times). That is one major reason why these chips are dying. Even though the chip has power limits, when only one core wants to boost, ALL of the power is offered, at once. The most insane part is that this seems to be a feature just so Intel (and also AMD has it) can get a good CineBench score by reviewers. Lock those cores and you'll be a LOT better. -
Just by activating "Synch All Cores" in your P-Cores and placing a value (for all the cores) that is same or as close to the "out of the box" (Stock) all-core clock for the specific processor, you imediately fix the worst part of the problem. What degrades these CPUs faster is when a single core, even for a background task, asks to boost to whatever outrageous ammount of "single/dual core boost". Even idling in desktop you can see this destructive behaviour, doing its own unknowningly suicidal thing. It happens very, very frequently during whatever time of use. When it does that, it is given 1.50v+ depending on the individual CPU and motherboard+bios (some people even mention 1.60v at times). That is one major reason why these chips are dying. Even though the chip has power limits, when only one core wants to boost, ALL of the power is offered, at once. The most insane part is that this seems to be a feature just so Intel (and also AMD has it) can get a good CineBench score by reviewers. Lock those cores and you'll be a LOT better.
-
GTX 1080Ti or RTX 2080Ti. Which to choose?
LucShep replied to Holbeach's topic in PC Hardware and Related Software
Those old OCZ ZX PSUs were really, really good. But that was in their day, a long time ago (circa 2011).... Not so sure it'll do so good with a modern 270W GPU, but then it was SLI ready in their day, and it's pushing the GTX1070 after all. The one thing that you really need to change urgently is that PC case (holy old box batman!). It's absurdly claustrophobic for your new hot GPU, that won't go well. You have really good budget ATX cases now, absolutely worth the effort on the components transplant to it. PS: Take a good look at the Montech AIR 903 AIR MAX (great ATX case, a review by TPU), which costs just £60,00: https://www.scan.co.uk/products/montech-air-903-max-black-mid-tower-chassis-w-tempered-glass-3x-140mm-argb-fans-usb-ype-c-e-atx-atx https://www.overclockers.co.uk/montech-air-903-max-midi-tower-tempered-glass-black-cas-mon-01229.html IMHO, there's no excuse not to do it right away with the new GPU coming. More than giving that system a modern new look, it'll be a humungous jump in better internal space, layout, and especially airflow (4x 140mm fans that you plug to a SATA power connector of PSU and one single PWM to motherboard!). -
Absolutely. It's a total <profanity> show. This stuff can be pretty aggravating, as many users kept working from home since the pandemic, and are oblivious to this crap. Not funny if your daily working tool uses one of these Intel chips and goes "kaput". I'm not so sure about DCS users, how many visit this section anyway? There may be so many who may also be oblivious to this. At least some kind of heads-up to friends, then them to their friends and etc, also in other communities, should be done by each of us. So many people who should be alerted. Meanwhile, another section of the freak show has already started... all of the 12th gen "K" CPUs have already started to slowly go up in price (at least where I'm looking).
-
Yes, your 14900KS will need the microcode update, that's for sure. Don't trust lady luck on this. Everybody with 13th and 14th gen CPUs that are over 65W needs to do it once it's out (no buts or ifs). Get the friggin microcode update installed once it's out. And consider RMA your CPU if there are signs of damage already.
-
Not sure how many will raise the arm to confirm, but...... The truth is that this is really problematic, even more than expected. And, yes, there'll be fatalities among DCS users' Intel 13th/14th gen CPUs at some point, if not already. If there's a techtuber that you should take seriously is Roman “der8auer” Hartung, an extreme overclocker that is also a mechatronics engineer. He did what most should have done, which is taking his time and only release an opinion that he's sure on and ready to be released (contrary to most who ran for the clicks). This is what he has to say (at 11.02 in the video): It seems it affects all 13th and 14th gen CPUs that are above 65W.
-
GTX 1080Ti or RTX 2080Ti. Which to choose?
LucShep replied to Holbeach's topic in PC Hardware and Related Software
@Holbeach Not much to add to what @kksnowbear just said above, but "is there going to be an improvement or not" is not even a question. The RTX2080Ti is at least 50% faster than a GTX1070, and in practical terms it can double the performance in heavy scenarios. Your system will slow it down quite a bit (being outdated for it), but you'll still see enough gains not to be disapointed. The RTX2080Ti Strix was among the best models, and it's still a very good GPU today. Just please consider upgrading your system ASAP (discussed already in this thread) because you won't see its full capabilities until then. PS: the RTX2080Ti Strix is somewhat power hungry (can eat up to 270W in power consumption, 2x 8-pin PCIe from PSU required). I see in your signature "1kw PSU".. but what brand and model of PSU is it? -
Last week I was in a group chat where this stuff was being discussed, there was a video posted there about a degraded i9 13900KS. Absolutely worth the watch, more so if you already watched Buildzoids' oscilloscope video (the one I'm quoting) where the crazy single-core boost voltages problems are shown. If all you watch are the glorified techtubers in the techspace, this guy will look obnoxious (ignore that). The point here is, he's actually correct and his aproach here makes all the sense. :WARNING: swearing
-
Thanks for the heads up, didn't even remember to check up Buildzoid's channel. Watching the video right now, and there is exacty what I was saying before (crazy single-core boosts + voltages), check at about 12.13 and on: ...see what I mean? It's insane how this happens. These 13th and 14th gen i9s (and i7s?) all seem to work at stock with well over what is considered normal, reaching and surpassing 1.5v(!!). 12th gen i9s and i7s were already starting to slowly degrade if you're were using over 1.4v on long term overclocks. I just wonder if it has to do with Intel probably using their own motherboards or BIOS settings for stability and long-term reliability testing, and not used by any other motherboard manufacturers (which could explain why they seem dumbfounded, and would be tremendous incompetent but not out of possibility).
-
Alder Lake is an all-new design compared to Rocket Lake, completely different architectures. Not the case with Raptor Lake. We can throw all the objective and non-objective reasons we like until the cows go home. My point remains. Of course there are differences in the internal voltages of Raptor Lake, and must have to do with the Raptor-Cove changes. Higher clocks on E-Cores are not helping either, for sure. The core voltages and VID tables are different, even the system agent voltages are different (probably were offset to not cook even more the processor). I'm expecting a much bigger hit than a percent or so. But, so long as it stops the problem, it won't be a big deal. At this moment, it's understandable that AMD users are rejoicing with their choice. I am as well with my Intel 12th Gen being totally fine and exempt of these issues. But AMD is not exactly exempt of the "single-core boost high voltages" either, or problems in their own platforms, you know...
-
Sure, and I'm not contradicting you. But if you're a potential victim in this strange (still ongoing) process with frak knows what exact fix will be, you might as well do something to avoid it happening, or mitigating the issue if it already has started to happen. For me if I was in that scenario, that would be what I'd do.
-
Oh, but they do have their share of guilt, be not mistaken. Imagine buying a top end system that, by default BIOS settings, is pumping even more voltage/amps on an already very voracious hot chip, to higher levels. If you're pushing that system for months on end, and we know now that these chips have some problem, then some sort of degradation accelerated by questionable BIOS settings is not unexpected. And if the degradation already started to occur at that point.... Overclocking and tweaking has many forms. On one end you have the excessive "Extreme" enthusiast level, raising (even more) clocks and voltages to insane levels with these chips. On the other end you have the mild OC tunings with locked clock on all cores, or the overclocking/undervolting ones that are far less severe (and probably better than stock for lifespan). It may take years, but yes, I too suspect it'll be only a matter of time until it starts happening to every i9 13900K/14900K (and also some i7 13700K/14700K) with it at the (currently in use) stock clocks/voltages. At stock the 13th/14th K processors push too high in single/dual core boost, and it takes voltages/temps that (IMO) are better avoided. That's why I wrote this above, and quoting: