Jump to content

Recommended Posts

Posted
15 minutes ago, Hiob said:

As long as they keep polishing 10 year old turds to make them appear shiny again or try to muscle their way through by means of sheer voltage

😄 😄 😄 Holy sh*t I actually LOLd  and I've not heard a more eloquent summary of this mess since it started.

10 year old turds....indeed LOL

  • Like 1

Free professional advice: Do not rely upon any advice concerning computers from anyone who uses the terms "beast" or "rocking" to refer to computer hardware.  Just...don't.  You've been warned.

While we're at it, people should stop using the term "uplift" to convey "increase".  This is a technical endeavor, we're not in church or at the movies - and it's science, not drama.

Posted

😈 that‘s what it is. Boing of the semiconductor industry. You can only crash so many 737s until you need to realise that a REAL game changer must be designed from scratch.

  • Like 1

"Muß ich denn jedes Mal, wenn ich sauge oder saugblase den Schlauchstecker in die Schlauchnut schieben?"

Posted (edited)
1 hour ago, Hiob said:

I‘d put it this way: Intel needs to come out with something really new, really convincing to win me over again. As long as they keep polishing 10 year old turds to make them appear shiny again or try to muscle their way through by means of sheer voltage - I stay clear.

That said, my current system (luckily) won‘t be due for an upgrade for at least another two years. Plenty time for Intel to gain pace (or AMD to screw up).😁

I'll give credit to AMD for placing effort on something that has "gaming usage win" plastered all over it, which is the 3D V-Cache of the AM4 5X00X3D and AM5 7XX0X3D chips. 
A really phenomenal idea to do it on chips with single CCD. 👍 (the only really good ones for gaming, as the latency issues between CCDs are inherent by design).

I've had inumerous Intel and AMD processors during the last 25 years (overclocked most of them, I think) and, can surely tell, there were times when Intel was really dragging their feet (again, credit to AMD for spicing things up and make them work). And Intel surely deserves the current public backlash with this degradation crap.

But then to call modern Intel processors "10 year old turds" is right on the verge of blatant ignorance. 😐
 
I had a 10700K before the current 12700K I own. That older one was no slouch, but the jump in performance with Alder Lake was huge. Depending on application, ~60% in MT and ~40% in ST. Even more once tuned and with a mild overclock (another extra ~7% performance, for free). That was launched in late 2021 and, in fact, an all-new design.

People can trash talk all they want about E-Cores being useless for gaming but, for me, they're godsend. 🙂 
I can put every little extra app running in the background on the E-Cores (Discord, HWINFO, peripherals software, etc, etc, even the AV!) while I'm gaming with all the 8 P-Cores unaffected by those programs and games set on them (Process Lasso FTW!). 😉 Smooth as butter, stutter free, ZERO (AM)Dip in all my games.

Edited by LucShep
spelling(?)
  • Like 1

CGTC - Caucasus retexture  |  A-10A cockpit retexture  |  Shadows Reduced Impact  |  DCS 2.5.6 - a lighter alternative 

DCS terrain modules_July23_27pc_ns.pngDCS aircraft modules_July23_27pc_ns.png 

Spoiler

Win10 Pro x64  |  Intel i7 12700K (OC@ 5.1/5.0p + 4.0e)  |  64GB DDR4 (OC@ 3700 CL17 Crucial Ballistix)  |  RTX 3090 24GB EVGA FTW3 Ultra  |  2TB NVMe (MP600 Pro XT) + 500GB SSD (WD Blue) + 3TB HDD (Toshiba P300) + 1TB HDD (WD Blue)  |  Corsair RMX 850W  |  Asus Z690 TUF+ D4  |  TR PA120SE  |  Fractal Meshify-C  |  UAD Volt1 + Sennheiser HD-599SE  |  7x USB 3.0 Hub |  50'' 4K Philips PUS7608 UHD TV + Head Tracking  |  HP Reverb G1 Pro (VR)  |  TM Warthog + Logitech X56 

 

Posted
3 hours ago, LucShep said:

Dude, sometimes I read these threads with such issues, I just feel like jumping to that other side of the screen and "LET ME MESS WITH THAT COMPUTER!" 😂🤣 LOL

:lol:

3 hours ago, LucShep said:

I know it's your PC and your call but... an i7 14700KF at 4.3Ghz is not "ok" at all, IMO. 😕 

I completely understand what you mean, but it's not me, it's Intel. :)))

3 hours ago, LucShep said:

I'm pretty certain at this point that you can not adjust the CPU Core Ratio (B760 mobo....).

I can share all screenshots with you. You know MSI BIOS in its advanced mode. There's a huge OC button on the left side and you can set ratios P and E cores separately etc. etc. and I can see results in HWMonitor, CPU-Z or even Task Manager/Performance.

3 hours ago, LucShep said:

But, IIRC, you've disabled the Intel Turbo Boost and Boost Technology rows on the BIOS, right?

No, I haven't right now. It was one of the temporary solutions. If I do that, --you know better than me-- no need those ratio or voltage tweaks. PC works at base frequency: 3.6GHz and I see around 1.01V and everything is nice. Even in DCS, I get good performance. While I was setting/struggling those adjustments I mentioned prev. post, I "enabled" the boost options back.

3 hours ago, LucShep said:

The black screen and freezes are due to the inability of the motherboard to feed the CPU (when boosting) - the B760 can't handle that i7 14700K when going full tilt.
That's why it now works "ok" with boosts disabled (less effort for the motherboard VRM), at the cost of a very(!) sedated 14700KF.

You're damn right man. I got it. 😕

  • Like 2

Intel i7-14700@5.6GHz | MSI RTX4080 SuperSuprimX | Corsair V. 64GB@6400MHz. | Samsung 1TB 990 PRO SSD (Win10Homex64)
Samsung G5 32" + Samsung 18" + 2x8"TFT Displays | TM Warthog Stick w/AVA Base | VPC MongoosT-50CM3 Throttle | TM MFD Cougars | Logitech G13, G230, G510, PZ55 & Farming Sim Panel | TIR5 & M.Quest3 VR
>>MY MODS<< | Discord: Devrim#1068

Posted
17 minutes ago, LucShep said:


But then to call modern Intel processors "10 year old turds" is right on the verge of blatant ignorance. 😐

 

Well, obviously I was hyperbolic to make a point and I just threw in 10 years. Could as easily be just 7 or 8.

But it has a true core. Intel never made a big jump in anything. They just made little incremental baby steps. Their IPC didn’t improve in a meaningful way ever, the e-core shenanigans (even if somewhat matured by now) was just a way to increase their waver yield. Now they have e-cores and their cpus are still huge energy hogs. Every new „high performance“ cpu just was well beyond the peak of the energy/performance curve, or in other words bought their performance for huge energy cost…. and so on and so forth. The micro-architecture in its core IS in fact in many ways unchanged….. not to mention how long they were stuck on 12 and 10 nm.

That all mainly hurt their high-end cpus, the mid-tier is mostly ok, but that is where all the spotlight is. AMD with Ryzen and particular since Ryzen 3000 was the force that really pushed the boundaries and made intel move. They had their own issues of course. Like lacking clock speed e.g., but they improved and found ways around that.

Anyway. I was must talking about how I feel towards Intel (currently). May not resonate with everybody. That‘s fine.

  • Like 1

"Muß ich denn jedes Mal, wenn ich sauge oder saugblase den Schlauchstecker in die Schlauchnut schieben?"

Posted
1 minute ago, Hiob said:

Intel never made a big jump in anything


except the jump to the 386 chips, and the one to the first generation Core i7 … but I agree that nowadays we only get incremental jumps. 

 

For work: iMac mid-2010 of 27" - Core i7 870 - 6 GB DDR3 1333 MHz - ATI HD5670 - SSD 256 GB - HDD 2 TB - macOS High Sierra

For Gaming: 34" Monitor - Ryzen 3600 - 32 GB DDR4 2400 - nVidia RTX2080 - SSD 1.25 TB - HDD 10 TB - Win10 Pro - TM HOTAS Cougar

Mobile: iPad Pro 12.9" of 256 GB

Posted (edited)
9 minutes ago, Rudel_chw said:


except the jump to the 386 chips, and the one to the first generation Core i7 … but I agree that nowadays we only get incremental jumps. 

Well that is not really the time frame we‘re talking, is it?😅

Back in 386DX33 (my first 8086 CPU) and 486 and early Pentium days Intel actually was at the top of their game. Even Celerons, Core2duos and 4-digit „i“-CPUs were good. They started stagnating (relatively) with the 5-digit „i“s.

In fact. The mega strength they had in these days made them complacent. When ARM cpus raised and AMD finally got their stuff together they had like 90% market share. They felt invulnerable.

Edited by Hiob
  • Like 1

"Muß ich denn jedes Mal, wenn ich sauge oder saugblase den Schlauchstecker in die Schlauchnut schieben?"

Posted
3 minutes ago, Hiob said:

Well that is not really the time frame we‘re talking, is it?


that’s the problem with terms like never, or always, … I wasn’t aware of the timeframe for this thread, just replied because the expression struck me 🙂 

  • Like 1

 

For work: iMac mid-2010 of 27" - Core i7 870 - 6 GB DDR3 1333 MHz - ATI HD5670 - SSD 256 GB - HDD 2 TB - macOS High Sierra

For Gaming: 34" Monitor - Ryzen 3600 - 32 GB DDR4 2400 - nVidia RTX2080 - SSD 1.25 TB - HDD 10 TB - Win10 Pro - TM HOTAS Cougar

Mobile: iPad Pro 12.9" of 256 GB

Posted

My last (private) Intel was a i7 4790K. It lasted me 5-6 years until it was replaced by a Ryzen 3900X.

  • Like 1

"Muß ich denn jedes Mal, wenn ich sauge oder saugblase den Schlauchstecker in die Schlauchnut schieben?"

Posted (edited)
14 minutes ago, Rudel_chw said:


that’s the problem with terms like never, or always, … I wasn’t aware of the timeframe for this thread, just replied because the expression struck me 🙂 

Well I literally started the whole rant with the specification of a ~10 year (give or take) time frame. But let’s stop nitpicking. I think my message is clear. And that is just my personal opinion. Nothing written in blood. 🤗😊

And just in case: I‘m in no way, shape or form giving anybody that drives an Intel a hard time. In fact, when I retired the 4790, it was basically a coin toss between Intel and AMD. Could easily be the other way round and I would probably have been stuck on the Intel road until today. It is just that, following the events since then, I‘m quite happy with my choice. And right NOW, I certainly won’t switch back. We‘ll talk again in two years or so…..😅

Edited by Hiob
  • Like 1

"Muß ich denn jedes Mal, wenn ich sauge oder saugblase den Schlauchstecker in die Schlauchnut schieben?"

Posted (edited)
4 hours ago, Hiob said:

Well, obviously I was hyperbolic to make a point and I just threw in 10 years. Could as easily be just 7 or 8.

But it has a true core. Intel never made a big jump in anything. They just made little incremental baby steps. Their IPC didn’t improve in a meaningful way ever, the e-core shenanigans (even if somewhat matured by now) was just a way to increase their waver yield. Now they have e-cores and their cpus are still huge energy hogs. Every new „high performance“ cpu just was well beyond the peak of the energy/performance curve, or in other words bought their performance for huge energy cost…. and so on and so forth. The micro-architecture in its core IS in fact in many ways unchanged….. not to mention how long they were stuck on 12 and 10 nm.

That all mainly hurt their high-end cpus, the mid-tier is mostly ok, but that is where all the spotlight is. AMD with Ryzen and particular since Ryzen 3000 was the force that really pushed the boundaries and made intel move. They had their own issues of course. Like lacking clock speed e.g., but they improved and found ways around that.

Anyway. I was must talking about how I feel towards Intel (currently). May not resonate with everybody. That‘s fine.

That's because you haven't direct experience with Intel for ages (4790K? 😄 ...that launched in Q2 2014, over a decade ago!).

Maybe if you'd built and tested systems recently with both Intel and AMD, side by side, you'd see more clear differences, and where you're talking nonsense.

Their (Intel's) IPC didn't improve? 😐
The jump in IPC was friggin monstrous when Alder Lake got out in late 2021 (not 10, not 7 or 8 years ago), imediately rendering everything (including AMD AM4) obsolete. 
The 5800X3D chip was really the saving grace of AMD. 

After messing with both Intel and AMD and, if it wasn't this degradation BS, I'd honestly consider Intel "K" chips from 12th, 13th and 14th gen to be the better chips today.
Better than even the very latest AMD equivalents (yes, including the just released and, it seems, underwhelmingly so, 9000 series).

Far better memory compatibility (don't even get me with the mem speed with 4 sticks of RAM on AM5), far better behaviour between different motherboard manufacturers models (much easier to predict when building a new system), far less fuss, headaches (AM4's USB issues) or "fafo" once you've dealt with the outrageous (yes they all are) stock BIOS settings.
All the E-Cores that are ignorantly critized, which are easily dealt with Process Lasso, and then become a benefit for gaming (as I described above).
The list goes on once you start handling both old and newer apps and games.

Power hogs (yes, latest ones are), space heaters (yes, latest ones are), call it whatever you want... they've been, performance and stability wise, the better "total package", IMHO.
Until this degradation crap started.

Yes, the AM4 5800X3D and AM5 7800X3D are better gaming chips (though not always the best chip there), but they're a one-trick pony that is nothing special for anything other than gaming.  And, sure, I expect the 9800X3D to be the next success. But the rest of this 9000 series? :dunno: Honestly, doesn't really look all that good to me....  

AMD is "winning" huge market share this very day, not because they're really better, but because their competitor did a ridiculous own goal... 🤦‍♂️ (nice one Intel!!).

Edited by LucShep
  • Like 2

CGTC - Caucasus retexture  |  A-10A cockpit retexture  |  Shadows Reduced Impact  |  DCS 2.5.6 - a lighter alternative 

DCS terrain modules_July23_27pc_ns.pngDCS aircraft modules_July23_27pc_ns.png 

Spoiler

Win10 Pro x64  |  Intel i7 12700K (OC@ 5.1/5.0p + 4.0e)  |  64GB DDR4 (OC@ 3700 CL17 Crucial Ballistix)  |  RTX 3090 24GB EVGA FTW3 Ultra  |  2TB NVMe (MP600 Pro XT) + 500GB SSD (WD Blue) + 3TB HDD (Toshiba P300) + 1TB HDD (WD Blue)  |  Corsair RMX 850W  |  Asus Z690 TUF+ D4  |  TR PA120SE  |  Fractal Meshify-C  |  UAD Volt1 + Sennheiser HD-599SE  |  7x USB 3.0 Hub |  50'' 4K Philips PUS7608 UHD TV + Head Tracking  |  HP Reverb G1 Pro (VR)  |  TM Warthog + Logitech X56 

 

Posted

FYI Intel offered to replace my CPU when I opened a ticket, they didn't bother me with too much troubleshooting. I noticed some weird symptoms a month ago and opened a ticket, I can't say for sure it's the CPU but it is known to be defective and I can't be bothered to diagnose it definitively. 

  • Like 2
Posted
13 hours ago, LucShep said:

Yes, the AM4 5800X3D and AM5 7800X3D are better gaming chips (though not always the best chip there), but they're a one-trick pony that is nothing special for anything other than gaming.

Nothing wrong with that, when most gamers do little other heavy processing than gaming. Being great at what really matters, perfectly fine at anything else, for a great price, makes a good product.

  • Like 2
Posted (edited)
15 hours ago, LucShep said:

That's because you haven't direct experience with Intel for ages (4790K? 😄 ...that launched in Q2 2014, over a decade ago!).

Maybe if you'd built and tested systems recently with both Intel and AMD, side by side, you'd see more clear differences, and where you're talking nonsense.

Their (Intel's) IPC didn't improve? 😐
The jump in IPC was friggin monstrous when Alder Lake got out in late 2021 (not 10, not 7 or 8 years ago), imediately rendering everything (including AMD AM4) obsolete. 
The 5800X3D chip was really the saving grace of AMD. 

After messing with both Intel and AMD and, if it wasn't this degradation BS, I'd honestly consider Intel "K" chips from 12th, 13th and 14th gen to be the better chips today.
Better than even the very latest AMD equivalents (yes, including the just released and, it seems, underwhelmingly so, 9000 series).

Far better memory compatibility (don't even get me with the mem speed with 4 sticks of RAM on AM5), far better behaviour between different motherboard manufacturers models (much easier to predict when building a new system), far less fuss, headaches (AM4's USB issues) or "fafo" once you've dealt with the outrageous (yes they all are) stock BIOS settings.
All the E-Cores that are ignorantly critized, which are easily dealt with Process Lasso, and then become a benefit for gaming (as I described above).
The list goes on once you start handling both old and newer apps and games.

Power hogs (yes, latest ones are), space heaters (yes, latest ones are), call it whatever you want... they've been, performance and stability wise, the better "total package", IMHO.
Until this degradation crap started.

Yes, the AM4 5800X3D and AM5 7800X3D are better gaming chips (though not always the best chip there), but they're a one-trick pony that is nothing special for anything other than gaming.  And, sure, I expect the 9800X3D to be the next success. But the rest of this 9000 series? :dunno: Honestly, doesn't really look all that good to me....  

AMD is "winning" huge market share this very day, not because they're really better, but because their competitor did a ridiculous own goal... 🤦‍♂️ (nice one Intel!!).

 

Of course, a lot of what you say is true.  I just got a laugh from the 'polishing a turd' remark; didn't intend to step on toes.  I think I "got the joke" that @Hiob intended, maybe others not so much.

I also do both Intel and AMD machines - but you know, here's a sort of telling (albeit anecdotal) metric:  I haven't done many new Intel machines at all, mostly AMD.  Of course, I do a lot of used stuff, and less new stuff in general.  But still, seems to reflect what's taken place in actuality:  AMD is doing a lot better of late than Intel.  I'd probably say that, with the possible exception of the 12000s, Intel has been sort of slacking since the 10000s.

It seems you've had problems getting memory to work in AM5 builds; I don't share that experience.  I have used fully-populated slots on all the AM5 builds I've done so far and all work exactly as expected.   And that's with 6000Mt modules at a decent CAS below 40, and without having to pay so much for 30.  It works, predictably and reliably.  People have problems, IMHO, when they push too hard on what AMD has already described.  Keep it reasonable=no problems, at least for me

Far as the e-cores goes, I personally don't endorse the idea adding (yet another) piece of software to get hardware to behave.  Don't have to do that on any of the AM5s I've built, either.  I think the e-cores concept is more akin to marketing gimmick than brilliant engineering design.  More cache works way better, and I think that speaks for itself - and loudly so.

And that's the thing:  With the extra cache/X3D parts, I believe AMD has actually 'changed the game' (pun completely intended).  Where the Intel lineup...well, not so much.  Higher clocks, more heat, more power.  (Hey... that sounds sort of familiar for some reason...)

If we're being honest - and I do have some training and background in this area - the whole power and heat thing is basically why CPU makers started making chips with more cores to begin with, since faster and faster clocks were rapidly becoming untenable (back in the day).  Can't dissipate heat as well, and all the while the package is getting smaller...(ever wonder why the IHS came about?  Or why the TIM was such a big deal back in in the Kaby Lake/Coffee Lake units?)

It honestly seems like Intel has been trying to just go back to faster clocks, more power, more heat...which, again, simply becomes untenable after a while.  Hell, one could argue that this is more like what was being done far more than 10 years ago.

And where does it lead?  Well, to the degradation issues, that's where.  This isn't a new problem.  Overclocking CPUs has generally always involved more voltage and thus more heat.  Get too crazy and eventually the chip becomes unable to run stable without more voltage.  Then even that won't run stable...and eventually it won't run stable (if at all) without lowering speed significantly.  Electromigration is real, and it's not new: >LINK<

A quote: As the structure size in electronics such as integrated circuits (ICs) decreases, the practical significance of this effect increases.

Sound familiar?  Wow.

(Yes, Intel also had some 'corrosion' issues, but even they say it's a separate matter, and certainly doesn't explain what's happening now).

So I don't think it's really fair to say AMD is only succeeding lately because Intel has sh*t the bed.   Intel has indeed screwed up, no doubt.  But at the same time (recent 9000 flops laid aside; a different discussion) AMD has done something inside the CPU that actually made a big difference, and it wasn't the "brute force" buzzword that everybody's now using about what Intel has done.

It's not just Intel losing ground...it's also AMD gaining ground.

I should point out here, for context, I've been a lifelong Intel user since I built my first PC c.1990; it was a big deal when I got a 387 chip (yes, 387). But when the 10th gen Intel CPUs came out, I was done. AMD was already clearly pulling ahead at that point, and I had seen all I needed to see.  Next upgrade, I built a 5800X3D and never looked back.

Anyway, probaby a bit off topic now, for which I apologize....but I just wanted to say I simply 'got the joke' earlier.

Edited by kksnowbear
  • Like 1

Free professional advice: Do not rely upon any advice concerning computers from anyone who uses the terms "beast" or "rocking" to refer to computer hardware.  Just...don't.  You've been warned.

While we're at it, people should stop using the term "uplift" to convey "increase".  This is a technical endeavor, we're not in church or at the movies - and it's science, not drama.

Posted (edited)

 

5 hours ago, kksnowbear said:

And that's the thing:  With the extra cache/X3D parts, I believe AMD has actually 'changed the game' (pun completely intended).  Where the Intel lineup...well, not so much.  Higher clocks, more heat, more power.  (Hey... that sounds sort of familiar for some reason...)
(...)

It honestly seems like Intel has been trying to just go back to faster clocks, more power, more heat...which, again, simply becomes untenable after a while.  Hell, one could argue that this is more like what was being done far more than 10 years ago.
(...)

Anyway, probaby a bit off topic now, for which I apologize....but I just wanted to say I simply 'got the joke' earlier.


Ok, I see now. It was about the backwards philosophy, or take on the aproach, and not really particularly focused in the gen architectures innovations per se. 
My apologies to @Hiob then, as I didn't "get the joke".

I agree with all of that.  As I said, that's where l give all the credit to AMD for the 3D V-Cache on the X3D chips. 
It was, IMO, the biggest innovation I've seen in a long, long time. It makes sense and it absolutely works for the purpose (gaming).

But then, as always, it's AMD with missed opportunities. And why I think they never were, aren't, probably will never be, as proficient as Intel (or Nvidia, for that matter).
By not imediately extending their X3D line up to their lower end products (sub 220$) in AM5, after its success previously in AM4's 5800X3D, they missed a huge oportunity.
For example, for the 6-cores Ryzen (so, like 7500X3D and 7600X3D, 9500X3D and 9600X3D etc), as that's the segment where the bulk of gamming communities focus on (so, much bigger sales numbers).  This is where the X3D aproach would make all the sense to be, too.
 

5 hours ago, kksnowbear said:

Far as the e-cores goes, I personally don't endorse the idea adding (yet another) piece of software to get hardware to behave.  Don't have to do that on any of the AM5s I've built, either.  I think the e-cores concept is more akin to marketing gimmick than brilliant engineering design.  More cache works way better, and I think that speaks for itself - and loudly so.

You don't "have to". But you should to get the most out of them.
I'll agree that needing extra software, like Process Lasso (or others like it, so specialized that it really works) isn't ideal. 
Maybe Intel should provide a very basic version of it, with some very simple guidelines for most regular users.

My take on this is, perhaps, a bit different than most, as I personally don't mind messing with this stuff for myself (and it's just done once, after all).
But yeah.... I'm not seeing a computer illiterate person going around this, the way I do.

If I only used my PC for gaming, perhaps I'd have got an AMD X3D processor.
But since I require more than a mere gaming platform from my computer, a processor with the E-Cores actually became usefull, almost like a "second processor" assisting the "main processor" (the P-Cores). 

I can do all my stuff (work or hobby) in any other non-gaming apps, by having the added power and core-count of the E-Cores to the P-Cores (so using them all for it). 
Or....
I can also use the E-Cores for all the extra background apps stuff (and exclude all that from P-Cores), so that it never interferes with the games (which I set on the P-Cores only). 

So, the best of both worlds, all depending on situation. 🙂 And all a matter of setting such rules once, from them on automatized. 
It's separating tasks or not (in the P-Cores/E-Cores environment) depending on the purpose and, effectively, getting the most out of the system almost ideally, in my opinion. 
But that's not something that the system can really guess or decide for myself, and why I like it this way (or don't mind having it this way).

Anyways, sorry all for the off-topic, but this subject is interesting to me and can easily get carried away. 🫤

Edited by LucShep
spelling(?)
  • Like 1

CGTC - Caucasus retexture  |  A-10A cockpit retexture  |  Shadows Reduced Impact  |  DCS 2.5.6 - a lighter alternative 

DCS terrain modules_July23_27pc_ns.pngDCS aircraft modules_July23_27pc_ns.png 

Spoiler

Win10 Pro x64  |  Intel i7 12700K (OC@ 5.1/5.0p + 4.0e)  |  64GB DDR4 (OC@ 3700 CL17 Crucial Ballistix)  |  RTX 3090 24GB EVGA FTW3 Ultra  |  2TB NVMe (MP600 Pro XT) + 500GB SSD (WD Blue) + 3TB HDD (Toshiba P300) + 1TB HDD (WD Blue)  |  Corsair RMX 850W  |  Asus Z690 TUF+ D4  |  TR PA120SE  |  Fractal Meshify-C  |  UAD Volt1 + Sennheiser HD-599SE  |  7x USB 3.0 Hub |  50'' 4K Philips PUS7608 UHD TV + Head Tracking  |  HP Reverb G1 Pro (VR)  |  TM Warthog + Logitech X56 

 

Posted

Don't give it too much weight. I didn't mean to give a proper deep analysis of the past "whatever" years, but just my general perception of what Intel did and didn't. I mean you can't deny, that they at least two times just increased power consumption and clock and counted it a new generation.

In my(!) perception the innovations on Intels side are sparse and mostly driven by the competitors catching up and stealing market shares. A sleeping giant got startled so to say.

Also, AMD isn't safe from dropping the ball either, is it? Judging from the latest reviews of the new 9000 series, the advancements seem not that impressive.

  • Like 2

"Muß ich denn jedes Mal, wenn ich sauge oder saugblase den Schlauchstecker in die Schlauchnut schieben?"

Posted
20 minutes ago, Hiob said:

Don't give it too much weight. I didn't mean to give a proper deep analysis of the past "whatever" years, but just my general perception of what Intel did and didn't. I mean you can't deny, that they at least two times just increased power consumption and clock and counted it a new generation.

In my(!) perception the innovations on Intels side are sparse and mostly driven by the competitors catching up and stealing market shares. A sleeping giant got startled so to say.

Also, AMD isn't safe from dropping the ball either, is it? Judging from the latest reviews of the new 9000 series, the advancements seem not that impressive.

Well said.

Free professional advice: Do not rely upon any advice concerning computers from anyone who uses the terms "beast" or "rocking" to refer to computer hardware.  Just...don't.  You've been warned.

While we're at it, people should stop using the term "uplift" to convey "increase".  This is a technical endeavor, we're not in church or at the movies - and it's science, not drama.

Posted

Still, after all those words, I don't regret a single Intel machine I haven't build since Ryzen emerged, 100% AMD and they are all happy, from ol' Granny to my friends sons gaming rig.

I personally got fed up with Intel systems after the myriad of Patches, Bios' and IME-Fw_Upgrades to remedy Meltdown and Spectre disaster. 

 

For most users; I dare to say 80+%, it doesnt matter if their PC scores 27135 Points, 29k points or just 21k, they hardly ever need that much raw computing power.

What matters more is TCO and that is directly connected to the amount of headache and grey hair a system causes... thank you Intel.....ok...I can skip the hair dresser now  LoL

  • Like 1

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Sapphire  Nitro+ 7800XT - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus XG27ACG QHD 180Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Posted (edited)

Hi again. It's me.

I've updated my MSI B-serie mobo. It has now x129 microcode patch.
Returned to all default settings. No any voltage tweaks applied. Only Intel Turbo Boost thing enabled.

I see currently 1.5+ volt on CPU-Z and HWMonitor readings,
but I see currently 1.2~1.3 volts on HWinfo (Update: and I see similar values on AIDA64 too).

It's 1.6V "max" on HWMonitor, but it's only 1.334V "max" on HWinfo.

I'm not looking at VID stuff, I'm checking core voltages.
Why is there so much difference? Which software is correct?

Thx.

Edited by Devrim

Intel i7-14700@5.6GHz | MSI RTX4080 SuperSuprimX | Corsair V. 64GB@6400MHz. | Samsung 1TB 990 PRO SSD (Win10Homex64)
Samsung G5 32" + Samsung 18" + 2x8"TFT Displays | TM Warthog Stick w/AVA Base | VPC MongoosT-50CM3 Throttle | TM MFD Cougars | Logitech G13, G230, G510, PZ55 & Farming Sim Panel | TIR5 & M.Quest3 VR
>>MY MODS<< | Discord: Devrim#1068

Posted

The voltage spikes can be very short, depending on the polling rate of the monitoring software, it may sometimes pick them up or not.

At least this is my guess.

"Muß ich denn jedes Mal, wenn ich sauge oder saugblase den Schlauchstecker in die Schlauchnut schieben?"

Posted (edited)
1 hour ago, Devrim said:

Hi again. It's me.

I've updated my MSI B-serie mobo. It has now x129 microcode patch.
Returned to all default settings. No any voltage tweaks applied. Only Intel Turbo Boost thing enabled.

I see currently 1.5+ volt on CPU-Z and HWMonitor readings,
but I see currently 1.2~1.3 volts on HWinfo (Update: and I see similar values on AIDA64 too).

It's 1.6V "max" on HWMonitor, but it's only 1.334V "max" on HWinfo.

I'm not looking at VID stuff, I'm checking core voltages.
Why is there so much difference? Which software is correct?

Thx.

 

That's a tough one to know for sure. It'll depend on how the motherboard sensors are being read by each different monitoring software.

IMO, HWINFO is usually more accurate than HWMonitor but, again, it'll also depend on the motherboard sensors, not just the software.
For some reason that I haven't understood, one that usually shows a fairly accurate reading of the Vcore is CPU-Z (even though it's not really a monitoring software).

You might have noticed (as said before in this thread, videos showing it and all) that the single/dual core voltage spikes (always going 1.5v+) have not been resolved by the new Intel microcode. The main problem still exhists.
So, if it's fully back to stock BIOS values, then it'll be reaching ~1.55v when it boosts, regardless of what you see in any monitoring software.
And that will still continue provoking degradation, albeit at a slower pace than with previous microcode.

Edited by LucShep
  • Thanks 1

CGTC - Caucasus retexture  |  A-10A cockpit retexture  |  Shadows Reduced Impact  |  DCS 2.5.6 - a lighter alternative 

DCS terrain modules_July23_27pc_ns.pngDCS aircraft modules_July23_27pc_ns.png 

Spoiler

Win10 Pro x64  |  Intel i7 12700K (OC@ 5.1/5.0p + 4.0e)  |  64GB DDR4 (OC@ 3700 CL17 Crucial Ballistix)  |  RTX 3090 24GB EVGA FTW3 Ultra  |  2TB NVMe (MP600 Pro XT) + 500GB SSD (WD Blue) + 3TB HDD (Toshiba P300) + 1TB HDD (WD Blue)  |  Corsair RMX 850W  |  Asus Z690 TUF+ D4  |  TR PA120SE  |  Fractal Meshify-C  |  UAD Volt1 + Sennheiser HD-599SE  |  7x USB 3.0 Hub |  50'' 4K Philips PUS7608 UHD TV + Head Tracking  |  HP Reverb G1 Pro (VR)  |  TM Warthog + Logitech X56 

 

Posted

So Intel have proven incapable (at this point) of writing a piece of software (microcode patch) that addresses the problem for their own hardware. That's genuinely breathtaking. They wrote it, tested it, saw there were still dangerous spikes but went ahead and released it anyway. Just...WTF?!

  • Like 1
Posted (edited)
3 hours ago, Panzerlang said:

So Intel have proven incapable (at this point) of writing a piece of software (microcode patch) that addresses the problem for their own hardware. That's genuinely breathtaking. They wrote it, tested it, saw there were still dangerous spikes but went ahead and released it anyway. Just...WTF?!

The issues with single/dual core voltage spikes and ~1.55v Vcore can't be addressed, unless it's done by the user.

They can't bring down the voltages to "normal" levels, because that wouldn't allow boosts to go high as marketed (as in spec sheets).
Just like they can't disable the single/dual core boost (which would imediately resolve most of the problem!).

Because that would represent changing a product "after the fact", when it was already presented, marketed, and sold as that.
The single/dual core boost and high "guaranteed" clocks are features, that they marketed and spec'ed for the product.

It would be admiting a grave mistake, almost like "it's a scam product", and taking defeat (law suits and indemnifications would imediately go through the roof! 🤣).

Unfortunately, the possible solutions/mitigations are something that you'll have to do on your own, for your own. 🤷‍♂️ "Dew it!!"

Edited by LucShep
  • Like 2

CGTC - Caucasus retexture  |  A-10A cockpit retexture  |  Shadows Reduced Impact  |  DCS 2.5.6 - a lighter alternative 

DCS terrain modules_July23_27pc_ns.pngDCS aircraft modules_July23_27pc_ns.png 

Spoiler

Win10 Pro x64  |  Intel i7 12700K (OC@ 5.1/5.0p + 4.0e)  |  64GB DDR4 (OC@ 3700 CL17 Crucial Ballistix)  |  RTX 3090 24GB EVGA FTW3 Ultra  |  2TB NVMe (MP600 Pro XT) + 500GB SSD (WD Blue) + 3TB HDD (Toshiba P300) + 1TB HDD (WD Blue)  |  Corsair RMX 850W  |  Asus Z690 TUF+ D4  |  TR PA120SE  |  Fractal Meshify-C  |  UAD Volt1 + Sennheiser HD-599SE  |  7x USB 3.0 Hub |  50'' 4K Philips PUS7608 UHD TV + Head Tracking  |  HP Reverb G1 Pro (VR)  |  TM Warthog + Logitech X56 

 

Posted (edited)

^ Exactly. They have a choice between a class action lawsuit over false advertising, or trying to delay the degradation long enough so the CPU fails after the extended warranty and they can deny your claim.

They chose the latter.

Edited by Aapje
  • Like 2
Posted

Fry that thing with prime95 asap and teach Intel a lesson, that's my 1st idea...and the one I would choose.

Then, sell the new CPU that you get in an unopened box, sell the board...and say THANK YOU to Intel.

Actually, I would try to get a refund and not another doomed CPU, so you only have to do something meaningful with the board, maybe a sub-65w box for NAS or the living room.

 

This is outright wrong to handle it this way and I think some people will sue Intel for this, more sooner than later

 

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Sapphire  Nitro+ 7800XT - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus XG27ACG QHD 180Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...