-
Posts
1688 -
Joined
-
Last visited
-
Days Won
2
Content Type
Profiles
Forums
Events
Everything posted by LucShep
-
If you haven't disabled the CPU power savings (Speedstep, Speedshift, and all the C-States), then you're just saying to the motherboard that "I want those P-cores to all be limited to that X ammount of Ghz, when at maximum". It'll still downclock like always, power savings the same, everything the same, when you're not pushing the processor. So no. It won't be forcing to always be there. You're just limiting, placing a ceiling if you will. And that's where the benefit comes from. Because (if you run it stock) the voltages are "tabled" according to clocks (the higher the clocks, the higher the voltage, lower clocks translate to lower voltages and etc). Taking an i9 13900KS as example... It boosts to 6.0Ghz in single-core (the "Max Turbo Frequency"). To get there, it requires more voltage and, even if it's just for one core, all of that power is offered, at once. If instead you sync all the P-cores, and set to 5.6 Ghz (which is its stock "out of the box" all P-cores clock), it won't boost there anymore and will not reach such silly high voltage (plus, you also get lower temperatures). It's how it should have been (IMO), like correcting it (and restricting that). For which you actually don't get lower performance in gaming or most things really. Not saying that this will save your CPU forever, or that it won't degrade ever again (given the latest news, it seems deeper than that). But, no doubt, you're already cutting the worst and biggest offender, and easing things a whole lot. Check Buildzoid's video, and how that single-core boost goes over 1.5v... (insane how it happens, and that's supposedly "normal" for Intel 13th/14th gen!! ).... Picking a part at 12:13 time of video, just keep watching:
-
Didn't even knew this thread exhisted. It seems there are others afflicted with the moto-nerd virus here!! My companions in "the path": Honda CRM 50 1993 Lots of memorable rides and adventures in this gutless POS (but a great school for anyone starting to ride). Suzuki GSXR 600 SRAD 1997 A friggin banshee, screaming out of the corners. A product aimed for the early twenties fellas, I too took the bait (was 23 then). Definitely not the most pleasant ride in slow traffic (needs moooaar revsss maaan). Nice toy that ended not seeing a race-track like I once planned. Ducati Monster 900 "Dark" 1999 Huge booming soundtrack, torquey "rounded" engine, huge fun. It didn't last more than two months, destroyed in a collision with a wreckless driver doing a U-turn in the worst moment possible. Such a lovely bike. And such a shame. Yamaha TRX 850 1998 This was kind of an oddball motorcycle, rare too, that I imediately wanted to have as soon as I could. Japanese sportbike with Ducati styling and a trellis frame? A parallel twin engine with 270º crank? (so, works and sounds like a V-Twin) Interesting bike that, unfortunately, too many times felt like a mongrel of parts (and technically it is). Ducati 750 SuperSport 1993 Old, traditional, agricultural, so many of the "ugly headlight looks like a brick" comments (yeah, yeah.. whatever). Huge character, rounded engine with nice soundtrack, and probably the sweetest ride in the whole list. Sometimes less is more, and simple is everything. If you ever owned one, you're probably nodding and smiling (yep, you get it). Triumph Daytona 955i (T595i) 1999 Great triple cylinder engine like no other, but an odd chassis geometry that I ended not getting along with. Always felt curious about it, got it in a deal that seemed too good to pass. A rare and great machine that simply took too long for me to gel with. Ducati 748E 2000 A dream come true, and the best motorcycle I've ever had, by far (and away) - and there's a very trick 916 engine under those fairings (the original 748 went kaput). Warming it up was like a ritual, and the most beautiful ride, fully in control. Feeling every little thing, and that deep sound and engine vibes, it's very visceral. Whatever good things you read about these, I assure you they're all that, and more. It's "the whole package", everything in the right dose. Shame about the maintenance (done frequently, sometimes laborious) that gets expensive, but I'd imediately own another if I could (yes I dislike modern literbikes). One of the very few pics I have of it (oddly, only a handful through those years), with my ugly head contemplating this marvel of design and engineering. Suzuki GSXR 1100 WP 1993 Nostalgia hit me really bad, in a time when I could no longer afford maintaining the yellow Duc, I decided to trade it for something else (my biggest regret, ever). Went for this piece of youth icon (of mine at least). And I confirm that it's as mad as everybody's stories that you may have heard of these. Imaculate for its age (they don't build them like this anymore). With modern tyres and a well setup suspension, despite the age and old tech, it is still a really good motorcycle. But..... just not for me. Aprilia RSV 1000 Tuono Fighter 2004 Went looking again for a sporty V-Twin, specifically the Aprilia RSV Mille (Gen 1, early 2000s). The problem is my age, the body aches, no disposition for a race-rep machine. The RSV Tuono is the same bike as the RSV Mille, minus the fairings, with a handlebar instead of clip-ons. It was an easy decision after a run around the block. I adore this motorcycle, the ride and feel, the strong engine, chassis and build quality, the sound, the details and the look (even though I wouldn't call it "pretty"). Currently at 77000 KMs, still going strong as ever. The guy who said "italian motorcycles can't do the miles" does not have any clue. The year for each mentioned one is of the model, not necessarily the year of my acquisition. They're placed by chronological order, during 30-odd years of riding to current day. Never had a "brand new zero milleage" motorcycle, and I don't think I will at this point. I value motorcycles very much for the emotions, the memories they create and, for that, they don't need to be brand spanking new, or the latest and greatest. They just need "to be right".
-
Problem on Intel 13th & 14th Gen CPUs.
LucShep replied to Devrim's topic in PC Hardware and Related Software
Quoting myself from another thread (pardon that), but.... Just by activating "Synch All Cores" and placing a value (for all the cores) that is same or as close to the "out of the box" (Stock) all-core clock for the specific processor, you imediately fix the worst part of the problem. What degrades these CPUs faster is when a single core, even for a background task, asks to boost to whatever outrageous ammount of "single/dual core boost". Even idling in desktop you can see this destructive behaviour, doing its own unknowningly suicidal thing. It happens very, very frequently during whatever time of use. When it does that, it is given 1.50v+ depending on the individual CPU and motherboard+bios (some people even mention 1.60v at times). That is one major reason why these chips are dying. Even though the chip has power limits, when only one core wants to boost, ALL of the power is offered, at once. The most insane part is that this seems to be a feature just so Intel (and also AMD has it) can get a good CineBench score by reviewers. Lock those cores and you'll be a LOT better. -
Just by activating "Synch All Cores" in your P-Cores and placing a value (for all the cores) that is same or as close to the "out of the box" (Stock) all-core clock for the specific processor, you imediately fix the worst part of the problem. What degrades these CPUs faster is when a single core, even for a background task, asks to boost to whatever outrageous ammount of "single/dual core boost". Even idling in desktop you can see this destructive behaviour, doing its own unknowningly suicidal thing. It happens very, very frequently during whatever time of use. When it does that, it is given 1.50v+ depending on the individual CPU and motherboard+bios (some people even mention 1.60v at times). That is one major reason why these chips are dying. Even though the chip has power limits, when only one core wants to boost, ALL of the power is offered, at once. The most insane part is that this seems to be a feature just so Intel (and also AMD has it) can get a good CineBench score by reviewers. Lock those cores and you'll be a LOT better.
-
GTX 1080Ti or RTX 2080Ti. Which to choose?
LucShep replied to Holbeach's topic in PC Hardware and Related Software
Those old OCZ ZX PSUs were really, really good. But that was in their day, a long time ago (circa 2011).... Not so sure it'll do so good with a modern 270W GPU, but then it was SLI ready in their day, and it's pushing the GTX1070 after all. The one thing that you really need to change urgently is that PC case (holy old box batman!). It's absurdly claustrophobic for your new hot GPU, that won't go well. You have really good budget ATX cases now, absolutely worth the effort on the components transplant to it. PS: Take a good look at the Montech AIR 903 AIR MAX (great ATX case, a review by TPU), which costs just £60,00: https://www.scan.co.uk/products/montech-air-903-max-black-mid-tower-chassis-w-tempered-glass-3x-140mm-argb-fans-usb-ype-c-e-atx-atx https://www.overclockers.co.uk/montech-air-903-max-midi-tower-tempered-glass-black-cas-mon-01229.html IMHO, there's no excuse not to do it right away with the new GPU coming. More than giving that system a modern new look, it'll be a humungous jump in better internal space, layout, and especially airflow (4x 140mm fans that you plug to a SATA power connector of PSU and one single PWM to motherboard!). -
Absolutely. It's a total <profanity> show. This stuff can be pretty aggravating, as many users kept working from home since the pandemic, and are oblivious to this crap. Not funny if your daily working tool uses one of these Intel chips and goes "kaput". I'm not so sure about DCS users, how many visit this section anyway? There may be so many who may also be oblivious to this. At least some kind of heads-up to friends, then them to their friends and etc, also in other communities, should be done by each of us. So many people who should be alerted. Meanwhile, another section of the freak show has already started... all of the 12th gen "K" CPUs have already started to slowly go up in price (at least where I'm looking).
-
Yes, your 14900KS will need the microcode update, that's for sure. Don't trust lady luck on this. Everybody with 13th and 14th gen CPUs that are over 65W needs to do it once it's out (no buts or ifs). Get the friggin microcode update installed once it's out. And consider RMA your CPU if there are signs of damage already.
-
Not sure how many will raise the arm to confirm, but...... The truth is that this is really problematic, even more than expected. And, yes, there'll be fatalities among DCS users' Intel 13th/14th gen CPUs at some point, if not already. If there's a techtuber that you should take seriously is Roman “der8auer” Hartung, an extreme overclocker that is also a mechatronics engineer. He did what most should have done, which is taking his time and only release an opinion that he's sure on and ready to be released (contrary to most who ran for the clicks). This is what he has to say (at 11.02 in the video): It seems it affects all 13th and 14th gen CPUs that are above 65W.
-
GTX 1080Ti or RTX 2080Ti. Which to choose?
LucShep replied to Holbeach's topic in PC Hardware and Related Software
@Holbeach Not much to add to what @kksnowbear just said above, but "is there going to be an improvement or not" is not even a question. The RTX2080Ti is at least 50% faster than a GTX1070, and in practical terms it can double the performance in heavy scenarios. Your system will slow it down quite a bit (being outdated for it), but you'll still see enough gains not to be disapointed. The RTX2080Ti Strix was among the best models, and it's still a very good GPU today. Just please consider upgrading your system ASAP (discussed already in this thread) because you won't see its full capabilities until then. PS: the RTX2080Ti Strix is somewhat power hungry (can eat up to 270W in power consumption, 2x 8-pin PCIe from PSU required). I see in your signature "1kw PSU".. but what brand and model of PSU is it? -
Last week I was in a group chat where this stuff was being discussed, there was a video posted there about a degraded i9 13900KS. Absolutely worth the watch, more so if you already watched Buildzoids' oscilloscope video (the one I'm quoting) where the crazy single-core boost voltages problems are shown. If all you watch are the glorified techtubers in the techspace, this guy will look obnoxious (ignore that). The point here is, he's actually correct and his aproach here makes all the sense. :WARNING: swearing
-
Thanks for the heads up, didn't even remember to check up Buildzoid's channel. Watching the video right now, and there is exacty what I was saying before (crazy single-core boosts + voltages), check at about 12.13 and on: ...see what I mean? It's insane how this happens. These 13th and 14th gen i9s (and i7s?) all seem to work at stock with well over what is considered normal, reaching and surpassing 1.5v(!!). 12th gen i9s and i7s were already starting to slowly degrade if you're were using over 1.4v on long term overclocks. I just wonder if it has to do with Intel probably using their own motherboards or BIOS settings for stability and long-term reliability testing, and not used by any other motherboard manufacturers (which could explain why they seem dumbfounded, and would be tremendous incompetent but not out of possibility).
-
Alder Lake is an all-new design compared to Rocket Lake, completely different architectures. Not the case with Raptor Lake. We can throw all the objective and non-objective reasons we like until the cows go home. My point remains. Of course there are differences in the internal voltages of Raptor Lake, and must have to do with the Raptor-Cove changes. Higher clocks on E-Cores are not helping either, for sure. The core voltages and VID tables are different, even the system agent voltages are different (probably were offset to not cook even more the processor). I'm expecting a much bigger hit than a percent or so. But, so long as it stops the problem, it won't be a big deal. At this moment, it's understandable that AMD users are rejoicing with their choice. I am as well with my Intel 12th Gen being totally fine and exempt of these issues. But AMD is not exactly exempt of the "single-core boost high voltages" either, or problems in their own platforms, you know...
-
Sure, and I'm not contradicting you. But if you're a potential victim in this strange (still ongoing) process with frak knows what exact fix will be, you might as well do something to avoid it happening, or mitigating the issue if it already has started to happen. For me if I was in that scenario, that would be what I'd do.
-
Oh, but they do have their share of guilt, be not mistaken. Imagine buying a top end system that, by default BIOS settings, is pumping even more voltage/amps on an already very voracious hot chip, to higher levels. If you're pushing that system for months on end, and we know now that these chips have some problem, then some sort of degradation accelerated by questionable BIOS settings is not unexpected. And if the degradation already started to occur at that point.... Overclocking and tweaking has many forms. On one end you have the excessive "Extreme" enthusiast level, raising (even more) clocks and voltages to insane levels with these chips. On the other end you have the mild OC tunings with locked clock on all cores, or the overclocking/undervolting ones that are far less severe (and probably better than stock for lifespan). It may take years, but yes, I too suspect it'll be only a matter of time until it starts happening to every i9 13900K/14900K (and also some i7 13700K/14700K) with it at the (currently in use) stock clocks/voltages. At stock the 13th/14th K processors push too high in single/dual core boost, and it takes voltages/temps that (IMO) are better avoided. That's why I wrote this above, and quoting:
-
Yes, there's the Intel baseline profile with newest BIOS releases in past April, but haven't seen a single motherboard with it loaded by default (you have to load that profile yourself, after updating the BIOS with it). How many users really have done that in their own computers? Then there's this issue now with a gazillion of users mentioning problems, even after loading this profile and rectifying things, and still having processors going faulty, system lock-ups, bluescreens and whatnot. By that point, if the CPU had already started to degrade, due to previous excessive voltages and limits removed, then of course that's not going to help it. A degraded CPU is one that now needs more voltage for the same specific clock, not less. With all this happening, it's a fair sentiment. And one that is spreading among many, many users, believe it.
-
GTX 1080Ti or RTX 2080Ti. Which to choose?
LucShep replied to Holbeach's topic in PC Hardware and Related Software
I passed to a family member a much older system with Intel 7 2600K, ASUS P8P67, 24GB of DDR3 1600Mhz 8-8-8-24 RAM (4+8+4+8 GB sticks, all Gskill RipjawsX). It's overclocked @4.3Ghz (all-core) with a simple Hyper212 (two fans in push-pull) for many, many years without any issues or degradation. I'd strongly recommend doing it. Like yours, it's using a GTX1070 (Gigabyte G1 Gaming). The most that you'll get with that CPU, without bottlenecking the GPU, is with an RTX2070 Super or GTX1080Ti. And notice: overclocked, not stock. Doesn't mean that you shouldn't get the RTX2080Ti if it's a good deal (if it is, then get it!), but you'll be hostage of situations where games are purely GPU dependent, or not. When games are CPU dependent, that RTX2080Ti will be severely underused, because that 2600K CPU+ DDR3 RAM can not keep up with that GPU. This will certainly be the case with more complex missions in DCS, in single and multi player. Honestly, I'd really consider upgrading the CPU+motherboard+RAM (and cooler) if you're getting a more potent GPU. Something like an Intel i5 12600KF, ASRock Z690 Pro RS, 64GB DDR4 (2x 32GB 3200 CL16 Gskill RipjawsV, for example), and a Peerless Assassin 120SE air-cooler, that would be a combination with absolutely tremendous jump in processing power. While not exactly cheap change, that can be had for not too much money (about $400 for all those parts new) - probably the best bang for the buck right now. -
There. We just have a different perspective. The tick-tock model is the perfect example I'm trying to convey. You're looking at it from a pure technical perspective, I'm looking at it from a practical perspective. Just because a new "Gen" chip with a few changes is launched, doesn't mean that it is an all new design. I'm sure you can agree that tick-tock model was the definition of Intel's stagnation (generally accepted as such) during last decade. Exactly because of lack of inovation. Nothing different here in this case, with evolution/revision changes on the P-Cores, from "Golden Cove" to "Raptor Cove". So, no, it is not incorrect saying that 13th and 14th gen "K" CPUs are in essence 12th gen CPUs "cranked up" (evolved, revised and pushed further), because that's what they really are. Didn't see this before replying, sorry. Again, it has mostly to do with voltages and huge clocks boosts. Which were never that high in 12th gen. If you overclock a 12th gen CPU to ambitious levels that require those kinds of voltages, ones that you see 13th and especially 14th gen single/dual core boosts, they'll also degrade at some point. The microcode bug that sends excess voltage to the 13th/14th gen CPUs, that supposedly will be fixed in August according to Intel, will lower that once the fix comes out. And I bet that it will penalize performance.
-
I guess we have different definitions of what a "new chip design" is then. Raptor Lake (13th/14th gen) from Alder Lake (12th gen) is as much of a new chip design as Rocket Lake (11th gen) was from Comet Lake (10th gen). It could be that I look at it by an old man's perspective, but a new chip design (and not a revision/evolution) is composed of architectural and process node changes.
-
The problem is really the voltages and temperatures (especially noticed with single and dual core clock boosts, IMO). But I agree that could be something else that was never right. The part that really disapoints the "long time Intel fan" in me is knowing that this has only one of two possibilites... Either it's 1) result of incompetence, or 2) they knew and risked it anyways, hiding the potential issues from the public. Regardless, it sounds like greed and despair, to get on top of a competitor who has made significant strides within the last decade.
-
No, it isn't. Raptor Lake (13th/14th gen) is "Alder Lake 2.0". There was no fundamental change in design whatsoever, it's basically a refinment of the very same heterogeneous CPU architecture, to achieve higher clocks, higher L2 and L3 cache, with revised P-Cores and more E-cores added, all with "optimized" (LOL) voltage frequency curve. Of course, all at the inevitable cost of higher power consumption and temperatures, pushing the silicone and circuitry to previously unseen levels. I'm sure we can all agree that Anandtech is an absolutely valid reference in the tech reviewing space, and I'll quote from their 13th-gen review:
-
Well, it took a while to see the problem exposed. It had been pointed since 13th gen got out in late 2022 that there would be issues. 13th and 14th gen "K" CPUs are in essence 12th gen CPUs "cranked up", overvolted to the moon and running melting hot. A really bad idea, when it's a given that motherboard manufacturers, in their turn, then unlock all sorts of stuff on top of that (so, even more voltage, such as "Multi Core Enhancements" and all limits removed), just to extract that little bit more (even more) to beat the next competitor. It was a lose-lose bet from the start and, sadly, the injured part are really the users, not the manufacturers. Now people got degraded CPUs, and I'm not sure how many of the affected users will win the battle of RMA-hell. The other problem (if not the main one) is this obsession with "single-core benchmark scores", that most modern tech reviewers use as a yardstick, and that then influences people to chose this and that. Of course both Intel and AMD are watching and then "oh that's what they want, and using it to compare us with our rivals, ok... we'll ramp that up for next releases".... When that started to happen, you started to see hilarious voltages for insane single-core clock boosts, both from Intel and AMD. Which in the end matters close to nothing to your gaming and real life usage... unless (yep!) you're running the silly single-core benchmark! If you're running a stock 13th/14th gen i7 and i9 CPU (not overclocked), then perhaps consider locking all your P-Cores to the same clock, the same value that is listed for the max "all core" clock out of the box on your processor. And, after that, also using offset voltage to readjust the CPU bottom voltage (more or less, depends on individual CPU). By doing that, you're basically attacking the problem in two fronts, 1) you stop those one or two cores from boosting way too high, which pushes stupid voltages and degrades further, and 2) with the offset you move the bottom voltage just a little bit, to make it all work reliably with whatever application. Make a stress test (with AVX2) using Intel XTU, or continuous multi-core runs of Cinebench, while monitoring (HWinfo, for example) your CPU package temps, power, VIDs, etc. Just with that changed, it's imediately noticed a considerable reduction in CPU power consumption and temperatures, while fixing crashes, and possibly the further degradation.
-
My perspective is of someone who has dealt with all the problems that you, clearly, have no idea they even exhist. How about if I tell you that (over?) a 1/3 of the game total size is comprised of badly oversized and formated textures? That the game is unable to fit those inside the 8GB VRAM of GPUs that most people are still using? Do you even understand what happens when the VRAM of a GPU is filled up and it gets to the RAM and, when that's filled to the top, then to the pagefile that is slow (AF!) in sata ssd or budget nvme? Try this... get a three or four year old used 8GB vram GPU (RX6600XT, etc) in your system, then take a stick of RAM out of your PC; cap that processor to, say, 2 P-Cores only, and then get into that ECW or 4YA fully populated server again and, sure, go ahead and put that screen set at 1080P. (and yet you'll still have more performance than the average guy out there!) Then get back to us with the story of perfect smooth undemanding experience......... LMAO Regarding VR, that's your opinion based on speculation only. DCS could run pretty decent in VR and, indeed, it once did. It comparatively runs like heaven in 2.56, on systems less capable than mine. The question should be "why can it not run like that today with 2.9x", if it was more capable years ago? (and note, was already demanding then) Because, in various aspects, we clearly have had the contrary of progress.
-
Yes it does. I’m sure there are many players who run this without trouble My previous point, which was in the post that you quote that from, was also to demonstrate that not everyone will be able to see these problems. Because we all have different perceptions, sensibilities, or even tolerances, to those problems that do occur, frequently or not. That is certainly true but look what I’m running, 4K 120Hz at all the highest settings. You don’t need such hardware to run the game in 1080p 60Hz at medium settings. DCS really is not a particularly demanding game provided you run settings compatible with your hardware. Not a particularly demanding game? You jest, surely. There are only a handfull of games that I can recall being as harsh (or badly optmized) as DCS, to demand the sort of hardware specs that people feel compelled to invest, to brute force around all the problems. I think you're totally oblivious to the reality. We're not talking scenarios like a free flight of Su-25T over Caucasus with a handful of tanks, because that surely can not be considered the example these days. Have you even entered a fully populated server, popular ones like ECW or 4YA, with, say, an F-14A/B module in Syria map? Very few people (if any, at all?) will be able to say that they have the perfect performance experience that you describe having there. What's the average "new build for DCS" these days? Ryzen 7800X3D, RTX4070TI Super or RTX4080 Super (both 16GB VRAM), 64GB DDR5 (6000 C30) RAM, 2TB or 4TB high-range NVME... something like that right? And then what are the "official" hardware requirements that are listed again? Does that make any sense? Are you even aware of what's the average gaming PC these days? I'd wager that it's something like a Ryzen 3600X or i5 10600K, RTX3060 or RX6650XT, 16GB DDR4 RAM (32GB with luck), 1TB SATA SSD or ultra-budget NVME, along those lines. I'm very convinced that it's those systems that compose a very considerable part (if not most) of the userbase, because that's also I'm most frequently asked to assist with. Even at 1080P, those will increasingly have to run the game at near-Minecraft image quality to enter and compete in those servers. And stuttering gallore, ooh I assure you.
-
That is beside the point. But, in anycase, the reality is that the sort of machine that the newest DCS 2.9x version requires to be absolutely stutter free still does not exhist. This comes from someone that builds and assists with gaming systems, as a hobby, in a variety of budgets and parts (used and new). Some of which I actually built for DCS... There are inherent optimization problems within the game itself (in so many ways) that even the devs recognize that it requires intervention - and why Vulkan (among other things) is being developed. "Stuttering", "Hitching", "Juddering" are things that are perceived differently, greatly or not, and easily tolerated or not, from one individual to another. 2D or VR, no matter. For example, with a 4K screen at 60hz refresh (S-Sync 60FPS, no VRR, as is my preference), you absolutely need a Frametime below 16.6ms at all times, regardless of maximum Framerate achievable. Anything over that frametime value will create a "Stuttering/Hitching/Juddering" effect (or sensation) in the image motion. Some people are not sensitive to it, others are (like me). For instances, I can (unfortunately!) even feel it going below and over 8.0ms in 4K 60Hz, locked at 60FPS, as slightly flutuation almost as "hitching". Yes, even when it's within that optimal sub-16.6ms Frametime. I had a friend right beside me with it occuring and he would not see/feel/sense any difference, no perception of such. I think you may be like him. Your yardstick for "smooth performance" (your own PC, looking at its specs in your signature) is currently beyond the capabilities of, I'd say, the majority of active DCS users. I'm of firm belief that ED (and 3rd parties) keep making the same grave mistake by insisting on making more things "prettier" but heavier. It gets harder and harder to increase (or keep same) game settings, an ideal image and game performance that will only be seen by the fortunate ones who can spend so much money on PC hardware. And even such fortunate ones get performance problems, looking at these forums.... Nothing to do with the ability of enjoying the sim/game as is (performance problems or not), but it's the truth.