Jump to content

LucShep

Members
  • Posts

    1688
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by LucShep

  1. 32'' is definitely a noticeable increase coming from 27''. ...do you imagine downgrading to a 22''? (those tiny little things heh) Now compare your 27'' to a 34'' SuperWide: If your desk is short and you're forced to be very close to the monitor, then of course a 42'' is probably out of the equation but, just look at the scale......... The in-game cockpits (everything really) then start to feel a LOT more "real life like" (LG 42'' OLED C2 or C3 are really, really good, I tell ya) PS: comparisons from https://www.displaywars.com/
  2. It depends on various things... It depends on the system you'll be using DCS with that new screen. You seem to have a 2560x1440 27'' screen. How much are you seeing in the GPU usage? If you're already seeing over 80% of GPU usage at that resolution, then at 4K you'll certainly hit 100% with the same settings, forcing you to decrease settings. On the other hand, with the recent inclusion of DLAA, TAA, and DLSS, you'll certainly manage a compromise with settings that can make you really satisfied. It depends what you value most, more resolution or more framerate. The jump in clarity and definition from 1440P to 4K is as big -if not bigger- than 1080P to 1440P. It's that good. But it's still very intensive after all these years - it does push the GPU to its limits at a certain point... forget 90+FPS everywhere and all the time in DCS at 4K, it won't happen. As to say, if you're the type of person that "it always has to be 120+FPS all the time, period" then DCS in 4K is probably not for you. Especially if not using the very best hardware. It depends what you value most, vertical depth or horizontal depth. Widescreen format, either in 21:9 (super-wide) or 32:9 (ultra-wide) really is an acquired taste. You should definitely try it first, it may or may not work for you. Some swear by it, and it does have its pluses depending on application. For example, for sim-racing the big curved ultra-wide 49'' monitors are excelent. But then, we're talking flight-sims + TrackIR here.... and in this particular scenario (DCS and others) I personally don't think it works all that well. You lose the immersive vertical depth that a regular 16:9 screen provides, and is an important factor you'll be missing. Personally, I felt like "something is missing", as if the top of the screen was chopped off (I regretted it... never again). In regards to the spotting in DCS, even with latest updates, it's still mediocre regardless of resolution (IMO). I do not think this should be considered as a decision factor for your new screen. The immersion, the clarity, definition, motion handling, crispness of colors, overall image quality, now those certainly are. I'm sure opinions will vary but, I'd really recommend two potential paths in 16:9 format, first because they are safe bets for DCS and, second, because in your case it's basically taking what you already seem to like and make it "bigger, more better" : A good 32'' 2560x1440 (1440P) monitor is fairly affordable now. Yes, the pixel density is lower compared to your 27'' with same resolution (equivalent to that of a 24'' 1080P screen), but still very good on a bigger 1440P screen that is also easy to run (same as you have in that aspect). There are OLED panels at 32'' size, but they're too pricey for the size (IMO). Models with IPS panel are affordable and the better choice for this size, with good overall quality and no ghosting/smearing issues with fast moving images (likely to happen with VA panels and why some avoid these). Examples: Gigabyte M32Q, Asus TUF VG32AQL1A, LG 32GP750 and 32GP850 (these are all IPS) A good 43'' or 42'' 3840x2160 (4K) monitor can be expensive and a little harder to run but is, most likely, the choice that will knock your socks off. IPS panel would be good but don't know any high-refresh model (LG 43UN700-B is 60Hz only, ok but not "wow"). There are some with VA panel (Samsung, Asus, etc). An OLED panel is definitely the best of all (excelent) and very worth a look - if it's for pure gaming use (burn-in risk factor). Examples: LG C2 42'' and C3 42'' (OLED, 120hz) I'd even recommend a 48'' 4K OLED, but then we're entering a whole'nother level of "bigger more better"....
  3. Yep, it's far from a perfect solution, and the main problem with DCS 2.5.6. Any of today's content that was already available then (March of 2021), and as of that version release, runs much better there (in my experience and of many others), inclusively online if with people using that same version. But you're left behind with what already exhisted back then, as explained in its download page details. As to say, if you're an advocate for MP public servers, and/or of latest updates/fixes, and/or of most recent content (modules, campaigns, etc), then it will not serve you. There was loads of content already (actually most of today's) and plenty campaigns available at the time for a plethora of modules. It might be the case for yours (see in the DLC page which version it requires). If you're intrigued, and if you have the drive space, my suggestion is to give it a good go and see how it works for you. For me, I'm finally enjoying DCS in VR, no more frustration with framerate, frametimes, stuttering and whatnot, and can inclusively run higher settings. And as much as I was antecipating upcoming content (Mig-17F, A-7E, Kola), my conclusion is that the performance benefits, as of today, are too good with 2.5.6 to ignore it. Not going to suck it up with the issues and/or upgrade perfectly capable HW just to keep up with the joneses. Not again.
  4. Hey, call it what you will. I just believe that ED should solve this versus everyone just having to throw hardware at it.
  5. Not sure I understand... and pardon my rant here if I misunderstand. So we should start accepting to have 56GB+ RAM and 24GB+ of VRAM usage ingame, in 2023? And even worse in next two or three years? If I've read it correctly, and if that's what it means, then I'm sorry but that's stupid. On a hobby that is already pretty demanding and expensive as is. I was watching that video from Jabber's mesmerized thinking "how did we get here?!"? - he seems to have a top of the line PC, and the OSD figures there are alarming. I'm currently running DCS2.5.6 (just my preference) and, with the same modules and maps, I can not understand where to justify the RAM, VRAM, CPU and storage much(!) higher usage in version 2.9, to warrant the "step up" - it's not even a different, brand new DCS game. IMHO, it should be contested and criticized by everyone (we're all paying customers after all). But, instead, it seems people prefer to pony up for overkill hardware, in a desperate atempt to disguise issues that should not exhist.
  6. Perhaps this might help alleviate what may be oversaturation of IO operations, due to to many threads running at the same time. It's worth a try. In anycase, it does show DCS is able to exhaust the VRAM even with a 24GB GPU (and that seems 1440P, not even 4K, not VR) and system exceding 59GB of RAM usage(!?!). It just looks silly that a sim/game, already a monster eating resources a few years ago, got to be this demanding. But at least it seems ED is on it (fingers crossed). @Steel Jaw I'd suggest to try Taz' Optimized Textures. It'll alleviate a bit the outrageous VRAM consumption (and how the textures should have been from the start).
  7. I've seen 42GB online quite often and already found it absurd, but 63+GB ?? That's certainly not normal! Your RTX3060Ti is a bit short with its 8GB of VRAM, and DCS easily consumes upwards of 14GB VRAM if online. Then it starts to eat RAM and pagefile to compensate. That said, I'd take a screenshot and present it as a bug in the respective section. No sim/game should ever consume that much RAM.
  8. No, no need to to wait or to get DP2.1 hardware for 4K in DCS. DCS is not the kind of game you expect super high framerate (actually the opposite), so the bigger bandwidth with newer standard won't make any difference for it.
  9. I found the X56 throttle to be actually decent (the stick not so much), can be a good purchase as these days it's found fairly cheap in the used market. Its known issues can be easily remedied taking the following things as a rule for it: LEDs always OFF in the Logitech software, pretty much a requirement to avoid any issues. Better make sure as well to connect it through a self-powered USB HUB. No need for expensive ones, cheap ones like TP-Link UH720 ($25.00) are absolutely fine. Do the following for both the throttle and stick, to solve ghosting and jittering problems (usual in throttle and rotaries, also happen with the X52 and X52-Pro), by using Vjoy +Joystick Gremlin, following this tutorial:
  10. I agree with @BitMaster. Even though DDR5 is better in this regard than DDR4 was before, it is clear that the rule still persists - two sticks of RAM are always better than just one. The main advantage is higher memory bandwidth, higher access efficiency, and lower latency. It's all better with two sticks (instead of just one). DDR5 is much cheaper than before, there is no reason to not get a Dual kit (say, 2x 32GB or 2x 48GB) that is meant to be installed and run in that configuration. All these Intel and AMD processor based systems that we go for are designed to run ideally in Dual-Channel memory configuration. Especially if you're already spending on a system, not doing it is counter intuitive. It's neglecting a particular part of the system performance that most people here crave for. There are plenty articles and videos explaining this in a more cientific aproach (and with tests). This one for example is a bit old (dates from when DDR5 was out and too expensive, and DDR4 mobos for 12th gen were a better compromise) but have a listen from 3:19 and on:
  11. I don't have a Ryzen 7950X3D, but its issues are well known in the HW community. You may be already very familiar with the following but, in anycase... The AMD 7950X3D is a dual chiplet design CPU. It gets more complicated because it has 16 cores, but only 8 of them make use of the 3D V-Cache. IIRC, CCD 1 (chiplet one) is the CCD with the 3D V-Cache, so make sure that the game is only worked on (CPU affinity) by those 8 cores with 3D V-cache, and not use the CCD 2 (chiplet 2) with games. I presume that would mean creating the affinity to logical core 0-15 with the 3D V-Cache (and 16-31 without). Something like Process Lasso may be the better way to create settings to automatize affinities, very usefull with a 7950X3D.
  12. Yes, pre-rendered frames depend on CPU but it actually works the opposite way - more pre-rendered frames aliviates the CPU. Adding pre-rendered 3 (and up to 4) lets your GPU stack them up, at a small cost of latency (unnoticed in a flight sim - i.e, no downside in this aspect). So if your CPU ever struggles to keep up (likely case in DCS), you can tell the GPU to let X amount of frames pre-render. Pre-rendered value at 1 would be ideal in a world where this sim never, ever, stutters or hitches. But alas.... "DCS" Pretty much everybody in DCS VR is, unfortunately, running the game smoothly for a limited time only, with the odd hitch or stutter always creeping in. Therefore should try setting it to either 2, 3 or 4 frames to help smooth them over. I think benefits vary between values, and can be debatable (through empirical and quantitative testing, etc). Best way is to try it out yourself and see what looks/feels better. For me, it certainly makes a difference, for the better (prefer it set at 3). Could be different with 13900K + RTX4090, but it won't necessarily work better. "Default" adds performance penalty, more stressful due its higher projected accuracy. Even if with lower cascades (mod in my sig). The odd thing for me is that, Terrain Shadows from OFF to FLAT with game versions up to 2.5.6 used to get virtual zero GPU frametime cost. But they can cost about 15%~25% ever since version 2.7 got out, and it remains like that with 2.9 today (and 2.8 before). Therefore why I always use them OFF and recommend so, for VR. ...one reason (among various other reasons) why I got back to version 2.5.6 for VR and never looked back - Terrain Shadows (Default or FLAT) can be used with far less issues.
  13. No problem. If you just want get back to 2.8, then maybe just download and install the current Stable version - that's still 2.8. If you're looking to rollback your current installation to a specific OpenBeta version, then it's a little more involved... I covered that in another thread recently, the principle is the same as in HERE. You just look for the desired version ID, and follow those same guidelines for it.
  14. I concour. I still adore my RTX3090 24GB, but wouldn't mind swapping it for a good RTX4080 16GB if the opportunity was given. It has less VRAM (16GB vs 24GB) but that's plenty, and it's overall faster (+25%) while being more efficient (~50W less at peak consumption, ~320W vs ~370W). Very expensive but may be worth it for someone coming from an older GPU, considering the current sad state of the HW market. Regardless of choice, and if you're spending for DCS VR, avoid any GPU with less than 12GB of VRAM, and better if it's 16GB or more. If you're OK with used hardware, there may be some deals worth looking at (Ebay, etc) for RTX3080Ti 12GB and RTX3090 24GB. Otherwise, if it really has to be brand new, RTX4070Ti 12GB or RTX4080 16GB would be my pick. PS: rumour has it that NVDIA will be releasing the RTX 4000 Super series refresh, from start of next year (2024). You may want to wait (or not) and see if the RTX 4070Ti Super 16GB is a reality, like the info leaks suggest.
  15. Well, you got a beastly machine, that's for sure. But optimizations issues within DCS itself, especially in VR, can not be cured by hardware horsepower. One small mistake that I see already - you should never, ever, enforce Antialising mode and settings in the NVIDIA driver profile for DX11 games (llike DCS is). You placed that at "Enhance The Application Setting" and following one at "2x". That gives no benefit, it creates a conflict within game settings, and impacts performance. Always leave Antialising mode and setting at "Application Controlled" for DX11 games (different story for older DX9 games, but that's irrelevant here now). The sole exceptions are "Gamma Correction", "Line Gamma" and "MFAA", which can be enabled/disabled to own preference. Anyways.... you got nothing to lose, might as well experiment a little. We'll go in parts. First, the NVIDIA profile settings. Second, after it, I'll opiniate a bit (a few notes) about in-game settings that you may want to experiment as well. One important setting to change in the NVIDIA Global Settings seems to be already okay (the Shader Cache Size at 10GB is a "must do"). The thing is, you should costumize the specific profile for DCS, in the NVIDIA specific settings for the game. So, let's go in steps (please bare with me)... First of all, open the NVIDIA control panel, and once in the "Manage 3D Settings".... - Leave the "Global Settings" as is for now. - Go to the "Program Settings" (at the imediate right of the Global Settings). - Search then select "Digital Combat Simulator: Black Shark (dcs.exe)" (this is the profile that NVIDIA identifies and applies things for DCS World). - Click "Restore", so that it reverts things as a clean sheet for the specific profile of DCS (it'll do so only for the selected game profile). Now, try my settings as in the image below, exactly as they are (obviously, feel free to experiment after). Do pay attention to the following - my global settings may be different to yours. So, when you see "Use Global Setting" in my settings there, pay special attention to what appears inside parenthesis right after it (it's what the setting is at). Change accordingly. Once you finish with the changes, click "Apply" on the bottom. NOTE: If at some point you wish to revert to default DCS profile settings, and start all over again (for whatever reason), then click "Restore". And you'll have a DCS profile clean sheet again. (click on image to enlarge it) Finally, looking at your specs, and your current DCS System settings (the game options) as they are as a basis, try changing these in there and see how it goes: "Visib Range" - - - - - - - - - - - > MEDIUM (good enough detail for VR that doesn't bog down performance) "Shadows" - - - - - - - - - - - - - > MEDIUM (good enough detail for VR that doesn't bog down performance) "Lens Effects" - - - - - - - - - - - > Flare (clean and natural sun effects, with no BS camera lens bubbles) "Clutter/Grass" - - - - - - - - - - > 1100 (nice enough detail for VR that doesn't bog down performance) "Forest Visibility" - - - - - - - - - > 100% (so at maximum, this will balance and mask somewhat the tree-popping by limited Visib Range) "Forest Details Factor" - - - - - > 0.4 (this is a LOD switch for details, avoid more than 0.5) "Scenery Details Factor" - - - - > 0.4 (this is a LOD switch for details, avoid more than 0.5) "Preload Radius" - - - - - - - - - > 75000 (even with 64GB of RAM, more than "90000" the loading times become long and impacts RAM + pagefile) "Chimney SMoke Density" - - - > 1 (this is the number of chimneys in a radius, at minimum is best also for lower repetition pattern) "Anisotropic Filtering" - - - - - > 8x (you don't need more than this in DCS, be it for VR or 2D screen, and can impact performance) "Terrain Object Shadows" - - - > OFF (this is important to avoid stuttering in VR, "Flat" or "Default" is used only when performance is really good in VR) Anyways, that's what works out for me, for DCS in VR. Not sure (no guarantees) but it may be usefull for you and others. And if nothing else seems to work.... heck, revert to a previous version of DCS that works better for you, if you do feel that's better. I actually ended up reverting to a much older version - that one in my sig - and it's been the best DCS VR experience yet for me.
  16. DCS is DX11 and, AFAIK, the DX11 drawcalls are made on the CPU. It also uses Defered Rendering mode which is less optimized (and not Forward Rendering, which would be ideal, especially for VR). So, yeah, CPU is still kind of a big deal. @markturner1960 His performance problem may have to do with the fact that the AMD 7950X3D has 16 cores, but only 8 of them actually make use of the 3D V-Cache. IIRC, CCD 1 is the CCD with the 3D V-Cache, so make sure that the game is only worked on (CPU affinity) by those 8 cores with 3D V-cache, and not use the CCD 2. I presume that would be logical core 0-15 with the 3D V-Cache (and 16-31 without). Lastly, the blurriness. Make sure to use MSAA x2 (or x4, if performance is really good) instead of DLSS/DLAA or TAA - this is awful in VR, even with added sharpness, cockpits get very blurry. And get back to 150% HMD resolution once the performance allows it (never less than 100%, even in worst scenario, IMO). Bit of a side note but, for me, DCS 2.9 image quality suffered a bit in VR, compared to previous versions (very foggy, lots of haze) with a Reverb G1. But it's fine in 2D monitor. I know it's a bit controversial in a community where the latest is always "the best evaah" but, for me, it is like so no matter what I mess within the settings (so, again, returned to good old DCS 2.5.6, for VR - it's all perfect there).
  17. When you mean AIO version, I presume you mean watercooled GPU with own radiator? If so, yes, there's a few RTX 4080 models with own radiator: Gigabyte RTX 4080 16GB Aorus Xtreme WaterForce (360 radiator) Colorful iGame RTX 4080 16GB Neptune OC-V (360 radiator) Inno3D RTX 4080 16GB iChill Black (240 radiator) Be careful with the prices that some sellers are placing on these (top performance and coolest RTX 4080). These are usually around $1550.00, but some are placing even higher prices that get too close to RTX 4090 money... Also note, the RTX 4080 is not a problematic GPU in regards to temperatures or quality, and pretty much any air-cooled model is good (and less expensive than those). Matter of opinion but, I'd even say to pick the brand you like/trust the most, and/or go for the one you like the most aesthetically. In any case, here's an RTX 4080 buying guide that may help:
  18. I presume that monitor is 3440 x 1440 144Hz+(?). Graphics card upgrade would be good there. An excelent choice to go with that monitor is the RTX 4080 16GB.
  19. No problem, glad if it helps. Regardless of the motherboard you end up getting, if the number of M.2 drives is important, then you always have the option of a PCIe expansion card to fit a bunch of them. For instances, among others, you have the ASUS Hyper M.2 X16 PCIe 4.0 X4, it supports upto 4x NVMe Gen4/Gen3, and sells for about £75.00. Though I think it does split the lanes for it to work with more than 2 drives.
  20. First of all, congrats! I'm pretty sure you are happy with the experience. Those positive first performance tests, after you brought that thing up by yourself, are so good and rewarding. A pre-built PC system acquisition just can't fulfill that sentiment of acomplishment and ownership. OK, now as for the high CPU temps. Intel 13th and 14th gen are supposed to be good up to 100ºC, but keeping it under 85ºC is ideal. From what I gather you're well aware of the case fan + radiator fan orientations for thermal dynamics, and that you got the rear fan in order (that one is vital, to get hot air out). Please bare with me on these following points: Make sure that you really did peal out the sticker from the cooler's coldplate (the flat thing that actually touches the processor). I know it sounds silly (looks too obvious), and pardon the whole process repeat of thermal paste cleaning and reapplying, etc, but it happens more often than you think! Get the Thelmalright Anti-Bending Buckle for LGA1700. Although not a necessity, it's a cheap solution that is worth the time (it's not complicated, watch the videos). The stock ILM in Z790 (and previous Z690) motherboards, unfortunately, doesn't ensure the best connection between the cooler's coldlplate and processor, and that thing solves that problem. ASUS is one of the motherboard manufacturers that uses an "enhancement" setting that they leave ON (on "Auto") by default. In my opinion, it should always be set to OFF (as "Disabled"), because leaving it enabled actually brings more troubles than benefits. That specific setting is "MCE" (MultiCore Enhancement) and probably seen there as "Asus MultiCore Enhancement". What this setting does is remove Intel's safety limits and applies even more voltage, to enforce turbo-clock on all cores (hence why many call it an irresponsible cheat). As far as long-term goes, MCE won't overvolt anything to dangerous levels, but it does make the CPU operate with (even more) higher wattage and temperatures. The thing of relevance here is, it provokes what it's not supposed to from factory. It's found in the motherboard BIOS, in Advanced mode, under the "AI TWEAKER" or "EXTREME TWEAKER" tab section (one name or the other, depending on model). Just put that thing OFF (as "Disabled"), save and exit BIOS... and off you go. IIRC, on the TUF boards it's under "AI TWEAKER" tab, while on the ROG boards it's under the "EXTREME TWEAKER" tab (like image below). Different names but same thing. Regardless, it should look similar to this:
  21. The problem is that, if using VR for DCS, things are rendering the entire scene twice, once for each eye. Not sure how far Vulkan will help in the future - we'll see later, once it's in for DCS. What is realized is that DLSS is definitely not a decisive solution for VR, because ghosting issues occur, and the P.D. (or the resolution per eye) has to be increased to disguise the heavy blurriness - then losing any gains you'd otherwise obtain with it. If you consider that DCS is already utterly demanding on a screen (be it 1440P or 4K), now imagine rendering it twice.... Hence the brutality of PC specs overkill you often see with fellow DCS VR users for newer headsets (Pimax Crystal, Varjo Aero, Bigscreen, Quest 3 and Pro, Pico 4, etc).
  22. You're not flying in VR... thats a completely different kettle of fish compared to 2D (i.e, with a monitor or TV panel). DCS in VR is waaaaaaay more demanding hardware wise, and more problematic optimization wise.
  23. I agree with what @kksnowbear posted (some good advice there). The i9 9900K (or 9900KS or 9900KF) upgrade could be a mid/short term stop-gap solution, it's a direct replacement in Z370 motherboards (as is your case) with newest BIOS. Note that they're still in high demand, lots of people are deciding to upgrade their older 8th gen based systems with it - and it overclocks nicely to 5.0+GHz (all-core OC). I'd only get one from the used market (Ebay, etc) where it can be found around $250. The thing is, you mention VR and, especially with DCS, that is the harshest form of gamming/simming, for PC hardware, that you'll probably encounter. The i9 9900K, while still plenty good if using a 2D monitor or TV (at any resolution), doesn't really cut it for VR in DCS at a certain point. That said, not even the most potent/current hardware can solve all of DCS big optimization issues, especially in VR..... If VR is your goal, then in your case I'd vote for the CPU + Motherboard + DDR5 RAM upgrade, and keep (re-use) all of the remaining components you have (they're still great). Then sell that 8700K, motherboard and DDR4 RAM on the used market (Ebay, etc) or to a friend. In regards to the CPU+motherboard+RAM, I'd change your upgrade specs and go instead for something like this: CPU: Intel Core i7 14700K / KF (or Intel Core i7 13700K / KF as alternative) You definitely don't need the i9 14900K (way too hot, too inneficient), I'd even say to avoid it at any cost. The i7 14700K (or the i7 13700K as less expensive alternative) does sim/gaming exactly as good, with less heat and power consumption, and for less money as well. Motherboard: ASUS TUF Z790 Gaming PLUS WiFi (DDR5) If your motherboard "has to be an ASUS", then let me tell you that you don't need a ROG MAXIMUS Z790 HERO. The TUF Z790 Gaming PLUS WiFi (DDR5 version) is great and does the same job reliably, for half(!) the price. Please note - Z790 motherboards support newest Intel 14th gen, but require latest BIOS update for it. There's some new ASUS models not requiring it but are expensive. Memory (RAM): 64GB (2x 32GB) DDR5 6400 CL32 64GB (2x 32GB) is very recommended for DCS (it devours memory, as you may have noticed). For "speed/latency VS price", DDR5 6400 CL32 is the sweet spot for Intel 13th/14th gen. Similar price to DDR5 6000 CL30 and is better. A couple of examples: - https://www.gskill.com/product/165/374/1665644504/F5-6400J3239G32GX2-TZ5RK - https://www.gskill.com/product/165/377/1677726067/F5-6400J3239G32GX2-RS5K (note: I prefer GSkill memory kits, but other brands should also have it)
  24. That's a good point but, the thing is, you got into the platform in the right foot, with the 2nd best in the line up (and a great chip btw). Now imagine that you only managed a 12600KF back at the time of launch (must have happened to someone). As good as that still is, wouldn't you feel better now knowing that you can put a 14700K or 14900K into your system later? That's a decent upgrade, with just a BIOS update, without having to change anything other than the thermal paste and probably the cooler (not even a new Windows install). Similar to the common "AMD superior life span" argument then, afterall. That was my point. What I still don't get with the whole "AMD AM4 platform long life span" that some people try to hold on like a flag... why does it even matter, if the initial products (and for years) were not even worth buying? For instances, take that example, R7 1700 to R7 5800... remember how bad Ryzen 1000 series were for gaming, and how hard techtubers tried to convince otherwise? ...barely able to keep up with a stock i7 3770K launched five years prior. The 2000 series were a bit better, as was the 3000 series again later, but they all suffered with infinity-fabric issues (poor performance, stuttering gallore). Those in that scenario, who bought the R7 1700 for gaming, got quite a few hair-pulling issues for over three years until, finally, upgrading to the first good line-up of AM4 CPUs (but last to be released for it), the 5000 series. Was the whole thing a worthy investment and experience? Wouldn't it have been better to enjoy all those "suck it up" years instead with the i7 8700K (awesome, stock or OC'ed), with zero issues of any kind, performance or otherwise, right from the start, from day one and for years, probably to this day? Platform life span is all good and great on paper, but in practice...? I believe instead that buying RIGHT the very first time makes up for any lack of platform continuity. Noone should ride on promises - much less on a platform which doesn't even work right with four sticks of RAM, featured on every expensive mobo they sell (a scam). If you get a gaming PC to enjoy yourself, and pay by the nose, then make sure you get what is proven to make you the happiest from the very first moment. Forget whatever hasn't even been launched yet.
  25. You're totally correct that Intel Z790 is a dead end, while AMD AM5 isn't. About Intel's chipset platforms life span, as side note and pardon my rant, people often extrapolate things and it has been overblown (IMO). Example, the Intel Z690 motherboards. Across different manufacturers and all price-segments, most of these support now (with a BIOS update) three generations of Intel chips - 12th, 13th and 14th gen. So, someone who bought an i5 12600K and a Z690 motherboard in 2021, is able to upgrade over two and a half years later to, say, the new i9 14900K, as a direct CPU swap with a simple BIOS update (so, just like with AMD AM4 respective CPUs). And this backward/forward compatibility has happened with every Intel "Z" chipset for years now: Z170 (2016) and Z270 (2017) both support 6th and 7th gen Z370 (2017) and Z390 (2018) both support 8th and 9th gen. Z490 (2020) and Z590 (2021) both support 10th and 11th gen. Z690 (2021) and Z790 (2022) both support 12th, 13th and 14th gen. You won't see any mention of this from any techtuber/influencer or website, because it goes against the recent narrative and agenda, too controversial for some minds. Then the profitable "niceties" and connections from that other side could end, and the AMDrones (the fanboys, like football hooligans) would flood the gates to harass/cancel you and your platform, and/or poison the comments section till the end of days, like always. And so, along with marketing and herd mentality, things go as they go...
×
×
  • Create New...