Jump to content

kksnowbear

Members
  • Posts

    877
  • Joined

  • Last visited

5 Followers

Recent Profile Visitors

3698 profile views
  1. Hopefully that's just a typo Again, unless you want to drop the GPU slot to 8 lanes, you need to avoid M2_2 and M2_3. As I described above: Those two slots use chipset lanes which aren't shared with anything else. Also, if you want to install an additional M2 drive, you can get a PCIe M2 adapter card (~$15) and install it in the bottom PCIe slot. This slot uses four chipset lanes at PCIe 4 so you'd get full speed of any PCIe 4 drive. It requires you to buy the adapter card, but this would bring the total to three PCIe 4 drives (one each at M2_4, M2_5 and one on the adapter card), and one PCIe 5 drive (at M2_1), all running at full rated speed/no sharing. And you'd still have access to all four SATA ports as well, if you want/need to use those. Not as fast as M2 drives obviously, but still perfect for older/slower storage like HDDs and SATA SSDs you might want to reuse.
  2. Nothing wrong with using SATA SSDs In fact, for boards where there's only one M2 slot, I further the advice above by recommending boot from a SATA SSD, using other SATA drives (even conventional HDDs) as needed for general storage/games that aren't as demanding, and reserving the M2 slot for games that benefit most from the additional performance. As far as the fans/AIO mounting... Brilliant AIO radiators should never be used as intakes; people usually do this for one reason (and it's a not a very good reason). As above, with the Gen5 drives, there will invariably be those to come along and dispute this. Their opinions are at odds with fact.
  3. Right. M2_1 will run at full PCIe 5 speed regardless of which other slots you populate (this is because it uses 4 dedicated CPU lanes). Unless you're willing to split PCIe lanes to the GPU to get more PCIe 5-speed storage, then avoid M2_2 and M2_3. Use (up to two) PCIe 4 drives at M2_4 and M2_5. On that point, let me strongly recommend that you use a smaller, less expensive PCIe 4 drive to boot from (i.e., separate boot and game drives). Even if you boot to a considerably slower drive, you're still far better off not running the game from the same drive you boot/run the OS. Just looking at the pic above. Please tell me you do not intend to mount that AIO CPU cooler as an intake...
  4. Actually it is documented fairly well, in several places in the manual. It is also not entirely accurate to say that there is no benefit to using PCIe 5.0 storage. In the simplest of terms, the faster a storage subsystem fulfills a request for data, the more responsive the system. This can benefit games beyond just decreased loading times. There will invariably be those to come along and dispute this. Their opinions are at odds with fact. It is true Gen5 storage generally costs more per unit storage. This is as simple as "You get what you pay for". It's nowhere nearly as bad as some people make it out to be: The speed of PCIe 5 storage (for a good drive) is doubled, but the cost is ~20% more depending on the drive size etc. In the case of a 2TB drive, a very good PCIe 5 drive costs around $30 more than a 2TB 990 Pro, arguably the best PCIe 4 drive available. That $30 cost difference represents far less than 1% (maybe even as little as 0.5%... yes, one half of one percent) of a build like the one being discussed in this thread. That should make sense to anyone who spends several thousand on a gaming PC...particularly anyone who bought a 5090. Whether it's "worth it" or "makes sense" is entirely up to the individual; the fact that it can improve performance is a separate matter. (Hint: If someone's idea of "improved performance" is based strictly on frame rates, they are already missing the point).
  5. Pretty sure that is not accurate. The M.2 slot next to the RAM (M2_1) uses four PCIe 5 lanes directly from the CPU, and the GPU slot uses 16 PCIe lanes directly from the CPU. Neither of these is affected by the other, provided that only those two are populated. M2_2 and M2_3 use lanes shared with the primary GPU slot. This allows a bifurcated configuration which splits 16 of the CPUs PCIe 5 lanes to x8/x4/x4 if desired. (8 for GPU, 4 each for those two M.2 slots.) Populating these M.2 slots would drop the GPU slot to 8 lanes - by design. This is for people who prefer more storage, considering that 16 PCIe 5 lanes aren't really going to be affected by running at 8 instead of 16 with current GPUs. M2_4 and M2_5 use four chipset PCIe 4.0 lanes each, which are not shared at all. You can run two Gen4 drives and one Gen5 drive, all at 4 lanes each (i.e. full speed). This is no different than an X670 chipset/board. Use M2_1, _4 and _5. The only reason for the board supporting bifurcation is to support more storage (five slots, at the expense of sharing CPU PCIe lanes). Note this assumes a supported CPU is used. 9000 and 7000 series CPUs support this; 8000 series do not. All of this is covered in the manual and on the Asus website.
  6. @Vitamin_J94 Here's something that might be worth a look: https://www.newegg.com/samsung-ls57cg952nnxza-57-duhd-240-hz-odyssey-neo-g9-va-white/p/N82E16824027274?Item=N82E16824027274&SoldByNewegg=1 (Just happens to be a NewEgg link, I don't make anything from any source for it and it's also available from Amazon and other places too) It caught my eye since it's the same type Samsung VA panel as discussed above, at 4k in the vertical (50% more pixels than your current setup) ...but with *twice*(!) the horizontal pixels as 4k (7680, the same horizontal FOV as you're accustomed to). So you gain in the vertical without losing in the horizontal. If I'm looking at things correctly, it would be roughly the same physical height as your 32s; something like ~16" (and actually a bit more, if I'm not mistaken). And with that magnificent 1000R curve. Immersion like no other Maybe worth considering. Of course, if you just prefer to go to a 16x9 aspect ratio and still get 4k vertical pixel count, there's always the Ark 55 as above Best of luck to you, whatever you choose.
  7. Yup. And this is why I strongly prefer curved over flat panels (even though I still think the 21:9 and 32:9s are more immersive than 16:9s that are available). The Ark 55 gets points for sheer size and a 1000R curve; if you do the math a 55" panel is comparable to even the G9 in the physical FOV...plus it's 4k and yes a very good panel even if it's VA... I often think I should've jumped at the $1450 mark a few months ago...but it may come back around in time. What I really wish is the Ark 55 was an OLED panel...lol but then it would cost like 8 billion dollars and I couldn't afford it anyhow
  8. This is something that I've been considering for quite some time. I just wish it offered a better panel than VA (if I'm not mistaken), plus even though at 4k its an increase in vertical resolution, I'm actually losing 33% in horizontal FOV compared to my current G9. But I may do it yet (and even though it's 55" my measurements indicate conclusively that will fit quite nicely over my desktop once mounted properly, yet never come close to my knees, and still put my eye line at/near the actual industry expert-recommended top edge of the monitor vs the center... )
  9. It may be true that ultrawides are more appropriate to simracing environments, given their extra FOV in the horizontal. I'm just not a fan of claims that ultrawides "lose" something in the vertical. As I illustrated earlier, they don't lose anything provided you're comparing apples-to-apples at the same vertical resolution (and why would anyone do anything else but apples-to-apples?) That claim is misguided, ignores empirical data, and is more of an optical illusion that it is even close to fact. An ultrawide monitor looks like it's not as tall because of it's aspect ratio, but resolution is what determines what you see in-game. For example, my G9 shows me exactly as much of the game as any 1440p 16:9 monitor in the vertical - and it shows twice as much in the horizontal. In fact, 33% more than even a 4k monitor in the horizontal (5120 vs 3840). Consider that there's nowhere in the game that you enter your monitor's physical size. That's because the game doesn't know or care what size monitor you have - all it knows is how many pixels high, times how many pixels wide. That pixel count determines what you see (without turning your head); the FOV or viewport, if you will. If you feel you'd be happy with a 16:9 aspect ratio, then there's little doubt that a 4k OLED display (size as you prefer/budget allows) is the way to go. However, if you want something curved and larger than say ~48", OLED isn't likely an option. Be aware also that it's misleading for anyone to suggest that you can't use monitors >48" in a flight sim environment - even on a desktop. Military aircraft commonly feature displays, gauges, and controls at and even below knee level, and obviously what a pilot sees goes well below typical desktop height - a fact that using monitor stands like yours (and other creative approaches) can be set up to reflect nicely, especially if one doesn't have the physical displays etc...so buy whatever size your budget permits
  10. Your "in-game" vertical field of vision is precisely the same as anyone running a 1440p monitor. It's a myth that you're "losing" anything by running a ultrawide type aspect ratio. Moreover, since you're already at 1440p, the only way to increase resolution is go to 4k (or some variant of 2160 "short side") - which, as above, if you want to keep a similar aspect ratio, the resulting resolution would then obviously require much more of your GPU (though I don't know that anyone even makes such a monitor that's bigger than 32"). Just something to think about (and I speak from first hand experience). You can go to a 4k at 16:9 but you'd be sacrificing the 300% horizontal FOV you paid to get from three monitors - and only gaining 50% in the vertical. (FWIW I use a Samsung Odyssey G9 49" 32:9/5120x1440 monitor).
  11. Ah...so it sounds as if you're interested in 4k resolution (that is, 2160 on the vertical or "short side")...correct? Do you intend to try getting approaching a similar aspect ratio? If so, probably important to keep in mind that the total resolution will increase a *lot*. And while the 4090 is a magnificent card (I use one and have owned three), there's going to be a fairly big resulting hit on performance.
  12. Can you specify the resolution of your current monitors, please? What model are they (or type panels; VA, IPS, etc) Also it would be prudent to know whether your intended result is same, higher or lower resolution than what your current arrangement provides.
  13. lmao I give up...you can lead a horse...but a monkey still isn't a carpenter just because he's got a hammer lol...
  14. What you're describing is because of what's called memory "training". Typical behavior (especially for recent gen AMD boards), depending on BIOS settings - as you've seen. The OP is discussing a different typical behavior, whereby a machine will do as he's described *if* the PSU is turned off after shutdown - as he has acknowledged above: it only happens when he switches off the PSU.
  15. Thanks for clarifying Yes, I would say this is normal - as I mentioned above, some BIOS settings (obviously, enabling the XMP profile in your case) will cause shutdown-restart even if the machine is already running (once you save/exit, that is). For example, if you're in BIOS and change certain settings, then save/restart, the machine will just restart and boot. Certain other settings, however (usually involving memory - like with XMP in your case) when changed while in BIOS, then save/restart will cause the machine to shut down, then restart. I've always associated this with (what I call) 'initialization', where memory training occurs later in the POST sequence. In any case, I believe what you're seeing is normal behavior. Glad if it helps. FWIW the only time I turn off the PSU switch is when I'm working on stuff (swapping guts) or moving a machine. But I do have several UPS units, basically one for each bench/machine plus a couple others for the monitors etc, so I don't worry about leaving them switched on.
×
×
  • Create New...