Jump to content

kksnowbear

Members
  • Posts

    881
  • Joined

  • Last visited

5 Followers

Recent Profile Visitors

4055 profile views
  1. You don't specify motherboard model. This is where to start. Detail is needed on the computer: Self built? Or prebuilt? Is it new, by chance? (In January this year, you indicated having a Z790/14900k system, with a 4090... so it appears you have changed since then). Download and run CPU-Z, to determine current BIOS version here: https://www.cpuid.com/softwares/cpu-z.html This is a very reputable, safe app. There are other ways to check BIOS version, but CPU-Z will offer additional information that will be helpful as well. Post a screen shot of the first three tabs (see example below). Likely the machine needs a BIOS update; this behavior could easily be due to out-of-date BIOS which doesn't properly support the CPU. Are you running AMD's Ryzen Master by any chance? CPU-Z tabs: (PS Also need info on what the 3060 GPU is doing there...are you trying to use it as a Physx device? Or maybe for the Lossless Scaling app? I'd try running it without, just as a test).
  2. OP: "I am linking to the part where they turn a 4090 24gb into a 4090 48gb." A 24G 4090 cannot be turned into a 48G 4090. Period. Not accurate is not accurate; the concept isn't complicated.
  3. First of all I didn't comment so that you can approve or understand. It doesn't matter whether you approve or understand, the reality is what it is. As usual, you're arguing with me just to be arguing with me. The title of the post states very plainly that a "rtx 4090 becomes rtx 4090 48gb". This is not accurate. By the time that shop produces a 48G 4090, it has little to do with the card they started with. With appropriate mods, labor and materials, I can swap the wheels off a Countach to a freaking Toyota Yaris...doesn't make it a Lamborghini And I'll say it again: A 24G 4090 cannot "become" a 48G unit, the substrate assemblies are not the same, period. Physically not possible, end of discussion. Cannibalizing a few parts off a beer truck won't give a honeywagon "the ability to turn in to" something that carries beverages (at least nothing that most people would drink). In reality, this 'news' is relevant to neither a gaming discussion forum nor a "tech media" website that purports to be about gamers. Although many places could potentially do the physical work, there are good reasons that myself and others similarly situated aren't already doing it, and have zero interest in it. A listing on eBay for a piece of modified hardware which is unsupported (and which actually violates their policies, at that)...doesn't change anything.
  4. They don't "magically" turn a 24G 4090 into a 48G 4090. It says so in the video. At 2:38, he says "Because Nvidia doesn't make a 48G 4090, the repair shop has to source it's own PCB". (see image below) They're not adding VRAM to a 24G 4090. A 24G 4090 doesn't have the electrical features needed to do this, so you couldn't turn a 24G card into a 48G card no matter what. He states "it's a custom board, just for this." At that point, it's no longer a 24G 4090. They simply use the GPU and other 'donor' components from the original card. The electronics work necessary to do this also is not magic. Anyone with the right tools and training can do it. I have some of this equipment and military training, and I regularly work with surface mount devices in my shop. Yet I have zero interest in doing something like this, because there's no real market for it, considering the costs. The video says a 48G card sells for $2800 US, and from that you have to subtract the cost of the donor card plus additional components (including the custom PCB), as well as labor and the tools/equipment they're using (and that equipment isn't cheap, I can promise you). Even if it's worth it to the shop in the video, it's only because part of that is earned back/financed by their other operations. Not worth it otherwise, simply because there's no sustainable market for a 4090 that costs $2800 just because it's got 48G - especially considering that 'standard' 24G 4090s are selling for record high prices right now anyway. (To be clear, I've sold 4090s recently, but unlike the b@stards who 'scalp'...I sold units at/near MSRP). Nvidia doesn't care because it's not even remotely a credible threat to any concern of theirs. The costs involved don't balance against the market.
  5. Hopefully that's just a typo Again, unless you want to drop the GPU slot to 8 lanes, you need to avoid M2_2 and M2_3. As I described above: Those two slots use chipset lanes which aren't shared with anything else. Also, if you want to install an additional M2 drive, you can get a PCIe M2 adapter card (~$15) and install it in the bottom PCIe slot. This slot uses four chipset lanes at PCIe 4 so you'd get full speed of any PCIe 4 drive. It requires you to buy the adapter card, but this would bring the total to three PCIe 4 drives (one each at M2_4, M2_5 and one on the adapter card), and one PCIe 5 drive (at M2_1), all running at full rated speed/no sharing. And you'd still have access to all four SATA ports as well, if you want/need to use those. Not as fast as M2 drives obviously, but still perfect for older/slower storage like HDDs and SATA SSDs you might want to reuse.
  6. Nothing wrong with using SATA SSDs In fact, for boards where there's only one M2 slot, I further the advice above by recommending boot from a SATA SSD, using other SATA drives (even conventional HDDs) as needed for general storage/games that aren't as demanding, and reserving the M2 slot for games that benefit most from the additional performance. As far as the fans/AIO mounting... Brilliant AIO radiators should never be used as intakes; people usually do this for one reason (and it's a not a very good reason). As above, with the Gen5 drives, there will invariably be those to come along and dispute this. Their opinions are at odds with fact.
  7. Right. M2_1 will run at full PCIe 5 speed regardless of which other slots you populate (this is because it uses 4 dedicated CPU lanes). Unless you're willing to split PCIe lanes to the GPU to get more PCIe 5-speed storage, then avoid M2_2 and M2_3. Use (up to two) PCIe 4 drives at M2_4 and M2_5. On that point, let me strongly recommend that you use a smaller, less expensive PCIe 4 drive to boot from (i.e., separate boot and game drives). Even if you boot to a considerably slower drive, you're still far better off not running the game from the same drive you boot/run the OS. Just looking at the pic above. Please tell me you do not intend to mount that AIO CPU cooler as an intake...
  8. Actually it is documented fairly well, in several places in the manual. It is also not entirely accurate to say that there is no benefit to using PCIe 5.0 storage. In the simplest of terms, the faster a storage subsystem fulfills a request for data, the more responsive the system. This can benefit games beyond just decreased loading times. There will invariably be those to come along and dispute this. Their opinions are at odds with fact. It is true Gen5 storage generally costs more per unit storage. This is as simple as "You get what you pay for". It's nowhere nearly as bad as some people make it out to be: The speed of PCIe 5 storage (for a good drive) is doubled, but the cost is ~20% more depending on the drive size etc. In the case of a 2TB drive, a very good PCIe 5 drive costs around $30 more than a 2TB 990 Pro, arguably the best PCIe 4 drive available. That $30 cost difference represents far less than 1% (maybe even as little as 0.5%... yes, one half of one percent) of a build like the one being discussed in this thread. That should make sense to anyone who spends several thousand on a gaming PC...particularly anyone who bought a 5090. Whether it's "worth it" or "makes sense" is entirely up to the individual; the fact that it can improve performance is a separate matter. (Hint: If someone's idea of "improved performance" is based strictly on frame rates, they are already missing the point).
  9. Pretty sure that is not accurate. The M.2 slot next to the RAM (M2_1) uses four PCIe 5 lanes directly from the CPU, and the GPU slot uses 16 PCIe lanes directly from the CPU. Neither of these is affected by the other, provided that only those two are populated. M2_2 and M2_3 use lanes shared with the primary GPU slot. This allows a bifurcated configuration which splits 16 of the CPUs PCIe 5 lanes to x8/x4/x4 if desired. (8 for GPU, 4 each for those two M.2 slots.) Populating these M.2 slots would drop the GPU slot to 8 lanes - by design. This is for people who prefer more storage, considering that 16 PCIe 5 lanes aren't really going to be affected by running at 8 instead of 16 with current GPUs. M2_4 and M2_5 use four chipset PCIe 4.0 lanes each, which are not shared at all. You can run two Gen4 drives and one Gen5 drive, all at 4 lanes each (i.e. full speed). This is no different than an X670 chipset/board. Use M2_1, _4 and _5. The only reason for the board supporting bifurcation is to support more storage (five slots, at the expense of sharing CPU PCIe lanes). Note this assumes a supported CPU is used. 9000 and 7000 series CPUs support this; 8000 series do not. All of this is covered in the manual and on the Asus website.
  10. @Vitamin_J94 Here's something that might be worth a look: https://www.newegg.com/samsung-ls57cg952nnxza-57-duhd-240-hz-odyssey-neo-g9-va-white/p/N82E16824027274?Item=N82E16824027274&SoldByNewegg=1 (Just happens to be a NewEgg link, I don't make anything from any source for it and it's also available from Amazon and other places too) It caught my eye since it's the same type Samsung VA panel as discussed above, at 4k in the vertical (50% more pixels than your current setup) ...but with *twice*(!) the horizontal pixels as 4k (7680, the same horizontal FOV as you're accustomed to). So you gain in the vertical without losing in the horizontal. If I'm looking at things correctly, it would be roughly the same physical height as your 32s; something like ~16" (and actually a bit more, if I'm not mistaken). And with that magnificent 1000R curve. Immersion like no other Maybe worth considering. Of course, if you just prefer to go to a 16x9 aspect ratio and still get 4k vertical pixel count, there's always the Ark 55 as above Best of luck to you, whatever you choose.
  11. Yup. And this is why I strongly prefer curved over flat panels (even though I still think the 21:9 and 32:9s are more immersive than 16:9s that are available). The Ark 55 gets points for sheer size and a 1000R curve; if you do the math a 55" panel is comparable to even the G9 in the physical FOV...plus it's 4k and yes a very good panel even if it's VA... I often think I should've jumped at the $1450 mark a few months ago...but it may come back around in time. What I really wish is the Ark 55 was an OLED panel...lol but then it would cost like 8 billion dollars and I couldn't afford it anyhow
  12. This is something that I've been considering for quite some time. I just wish it offered a better panel than VA (if I'm not mistaken), plus even though at 4k its an increase in vertical resolution, I'm actually losing 33% in horizontal FOV compared to my current G9. But I may do it yet (and even though it's 55" my measurements indicate conclusively that will fit quite nicely over my desktop once mounted properly, yet never come close to my knees, and still put my eye line at/near the actual industry expert-recommended top edge of the monitor vs the center... )
  13. It may be true that ultrawides are more appropriate to simracing environments, given their extra FOV in the horizontal. I'm just not a fan of claims that ultrawides "lose" something in the vertical. As I illustrated earlier, they don't lose anything provided you're comparing apples-to-apples at the same vertical resolution (and why would anyone do anything else but apples-to-apples?) That claim is misguided, ignores empirical data, and is more of an optical illusion that it is even close to fact. An ultrawide monitor looks like it's not as tall because of it's aspect ratio, but resolution is what determines what you see in-game. For example, my G9 shows me exactly as much of the game as any 1440p 16:9 monitor in the vertical - and it shows twice as much in the horizontal. In fact, 33% more than even a 4k monitor in the horizontal (5120 vs 3840). Consider that there's nowhere in the game that you enter your monitor's physical size. That's because the game doesn't know or care what size monitor you have - all it knows is how many pixels high, times how many pixels wide. That pixel count determines what you see (without turning your head); the FOV or viewport, if you will. If you feel you'd be happy with a 16:9 aspect ratio, then there's little doubt that a 4k OLED display (size as you prefer/budget allows) is the way to go. However, if you want something curved and larger than say ~48", OLED isn't likely an option. Be aware also that it's misleading for anyone to suggest that you can't use monitors >48" in a flight sim environment - even on a desktop. Military aircraft commonly feature displays, gauges, and controls at and even below knee level, and obviously what a pilot sees goes well below typical desktop height - a fact that using monitor stands like yours (and other creative approaches) can be set up to reflect nicely, especially if one doesn't have the physical displays etc...so buy whatever size your budget permits
  14. Your "in-game" vertical field of vision is precisely the same as anyone running a 1440p monitor. It's a myth that you're "losing" anything by running a ultrawide type aspect ratio. Moreover, since you're already at 1440p, the only way to increase resolution is go to 4k (or some variant of 2160 "short side") - which, as above, if you want to keep a similar aspect ratio, the resulting resolution would then obviously require much more of your GPU (though I don't know that anyone even makes such a monitor that's bigger than 32"). Just something to think about (and I speak from first hand experience). You can go to a 4k at 16:9 but you'd be sacrificing the 300% horizontal FOV you paid to get from three monitors - and only gaining 50% in the vertical. (FWIW I use a Samsung Odyssey G9 49" 32:9/5120x1440 monitor).
  15. Ah...so it sounds as if you're interested in 4k resolution (that is, 2160 on the vertical or "short side")...correct? Do you intend to try getting approaching a similar aspect ratio? If so, probably important to keep in mind that the total resolution will increase a *lot*. And while the 4090 is a magnificent card (I use one and have owned three), there's going to be a fairly big resulting hit on performance.
×
×
  • Create New...