Jump to content

kksnowbear

Members
  • Posts

    356
  • Joined

  • Last visited

2 Followers

Recent Profile Visitors

2191 profile views
  1. I'd be strongly inclined to agree with your suspicion re backlighting. The little inline USB meter is nice to have if you're doing anything concerned with USB power. Helped me out more than once, for sure. Not terribly expensive (I paid ~$20 but you could probably do better): MakerHawk USB Multimeter USB Voltmeter Ammeter Load Tester USB Voltage Current PD Battery Power Capacity Charger Type C Meter Tester LCD Display Cable Resistance QC2.0/3.0/4.0 N10 1.44 Inch Screen https://a.co/d/2n5GSFx HTH...best of luck to you
  2. To be fair to ED, I don't think it's their fault that Intel developed this goofy-a$$ed "P and E core" nonsense, nor that any individual decided to buy into it. Just saying.
  3. Having thought about this more: I can't say I've ever tried it, but there might be a small chance you could install two Nvidia drivers, one each for the 4090 and the 730. I'm not sure it would work TBH. Something else you could try is set the 730 (via Device Manager > Properties) to use a generic/'standard' Microsoft driver, which might work alongside the Nvidia 4090 driver. (These drivers are provided by Nvidia to Microsoft anyway, but you never know what may/may not work). Another idea is to use DP-HDMI cables. I have some and they work, but it is only 'one way' (i.e. converts DP output to HDMI, but won't convert HDMI output to DP), however, it could work for connecting your 4090's DP outs to HDMI inputs. This would mean your 4090 is driving all the displays (not optimal IMHO, but could work). Not sure about your 4090; mine can only drive 4 displays simultaneously though it does have 5 outputs (2xHDMI, 3xDP). Finally you don't mention what motherboard/CPU you use...but one other possibility is using CPU iGPU via motherboard to gain one more display. This, with using DP>HDMI cables as above, might yield the 5 displays you want. It's probably better IMHO to just go with a 1030 card...but these other ideas I had might be of some help. Good luck.
  4. Oh, incidentally, the Armory Crate Uninstall Tool should be with your motherboard drivers on the Asus website. Asus makes a number of different X670E boards; you don't specify which one you have, but when I look at mine (X670E-F), it is there:
  5. I think it's likely this is a driver issue. See below: 4080's aren't supported by drivers that also support a GT730, and vice-versa. Drivers that are new enough to include support for 4080s (since ~NOV 2022) don't appear to also include support for GT730 cards. I went back to find the oldest 4080 driver after it's November 2022 release, and that driver doesn't include support for GT730 cards. I could be wrong, but I think this is accurate. You'd need a card like a GT1030, which is supported in drivers that also support 4080.
  6. Interesting development, though I'm not at all surprised AURA would be causing problems. It's known for being problematic; as I said, I finally just all but quit using it after quite some time, in part due to problems like this. The USB thing: It seems unrelated to the RGB issue. It may have been going on before the RGB thing, you'd know before I would. What I can tell you is this: USB ports limit the amount of current that can be drawn in total from the port (see table below). USB 2.0 is 500mA (1/2 amp), which might not be enough when too many things are connected - it depends on the sum total of what each device draws (usually this can be determined via documentation; I have an in-line meter that tells me how much current is being drawn from a USB port). This was one of the major changes in the next USB revision: Current was increased to 900mA. The 127 devices you mentioned is the limit of separate devices that can be connected to a single port, but that's not considering power. (Mathematically, a USB 2.0 port could only support power for 127 devices if the average power for each device was less than 4 mA and I'm not sure that's very likely TBH.) I would tend to agree about powered hubs, however it can be tricky. I believe that not all 'powered' hubs are created equal. But, in general, yes, I would think a powered hub would itself draw less than the allowed 500mA on a USB 2 port, provided that's all that was plugged in (no other devices that use USB port power). Also, the 'downstream' side of any hubs cannot exceed what *those* ports are capable of, both the limit of USB ports *and* the limit of the hub's power supply. Again, you'd have to add all the devices to make sure you weren't drawing too much. Also, it could be that the hub you're using has issues, or that the hub just doesn't work well with your motherboard ports. It's not impossible; people take for granted USB "just works" but sometimes USB can be a pain in the ass, particularly when using lots of devices from different manufacturers. USB is not the 'universal fix all' that people tend to see it as - in spite of the name. BTW I suspect this is more about the increased odds of conflict or overload when using more and more devices, rather than a specific rule that says you can't plug in X devices. As you mentioned, USB 2.0 ports can handle 127 devices, and up to 500mA current per port if I'm not mistaken. Obviously, one device is far less likely to cause problems than 10; 10 less than 50, and 50 less than 120
  7. Absolutely - but I'm referring more to the known issues with uninstalling the various RGB control apps from different mfrs. If it gets to the point that Asus themselves provide a special clean up app for removing their lighting software...well, that's a sort of tacit admission of itself, IMHO Good luck - it could still be a hardware issue. TBH I'm not sure if the multiple headers on a board all go back to one hardware controller or not. Wouldn't surprise me if they do, but wouldn't surprise me if they don't either.
  8. Oh, my apologies, I thought you were saying you'd go to 64G by adding his 32G set to yours. If you're talking about just changing your 32G 3200 set to his 4000 set, then no, not worth it IMHO. Not gonna do crap for frame rates and would hardly even be noticeable (if at all) IMHO. As I said, memory capacity of itself will not directly affect frame rate (in any game) and, within the same general range, speed won't either. RAM just won't affect frame rate that much (despite what thousands of idiots on Reddit and YouTube might try to claim). It's certainly not going to make enough difference that you could even tell, if you're already getting 90FPS per your first post, assuming nothing else in the system changes. For example: If we assume RAM contributes to overall performance something on the order of 5% on an otherwise identical system - which is fair and reasonable - and then you increase speed (MT) of RAM by even 25%...then that's a 25% increase in a 5% factor...the change in overall system performance would be around 1.25% (or in terms of FPS, from 90 to ~91). If we assign generous values, and say RAM speed in MT is a 10% factor (double the example above), then a 25% increase in RAM speed might equate to a couple more frames (~93...95?). I doubt you or anyone else can reliably tell the difference in 90 and 95 FPS (although at certain key thresholds with VR, a small change could make a big difference, I don't think it applies to the difference in 90 and 95FPS). Mind you these are oversimplifications, strictly intended to illustrate the point about RAM effect on overall performance. It might vary a bit in one game or another, or depending on settings, resolution, etc blah, blah freakin' blah. Either way, the general idea still applies. LOL Nobody would pay hundreds (or thousands) for a GPU, if $100 worth of RAM would make a 20% difference in frame rate As for 32G v 64: You might not encounter the same RAM limits others have noted. Depends on a number of factors, but it is widely reported that people see more than 32G used on certain maps, missions, MP etc.
  9. I've used Asus AURA on a number of RGB builds over some time (maybe 20 builds in 2 years). It can be very finicky and is generally junky IMHO. I moved to SignalRGB more recently, having done 5-6 with that now. It's a better way, IMHO. Among other things, it's not proprietary - which if you know RGB in general, that's the biggest problem: Asus software doesn't want to work with eVGA GPUs (or whatever). It's not perfect, in that some things are not supported (most notably, AMD GPUs from any manufacturer aren't supported yet last I looked, which is really unfortunate). And your specific RGB controller may not be supported (why I asked what model). Here: https://www.signalrgb.com/devices/ Can't hurt to check. Maybe try it. It's free (with ads, but you can pay to get rid of them). It's actually not bad, and in my experience, behaves better by far and uses less system resources than Asus AURA. I've had very good experiences with it controlling GPUs from one mfr on motherboard from another mfr and using a few different RAM and fan mfrs as well. Helps get away from the proprietary sh*t that a lot of the manufacturers try to force on you, so that you have to buy all your RGB components from them (**TOTAL** BS in my opinion). PS: Pretty sure there's an 'Aura clean up tool' or some such...strongly recommended. These stupid lighting control apps are notorious for leaving sh*t all over a system.
  10. So it really does more or less sound like something with the ARGB controls from the motherboard (to include software, although hardware is not impossible) as opposed to a power issue. To me. Are you determined to use Asus' software or could you consider something else? Have you tried removal/reinstalling Asus software?
  11. Many people here will agree DCS can use more than 32g and therefore 64g has become kind of a standard of sorts. It does depend on the circumstances, but if you can afford it, it can help avoid the really bad things that happen when memory capacity is inadequate. Increased memory capacity alone will not directly affect frame rate, though it can affect how "smooth" things feel by avoiding the problems I mentioned above. Mixing memory modules is just usually not a good idea. You MUST be very careful with different modules as you have indicated (your 32g not same as friends). I will say that generally this is not recommended at all, and it can definitely cause problems. So unless you're absolutely sure, I'd consider another way. One alternative is to find someone who's willing to take your 32g in trade toward a 64G set. There are reputable sellers who offer this. So I'm not accused of posting just to try to sell stuff (utter BS regardless), PM me if this interests you at all. Good luck
  12. Where does the hub get its ARGB control input? Does this indicate the AIO and Thermalright hub are both (separately) connected to the motherboard? As in two separate motherboard ARGB headers? What is the exact model of the Thermalright hub, please? When the blinking occurs is it *all* the LEDs, only those connected to the hub, or only those on the AIO? If they're all doing it *and if* they're all ultimately controlled by/connected to the motherboard ARGB headers, I'd suspect something going on with the motherboard ARGB, and yes software can be a problem (Asus AURA can be particularly stupid at times. There are better choices in some cases). Power might be an issue, but that would depend entirely on how everything is wired up. If the AIO and fan hub get power from separate points, I'd think you'd have other power related issues (if it were a PSU problem for example.) If both are *powered* (never mind PWM and ARGB for a moment) by the same cable for instance then yes, a single bad connector might cause it. (Note since PWM fans don't control speed by varying DC voltage like older fans do, it is entirely possible to have ARGB and PWM connected to the motherboard, and still have a power connector as well). It would be very helpful to have a complete and accurate diagram of how exactly all this is put together. Did you build the machine yourself?
  13. There are far more people who find tech like DLSS/FSR/RSR beneficial, and the vast majority consider the gains outweigh the drawbacks. This 'level of approval' is also consistent with/reflected in the growing implementations throughout the industry. This isn't (and shouldn't be turned into) a specific commentary on a per-sim basis. But, if that's the angle, well then all I can say is ED chose to do it for some reason. Proof of the pudding, as it were. And I really don't know where you got the idea about a "hype train promoting the idea that it has no downside"...but no one who knows anything about it has ever remotely tried to claim that. If they claimed that, it's a dead giveaway they don't know cr*p about it. It's a compromise - but a damn good one for many people who cannot afford to spend thousands on a GPU but still want to get substantially improved frame rates. The choice re: quality sacrifice is, once again, entirely subjective and a matter of individual opinion. As always, what's 'worth it' depends entirely on personal perspective.
  14. Maybe what I'm saying isn't clear; let me try again: I'm not at all saying that you can't use DLSS at 4k, nor that it won't work or whatever. What I'm saying is that a 4090 at 4k res is not the "showcase" situation which technology like DLSS (or AMD equivalents FSR and RSR) are focused on. Pretty sure the tech came about as a way to allow higher frame rates at a given resolution, even on lesser GPUs. For instance, yes have a 4k monitor, but rendering at lesser res to lower the workload while still getting the 'pixel sharpness' of the 4k resolution display/panel. Works great/been there/done that. All those points you make about 4k are perfectly valid - what I'm saying is that these technologies are more impressive by far for what they can do on lower-end hardware which is more readily available/affordable than the (far fewer) people who can already afford a 4090/4k arrangement. For every one person running a 4090 at 4k who could already get very good performance regardless, the technologies now make it possible for 100 or 1000 to get similar levels of performance without necessarily paying the same price (though I would absolutely maintain it's not without compromise). Or: 4090 (already very good and thus don't 'need' DLSS tech to perform well) <lower-end GPUS> (which were struggling at performance in certain circumstances, but can now approach double their performance) I think, of the two cases, the latter has far more to gain, where the former was fine even without DLSS. I think the 'saltiness' which you observed yourself (as in not me) is due to a related-but-different cause. Among other things, it (further) diminishes the validity of simple FPS numbers being proportional to cost of the hardware.
×
×
  • Create New...