Jump to content

EightyDuce

Members
  • Posts

    315
  • Joined

  • Last visited

Everything posted by EightyDuce

  1. If you're getting some performance issues, I would make sure that your ram is infact stable (linpack, ycruncher, superpi, p95, testmem5) . Memory instability presents itself in strange ways. Also, before you go on your venture, just know that benefit from bringing CAS latency 30 will have nearly imperceptible impact on performance unless you are running benchmarks (from where you're at with BZ settings). For example, you probably already went from high 50's GBps read/copy to mid to high 60's GBps and dropped latency from 70's ms to low 60s or high 50 ms. It will also potentially require a voltage increase or drop in frequency. DDR5 greatly benefits from secondary and tertiary timing tuning along with frequency, followed by infinity fabric. Instead of focusing on primary timings, I would see if you can push your kit to 6200 or 6400 mt/s, if you can't get 6400 or 6200, see how much you can push IF. That being said, from reading your post, and this is not meant as as insult, it appears you are very inexperienced when I comes to RAM tuning. Unless you are serious about learning at least some aspects of memory timings and interaction, I would just stick to BZ timings and move on with your life. You can quickly go down a rabbit hole for very little gain beyond what you currently have.
  2. Unless you're looking for something like a Strix or the Liquid Suprim from MSI, all 4090s perform virtually the same. Some will have higher power limits, but the gains aren't really worth the power/heat increase. Buy the one with the best warranty unless you're taking the cooler off to put a waterblock on it.
  3. Didn't even know this was an issue, at least in DCS.... I guess ignorance is bliss.
  4. Its a P3 Kill-A-Watt. Can't remember where I bought it, probably Newegg or Amazon, but it's been close to 15 years. The benchmark is a Cyberpunk built in benchmark that runs same amount of time for each run, couldn't tell you the length right now as I'm at work. Off top of my head its like 1-2 min long.
  5. Just got the UPS hooked up and running the same Cyberpunk 2077 Overdrive benchmark it was pretty much locked at 486w output (similar to what the smart plug reported) with one momentary peak to 513W. It appears Kill-o-Watt is the outlier.
  6. Posted for awareness and hasn't been personally tested. I'm on a 4090.
  7. AMD released new drivers that supposed to address VR performance. https://www.amd.com/en/support/kb/release-notes/rn-rad-win-23-10-01-41-vlk-extn
  8. New drivers released that supposedly addresses VR performance... https://www.amd.com/en/support/kb/release-notes/rn-rad-win-23-10-01-41-vlk-extn
  9. I thought it was an APC-branded UPS, however, it's actually a "CyberPower 1500VA / 900Watts True Sine Wave Uninterruptible Power Supply (UPS)". It's just going to have the Synology NAS, Unifi Dream Machine SE and two switches on it. We don't typically lose power for long but recently with the weather, have had frequent quick outages and the NAS hasn't been happy about it.
  10. I have an APS battery backup UPS coming Thursday for my server room but I'll hook it up to my PC and see what it reads. I'm not sure how accurate the sengled plug is, but the Kill-a-watt was, and from a quick Google, still is a popular measuring tool and folks report as accurate but there's no telling if it went off at some point of its 15 years of sitting in a drawer. Unfortunately until the UPS gets here I have no other way to measure, but the readings have been surprising if nothing else... Especially the disparity between the killawatt and the smart plug.
  11. PSU is an MSI MPG A1000G, so I don't think it's the issue. I'd be more inclined to think that the 15 year old kilowatt P3 may be the issue. Unfortunately I don't have another way to measure other then in-windows reporting and napkin math. Edit: so I just thought of something and grabbed one of the smart plugs that records power use/draw I had on my 3D printer. Cyberpunk 2077 RT overdrive benchmark indicates 463-480w draw at the wall. No idea how accurate these are (sengled smart plug), but it's a heck of a spread from the killawat P3.
  12. To be clear, the 880-917w (Cyberpunk 2077 RT Ultra) was from the wall while benching (built in benchmark). Same bench in RT Overdrive was pulling 940-947w. DCS Marianas F18 free-flight in VR was pulling 610-642w. Diablo 4 in-game, pulling 520-560w-ish. If all you're doing is DCS, you have headroom. But some things may be more demanding. I'm actually surprised CyberPunk 2077 RT overdrive pulled that much power....makes me think I should have gone with a 1200w+ PSU instead of a 1000W lol. @some1 Do you by chance have Cyberpunk 2077 and could run the built in benchmark to see if your power-draw spikes?
  13. Just the tower and anything that's plugged into it.
  14. 917w at the wall plug on the kill-a-watt while benching. Thats with everything in the signature + EK-D5 pump, 6x140mm fans, 3X Nvme SSDs.
  15. When I get home I'll check my Kill-0-Watt to get you the powerdraw with a 7800X3D and a 4090. ... As a bonus, I'll do it without digging into my dusty pockets and bringing up a 5800X3D and DDR4 for some irrelevant reason.
  16. Yeah, I'm not really a hat guy. So it's either the clip or the hat... Neither is perfect. That being said, can't argue with results. If Tobii tracking was 1:1 to TIR5, it would have been a slam dunk. Maybe with Tobii 6?
  17. After close to 10 years of using TIR5, I was excited to try Tobii. After about a month of trying it out, I could never get it to the same level of smoothness/sensitivity that I had with TIR5. I tried with and without eye tracking. In the end I sold it on ebay and went back to TIR5. Now I'm almost exclusively VR. To me, the only downside to TIR5 is the janky LED pro clip they sell.
  18. That has been the hope for years. The closest thing that came to it in recent history was the 6800/6900/6950. People thought the 7900XTX was going to be a killer product up until the day testing went live, and even when that balloon was popped, people still thought it was going to get better with drivers (it had slightly, yes). I really hope AMD gets back to the days of 290X and 5870s, I loved those cards and they were fantastic performers. But my last two purchases were Nvidia (1080ti and now 4090) just because they provide a superior, no-compromise solution that will tick away for at least two generations.
  19. To be honest, unless you have a board that allows for an external clock generator any overclocking features on the 7800X3D are pointless as its fused max is 5Ghz. PBO will be of no benefit and neither is curve optimizer, unless your goal is efficiency and cooler temps. The only way to make the 7800X3D go above 5Ghz is with FCLK/ECLK modification. The reason curve optimizer works on other SKUs , is that other SKUs are almost always thermally limited. With CO negative offset there's less heat thus more room to boost to higher speeds. 7800X3D you're stuck at 5Ghz. Now as far as memory is concerned, it will depend on the application and whether it's bandwith/latency starved to how much of a benefit you will see. But realistically best case with an X3D chip would be 7-10%. Arguably, thats one part of the appeal of an X3D chip...you can drop it in a budget MB and will be go about as fast as a $500 MB.
  20. Asus +7800X3D happily ticking away. Not really loyal to any brand, just my wallet and needs. Next go around, if Asus has better boards/features for the price, I'll stay with Asus. Otherwise, GIGABYTE is my runner up/tied for first place. IMO this thing got spun up by the internet from just a couple of incidents (some of which cause isn't even confirmed) to the point where common sense exited the chat. Internet is gonna internet.
  21. While with the X3D RAM specs don't make as much of a difference, the difference in overall performance is still there when comparing potato RAM settings to EXPO/XMP to manual tune. EXPO timings are very loose overall. On my 7700X I got close to a 10-13% improvement with memory overclock and memory timing tune (beyond EXPO ); EXPO2 got me roughly 5-7%. On my 7800X3D I'm at roughly 7-10% improvement, game dependent. DCS I'm on a lower end with about 8% improvement going from EXPO/XMP DDR5 6000 32-38-38-96 at Auto IF to 6200 30-36-36-46 (and tuned secondary/tertiary timings) and IF @ 2167. In VR this may mean the difference between dropping down into a lower reprojection bracket or cruising at 90FPS. TLDR: even with the X3D, there is still a significant difference between running bad/"stock" memory settings and tuned. Just some quick 5 minute tuning bring a good uptick, I'm not even talking about min-maxing.
  22. Asus cluster aside, the 7800X3D failure seems to be a much more of a nothing burger than originally led on. By GNs own admission they had to really work to make it fail. From the very few reported incidents it's hard to tell the nature, whether user error/manipulation (attempted OC), straight up hardware defect, failure of OVP/OCP mechanism, aggressive voltage being set by board manufacturers or a culmination of all. Asus definitely not helping AMDs mindshare, that's for sure.
  23. Not warrantied because it is a BETA release. If you wait a couple days the stable version should be out. Shouldn't explode your stuffs any more than anything else. I would wait for a stable release.
  24. Good video. Shows how sloppy some protections were handled. Also shows how unlikely this is to actually occur due to a cascading of issues that culminate is a catastrophic failure. As for countering the issue... I've ran at a manual SOC (1.2-1.25v) since day one due to manually tuning memory. Other than that, update the BIOS and carry on with your life.
×
×
  • Create New...