Jump to content

Panzerlang

Members
  • Posts

    479
  • Joined

  • Last visited

Everything posted by Panzerlang

  1. Brunner are fraudulent advertisers too in my opinion; not by what they claim but by what they don't say. I bought their MK1 base and it reduced and then cut power in under a minute of sustained-turn dogfighting. It's now an obscenely expensive waste of cupboard-space.
  2. DCS running on the monitor (Mig21 over Caucasus), full-screen, at 3480x2160, the CPU runs at full speed (5.5GHZ) but only 23% utilization. GPU (4090) is at 40% utilization. It has gone back to idle after shutting down the game though.
  3. ...aaand...at idle, power is around 25w (browser open), temp fluctuates around 40c, CPU utilization around 3%. Something is bent. The CPU should return to idle after a task is finished but it doesn't, it maintains whatever state it was last in. Maybe it's broken or the BIOS is fubar.
  4. Still weird behaviour. At idle both CPU-Z and HWMonitor show the CPU min and max speed at 5.5GHz and wattage around 20w. When I run Cinebench (supposedly a hard-core tester) the values fluctuate between min 4.8GHz and max 5.2GHz, with power at 207w. Maybe I have a false memory but back when I used to OC my rigs a test (Prime-95 mostly) would always push the CPU to the max speed I'd set it at in the BIOS. Cinebench is a 10-minute test. On the CPU multicore test it scored 1831. Max temp reached was 83c. I'm completely unfamiliar with a CPU that idles at 5.5GHz with a power-draw of around 20w but will then run at a max of 5.2GHz and a min of 4.8GHz under test. On top of that, even though the test is finished but with the app window still open, the CPU is still running at 207w and 83c. Now I've closed the app window, no change. I'm going to reboot the PC...
  5. I have it set at max 5.5GHz, temps have gone no higher than 85c with tests pushing the clock to 5.3GHz (neither test will push it to run faster) and power at 217w (max is 253w). I'm happy with that. Thanks for the input guys. Going to give Cinebench a go next.
  6. So...don't update either app and continue to use them. In other news...ready-meals bought from Tesco may not be cooked in Brand-X microwaves, due to Brand-X not getting Tesco's permission. Or sumfink like dat. And ED makes more money because a percentage of users buy it due to 3rd-party products that significantly enhance it. It's a two-way deal, in which one might wonder who should be paying who. Answer...nobody, due to mutual benefit.
  7. Good video. If we apply the 'Previous Pimax Experience Filter' however, I'm confident nobody will be getting a Super for xmas. Lol. And that's the pre-orders. Maybe summer of next year they might be readily available (in stock) on outlets like Amazon. I hope to be proved wrong of course.
  8. Yeah, I figure P95 isn't as brutal as CPU-Z. "Performance CPU Clock Ratio" at 57 obviously limits (limits?! Lol) the speed to 5700 on the base clock of 100. No hiccups in either test with power limited to 253w but the package temp still hits 95c. I think I'll reduce it to 55, though I doubt DCS would push useage that hard anyway. What I was able to find was a post asking what "Synch all Cores" on ASUS BIOS translated into on Gigabyte BIOS and the answer was "Enhanced Multi-Core Performance". The overall thrust of the stuff in the original post was to fix motherboards (all brands apparently) from allowing unlimited wattage AND voltage-spiking single cores in turbo mode.
  9. And now, running Prime-95 again, they max out at 4900MHz. What fckery is this? Lol. Well, none of this makes any sense. CPU-Z test pushes the cores to 5700 and 95c, Prime-95 tops out at 4900 and 82c. Max power in both cases is 253w however. When this Intel finally blows up I'll be going AMD for the first time in my 30-year journey with PCs. Lol.
  10. Running the all-core CPU-Z stress-test it shows the cores running at 5200-MHz to 5300. They hit 87c.
  11. Well, I don't have a clue how to "synch all cores" in my Gigabyte Z790. I've googled it, nada. Edit: I believe it's the "Enhanced Multi-Core Performance" in my BIOS, already set to "Disabled" per the list above. I guess disabling it locks/synchs the cores.
  12. Doesn't locking the cores also force them to run at max speed permanently?
  13. I've done this to my 13900k. It now maxes at 4.9ghz instead of 5.6ghz (Prime 95 stress-test) but no longer hits 97c (maxes around 80c). Gigabyte Aorus Master Z790, F12 BIOS. Package Power Limit1 - TDP (Watts) > 253 (Holy grail, page 98, table 17, 8P+16E Core 125W Extreme Config) Package Power Limit1 Time > 56 (Holy grail, page 98, table 17, 8P+16E Core 125W Extreme Config) Package Power Limit2 - (Watts) > 253 (Holy grail, page 98, table 17, 8P+16E Core 125W Extreme Config) Core Current Limit(Amps) > 307 (Holy grail, page 184, table 77, S-Processor Line (125W) 8P+16E) Enhanced Multi-Core Performance > Disabled (OP recommendation) Performance CPU Clock Ratio > 57 (I mentioned I changed this to 56x using XTU, but OP told me to go back to 57 when doing the changes above) Optimizing Stability for Intel 13900k and 14900k CPU’s : r/overclocking (reddit.com) Still won't do XMP though, totally spazzes out the system. Might be because I have x4 16GB though.
  14. Yeah, I know, Bob Hope and no hope but worth a shot. Lol.
  15. I had a lot of fun with the BK but it was crude. And it blew the stock amp eventually. I now have two LFEs running from a 700w amp with SimHaptic and it's a superb setup. More context...the BK was like trying to race a Lada around Silverstone, the LFEs are like trying to race a Ferrari on a go-cart track. Lol. My wife thinks I'll put the ceiling through if I'm not careful.
  16. All I remember, vaguely, is the 20-series ran stupidly hot for not a lot more performance of the 1080Ti. I ran my 1080Ti until I got a 3080Ti (which now sits in an unused PC).
  17. It might already have been posted but if not:
  18. I tried the Leap Motion 2 but it didn't work when plugged directly into the Crystal, it gave two warnings: 1) Not enough port power. 2) USB 2.0 port detected, unit will not run at optimal configuration. As I understand it the Pimax module uses an older version of the Leap PCB and I guess works correctly. If you're going to sell a headset unit with the facility for optional add-ons you're pretty much obliged to maintain supply of them for the service-life of the headset, otherwise you run into the territory of false advertising.
  19. I was wondering more about what accessories will be offered and then go out of stock, never to return.
  20. The 'Room Setting' centre is the difference between XRNS working or not working. But it's a calibration process (centre and floor height), so I'm not sure it could be bound to a key.
  21. Transparency, never a bad thing. Cynicism loves a vacuum.
  22. It does speak to the quality though, because it all flows from the same mindset. Eg, the quality of the software: Pass-through turns itself off randomly. Turning it back on forces a headset reboot which sometimes fails and necessitates a PC reboot. Still in "Beta". Restart Headset (main panel button) does the square root of SFA. Getting the headset started from being turned off takes upwards of five minutes. I have to leave mine on permanently (green light on) to avoid the grief. Render Quality setting randomly goes south. FOV on standard lenses (35PPD), massively over-promised/stated, we all know what we actually got. Wide-FOV lenses, never got mine, got sent the worthless plastic 45PPD ones instead. Etc. I wouldn't trust this company as far as I could throw it now.
  23. The general expression for that kind of behaviour is "Leaving your customers hanging out to dry". What confidence can we have therefore that you won't do the same with the Crystal Light and Crystal Super? It's not like the Crystal is even remotely close to being 'End of Life'. Or is it? Just over a year after first release.
  24. Yeah, that's pretty much what I've read, though there's vague stuff about hacking global software into them. Too much aggro for me. Pico really did miss an opportunity there though.
×
×
  • Create New...