Jump to content

LucShep

Members
  • Posts

    1693
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by LucShep

  1. You need to wait for this...
  2. A lot of people are waiting for the AM5 Ryzen 7000X3D chip and, because of that, are passing on current deals for the AM4 Ryzen 5800X3D. I think some things should be realized before deciding to wait for that upcoming chip, such as: The new AMD Ryzen 7000X3D chip won't be unveiled before CES 2023 (January 5th-8th), and noone knows how long after that it'll be on market (at full price... $500+ ?). https://wccftech.com/amd-ryzen-7000-x3d-zen-4-3d-v-cache-cpus-ces-2023-unveil/ The current AM4 Ryzen 5800X3D is among the best (if not the best) all around gaming CPU today, and is currently being sold at >25% discounts (under $350 right now). It also runs cooler (than AM5 7600X or 7700X, also than AM4 5800X, and equivalent Intel) and therefore is not as picky on cooling. For example, a Thermalright Peerless Assassin 120 is a great option at $45, with room to mess about with OC, PBO and etc on that CPU, if ever desired. Motherboard prices... There's plenty good (AM4 platform) B550 motherboards for the AMD Ryzen 5800X3D, all of which are far more affordable than the newer (AM5 platform) B650 motherboards meant for the new Ryzen 7000 chips. For example, you can go as low as the MSI B550-A Pro, currently selling at $115. And there are plenty other great sub-$200 B550 motherboards to choose... Same can be said of the higher-end (AM4) X570 motherboards vs (AM5) X670 motherboards. DCS wants 64GB of RAM... DDR4 3600 CL16 (B-Die) is most recommended for the (AM4) Ryzen 5800X3D, and you can now get 64GB kits (4x 16GB) of that mem under $330. DDR5 6000 CL30 is most recommended for the (AM5) Ryzen 7000X chips, and 64GB kits (2x 32GB) of that mem are pretty hard to get under $500. Yes, the upcoming (AM5) Ryzen 7000X3D should be faster the current (AM4) Ryzen 5800X3D but, as you imagine, price will fit accordingly, plus you'll have to wait. The thing is, there's always something newer and better around the corner, such is the world of PC Hardware. As to say, something faster than the 7000X3D will be out just months later, and there it goes all over again... So, at some point, one has to draw the line and decide when and how much is enough. It depends if you're on a budget or not, and if you want to wait or not. Just my opinion but, considering costs of everything for a new system, little by little on this and that, it all adds up. In the end, there can be considerable savings by going with the current Ryzen 5800X3D (a killer deal right now, IMO). And that may also make that painfully expensive new GPU purchase a bit more digestable....
  3. I've been a big advocate of Intel i7 1*700KF series for years but, have to say, I think the Intel i5 13600KF and AMD 5800X3D are definitely the best purchases of the moment. 64GB of DDR4 (3600 CL18, for example) are still far more affordable than the equivalente DDR5 - any 64GB kit better than DDR5 5600 CL40 will be a lot more expensive. Most gaming rigs are currently being done considering 32GB, but for DCS you reallly should go for 64GB, which makes things a litte more complicated (different) for final prices, and most tech reviews and techtubers are not considering this. All of which makes it a very compeling case for the AMD 5800X3D + B550 mobo + 64GB DDR4 RAM choice, not the latest AMD Ryzen 7600X, nor the Intel i5 13600K.
  4. Yes, I'm located in Portugal and it's the same problem here (I think as it'll be for most of the E.U.), makes perfect sense what you describe. That's same as new, and therefore not a bad deal at all. EVGA is worth it in the case of RTX3000 series because of the best warranties and post-sale services in the GPU business (for already produced cards), even if they're no longer producing new Nvidia GPUs.
  5. Coreteks seems to go against the current AMD hypetrain and presentes some interesting ramblings...
  6. Sure thing. I'll agree to disagree.
  7. RTX3090 is what I'd recommend in your case, because it is noticeably better than RX6900XT for higher resolutions (VR and 4K), and the drivers software will be imediately familiar to you (no use transition adaptation issues). The former is worth a higher price than the latter. Not sure where you're located but the used prices you mention seem a bit inflated. Check local used markets but, most of all, give attention to EBAY listings on those GPUs models you look for, from top rated sellers (important there). In there, some are accepting offers below what you mention, and there also auction biddings going (yes, they're a drag, but worth it if you're active and patient).
  8. Semantics, and irrelevant for the topic. AMD R295X2 combined two R9 290X graphics processors. It was two of their fastest GPU chip at the time slapped on same PCB. Tomato, tomato. The thing is, you seem to conveniently miss that there's a very important thing called a "standard", for global market. Each of those horizontal resolution terms (8K, 4K, 2K) has a dominant resolution "standard", which is what the general public perceives for the matter, and manufacturers base on. AMD and Nvidia knows it. You know it. I know it. The dominant standard for 2K is 1920x1080, it's not 1920x720. The dominant standard for 4K is 3840x2160, it's not 3840x1440. The dominant standard for 8K is 7680×4320, it's not 7680x2160.* *Notice how the ultrawide vertical resolution here is 1/2 of the wide one, and this is what AMD used for their "8K" performance charts and argument (cheeky eh?). Maybe because 16588800 total pixels versus 33177600 total pixels is quite a big difference (it's double the pixel count!) ............. FWIW, Steve's video at 6:05... Huh? No, it's me who doesn't know where and how you got those prices(??). The MSI R7970 LIGHTNING MSRP was $599 on launch, and that one was $50 more than the reference AMD R7970 model MSRP ($549) on launch... Oh I did see that, a good stab from AMD towards Geforce Experience... if a bit unfortunate because noone seems to be using that optional POS anyway? I've used Nvidia (intercalated with the odd ATi/AMD) for some 25 years (Riva TNT) and, to this day, I've never had to create an account, or be tracked, to access and tune all my Nvidia GPU driver settings? Regular NV Panel is fine and reliable, and all one needs. But if so inclined, there's NV Profile Inspector (3rd party free app) which unlocks every single possible Nvidia graphics settings to be tuned (combined or not, general or specific) for the more advanced type users. And it's been like that for well over a decade. Fun fact, to this day there's nothing like it for AMD, and yes it's badly needed (and no, Adrenaline doesn't even come close, not even the defunt A.T.T. or R.P. 3rd party apps). And let's not even get into the headaches that fancy AMD Adrenaline drivers give at times (instability, green screens, CTD, BSoD, performance issues)... much less seeing even more stuff getting added now to it, likely presenting some trouble (even more) at some point. Hopefully they get it right, and it's all smooth sail for every new customer of these new AMD GPUs when they launch. What I'm upset with is, 1) the messing about with the (re)nomenclature of GPUs, an evident price hike manipulation on the product line up below the top model -which we'll surely see across all subsequent models- and, 2) how desperate people seem to be that they eat this horse manure this easily. Nvidia withdrew the RTX 4080 12GB, and we all know it wasn't "because it was confusing" like they said (on the contrary, it was very clear what they did). It's because it became evident it was an outrageously overpriced "70 class" model (an RTX 4070 in reality) creating gargantous backlashes all around the web. For which also AMD could maybe present a more affordable alternative, and they sensed it. AMD presented a similar trick as Nvidia with this (re)nomenclature and price trick. In their case, and this time around, the AMD x900XT is now the x900XTX model, and the x800XT model is now the x900XT. But hey............. *shrugs*
  9. No, that's not same category. The R9 295X2 (which were actually two cards in one PCB, like the GTX690 had been before it) was, like the original Titan, the top of the spear, a halo product. None of those was ever meant to sell in numbers, and are definitely not high-end, they're above that on a class of their own. They're the so called "enthusiast-class". A bit like the RTX4090 may be considered right now, or previously the RTX3090Ti, or RX6950XT, or RTX2080Ti. High-end GPU is something like an Nvidia GTX980, or RTX 2080, or RTX 3080. And the AMD HD7950, or R9 290, or RX 6800XT. None high end GPU had historically and categorically passed the $650 MSRP before the pandemic. "We're all living in Amerika"
  10. The marketing and buzzwords perhaps, but the misleading facts are quite egregious. The "8k" they presented is an ultrawide resolution (much shorter vertical area) that is not even close to 8K pixel count, therefore misleading. The "we use the old/regular PCI-E adapter/cable therefore with ours you can use it on your old PSU" was either misinformed or misleading on purpose, as you have more than one brand presenting cables/adapters meant for the current (non ATX 3.0) PSUs to actually use them with the RTX4000 series. It's also quite in bad taste, as to present a problem that just appeared 2 or 3 weeks ago on the competitor product, as "Hey! Ours didn't have it from concept because.. .errr, well, hmm.. we knew better and was exactly intentional! ...right??" Then you were either robbed, or bought the unnecessary ubber OC model from one of the AIBs. The MRSP of the GTX1080Ti FE was $699 and for most of the time it actually sold at or below that price worldwide! Considering continents like Europe now never get prices below MRSP+22% (and actually more for the RTX4090), how much do you think these RX 7900s will actually sell for? Also, one could argue that, while the RX 7900 XTX is "fairly priced" (if judging where it slots in performance - between RTX4090 and RTX4080), the RX 7900 XT is really badly priced because it actually replaces the RX6800XT (which was $649 MSRP). In two generations we're paying double the price for the same segment products. The HW market never worked on a basis of "new product faster than old, means price increase accordingly", which is what these manufacturers are pulling on us - with which you seem to actually agree with (??). It never worked like this, that is not progress. There is no more lockdown and mining crisis to justify the price gouging we had for nearly two years. Noone should bite that bait anymore, as it will only justify these ludicrous prices - as we're seeing now. The 7700XT is only speculation. It won't come anytime soon and it's not even known if it will feature chiplet design. You can now find used RTX3090 and RX6900XT for around $650 (+/-) and prices will only decrease - that is a fairly good value, and you can get right now. You're missing the global picture. Seeing these ludicrous prices for new GPUs of high-end segment means the mid-range segment prices will increase accordingly (and again, the MSRP being a pipe dream) - paying what you would have for high-end just little over two years ago. PC gaming is doomed as in, it's now a hobbie for the riches, like was before the late 90s. The fellas paying the RTX4090s and RX7900XTX are the miniscule elite of gaming, and never meant that much on the bigger picture because they matter very little for the sales numbers of games (simulations or otherwise, AAA or otherwise) and therefore to the real development of such products - stagnation as well as failure increase of otherwise good projects may increase. The bulk on ROI of gaming development (simulations included) sits within the mid-range, where the bigger numbers of potential customers are. New tech and HW horsepower only matter if the widest public can reach it - if we're only left with these "small elites" (because they are) then even products like DCS will fade and die, because the numbers of customers (old and new, who can no longer afford this pace) will dwindle on and on.
  11. Quite frankly, very disapointed. Lots of BS marketing, buzzwords, and misleading arguments (8K that is not 8K, old PCI-E slot now is a feature(!) used as a defensive low blow towards Nvidia, big etc). The worst offender was pricing. It was kind of predicted that AMD would inflate prices right at the very end, taking momentum advantage on recent Nvidia's failures, and sadly that was what happened. If it now means accepting $1000 MSRP for a GPU as "competitive pricing" (and we all know in reality MSRP is a pipe dream), then PC gaming is doomed on the mid to long term. FWIW, two years ago the price of the RX6900XT ($1000) was ridiculed and the product got totally overlooked (if it wasn't for miners), exactly because it was utterly expensive... Also, this $100 price difference between the two presented products, when the performance difference is rumored to be about 20%, is certainly matter for criticism. In the end, these Nvidia or AMD new GPUs are more a matter of "less of evils", rather than real value and price/performance progress, as it used to be just over two years ago. All hail the used market with RTX 3090s, 3080Tis and 3080s at far, far more affordable prices. That's where the propper performance for value is at right now, and what everybody should be getting for a GPU (for DCS but not only), IMHO. Not these new overpriced, exploitative, over-marketed products. Looking at local used markets and Ebay as I'm writing this...
  12. OC3D article: https://www.overclock3d.net/reviews/cpu_mainboard/intel_13600k_and_13900k_ddr4_vs_ddr5_showdown/1 OC3D video of same article:
  13. To back up the Saved Games directory is important - all the settings and options (graphics, audio, game, controls, etc) are there, as are your custom missions and add-on mods. If you have a bunch of modules, it can be a royal PITA to reconfigure all the controls. So, yep, "backup, backup, backup". I backup the install directory for particular reasons. I mostly use DCS 2.56 (and not really the latest versions) which I've widely modified - probably months worth of continuous modifications. Every version after that (since 2.7 got out) has been worse and worse performance wise, so losing this 2.56 version that I got would probably dictate retiring from DCS.
  14. Do not worry, NVIDIA will come up with a redesigned replacement cable soon.........!!! Now more seriously... If NVIDIA briefs all board partners in urgency and makes damage an absolute issue, then the matter is pretty darn serious. It all comes down to this - while most RTX4090 users may not have this issue with the cable right now, they might have it at some point.... (not good!). Let's just hope that an easy and practical solution will be found and implemented, for both future and exhistent RTX4000 GPUs users peace of mind. The writing was on the wall since day one. And, as always, early adopters suffer the most (this time on a ~2000€ product, not sold at MSRP). A bit of a sad realization that, what most of us have been saying in these forums to "be patient, wait a bit more, then decide on your next high-end GPU", actually makes all the sense. More so, now seeing this problem. NVIDIA scored an own goal. AMD may become #1 choice by default. ....I just hope AMD doesn't get dirty, and perceive all this as motivation to over inflate prices at last minute, because these new RX7000 GPUs will sell like hot cakes from day one.
  15. Buildzoid may have the longest ramblings on PC hardware, but what he says always have good, valid reasoning behind. There was nothing really wrong with how PCI-E connectors have been in recent years, it was unnecessary to adopt that new stupid connector. For the least, it really needs improvement, period.
  16. Yep, happens with some TVs and Monitors (have that problem too with mine). Easily fixed with some velcro adhesive tape, very effective. There are many other similar products but, for example, something like this: https://www.amazon.de/Klettband-Selbstklebend-Doppelseitig-Klettverschluss-Selbstklebendes/dp/B085NZ5JVT/ref=sr_1_11?__mk_de_DE=ÅMÅŽÕÑ&crid=3BMTPAAKZU9ZW&keywords=velcro+tape+adhesive&qid=1666474837&qu=eyJxc2MiOiIyLjUzIiwicXNhIjoiMC44MSIsInFzcCI6IjEuMDAifQ%3D%3D&sprefix=velcro+tape+adhesive%2Caps%2C78&sr=8-11
  17. More news/rumours on RDNA3: https://wccftech.com/amd-radeon-rx-7900-xt-rdna-3-graphics-card-to-pack-20-gb-gddr6-memory/ -------------- \\ -------------- AMD Radeon RX 7900 XT To Feature RDNA 3 "Navi 31" GPU Core & 20 GB GDDR6 VRAM The AMD RDNA 3 GPU lineup will kick off first with high-end offerings based on the Navi 31 MCM chip. The chip will be featured in enthusiast-class Radeon RX 7000 graphics cards including the RX 7900 XT which we have received new information about from our sources. As per the details, the AMD Radeon RX 7900 XT graphics card won't be the top model but it will feature 20 GB of GDDR6 VRAM. Our source reports that AMD's Radeon RX 7000 "RDNA 3" GPUs may exceed expectations & while there are conflicting rumors (here and here), it looks like the end result might be a far better product than anticipated. We have also come to know that the AMD Radeon RX 7900 XT originally featured 24 GB of memory capacity before being downgraded to 20 GB. The Full-Fat 24 GB & Top Navi 31 bin will be aimed at NVIDIA's full-Fat Ada die (the RTX 4090 Ti). AMD seems very confident that with 20 GB and a slightly cut-down MCM chip, they will sit in a comfortable position against the RTX 4090 and may even outperform it in pure rasterization performance while bringing a big jump in RT performance versus the existing RDNA 2 GPUs. With the Radeon RX 7900 XT utilizing 20 GB of memory, it would indicate that the rumors regarding a higher-end SKU seem to be true. AMD has already started using the *950 XT convention in its last RDNA 2 generation of GPUs so an RX 7950 XT is surely happening. Angstronomics also reported on similar details a while back in their juicy article over here which you should definitely check out. The Navi 31 GPU will also carry 6 MCD's which will feature 16 MB Infinity Cache per die and are also likely to carry the 64-bit (32-bit x 2) memory controllers that will provide the chip with a 384-bit bus interface. While this equals 96 MB of Infinity Cache which is lower than the 128 MB featured on the current Navi 21 GPUs, there's also a 3D-Stacked solution in the works which was pointed out recently and that would double the Infinity Cache with 32 MB (16 MB 0-hi + 16 MB 1-hi) capacities for a total of 192 MB of cache. This is a 50% increase versus the current Navi 21 design and it also makes Navi 31 the first GPU with both, chiplet and 3D stacked designs. These chiplets or MCD's will be fabricated on TSMC's 6nm process node and measure 37.5mm2 each. Now, this is going to result in a higher power draw and AMD seems to have confirmed this much that their next-generation graphics card lineup will feature higher power consumption, but they will still be a more efficient option than what NVIDIA has to offer. The AMD Radeon RX 6950 XT already has a TBP of 335W so for a >2x performance gain. The cards are expected to retain their dual 8-pin plug input for power and feature an updated triple-fan cooling design, which is slightly taller than the one currently in use. AMD will be unveiling its RDNA 3 GPU architecture and the Radeon RX 7000 graphics cards on the 3rd of November. They have a full Livestream planned which you can read more details about here.
  18. The AMD 7000X3D will be unveiled in CES2023 (January, I think), which means February 2023 is when it might be out (we'll see). It should be really good. The problem, other than the wait (and availability) is the prices, which should be pretty salty (over 500€ expected). The X670, B650 and B650E motherboards are (currently) also not really affordable, same for the 64GB kits of DDR5 (recommended by AMD is DDR5 6000 CL30, check the prices...). Reviews are out and the i7 13700KF and i5 13600K are definitely very competitive and actually the value kings, considering that you can (other than with DDR5) still get it running with DDR4, with excelent performance. And that's possible with the least expensive Z790 DDR4 motherboards - which can support DDR4 RAM you may already have from a previous system or, if not, the far more affordable 64GB kits of DDR4 3600 CL18 RAM (now going for 230€, give or take). Really, if it's a matter of budget (also because of GPU and PSU prices involved in a new system) I think the CPU+Mobo+RAM combo here is a no brainer.
  19. Have you tried to reduce the RAM speed, a step below (for example, as it's 3200 reduce to 3000)? That can help immensely with stability, regardless. I thought initially that you had a problem of mismatched timings and/or of voltage, but it can be a temperature (overheat) problem. Even if it was the two different dual-kits conflicting, I agree that 45 mins of flight in MP server (with high RAM consumption) should have shown problems long before that (IMO). Fast low latency mem such as B-Die (like you have there) can be sensitive to temperature. But, have to say, it does look a little odd to me to see ~60ºC (Max) with just XMP loaded. (....is DRAM voltage at 1.35v or 1.40v, not more? ....maybe restrictive case with very little airflow? ...something's definitely playing up there) If you look around (google something like "B-die temp stability", or related subject) you'll see a few people running them fine at mid 50ºC, but most can not run it stable at all if above 45ºC. That said, this temp problem usually applies to people overclocking and running tweaked (tighter) timings... you're not really pushing that memory, as you're just using XMP settings (i.e, "stock") with fairly loose timings. Right now, see if you can mount a 120mm or 140mm fan pointing air directly to that RAM (maybe using some contraption with zip-ties, like this fella here) and see if it helps. Maybe not urgent, but I'd recommend to invest in good airflow for that system before upgrading anything else.
  20. Yep, for DCS, I think so. i5 13600KF (or i7 13700KF) + least expensive kit of 64GB DDR4 3600 CL18 RAM + most affordable Z790 DDR4 motherboard... That will probably be (again) the best value base combo for new DCS systems, IMHO.
  21. AMD RDNA 3 “Navi 31” Rumors: Radeon RX 7000 Flagship already with AIBs, 2x faster Rasterization and over 2x Ray Tracing improvement: https://wccftech.com/amd-rdna-3-radeon-rx-7000-gpu-rumors-2x-raster-over-2x-rt-performance-amazing-tbp-aib-testing/ ------------------ \ \ ------------------ AMD RDNA 3 “Navi 31” Rumors: Radeon RX 7000 Flagship With AIBs, 2x Faster Raster & Over 2x Ray Tracing Improvement: The latest rumors come from Greymon55, who suggests that AIBs already have AMD Radeon RX 7000 GPUs based on the RDNA 3 architecture being tested in their labs. The leaker didn't quote partners but did give us some early performance estimates which sound really good. We do know that AMD is working on its Navi 31 flagship GPU for launch next month on the 3rd of November, so it is likely that these are the flagship chips that are being tested at the moment to tackle NVIDIA's GeForce RTX 4090 & RTX 4080 graphics cards. AMD RDNA 3 "Radeon RX 7000" GPU With 2x Raster & Over 2x RT Performance? As per the leaker, the AMD RDNA 3 GPUs featured on the Radeon RX 7000 series graphics cards are delivering up to a 2x performance increase in pure rasterization workloads and over 2x gains within ray tracing workloads. It is not mentioned if the RDNA 2 GPU is the RX 6900 XT or RX 6950 XT but, even if we look at the 6900 XT, the RDNA 2 chip offered superior performance in raster vs the RTX 3090 and came pretty close to the RTX 3090 Ti, while the RX 6950 XT excelled over it. A 2x improvement in this department would mean that AMD would easily compete and even surpass the performance of the RTX 4090 in a large section of games. In ray tracing, a gain over 2x means that AMD might end up close to or slightly faster than the RTX 30 series "Ampere" graphics cards, depending on the title. The Ada Lovelace RTX 40 series cards do offer much faster ray tracing capabilities, offering close to 2x gains in ray tracing performance over the RTX 30 lineup. So ray tracing will see a major improvement but it may not be able to close the gap with RTX 40 series. There's no word on new AI-assisted capabilities featured on RDNA 3 GPUs to help with upscaling technologies such as FSR. Reference TBP of AMD RDNA 3 GPUs Reportedly Looks "Amazing" Lastly, the leaker states that the reference TBP looks great and we don't know if that's comparing it against the RTX 40 series or the RDNA 2 lineup. AMD has already said that the RDNA 3 "Radeon RX 7000" GPU lineup will have much lower power figures than the competition. AMD confirmed that its RDNA 3 GPUs will be coming later this year with a huge performance uplift. The company's Senior Vice President of Engineering, Radeon Technologies Group, David Wang, said that the next-gen GPUs for Radeon RX 7000 series will offer over 50% performance per watt uplift vs the existing RDNA 2 GPUs. AMD's SVP & Technology Architect, Sam Naffziger, has highlighted that the next-generation RDNA 3 GPUs featured on the Radeon RX 7000 GPUs and next-gen iGPUs, will going to offer a range of new technologies including a refined adaptive power management tech to set workload-specific operation points, making sure that the GPU only utilizes the power required for the workload. The GPUs will also feature a next-gen AMD Infinity Cache which will offer higher-density, lower-power caches and reduced power needs for the graphics memory. The AMD Radeon RX 7000 "RDNA 3" GPU lineup based on the Nav 3x GPUs is expected to launch later this year, with reports pitting the flagship Navi 31 launch first, followed by Navi 32 and Navi 33 GPUs. A recent rumor also highlighted that the graphics cards will hit retail shelves in December.
  22. No, that's not heat. I'm seeing "TridentZ" and "Royal TridentZ" mixed there. That's not really an exact model match, as you suggested in your previous post. While same manufacturer and (apparently according to you) the same speed and timings, the components and XMP settings (sub-timings and etc) can vary wildly between the two models, and that could very well be why it's crashing when pushed. The problem is that you're trying to run unmatched dual kits of RAM with a certain XMP setting of one in the other. I still think that "32 + 16" mix of different model RAM that you got there can work fine, but it's not a "plug n' play" one click button solution. You can try the following in the BIOS, for the memory settings: Instead of using XMP settings, use "Auto" settings for the RAM. Once that's done, manually insert the speed (3200 was it?)* * I have no idea what specific memory models you have there, so you have to figure out what's their speed and timings After that, manually insert the main timings (usually in memory advanced settings), which are tCL, tRCD, tRP and tRAS (16-16-16-32 was it?)* Leave all the other sub-timings on Auto. * again, no idea what specific memory models you have there, so you have to figure out what's their speed and timings Increase the DRAM VOLTAGE to 1.40v (a small +0.05v increase to what you'd see with XMP). This +0.05v increase for the RAM is strongly recommended for stability with four sticks, and it's completely harmless. No overheating, nothing, no problem whatsoever. If all that fails, then you can try reducing the RAM speed, one step below (for example, if it's 3200 reduce to 3000). It'll almost certainly work fine. And yes, using a 64GB dual kit (2x 32GB), or adding a second dual kit of 2x 16GB (as long as the same exact model that you have), are better solutions.
  23. ........... LOL https://www.nvidia.com/en-us/geforce/news/12gb-4080-unlaunch/ That's it. RTX 4080 12GB is cancelled. The huge (justified) backlash of the "4080 12GB is just a rebranded 4070 to justify its stupid inflated price" all over the tech communities was just too big. Now, imagine all the AIBs producing their version of this model, as Nvidia was not producing a FE version of this model. Which means this only hurts the AIB partners, not so much Nvidia themselves. Makes you wonder how much of the EVGA story could have anything to do with it......
  24. No harm in trying. Intel 10th gen (with Z490) was the last CPU line-up that, in my experience, permited that sort of RAM goofing around. Just make sure that they are indeed the same make and model, same exact XMP settings (speed and timings, etc) before going for that mix-remix of RAM capacity. Place the four sticks. Start the PC. Then get into the BIOS, and load the XMP. Afterwards, still in the BIOS, search for the DRAM VOLTAGE setting and increase it by +0.05v (stock value from XMP should be 1.35v, so tweaked voltage value should be 1.40v). This very slight increase for the RAM voltage is usually recommended when running four sticks (wether of same capacity or not), and it's completely harmless. SAVE and EXIT the BIOS. Boot the PC to Windows, do some tests (DCS or whatever you feel is right to stress test it), and come here again later to tell us the story of how it went....
  25. I don't have Win11 (using Win10 Pro 64bit and will keep on using it), so not sure if the following applies - also not sure if you have tried it already. Could be as simple as reinstaling the software (to clear any corrupted files). Afterwards, you may also wish to try some different compatibility settings on the program executable. Right click in it and then click on "Properties", then on "Compatibility" and set that to "Windows 7" (or instead "Windows 8"). You may also wish to try to set it to "Run as Administrator" as well. For example: Other than this, I'm currently out of ideas. If all else fails, perhaps contact the TrackIR developers for assistance and/or ask in their public forums: https://www.trackir.com/help
×
×
  • Create New...