-
Posts
1688 -
Joined
-
Last visited
-
Days Won
2
Content Type
Profiles
Forums
Events
Everything posted by LucShep
-
Please stop. No, it's you not understanding the product, its volume of production and respective sales, the customer target, the competicion, and the potencial dimension of this issue. 1) Overclockers is ONE hardware shop worldwide among hundreds of thousands (considering online businesses along with brick-and-mortar stores). So, no, they're not representative of anything. 2) This same reference design is provided not just by AMD themselves, but also by PowerColor, Sapphire, XFX, AsRock, Asus and Gigabyte. Only MSI is not selling it. The problem is humongous because all these models of same reference design are the most important, they're the bulk of production and sales/profit of AMD's RX7900XTXs. The AIB models with custom coolers, boards and powerstages are better products but less lucrative, not expected to have as many sold, and why production of these is smaller, as they're necessarily more expensive and in-line with the prices of RTX4080 reference based models - the aimed competitor. The "better value" marketing argument no longer works there (foot, meet mouth), so most of the attention is on models using this reference design sold by all these brands. And why this issue is kind of a big deal. 3) Considering previous points, perceived image is important and damage is already being made to the product, even if turns out to be a simple bad batch. And if it turns out to be more than that (oh boy!), then we could be talking about tens (hundreds?) of thousands of GPUs with the problem, likely requiring RMA - so far denied (sheer incompetence, lack of QC, and anti-consumerism of AMD? I think so). Otherwise, users outrageing on the internet and on the stores by the hundreds, if not thousands, may appear. It could become devastating enough for logistics, costs, public image and brand reputation, even for big companies like NVIDIA (remember the RTX4090's 12pin plug cable issue - which the RTX4080 does not suffer from), imagine that with a far smaller one like AMD. 4) Again, Roman Hartung aka "der8auer" is anything but a "clown" as you said above (FFS.. ). Once again, he's providing public service (outstanding, IMO), giving us and also the manufacturers a heads-up with proof of "lower quality and/or issues" on expensive hardware products, through his own research.
-
OMG... Roman Hartung aka "der8auer" is a "clown"?? Really? ...one of the most proficient overclockers and hardware knowledged persons on the planet, directly co-assisting brands for products that we all use? What's next, Vince Lucido aka "Kingpin" is a clown?? Seriously, who are you? No, really... WHO ARE YOU? Comparing RX7900XTX prices with RTX3090 prices? LOL ...Troll much? Fun fact, I bought an RTX3090 (and a FTW Ultra at that) just a couple months ago for less than 1/2 (half!!) of the price that you pay for your overpriced 7900XTX Nitro+, when performance difference is about 23%. Or less after an undervolt, giving it a wee bit more performance and nearly 100W less consumption (compared to factory default's). BTW, that RX7900XTX Nitro+ is an AIB model that sells for 1400,00+ Euros. The hilarious bit (that is very likely going over your head) is that's the same price of an RTX4080 - of which every single model is utterly excelent (if overpriced as well), contrary to what you can say about the RX7900XTX. The point of an RX7900XTX instead of an RTX4080 is that the former is a cheaper alternative to the latter, but not if at close or same price (unless it's a gullible AMD fanboy?!). Who in his right mind would get into these forums to derail others with utter BS, write such nonsense? (and prefering instead to take clickbait 'tubers as references! OMG) Who in his right mind pretends to know stuff, then buys an RX7900XTX Nitro+ when instead has available RTX4080 models that are, matter of fact, a superior product (far better drivers, therefore better games support and features, better cooling and temps, better memory (GDDR6X), better construction, better power consumption, better overclocking, far bigger userbase and shared knowledge) at, pretty much, the same price? ...it's beyond belief.... *collective facepalm*
-
This recent vid from Frame Chasers was an amusing one (and maybe not too far-fetched)... EDIT: warning! - explicit language
-
YAY!!! I was just reading the initial DCS A-7 Corsair II thread, thinking "oh no, there's still no progress since August/22" , wondering if the development might have stalled. Then I noticed FlyingIron Simulations has its own dedicated section in these forums! Such an awesome development progress report. That's exactly what we all want to see, from time to time, until the module is released. The A-7 Corsair II was an aircraft which my country (Portugal) proudly used for many years, and one I've seen flying around since I was a kid. Later on, I even watched (mesmerized!) a group of two in ground combat simuIations, during exercises back in my days in the military service - impossible to forget those memories. So, it's no wonder it's a module that I've wished to have in DCS for years now. Your progress reports make that desire even bigger - eagerly waiting for the A-7 Corsair II module! Thank you team!
- 20 replies
-
- 6
-
-
- flyingiron
- a7
-
(and 1 more)
Tagged with:
-
I've used an RTX3060Ti 8GB for some 10 months, which is basically a slightly lower spec'ed RTX3070 8GB. Once undervolting + Memory OC is done on it, it's the same performance as a stock RTX3070 8GB. So, I feel qualified to comment. You will definitely be bottlenecked by the 8GB VRAM at some point (stutters ensue then), especially with the more detailed textured modules (F-14, F-16, more so if on Syria, for example), but performance will generally be "OK" for ~60FPS at 2560x1440. The RTX3070 8GB is about 20% to 25% faster than a GTX1080Ti 11GB but, while you can notice it, it's too little of an improvement to make it feel like a true upgrade over that one. So, perhaps your interest in getting an RTX3070 8GB in brand new condition is due to budget constraints. If that's the case, I'd really consider instead a used (2nd hand) RTX3080 10GB, which should be fine with that 750W PSU that you list in your specs. It'll surely feel like an upgrade over the GTX1080Ti 11GB (it's ~50% faster!). The RTX 3080 10GB has a LOT more "oomph" than the RTX3070 8GB, it's higher framerate guaranteed, and will definitely not bottleneck as soon - not only it's more of it, it's better and faster VRAM too (GDDR6X). Don't let the "ooh it's used!! ....oooh it may have been used in mining!!!" thoughts get into you because, while there may exhist some bad apples, there are some really good deals happening all the time (on Ebay and etc). There's lots of people upgrading to the (horribly overpriced!) newest and recently launched higher-end models, and selling theirs to balance the financial hit on that novelty upgrade. Take advantage of it while you can. The RTX 3080, and also the 3080Ti and RTX 3090, will be still be great GPUs for a long, long time, but I don't think paying stupid prices for a brand new one is worth it, when you can get 2nd hand ones in mint condition for a lot less. Hence my sugestion. Lastly, and as side note, if you're using an NVIDIA graphics-card, and you use a PC monitor (or TV) without any VRR features (so, without FreeSync or G-Sync), please do not feel forced to use V-Sync "ON" just because you too can't stand screen tearing. Instead, perhaps have a go with Scanline-Sync (tutorial in HERE). With it, one uses V-Sync OFF in game options, and Vertical Sync at "OFF" or at "Fast-Sync" (depends on game, DCS prefers the latter for Scanline-Sync) in NVIDIA Control Panel. No tearing, runs smooth - and DCS runs better comparatively.
-
Yep! I also prefer 4K/60Hz TVs (QLED or OLED, preferentially) for Flight/ Racing/ Mil simulations. Anything that is fluid, or less frantic pace than the "chicken-with-no-head-crazy-pace" esport games, you don't really need ubber-refresh gaming monitors (expensive!) at all. Image quality and size (bigger resolution and size = more real estate) is far more important, IMO. You can use Scanline-Sync, and you won't even miss FreeSync/G-Sync VRR features (lacking on most TVs). Here, one uses V-Sync OFF in game options, and Vertical Sync at "OFF" or at "Fast-Sync" (depends on game, DCS prefers the latter for Scanline-Sync) in NVIDIA Control Panel. No tearing and just as smooth (like butter).
-
RTX 4080 & 7900 XTX/XT benchmarks DCS
LucShep replied to xoxen's topic in PC Hardware and Related Software
That's a fair point, but then one can say the same about the "Plazma Torture Map test" used so often here in this thread - it'll show CPU limitations far, far sooner than anything GPU wise, won't it? One would think that using 4K or VR, in MP and with supercarrier + F-14A/B or F/A-18C, or F-16C in Syria, or P-47 in Normandy, would be a far more sensible test for GPU and VRAM stress utilization related aspects (framerate, frametime, temps, wattage, bandwitdth, etc), for comparison purposes with these higher-end GPUs. Those not having such modules may be able to Trial them, then use them for such a "benchmark". -
RTX 4080 & 7900 XTX/XT benchmarks DCS
LucShep replied to xoxen's topic in PC Hardware and Related Software
I understand the purpose of benchmarks, but I notice that most (all?) are testing with FC3 aircraft (Su-27, A-10A, etc), which have much lower detail in Cockpit 3D and far lower res textures. Same thing for the map, Caucasus is simpler and much better in performance than pretty much any other DLC map. I'd say to try that with higher-detail modules, such as an F-14A/B Tomcat, F-16C Viper, AH-64 Apache (among others) and in, say, Syria or The Channel maps, to get a better representation of the real capability of the GPU in test. -
Yep, this! ^^ Even better if you can test before commiting to it. I'm the living proof of it - I've "downgraded" from a beautifull 55'' curved Samsung NU8500 to a flat 43'' Toshiba 43'' QLED, exactly because (unfortunately) it was simply too big for desktop use (was awesome for simpit though). 42'' to 48'' is where the min to max limit is for me but, well, of course each has his/her own preferences.
-
In my experience, image will be great on both if it's a good monitor and a good TV that is compared. The advantages are not many in a PC monitor, but there are some: One is the use of DisplayPort instead of HDMI. For example, it's usefull in some specific situations, like using an Nvidia GPU with FreeSync if that's what the monitor features (can not do it through HDMI). The others are lower response times and input lag, higher refresh-rates, and better motion handling, though this has been no longer an issue with mid to high range models of 4K TVs (those of such reknowned brands, notice), thanks to console gaming lately (Playstation and Xbox) forcing a big raise in the gaming standards for them. Especially the latest OLED TVs from LG (the C1 and C2) are pretty much perfect in everything (image, response, latency, refresh, whatever), they're trully amazing. The only downside of the OLED TVs is image burning with prolonged use of static images (for example, overlays and huds in games, etc) causing issues in the long term and, of course, the high price. And here is where QLED gets in, it's plenty good, more affordable and there are no burning issues for long-term usage, in whatever usage scenarios. Problem is, a good 120Hz QLED TV may be too close in price to one of those two 43'' PC monitors you list (and both are good). I look at it in the two extreme opposite ways... If you want to spend the least, and if you're ok with being limited to 60Hz, the TCL 6 series C635 (or R635) 43" 4K QLED TV is a great "budget" option to use as a big PC monitor. But, that said, if you can afford any of those two monitors you mentioned (ACER Predator CG437K or AOC G4309VX/D) then certainly go for either 43'' PC monitor.
-
Getting a PC monitor is really a matter of preference, use case scenario, system and budget. So take my opinion with a grain of salt. Personally, I tried plenty ultrawide 21:9 monitors (QHD+), curved and non curved, and find it really is an acquired taste - you either love it or hate it. After the novelty initial moments impressed with "oh you can really see more to the sides when in cockpit!", it ends up feeling like the top of the screen has been chopped off. The vertical depth is completely lost (IMO) and with flight-sims (DCS and others) it does not work as good as 16:9 - these for me are still the perfect format. The upside of 3840x1600 is that, not having as many vertical pixels as 3840x2160, your graphics-card will need less work to display with same settings. And that is (IMHO) the only downside in a 3840x2160 (4K) screen - it can push the graphics-card quite a lot. Coming from your 2560x1440 27'' monitor, you'll have your jaw dropped once looking at a big 43''(+) 4K 3840x2160 screen, that's for sure. It may even feel a little "too much" in the first days (maybe distance yourself a bit more from the screen) but you'll get used to it and, trust me, afterwards nothing smaller will ever get the same appreciation. All of a sudden the cockpits and everything else in the sim/game look nearly "real life size", it's amazing (at least it is, in my experience). You got a pretty good system, so I don't envision it having any problem with performance in 3840x2160 (4K) at this moment, but as you know DCS is always increasing details and getting heavier, so be well aware of it in the future. All of those monitors you list are very good, so I think it's just a matter of your preference and budget, really. I'm currently biased towards a good QLED or OLED 4K TV, also great options at the moment if you want a big screen (42'' and over) for gaming/simming, some at lower prices than 43'' 4K PC monitors. Maybe also check those from LG, Samsung, TCL, Hisense and also Toshiba, if interested in that route.
-
Yes, rumours for yet to be announced future GPUs have started already... NVIDIA RTX 50 Series ‘Blackwell’ Source: https://wccftech.com/rumor-nvidia-rtx-50-series-blackwell-gpus-will-bring-biggest-performance-leap-in-nvidias-history/ "Previous reports have indicated that Blackwell GPUs will be fabricated on TSMC's 3nm process. The rumor comes from RedGamingTech and they have received some new information regarding the architecture. Firstly, Blackwell will feature an entirely new SM structure. Considering the underlying micro architecture is shifting to an MCM design with Blackwell - this is not surprising. Also, Blackwell will leverage a hyperspeed bus that will interlink the various SM and chiplets. A denoising accelerator will also be part of the ray tracing pipeline (modern path tracing setups don't actually trace the full sequence, they do it partially and a denoiser handles the rest) which should result in significantly improved RT performance. There still seems to be no word on specifications although the source notes that there are various Blackwell GPUs being considered and a lot of the binning will depend on how AMD's current RDNA3 and future RDNA4 offerings perform. Finally, RGT leaves us with the following teaser: "biggest perf leap in NVIDIA's history". NVIDIA Hopper was the world's fastest 4nm GPU at launch and the world's first with HBM3 memory. It featured higher specifications than even the NVIDIA RTX 4090 (which contains 16,384 CUDA cores) at a net total of 18,432 CUDA cores. Blackwell will provide significant generational improvement over Hopper (as has always been the case). Four NVIDIA Blackwell GPUs have already been confirmed in a prior leak." AMD RDNA4 Source: https://wccftech.com/amd-rdna4-rumored-specifications-monstrous-129-fp32-tflops-gpu-gddr7-3-5-ghz-clock-rates-and-more/ "One of the things that was lacking with AMD’s RDNA3 architecture was the ray tracing and AI/ML performance and AMD is looking to fix that with RDNA4. Caches will get big upgrades and 3rd Generation Infinity Cache is going to be rolled out along with a new data prefetch system. WMMA (Wave Matrix Multiply-Accumulate) V2 will also enable 2x the performance per CU for (presumably) matrix instructions allowing full SIMD lane usage. Even though the total increase in CUs is around 50%, the performance increase generation over generation is 100% - which is superb to hear. Given below are the separate dies: AMD Navi 41 – 144 CUs Here is where things get even crazier, Navi 41 is touted to have up to a massive 32 GB of vRAM – probably of the GDDR7 variety. A second configuration also exists with 24 or 48 GB of GDDR7 and 6 MCDs (the 32 GB variant has 4 MCDs). The Navi 41 will have 144 CUs, which clocked at 3.5 GHz and at a calculation of 128 ALU per CU should yield a whooping 129 TFLOPs of FP32 performance. Needless to say that is a Godzilla level of performance right there. AMD Navi 42 – 96 CUs Navi 42 will be slightly muted and have a net total of 12,288 CUs which clocked at 3.5 GHz should yield around 86 TFLOPs of FP32 performance. 96 CU is also exactly the count of the RDNA3 flagship, the Radeon 7900 XTX, so this SKU will represent the apples to apples generation over generation comparison between RDNA3 and RDNA4 (or more accurately, as apples to apples its ever going to get). As RGT notes in their slides, specifications like these *can* and do change before the final revision. AMD Navi 43 – 48 CUs Finally, we have Navi 43, which is going to have 48 CUs. Clocked at around 3.5 GHz, this will yield around 43 TFLOPs of FP32 performance. This will probably be the lower end of AMD’s RDNA4 lineup and just goes on to show how much performance is being packaged in this upcoming RDNA4 architecture where the “lower end” easily exceeds 40 TFLOPs. The memory standard is still going to be GDDR7 and it is going to come with 2x MCDs. This is also the only SKU where RGT notes that the GPU might not be of an MCM design – although they are being told this is currently the case."
-
RTX 4080 & 7900 XTX/XT benchmarks DCS
LucShep replied to xoxen's topic in PC Hardware and Related Software
Yep, that was my conclusion. Last month I antecipated that it'd be this sh!tshow and grabbed a used EVGA RTX3090 FTW Ultra from Ebay, a year old and in mint condition, 700€ (shipping costs included). As the old adage says, "search and you will find". DCS@4K/60Hz cranked at high+ settings (finally!) and not a single hitch, undervolted runs very cool (max temps don't get over 59ºC core / 71ºC mem / 65ºC hotspot) and rarely passes 265W consumption (peak so far was 308W, in Heaven Benchmark only). Couldn't be happier with the purchase. These newest higher-end cards available only at stupid high prices, that could instead get me a nice 2nd hand motorcycle or car? No thanks, you can have them. -
Hello Razbam, Pardon the following wall of text... I admire the attention to detail and relevance given to the "real thing" fidelity - vital for any top module for DCS. But, with all due respect, I think I'm not alone feeling that some criticism is applicable to your work with the sounds. Most of all, I wish Razbam gave more prevalence to the internal "in cockpit" sounds instead - this is where we spend 99% of the time in your planes (not outside of it). I've got a bunch of DCS modules, and trialed pretty much every one out there. I find the Mirage 2000C and the MiG-19 as some of the worst sound design among all DCS modules - again, no disrespect intended. I can't hear what the engine (and throttle/thrust) is doing, I can only hear wind sound. For the few things I can listen, I don't get any stereo-repartition for higher quality in the audio (from within the samples, not from from effect triggers). Actually, I can't understand the profusion of mono-channel samples after looking into the sound files (when stereo is widely supported?). Or the utterly mutted/muffled sounds, when the low-pass filters are supposed to do that work by the sound-engine of DCS itself, more so when selecting the "Hear like in helmet" option of the game. As it is, and as much as I love the M-2000C, I'm put off of it after flying just a few minutes. I feel... nothing. And go back again to other module. All in all, the immersion and feedback I get from sound when "driving" the aircraft is pretty much nule, no matter what version or revision of sounds you've done so far in the M-2000C. You may say "but that's how it is IRL!", though I'd find it discussible. The thing is, without a "seat-of-the-pants" (not being in the real aircraft) by being restricted to visuals and audio, some compromises are necessary, to balance and compensate, for immersion/pleasure/feedback during gameplay. And here I think other 3rd party's developers, as well as ED, interpret this far better. Take as mere "good" examples the F-16C, F-14A/B, F/A-18C, A-10C or even the FC3 modules. Loads of audio feedback, truly immersive, a delight to the ears. These have stellar work in the sound department and, perhaps, a benchmark that you should pick and base on, to produce your "in-cockpit" sounds (at least for this module). People give a lot of value to "eye-candy", but "ear-candy" is just as important in a simulation. Please take this as constructive criticism. I am (or rather, was) a sound-designer for almost twenty years (work in racing-simulation games, and a few side projects in military/flight sim mods), these days merely as a side hobby. I know the ins and outs of the craft, and I know it's hard and complex. I also know (past lessons) that if customers criticize, it's also a sign that they care. Cheers!
-
Yep! Very disapointing and this is what is holding me back on the upgrade from BS2 to BS3. How can you have one without the other is kinda baffling. It seems obvious a lot of people want both cockpit and HUD fully in english. That was possible before but not now, it seems. Be it by official update or by a mod of some talented fellow member, I prefer to wait and see how it goes before March, when the upgrade becomes less accessible price wise.
-
I'd definitely recommend instead a used (as in, second hand) Nvidia RTX3080 10GB or AMD RX6800XT 16GB, while you can still find deals for them at decent prices. Either of which can be bought used for around the same price as those two you mentioned there. Be patient and look around for those from trusted sellers and at the right price. Plenty found on Ebay, most on auction/bidings, but some sellers are accepting offers and this may be your chance ("buy it now" prices still a bit inflated).
-
RTX 4080 & 7900 XTX/XT benchmarks DCS
LucShep replied to xoxen's topic in PC Hardware and Related Software
WOW If true, that's a severe mishap by AMD, and some pretty heavy discounts on the RX7900 series are in order... which would actually make it better! LOL Meanwhile, an unlikely turn of events has just happened (maybe due to the underwhelming reception of the AMD RX7900 series)... NVIDIA GeForce RTX 4080 takes up best selling GPU spot at Newegg: https://wccftech.com/nvidia-geforce-rtx-4080-becomes-neweggs-best-seller-rtx-4090-takes-3rd-spot/ These are strange times, when a overpriced $1600+ GPU that was destined to rot in the shelves just a week ago, becomes the most sold GPU of them all. -
AMD rDNA 3 PRESS CONFERENCE
LucShep replied to SkateZilla's topic in PC Hardware and Related Software
... -
Friggin awesome tool to create missions! Have been using this lately for some fun co-op missions with friends, it's perfect! Many, many thanks for creating this, also for continuing development with fixes and updates.
-
RTX 4080 & 7900 XTX/XT benchmarks DCS
LucShep replied to xoxen's topic in PC Hardware and Related Software
Nicely written. That's a very fair assessment of the situation and, whichever way you look, all these new GPU prices are FUBAR (like the ones before, during the pandemic/mining chaos). AMD RX 7900XTX vs Nvidia RTX4080 ? The former looks better for the money (note: if at MSRP!) if rasterization is your priority, though we're yet to see how it performs in VR. (edit: see post below, it's worse in VR). But then one realizes that 200 bucks more for a better cooler, upscaling/frame generation, RT performance, power efficiency, thermals, and not having to deal with AMD jank? The RTX4080 becomes a no brainer - a harsh realization, I know. Plus, one will probably make most of that money back in electricity savings, due to the RTX4080 consuming less power on load and not having any of the idling issues of the RX 7900XTX and XT (due to AMDs multi-monitor power consumption problem). What needs to happen is for AMD to learn to price their cards competitively, and a ~17% discount for a crappy version of an RTX4080 isn't that. Then the other AMD GPU, the RX 7900XT, is a stupid joke, selling at just $100 less than the RX 7900XTX. AMD is not a charity entity, and people should stop treating them like "oh the poor underdog needs support, take my moneyz" and donating to them via purchasing of somewhat inferior products. I just talked with old connections working in PC stores here in Portugal (so, in the EU), they all suggest the RX 7900XTX will be available here near Christmas, priced at 1250€ and over, and way more expensive than that for the improved models produced by AIBs (Powercolor, Sapphire, Asus, MSI, etc).... That's not good at all. So, there isn't an ideal solution on the horizon for those looking to upgrade to a new high-end GPU. You either suck it up, or go the alternative route - which IMHO is the better solution - skip this generation and/or get a 2nd hand Nvidia RTX 3080, 3080Ti, 3090 or 3090Ti in the used market, maybe even an AMD RX6800XT, 6900XT or 6950XT if at lower prices. That is, while you can still get them at fairly decent prices. For these, be patient and look for those at the right price from trusted sellers, you'll eventually find one (I did in the past month in just three days). -
AMD rDNA 3 PRESS CONFERENCE
LucShep replied to SkateZilla's topic in PC Hardware and Related Software
It seems the "upto 70% faster" was a bit of an exaggeration... The next fly in the ointment will be the true, real price in stores. (inb4 another news episode of "scalpers purchasing batches at once, indirectly provoking price gouging", à la RTX4090 right after launch). -
Realistic, honest hardware requirements for DCS
LucShep replied to LucShep's topic in DCS Core Wish List
I don't know... pretty much everybody that has normal standards for the current day? 60fps is widely known as a normal threshold to where image/motion becomes far more enjoyable in PC and Console gaming - and it's what pretty much every developer has been aiming at (if not more) for quite sometime. Can it be done at 30fps? ...sure, just like you can do it at 800x600 resolution like in the old days... but who wants to use that crap anymore? Just to run it? What do you mean? At what visual screen standard? 720P/30fps ? 480P/15fps ? If so, who spends 80,00 €UR on a painstakingly detailed module to run it like that? Does that make any sense? Me too as well. A propper in-game benchmark is long awaited. But, not having any at this point, then we need to have a reference from the developer, so that it can be confirmed by the user base. Instead of guess estimating the required hardware (check the hardware section littered with such questions, among other sections in these forums). For example, the list of requirements could have an asterisc provided, like "using ABC module in DEFGH map on IJKL123 mission, at XYZ resolution/refresh", that we way we would have pointers, for a reference. I mentioned the most popular and hi-fi content (module and map) exactly because - being popular - there would be enough of a userbase to confirm or judge those requirements as correct or incorrect, using content with higher standards than the free stuff (which doesn't really represent the average performance with most of paid content). As it is, it's pure fantasy land - noone can confirm if requirements are correct or trustful, because there's no idea to what they refer to (e.g, ...is it at 720P/30fps with game in very low settings? ...is it using Free Su25T? ...is it a complex mission or free flight? etc). -
Realistic, honest hardware requirements for DCS
LucShep replied to LucShep's topic in DCS Core Wish List
Im refering to both, minimum AND recommended. Mentioning "LOW" or "HIGH" is meaningless in the current day, when resolution and framerate is not mentioned. That could be at 720P/30fps for all we know. 1080P/60fps with a GTX970 was easy with older 1.5x versions but not after that, certainly not with latest versions. (FWIW, I had a Xeon W3690@4.6Ghz, 24GB, GTX970, game on 500GB Sata3 SSD when 2.5 came out in January 2018 ). Notice that 2.5 was already consuming over 5.5GB of VRAM right away, at the time of its launch (and the GTX970 has only 4GB - actually 3.5GB+512MB!). Now imagine the newer 2.8 with dynamic clouds system and its even bigger impact in performance in GPU usage (but not only), surpassing 8GB of VRAM quite imediately with any DLC terrain and newer module, plus the insistence on oversized texture and formats, on everything and especially on specular and bump maps.... -
Realistic, honest hardware requirements for DCS
LucShep replied to LucShep's topic in DCS Core Wish List
A regular quick mission with any of the latest modules (or most popular ones, F/A18, F16, Apache, etc) on any of the popular DLC maps (PG, Syria, for example) would be a great benchmark for that, I think? The current suggested hardware requirements don't seem any realistic even with the free provided content, on free flight mission, at 1080P/60fps..