Jump to content

kksnowbear

Members
  • Posts

    880
  • Joined

  • Last visited

Everything posted by kksnowbear

  1. Not if it occurs simultaneously, it won't. The drive queuing mechanism will make one or the other wait - entirely. At that point, it's not half the speed of the bus, it's zero.
  2. Well, everyone has their opinion about that - but I only mentioned the PCIe 5.0 drive to address the question of 'future proofing'. If you can make the argument today that buying into PCIe 4.0 makes sense in terms of future proofing, when you only have a PCIe 3.0 board...then... (To which I suppose the obvious reply is that a PCIe 5.0 drive costs more than a PCIe 4.0 drive... ...but I got the T700 for less than what a 2TB 990 Pro would cost, so...)
  3. Well, I am blessed to have a wife who loves me and has the funds for really nice Christmas presents (though believe me, she got some pretty nice stuff too lol) That, and the fact that I do a lot of work 'in the industry', as it were, so I don't usually buy until/unless I manage to find exceptional deals. I was able to get the very same T700 you mentioned (a 2TB model) for <$217 'all in'. Yes, the newest stuff is always stupid expensive. And I usually strenuously argue against paying the premium associated with it. But, I only go for it when I can find really good prices; otherwise I wait it out. It's over-priced (especially when I know what the Gen 3.0 and 4.0 stuff is going for)...but I gotta be honest, it's nice having a new build like that, and I feel I deserve it at this point in my life
  4. Of course. But even then, you're still better off letting the game have storage access that isn't "shared" (as much as possible). After all, the amount of storage access used by all those other things combined will pale by comparison to what the game will use. Right...but I don't make a habit of recommending el-cheapo drives for performance storage. I use cheap drives all the time...for bench fodder, scratch installs, etc. But when I outfit a system to perform, I don't use cheaper/slower drives. One exception might be a customer on a very tight budget. I might go with a slower, cheaper drive - but I'd still avoid the slowest, cheapest units for anything I'd sell. In many cases, we're looking at people who are only upgrading storage, and (I'd bet) the majority won't be upgrading the whole system for at least a year or more. Just what my gut says. You have a very good point about the PCIe 4.0 drives, of course And if someone is in the position to buy storage now with the intent of moving it to an upgrade later, then of course it makes sense that way. I just get the gut feeling that at least some of the folks posting/reading aren't necessarily doing that. I personally consider the notion of "future proofing" as a fool's errand, TBH. But, if we're going down *that* road...well, then, it justifies the PCIe 5.0 drive I just bought (though to be accurate I am using it in a PCIe 5.0 slot). Even if you say "absurd" I see 12500 reads as "bad ass"
  5. In cases like this, IMHO you're better off running the OS from a SATA SSD, saving the one M2 slot for performance storage, where you install games. Of course, I realize the M2 device will be faster than a SATA storage device. But two things: 1. You're not (necessarily) using the OS drive a lot, once the system is booted and running. There are times when paging file access might come into play, but even then you're better off if these occur on a separate drive (see next point). 2. When (and note I say 'when', not 'if') the contention for storage access occurs (as it most definitely will)...any advantage the faster M2 drive has is automatically mitigated by two processes trying to get data to/from the single drive at the same time. By comparison, although a SATA drive might not be as fast, the two storage devices being physically separate will eliminate this contention (as much as possible in any typical PC). LOL I know what you mean But then, how often does one actually move the install from one drive to another? And, even if that scenario does favor the SN770...I would think that, in most cases, we're talking about performance while playing the game, not moving it from one drive to another
  6. You know, it's rare when I find that I disagree with you But I happen to have a Z390 board on one of my bench setups right now...and I also just happen to have not only a Samsung 970 EVO Plus (my go-to for PCIe 3.0 NVMe storage), but also a WD SN700 (as you recommended). Here's a sample from the test data I acquired just now (EVO Plus on the left, SN770 on the right): The Samsung drive is for all intent as fast as the SN770 - and in some data sizes faster, illustrating that the SN770 is not running the PCIe bus at it's 'full speed'. (If it were, then the Samsung would never be able to run any faster). In fact, neither drive will ever run at the full theoretical speed of the bus...but even accounting for that, there are times when the Samsung runs faster. And to be clear, the SN770 is rated to run *much* faster than PCIe 3.0 speeds. The bus itself has overhead, and other factors (firmware, cache, data size) will all have a part in actual transfer rates. Something else of note: The Samsung is consistently faster in writes. I'm assuming this is because it has it's own cache RAM, but either way, it's faster by a bit. Of course there are other tools for measuring drive speeds. And to be accurate, the SN770 is cheaper - but that can vary a lot, depending on when/where you buy. I've bought 970EVO Plus drives that were actually cheaper than SN770s of the same size. I'm aware, of course, that the data size plays a major part in "real-world" performance, and these charts show a few data sizes where the SN770 has a small edge. But the majority of the data sizes still read faster on the Samsung drive. Also, something to consider: DRAM-less drives (as the SN770 is) use HMB (Host Memory Buffer) for caching data - and, in short, "host" means they take system memory - something most people pay to get more of, not less. I do realize it's typically not much RAM, but still. Basically, for a little less cost on the drive itself, you're 'selling' system RAM capacity. For me, I'd pay a little more and buy a drive with it's cache RAM on it - where it belongs
  7. 12 inch fans? 8 inch bays? Wow. Hmm. As mentioned above, the reasoning for HDD is cost-per-unit of storage, and (as I also said) it depends. Sometimes you can find sales etc and wind up with an SSD that is competitive with a HDD for cost/unit. But, with 4TB HDDs out there for <$50, I'm not sure how often you'll actually find bulk storage cheaper. Many 1TB SSDs are going to go ~$50 ...so the HDD is 4x the storage for the same cost. Noise might be a factor for some - but, for me, there's a lot of other ambient noise in the room so at 'productivity' loads the noise isn't bothersome. Gaming loads will kick up fans of course, but (especially as I have worked in data centers etc most of my life) that fan noise is a good thing. In fact I get a little worried when I *don't* hear fans. On the speed question: Even a 5400RPM SATA HDD is adequate for streaming video locally - even at 4k. Obviously, things like video editing, or multiple streams perhaps, etc, would require higher transfer rates - but then, there would almost certainly be other considerations in those cases (network speed etc...). And if you're doing work that's sensitive to speed/time, then obviously your requirements are higher (like I said, it depends). But for bulk storage (documents, 'slow' games, even movies and audio storage, as well as backups and other utility purposes) the cost/unit of HDDs is a much better proposition.
  8. Was going to say, reading what you wrote before, it sounds as if you'd wind up with only one drive that contains both the OS (Windows) and DCS...that is, if I'm following your references to "C drive" indicating your OS install. If that's the case, then I'd say no, you shouldn't have your games (any games, to include DCS) on the same drive as the OS/Windows if it's possible to avoid. What I typically recommend is a two- or three-drive arrangement: 1. A small SSD (120-250G) for the OS (M.2/PCIe is preferred, SATA is acceptable) 2. Larger SSD, 500G+ (strong preference for M.2/PCIe if supported by system). 3. (if desired) A conventional SATA hard disk, 500G+ for older, slower games, movies, personal file storage. (Can use a SATA SSD here as well, depends on cost). Having game(s) on separate drives is recommended because it eliminates 'resource contention' that is caused by having the game and OS trying to access the same physical drive at the same time (obviously someone winds up waiting even if only for a split second).
  9. Generally, yes you should be fine
  10. Well, for one, if your sig is correct/current, then your Z390 motherboard is not (as in cannot be) running at the speed of any of the replacement drives you cited. Z390 boards support PCIe 3.0. Max transfer rate there is going to be ~3500. The drives you listed (SN850X/990 Pro, SN770/980 Pro ) are all PCIe 4.0 drives - great performers for the most part, but all of them are PCIe 4.0 drives (max transfer ~7000) and beyond the capability of your board. So not a lot of reason to spend the extra for speed you can't use. On Z390 boards, I usually recommend Samsung 970 EVO Plus drives. Cost effective and performance good as/better than anything else. BTW, SATA and NVMe are not counterparts NVMe is a protocol (like a language), where SATA is an interface. The counterpart to SATA is PCI Express (PCIe for short)...your motherboard has M.2 slots supporting drives that are either SATA or PCIe (see below). Note that one of the M2 sockets ONLY runs in PCIe mode. Also, commonly, populating M.2 slots will disable PCIe expansion slots and/or SATA ports on the board - check the manual p. 32 for specifics.
  11. This is actually a clever approach... ...but not as clever as strings and cans
  12. Don't own the card personally - but, based on what I see about it, it would be a fairly significant step DOWN from the 3060Ti mentioned in your signature (I am assuming you still have that GPU, and were considering the A770 in your post). The info I've seen suggests an A770 16G card performs on the order in the same range as a 2070Super or 1080Ti (as always, depending on game, settings, resolution etc etc etc). FWIW, if it competed with the likes of some of the higher-end AMD and Nvidia units, I think it would probably also cost close to some of those. Not likely to spend $425 and wind up with a card that's as good as one that goes for $1000 (even if you factor in ridiculous pricing). My $0.02
  13. Yes, but he doesn't actually say you can't see the difference...least I don't think so. But more importantly, if that's what you're taking from the video then you clearly don't understand the point he's making. It's not that he's not "just looking at the screen"...he does that, but the video goes into much more detail. Yes, anyone can look at the screen and say 4k is impressive - I've already acknowledged it, and no one's questioning that. I own several; they're nice. But at the end of the day I've built quite a few gaming machines in the past few years, and not one - exactly zero - has been for gaming at 4k. In fact, of all the ones I've worked on (including at least a few that were for DCS players from this very forum) only one was actually even 1440 (not counting my own, which is currently 5120x1440). By saying 'typical egghead' I infer you mean that he's attempting to make a more empirical, more organized approach to data collection and measurement, in order to quantify the basis for a conclusion. This is in contrast to typical fanboy, bragging rights type of anecdotal conclusion like "It looks very nice" without any real substantiation. Don't let all the facts and figures tell you anything...just look at the screen lol And anyone who doesn't agree obviously needs their eyes checked. He's making an extensive effort to take the subjective nature out of his analysis, and to be honest, as an industry professional of over 40 years, I think he does a fairly good job with it. It isn't just entertaining, it's informative. You're not fond of the conclusion because it doesn't agree with your own perspective - which is perfectly understandable. (Funny, every time I bring this up with someone who dropped a ton on a 4k setup, they don't like it very much either......lol). Earlier you made the comment "PC gaming is the only media that hangs onto lower resolution anymore. It’s funny to mention 4K as if it’s “new” when it’s been around for over 10 years everywhere but in PC gaming" I'm simply trying to illustrate there are reasons for it being the way it is - perfectly legitimate, valid reasons. It's not incidental or accidental; factually, it is the way it is for good reason: As nice as it may look, the overwhelming majority of gamers do not prefer 4k gaming, and/or cannot afford the costs. It was true then...and it's still true now. In other words, they're not choosing to "hang onto lower resolution" because they can't see that 4k is impressive. They're making a choice to 'hang on' to their cash because it's not worth the difference in cost to them. It's just another perspective. You know, the kind of thing discussion forums are (supposedly) about. You spent your money the way you wanted, and so did I. All good. (EDIT: It's worth mentioning here that the video is some 5 years old, and factually some things have changed which do alter the conclusion: Mostly, that higher-refresh 4k monitors are widely available now, where part of the problems illustrated in the video revolved around the fact that - at the time - 4k monitors were still almost entirely 60Hz. And gamers tend to prefer higher refresh rates over higher resolution. This doesn't change the cost considerations, but it is a more recent difference than what the video examines)
  14. Admittedly I've not watched the whole thing through in a long time... ...but does he actually say that?
  15. Right...but you're clearly among those fortunate enough to be able to afford 4k gaming (and 4k in general for years now). Again, that's just not the majority - which is why there's nothing remarkable at all about why PC gaming isn't "there yet" in the majority. And, TBH, most people don't consider it just a "bit more" when cost is doubled, increasing cost by $1000 or more. "Very nice" doesn't really enter into it when you just cannot afford it. And, at least in my experience, that's the majority of gamers. You're blessed (me too) and good for us! But if you wonder why not everyone has the same setup...well, cost. Same as for cars, houses etc etc.
  16. There's a perfectly legitimate reason that "PC gaming is the only media that hangs onto lower resolution", and there's nothing really all that amazing about why it's that way. It's simple: Cost. It's (comparatively) easy to display 4k when all you're doing is basically 'playing back' what's being delivered...but that's not at all what computers have to do. The gaming PC must read the 'raw' data from a storage media, move it through the system, calculate tons of results from millions of variables, render the graphical components, add post-processing effects, and so on... A 4k TV doesn't have to do any of that. (Consoles can do it, but are often constrained in ways that don't apply to PCs, else they couldn't do it either). One of my favorite online articles (video) is from about 5 years ago, entitled 4K Gaming Is Dumb (see below). 4K has grown in popularity, certainly - as have other resolutions like 1440p. However, last I checked - and I think this is still accurate - 1080p was, far and away, more common than any other resolution for gaming (at the time, more so than all others combined). I build a fair number of machines every year, almost all of them for gamers. And there are almost none who want 4k systems. In fact, a lot of them actually don't want anything higher than 1080p because of the performance hit it will involve, unless/until they throw cash at it - which they don't want to do, because it basically doesn't improve their scores on some stupid leaderboard or whatever. In my shop I have three 4k monitors, a handful of 1080p units, and a couple ultrawides (my own is a G9 32:9 1440). I understand the difference in quality, and there's no doubt it's a stunning difference. OLED displays are even more remarkable (I don't own one of those yet)...but therein lies the rub: Cost. Cost, cost, cost. Everyone I have visit my shop and sees the 4k monitor in action will agree it's beautiful. But almost no one is keen on the cost of the monitor plus the machine it takes to get good performance from the monitor. Most will go for 1080p to get (much) higher frame rates, and only recently is there some interest in 1440p, in ~10% of the clients. The thing is, you have to consider the additional cost of the monitor, but (for more significantly) the cost of the hardware that can drive that sort of resolution and still offer decent performance. It's only been in the past few years we've seen GPUs that are actually capable of delivering 4k at reasonable performance - and, during this exact same period, everyone knows what's happened to the price of these higher-end GPUs. The combined impact on overall cost can more than double the cost of a system, easily adding more than $1000 to a setup that's already over $1000 to begin with. Not everyone can afford that. In fact, based on builds I've done for gamers over at least the past 10 years, I'd have no reservation at all saying that the vast majority is not interested in - and cannot afford - the added cost. In many ways, that video still applies: It's easy to show all that
  17. Well, as stupid as that is...I have to acknowledge it behaves as it says Looking back, it seems I bought a license because the Superposition benchmark free version doesn't loop ("stress test"). Apparently the Heaven benchmark is different. My apologies. To the OP: It would seem this makes Heaven a good choice after all - sorry if my prior comments caused confusion.
  18. Not according to Unigine, it doesn't: EDIT: Not sure why they say it doesn't loop, but I just checked a Basic installation and, as Hiob says above, it does appear to run (and load the the GPU) constantly. If that's not considered 'looping', I wonder what they refer to in the chart above....
  19. Of course; there are several free alternatives I'm aware of. Problem with the Unigine benchmarks (Heaven, Superimposition etc) is that the free versions don't allow "looping" the tests. (This applies to several other free utilities i know of as well.) "The real problem is, is that it doesn't happen every time. I cannot reproduce it reliably. " Since the OP has stated the issue is intermittent in nature, it's likely problematic to use something that can't be run at length. You would probably find, for instance, things appear to function normally during these short test runs, which would obviously be misleading. Won't hurt anything, but may not help with determining if there's something wrong with the GPU. The only one I can think of that is free and still allows long run stress testing is FurMark, and some people think of it as dangerous (though it's probably OK if used properly). As for using AfterBurner to manually control fans for testing, the problem there is that you'd be adding a source for controlling the fans that isn't usually present, thus it's not actually testing the normal state of the system. Might work just fine; as above, it won't hurt anything...but might not help actually identifying whether there's a problem as indicated by the OP. TBH, although many choose to do so for 'custom' controls, there is no need to use a third-party utility to control the fans adequately; card should be more than capable of that on it's own - which kinda takes us back to what the OP asked for: a utility for testing, as opposed to something to replace the built-in fan controls which appear to be malfunctioning. To the OP: There might not be anything suitable to your purpose, and is free, and doesn't require Steam...you may need to reconsider your criteria there. As the saying goes, "Beggars can't be choosers" Best of luck.
  20. No disrespect intended, but that's a largely misguided line of reasoning. Software people (game developers and programmers) tend to say that, but it's absolutely not reality. Hardware people say the same thing sometimes, and they're no more correct than the software people. It's just not a concrete fact, either way. For one thing, you can't say with any real certainty that the software is seeing the exact same set of conditions every time and thus always responds the same. It could be seeing factor(s) you're unable to monitor and therefore it will behave differently without your seeing an apparent reason. Another thing is that software can absolutely have 'bugs' that do not occur predictably and reliably with the exact same apparent conditions. I've worked in computerized systems maintenance for over 40 years, and I can promise you that neither software nor hardware is bound to the 'rule' you're trying to apply. That said, I might be inclined to agree with your conclusion, but not for the reason you've cited, and not without further details (among them, where your measurement data is coming from). Something else worth considering: Are you including the GPU BIOS among 'software'? That's what it is, and there have been instances where it caused fans to misbehave. So if you haven't thought about that, it might not be a bad idea. You don't mention your GPU manufacturer, but some use Windows utilities that can update the GPU BIOS, and it might've been changed without your necessarily realizing what happened. As far as utilities go, I use 3DMark FireStrike; it's a reasonable stress test - there are others, of course. But 3DMark in it's basic form is free, is well-known and highly regarded, and it's safe for testing - arguably, some other tools are not as safe (FurMark comes to mind here).
  21. High refresh and variable refresh rates are not just good ideas for competitive shooters. And, again, the bigger the monitor (at a given resolution), the bigger the pixels are going to be and thus the picture cannot be as sharp as a smaller screen with the same number of pixels (just plain physically impossible). These are the biggest (and most important) differences between monitors and TVs. As I said above, the only real reason for buying/using a TV as a monitor is cost. However, and speaking of cost - the LG linked above is huge, so the pixel count is still a factor - but it is a proper monitor with a high refresh and variable refresh support. And just as discussed above, those two features make a huge difference. And it doesn't seem like a bad price, either
  22. Broadly and generally speaking, TVs are made for viewing from across a room, where monitors are designed to be viewed much closer (desktop). Generally, the only real reason people buy/use TVs as monitors is because it's cheaper - and there's a reason. That said, there are some very high quality-image TVs out there. The distinction at that point will be features, for instance, good gaming monitors are going to support features suitable to gaming more so than TVs will. Me personally I'd go with a monitor. HTH Edit: Further to the above comments, I found specs for the Samsung TV mentioned. It's a 4k 60Hz model with no variable refresh rate (this is among the 'features I alluded to above, and where gaming is concerned it's an important feature at that). At 43", it's also much larger than the 32" monitor - and that's not necessarily a good thing when viewing up close. Since the resolution is also 4k, it has the same number of pixels horizontally and vertically as the Dell. And because those pixels have to fill a larger space on the Samsung, the pixels themselves will be larger. So, up close, it won't look nearly as sharp as the smaller screen will. Of the two choices given, I'd strongly prefer the monitor.
  23. The former. The reason for this is the extra work the GPU must do in order to improve the visuals, by doing things like sampling edges 2-8 times more (for AA) etc. Thus, the GPU takes more time to produce a screen full (frame) of data, and more time per frame equals less frames-per-unit-time (i.e., "FPS" or frame rate) HTH
  24. Just to clarify: You mean the Performance Tab in the Task Manager window, is that correct?
  25. Out of sheer curiosity, where are you monitoring the RAM usage/with what tool(s)?
×
×
  • Create New...