Jump to content

EtherealN

Members
  • Posts

    15222
  • Joined

  • Last visited

  • Days Won

    12

Everything posted by EtherealN

  1. You know a product is good when the founder tried to make them not use his name on it anymore. EDIT: Actually, that video was so filled with expletives from the founder of McAffee that I'll just ask you to find it for yourself. :P Basically, it's a false positive. :)
  2. Here's the thing regarding updating to Windows X for free: Boom, everyone is quickly on a system that runs the latest Windows "store" (like app store etc). It's a method for them to make as many as possible exposed to their sales channel as quickly and easily as possible. That's the "Windows as a service" thing, this includes the sales channel integrated into the OS. (Compare with Google Play etc.)
  3. I'll add one note as well: wattage for PSU's is the total they can produce across all rails/voltages etc. Having components that "only" want 400W does not necessarily mean a 650W PSU can supply it, if they all want the same voltages etc. (This is one of the reasons why, when giving build advice, I always caution against going cheap on power supplies. Purchasing a 100-150€ PSU might seem excessive when there are 40€ PSU's out there with the same rating, but through a lot of frustration and tears I've learned to respect the PSU decision. (And getting an upmarket one is usually fairly fine anyway, since mine came with a 7 year warranty, meaning it'll serve multiple computers without going out of warranty.))
  4. One thing you could see about trying (depending on your level of confidence with overclocking): UNDERCLOCK the CPU and GPU. The idea there is to make them "top out" at a lower power draw than is common. If it works fine at that point, but restarts on standard, you've got a fairly solid case for either the PSU or the mobo power caps being on their last legs (most likely candidate would be the PSU). However, important in underclocking diagnostically in this is that it is not enough to simply lower multipliers etcetera, you also want to ensure that you find a stable underclock that does involve lowered voltages etcetera, to make really certain that you really is dawing less power than normal. Another possibility, depending on your exact CPU and mobo chipset ("i5 quad core" unfortunately doesn't give enough details on that), is to switch the game over to using the integrated graphics and then try running the game. Obviously it will probably not be "playable", but this is something you should be able to do without much fiddling at all. Again the idea is to try to ensure reduced power draw (through not using the GPU). If that works, it would be either PSU or Graphics Card - but the latter should cause BSOD, not restart/shutdown, meaning that we again make a fairly good case for the PSU. His computer and graphics card should throttle themselves if temperature was the issue (unless this has been turned off on purpose in bios/driver), and end result should be a BSOD, not a shutdown.
  5. Just in case, since this is a common mistake: are you attempting to use the transaction number, rather than the serial number?
  6. Mini-JSOW, pretty much. Not the most obvious weapon to put on a straight-winged turboprop, but... It is kinda cool. :D
  7. Black Sea gets benefits inherent to the graphics API switch, but not benefits inherent to new methods afforded by designing a map with a new API in mind, since this would literally require the map to be re-built from scratch. And as most people would agree, that is time probably best spent giving entirely new maps rather than making a slightly prettyer version of an old one. ;) EDIT: Example: Bad Company 2 used both Dx9 and Dx10 (or 11? Was a while ago, not sure.) It used the two codepaths to render the exact same picture, so the latter would not give a prettyer graphics. But it would give better performance. That's what you can _probably_ expect for Black Sea. But taking full advantage of the Dx11 features would require Black Sea to be re-done from nothing, which Is energy (and money) probably better spent giving you new locations to fly and fight in. :)
  8. Not really a case of "keeping" Dx10, since DCS has never used Dx10. As for what Dx11 does: it not only offers potential performance boosts, especially for systems like yours that doesn't meed minimum system specification ( ;) ), but it also gives access to new techniques that COMBINE the "gain graphics awesomeness" and "gain performance"; that is, with techniques such as clipmaps and others you have additional tools to make things pretty where it matters which would not be possible (or as feasible) in Dx9. A Dx11-compatible graphics card is, however, fairly cheap, and supporting older standards when even Dx11 is actually not too far away from getting replaced by Dx12 doesn't make sense, not when other components technically put DCS minimum system specification beyond what a typical Dx9-limited system would have anyway.
  9. Sticky: http://forums.eagle.ru/showthread.php?t=94531 Does that solve the issue?
  10. Not fully sure I understand the questions. Steam-purchased keys work on the ED distribution just as if they were purchased directly from ED. But yes, to use the ED distribution you would indeed have to download it, same as you would have to download the Steam distribution if you want to use that one. The difference is that with a Steam purchase, you DO have the option of purchasing on Steam, NOT downloading on Steam, but rather using said key with the ED distribution of the DCS World platform. "ED downloads" being available on Steam is a question I don't understand either. Do you mean the modules themselves - then most likely yes, most products will also be released on Steam, but a key purchased on the ED site will not be recognized by the Steam platform, wherefore you will not be able to download it through Steam. Note that you have no risk of losing any disks or anything. Indeed, you can download pretty much everything from the DCS website without even logging in! It is "simply" activation that requires the purchased serial number. (I do not know exactly how this will apply to things like maps and campaigns, however.)
  11. Just to be sure: you are saying you tried the directions given in the below linked post, but this had no resolution to your problem? http://forums.eagle.ru/showthread.php?t=94531
  12. Well, the microstuttering is a bit of driver-issue (and varies depending on game, since this has to be solved in part as a per-game basis by the driver vendor). I know AMD used to have the issue to a much greater extent than is present now. (Sadly I don't have the time right now to dig up the relevant articles, but techreport and a few other hardware sites did some in-depth investigation of this issue.)
  13. Please file a support ticket here: http://www.digitalcombatsimulator.com/en/support/ Your issue will be looked at and hopfully resolved promptly.
  14. Cheers, Tiling makes a lot of sense (and I'm reading Scissors and being similar to tiling in "logic", just different in implementation). The logic used in these seems to be similar to what 3D rendering software often does. Seems quite smart indeed if the implementation manages proper load-balancing. (Note that CFX, at the time of publication for that one and until at least fairly recently (late last year), had very high microstutter frequencies on those modes - causing jerky animation in spite of nominally high "framerates". I'm guessing the load-balancer was at fault. Several benchmark sites started recording and displaying frametime results rather than "framerates" or "FPS" due to this, because the "framerates" were very misleading whenever this problem manifests.)
  15. This makes it sound as if you're saying both cards are rendering the same frame. This is at least typically not the case, and could even reduce performance compared to the "standard" method - the problem is that in this case they would have to partition meshes, effects, postprocessing etcetera several times between themselves even within the same frame. An easier method would be to have (for example) a framebuffer of 3, and have the stronger one do 2 frames while the weaker does 1, then rinse and repeat. There are tricks that go this route though (esp. with Mantle and potentially Dx12, I hear, where you can obtain similar effects as CFX/SLI without data bridges etc), and I might have missed a specific implementation that gets around the standard "problems" with that route. (Would love links to papers/indepth articles on it in that case! Sadly work has kept me busy enough lately that I haven't been able to read up as much on this stuff as I used to.) But there are issues present that are not negligible: for example many lighting and shadowing effects require the rendering of the frame to be aware of everything that is in the frame. (And outside of the frame, too...) Having part of the frame be done by an independent unit is then problematic - but I guess could be solved through having the routines be aware of this and have a sort of pre-processing stages that ensures necessary information is passed. (Again, sort of what I think AMD is trying to do with Mantle and Microsoft at least are looking at "enabling" people to do in Dx12.)
  16. No. SLI/Crossfire uses a separate bridge for mirroring, and has one "master" card that handles interaction with the rest of the system. The memory bus here is between GPU and vRAM, not between GPU and rest-of-system. (EDIT: That is PCI-E territory.) What happens is this (very simplified, it's more complex than this of course, esp. with the PCI-E interface): Card0: master Card1: slave Card0: gets data/instructions Card1: receives data/instructions from Card0 Card0: does work Card1: does work Card0: receives results from itself and Card1 Card1: sends results to Card0 Then results are out through the framebuffer to the monitors. Now, this is not necessarily "bad". Drivers can make optimisations that ensure cards handle workload and data optimally for a specific title. It's just one of the technical finicky things that had to be done to make it work (and one of the reasons why there used to be an absolute requirement for the cards to be identical). But if there is no driver optimisations going on, they have to use a fallback mode which would typically be alternate frame rendering, where one card renders one frame, the second the next one, then back to the first one (unless there's three cards). This however requires that there is a buffer of information for each of those frames already prepared and well syncronised, which didn't always work that well (esp. in titles with none or bad driver support), leading to microstutters - where overall FPS would be good, but there would be an interval where a frame gets severely delayed.
  17. What kind of research though? This would only possibly be the case if you were previously bottlenecked specifically at the GPU. If you are bottlenecked anywhere other than specific GPU computation (which omits things like GPU memory/memory bus etc), adding a second card might actually decrease performance, since you'll retain your bottleneck but add a new layer in graphics. Also note that performance in SLI and Xfire solutions is heavily driver-dependant: in the sense that the vendor (AMD and nVidia) have to put specific support routines for the specific game into their drivers, otherwise you will see less of an increase than what might be benchmarked on Triple-A titles that get a lot of support from the vendor.
  18. Only if you were previously memory-starved. An example I would potentially imagine would be a 1GB card being asked to Eyefinity three 1080p displays, though I haven't tested that myself. (And, of course, if the card in question has a crap memory bus, then it still won't really help.) What I would suggest, in general, is to simply keep in mind what amount of vRAM is on the "reference design" versions of a given graphics card. The bus will be tailored to fit that, and you'll have a guarantee of non-issue. Sometimes you do have considerably more margin from the bus though, but you'd want to read up fairly close on comparative tests and so on to ascertain that. Also remember that adding a second card (Xfire, SLI) does NOT give you more vRAM to work with, since all cards will "mirror" the same data. (Unless Mantle and Dx12 change this in the near future.)
  19. Multimonitor setups, yes, you'll want more memory, typically. But do remember that there's more to video memory than just the amount of it and it's "clock frequency", just as important (sometimes even more so) is the speed of the memory bus. Having extra vRAM in a card doesn't help if the bus isn't fast enough to keep that IO going - but sadly something that many vendors trick you on when they make versions of a GPU with "extra" vRAM. Gets a bit more complicated since you want actual I/O throughput to do a reasonable metric; there's two variables there - the "width" of the memory bus, and the "speed". Some cards will have a narrower bus, but compensate with a faster one. (Think of it like a highway: you have X amount of lanes, with cars driving at Y speed. X-1 lane can still perform better if it comes with the cars driving faster.) For 1080p, I'd say 1GB is actually plenty as long as the bus is up to snuff. For more, you'd probably want to look at increasing the memory - depends on the effects used etcetera - though I've used 2-screen setup with DCS on a 1GB card with no issue; though that has been one monitor for DCS and one for windows, so that is a bit different. But don't blindly compare two versions of the same GPU card where the only difference is amount of vRAM, you might get disappointed.
  20. EtherealN

    DCS Xwing

    To be fair... The original movies had a high level of pacman going on as well. :P
  21. Not like russians don't have to put up with Mig Mig Migs CONSTANTLY being the enemy in every movie, game and joke... ;)
  22. Also, I just realised I was mixing up "Black Friday" with what in english is called "Good Friday". Up until now, I did not know there was such a thing as "Black Friday". :P (Thankyou Wikipedia.)
  23. I'd have to look for Wikipedia to know their dates closer than "somewhere around these months". So yeah... :P EDIT: A corollary to that; how many westerners know that christmas occurs in January for a considerable portion of "christianity"? ;)
×
×
  • Create New...