Jump to content

Pilotasso

Members
  • Posts

    11840
  • Joined

  • Last visited

  • Days Won

    10

Everything posted by Pilotasso

  1. Have tested freesync on my 1080Ti with the Gigabyte AORUS AD27QD monitor and it has been smooth sailing. :)
  2. I play other games as well. I'm setting up the monitor as we speak.
  3. they will likely see freesync 1 support as it is an universal interface (i.e. no proprietary chip inside the monitor)
  4. FS2 is specifically tied to AMD GPU's as G-sync is to NVIDIA's. Only freesync 1 is universal now.
  5. thats a killer system there.
  6. So I decided to buy a new high refresh rate monitor. My current unit (see below) has GREAT color reproduction (used to be for photo productivity) but 60Hz feels its not enough anymore. My eyes. :cry: My current monitor is a second hand ASUS PB278Q IPS QHD (1440P) 60Hz from 2014. Now that both AMD and NVIDIA cards can use adaptive Sync (Freesync and not Freesync2 nor G-sync since those are tied to each respective GFX brand) with 144Hz refresh rate, low frame rate compensation, with great color is where I want to go, so I came up with this at the local stores: Gigabyte AORUS AD27QD IPS 27" QHD 16:9 144Hz FreeSync 1MS response There are better monitors with true HDR out there but they are either G-sync or Freesync2 that locks my future upgrades to either AMD or NVIDIA and I want to have some flexibility in choosing between the 2, so I decided to compromise with fake HDR and freesync1. BTW Pixio PX277h is not sold here. They have an equivalent panel but without the RGB bling for 100$ less but thats not an option to me soo... what do you think? aBYCAsDNFq8 AKCaVtn3tRs
  7. Some partner cards are also dying because they use same exact reference PCB just different cooler, so go do full custom cards.... If you realy need the 2080ti.
  8. I expect the tensor cores should be used for other things, like DLSS or advanced AI for games in the near future intead of ray tracing. SpZmH0_1gWQ
  9. I got a 14% bump in FPS when I changed the CPU from 2500K to 1700X with a 970GTX and then got 68% improvement when my current 1080Ti was installed. I had a thread with screenshots, one of them is below. Original thread: https://forums.eagle.ru/showthread.php?t=185226&page=6
  10. AMD used to suck but not anymore, their current processors are competitive and they run DCS great, next gen will perhaps top the benchmark charts, anyone with an AMD board can make this upgrade cheaply. Also from my testing DCS is no longer a single core affair, the GPU matters more on any modern decent CPU.
  11. last weeks AMD announcement was on the server side. Of course thats good news because eventually it will trickle down to consumer chips but we dont know any concrete data on it. yeah Intel is in deep trouble. Waiting for news with anticipation (intend to change CPU next year).
  12. you wont notice a difference unless your doing heavy video editing for hours.
  13. Take that test with a grain of salt. It's rendering mostly grass an trees. Try Nellis downtown.
  14. At the moment the feature that has more immediate effect is dlss and that is the most interesting part over Ray Tracing (they will keep adding Ray samples and throw fps down the drane each generation ) . That boosts fps considerably and for games that will support it a 4k monitor is finally viable at high fps. That being said, by the time you me its widespread, the 3080ti will be probably be out.
  15. dont mind using pagefile off of my SSD's. I had pagefile running off my old vertex 3 SATA drive for 7 years. In that period most people would have upgraded twice anyway. SSD's are fast enough to use pagefile seamlessly. My old puters limiting factor was infact the CPU a 2500K. It was getting maxed out in all its paltry 4 threads.
  16. 32Gb. See my Sig.
  17. yes, 4. See my Sig. Several others hit this and even 3600 in some cases. What kits you got? High frequency with 4 requires kits with Samsung B-die IC's. For 3200 kits of SAMSUNG B die you can dare go up to 3400. For 3600 you can try every frequency possible that the memory controller can handle. Also Google Ryzen DRAM calculator 1.4.0.1. It does all your homework for you. EDIT: here you go: https://www.guru3d.com/files-details/download-ryzen-dram-calculator.html Make sure you select the proper CPU and memory type and then hit XMP to upload your memory default timings into the tool and then it re-calculates everything for different frequencies.
  18. also they have 2 tiers for the 2080Ti. The only difference between them is binning and the factory clocks (stock VS "overclocked" whatever that means). This probably accounts for the cheaper models. personally I would go with cheaper as the 400$ premium for superclocked variants getting you like 3 FPS more is a sad joke.
  19. Not strictly out of the question, got mine at 3533mhz
  20. browse through the product catalog page and if you have bonus points youll see how much each product can be discounted.
  21. Had my former 1700x run 4x8 @3200 and upgraded to 2700x that runs same kits @ 3533. Official spec is not indicative of the speed you might actually get, that depends on the kits being listed for your board, cpu silicon lottery and the quality of the BIOS. Doesn't matter if the board is X370 or 470 (providing its from the same brand) , the biggest factors were already mentioned above.
  22. I've seen the same benchmarks as you did on launch day. They show factory overclocked 2080 founder edition dual fan designs go against the stock single fan design of the 1080Ti FE. Meh. That was very shady but smart from NVIDIA marketing. When partners 1080Ti's were used the small difference in games that showed 100FPS+ all but evaporated away. See gamers nexus and hardware unboxed later post launch reviews. the 1080Ti will die at the same time as the 2080 does, when the 3080 is launched. DLLS and ray tracing wont arrive in time to change this. If it wasnt for dwindling stock the 1080Ti would probably outsell the 2080 into both of their end of life.
  23. Your a bit mistaken, the 1080Ti VRAM peaks at 484GB/s, while the 2080 peaks at 448GB/s. The 1080Ti also has more cuda cores, higher clocks, the only advantage 2080 has is its new architecture, and some of that is for deep learning that DCS doesnt use. Im not saying either GPU is faster than the other because I have not tested or seen tests in DCS with both, however in multiplayer you might run into VRAM starvation symptoms. I have seen my VRAM usage get over in 10.5 GB on mine. Also in most other games the 1080Ti is functionally equivalent to the 2080 when VRAM is not an issue.
×
×
  • Create New...