

Tj1376
-
Posts
593 -
Joined
-
Last visited
Content Type
Profiles
Forums
Events
Posts posted by Tj1376
-
-
Page 344 of the manual has the BK90 release parameter envelope on it - nothing extends past 7km.
Cheers
TJ
-
in the release version, in either user made missions, or practice missions that came with the Viggen, the RB-75 work, period. In the OB version, the RB-75 will not lock in the crosshairs, and thus will not fire. The crosshairs do not open and close and lock a target.
The practice missions with the RB-75, one with enemy tanks, or the practice mission called Beach something,, where you hit ship and targets on a beach. basically with the practice missions you can just set the weapon selector for RB-75, and fix the target to lock it. Not much to do as it is already setup as airstart all that has to be done is select the weapon.
Work just fine in release version. Same as every other weapons. But not so in the OpenBeta.
I uninstalled Openbeta Viggen module, copied over files and folders, compared the file line by line. I give up. I can't do anything else short of nuking the whole computer and starting over. Some file I am unaware of has a problem. I am giving up on this issue. Maybe it will be ok, or when our squad goes back to release version, or I will just not use this weapon. I have a zillion other aircraft to fly.
I am not saying it has a bug, I am just saying something in my computer is porked (only in DCS OpenBeta Viggen module) and I don't know why or where. I just wanted to make that clear.
I don't have any mods installed. I have also ran repair for the Viggen module and OpenBeta.
Nothing helped with this issue.
The crosshairs use to open with a large gap, to lock a target with T1 Fix. Then the gap became to a tiny gap during some patches maybe 6 months ago. I think it still worked back then, but shortly after that the issues started.
Oh yeah, tried different versions of the RB-75. Tried 2 instead of 4 on the pylons. Tried only the weapons no extra fuel loaded. No mixing of different weapons.
:joystick:
The Swede's should have bought our F-16's :doh:
You do not use T1 fix to lock the target. Use TV fix. This will probably fix your issue. :)
TJ
-
I havent tried on the new Open Beta, but the RB75 was working fine before it. Id assume you have something wrong in your procedures.
What doesnt work? The sight doesnt light up? You cant lock a target through the sight? The weapons wont fire? The weapons wont hit the target?
Run us through your startup and on target procedures, as that will help us help you figure out where the issue lies.
TJ
-
Hi team! I'd love to learn how to be more effective in CAP.
Are there any good missions where I can spawn CAP AI of different degrees of difficulty? The instant mission seems to always think 1v4 is a good idea. Im looking more for 1v1 to start with, and will add in a buddy as I learn some of the maneuvers.
I've played with MIST but I dont really understand how to use it, so was hoping someone had a basic mission with MIST (or MOOSE) that would spawn fighters when the user pressed F10 and selected the option to spawn - something like 1 for Average, 2 for High, 3 for Excellent. And then show me how to adjust their loadouts (this will be Red V Blue, and I'd like to vary between F18 and F15 but with specific loadouts (no 120Cs.)
I want to learn the editor in the process of doing this, so I'd appreciate comments on how to do it but I am also a learn by example kind of guy so if you have a mission and wouldnt mind walking me through how to accomplish this, I'd love it.
Appreciate all the help this fantastic community provides!
Thx
TJ
-
Performance Chart For RTX2080TI / I74790K CPU
Can you repost all your pics? They are broken links for me. I am using Tapatalk so let me know if forum users can see them and I’ll log on when I get back home.
I’m a big 4790k shouldn’t be getting a 2080ti guy when running 1080p monitors (I still believe that 1080p triple monitors can easily be managed by a 1080.) I’m really interested in your results with VR - and specifically interested in if increasing PD significantly increased your cockpit visibility. Are gauges clear to read? What about plane spotting? Is it night and day difference with visuals now? I read the below from your post- but I couldn’t tell if you thought it was a great improvement or if it was maybe slightly better with just solid frame rates.
Really interesting stuff. Thanks for posting.
TJ
Currently have PD at 2.3 and SSAA at 2.0. Resolution of cockpit slightly better, can read the time and distance to waypoint text on the top right of the Hornet MFCD without leaning forward.
Medium range much sharper. Buildings not blurred at 10,000ft. Runway lights seen at 20, 000ft on night mission. Individual streetlights apparent at 8000ft.
Runway lights on approach seen approx 2 - 3x distance further out.
Sent from my iPhone using Tapatalk
-
Nevermind, you can ignore this thread -
But for others that might find this neat little tool handy - "Load Profile" in the top right hand corner of the keybind page solves this problem. Just navigate to the old Config location, find the right file and load the profile.
TJ
-
As the thread states - I tried just moving the config folder over, but all my devices picked up new IDs, so that didnt work.
I then tried to rename the files in the config folder to the new device ID name, that also didnt work. :(
Any ideas on what I can do to try and copy the bindings over? I have quite a few modules in DCS.
TJ
-
:joystick:
I can only give that much and I guess ED asks for a lot more than what I do and can do.
:pilotfly:
Did you ever figure out how to fix this? I had 16 gig and noticed I was running out. Had 8 gig of dissimilar ram laying around and threw it in to see if it would boot. It did and now DCS eats 23 gig of ram.
I fly with others who say DCS on high textures only uses 12 gig, and here I am using twice that. :(
TJ
Sent from my iPhone using Tapatalk
-
Looks like that new rig was the best bet.
-2 to +9% performance gain old model to new model. (That’s comparing the 1080TI to the 2080. Remember that nvidia changed their model line up for the 20xx series. The 2080 is the old 1080ti. The 2080ti is the old Titan. The Titan RTX is the old Quadro.)
Even if you compare model to model 1080ti to 2080ti, you are seeing a 70% price increase for a roughly 20-25% performance improvement.
Good luck with your new rig!
TJ
Sent from my iPhone using Tapatalk
-
How to attack AA defended targets (Blue Flag)?
Bit how can you hit anything then? The exact location of the various targets are usually not known...The idea with the BK90 is that it is a cluster bomb and will take out targets fairly close to the target coordinates.
Join the server. Then leave the server. You’ll now have the mission file on your hard drive. Reverse engineer the mission file to give exact coordinates for every objective.
You’d almost always have this intel in real life, so we don’t consider it cheating.
TJ
Edit: scroll up a few posts and you’ll find a MIZ file I created. That’s exactly how I created it... reverse engineered the MIZ file you download when you join the BF server.
Sent from my iPhone using Tapatalk
-
I was not aware that the BK90 was working as it should on multiplayer servers... Is it dispersing correctly online?
Negative. It’s just a big bomb.
Very useful in testing against the fuel depots. :)
TJ
Sent from my iPhone using Tapatalk
-
I highly doubt we will see ray trace in DCS any time soon especially in VR considering they barely could achieve 50-60 fps on hd monitor for highly optimised games. Since this was the main focus of the nvidia presentation, I doubt raw power will be that more compared to 1080ti. I bet on 10-15% on benchmark which for dcs will be even less.
And I will get the rtx later and sell the 1080 if worth it.
Sent from my Redmi 4 using Tapatalk
That’s my plan. 1080s here continue to fall in price. Only question is if I should snap one up before the RTX launch or wait a bit longer. I’m actually nervous that people won’t sell their 1080s once RTX goes live, so prices might actually go up. So much speculation!
TJ
Sent from my iPhone using Tapatalk
-
my 1080Ti (OC) is @99%-100% (Task Manager) most of the time while flying in VR...
...I'm curious what this would be on the RTX 2080Ti with the same settings.
None of us will know until Sept 17th.
Rumors go anywhere from only 6% increase in rasterization to 30-35%. It also depends on if you are taking into effect that nvidia changed the model numbers- so a 2080 is comparable to a 1080ti. 2080ti is a Titan, etc. The high rumors are comparing straight model numbers (2080ti to 1080ti) but I don’t think this is accurate. The 2080ti costs what the Titan cost at launch, so I prefer to compare the 2080 to the 1080ti (which is where you get the smaller 6% rumors from.)
Once the NDA is lifted for the first reviewers on Sept 17th, we will all find out.
TJ
Sent from my iPhone using Tapatalk
-
No, I'm simply telling you that the size is irrelevant, as is the pixel density (since that's a function of size) — only the resolution matters.
My point was that his resolution was already significant — indeed higher than the one it was compared against — and making his graphics card tremble wouldn't exactly improve his FPS. In addition, if taxing the GPU was the end goal, then based on what we've seen so far, DCS isn't really the right software to do that in because something else is holding him back. And in other games, he can achieve the same effect simply by ramping up graphics quality until that resolution becomes an issue for the rendering pipeline.
My other point was that, if you're going to suggest things that get more out of the graphics card, then something as irrelevant as pixel density isn't a good choice because, again, it's not a factor.
Agree to disagree.
TJ
Sent from my iPhone using Tapatalk
-
Man, 2nd gen drivers are from 2010 and 2013.. I promise you, there's at least one that causes just about any process to cause spikes to 99% cpu usage that normally wouldn't be a heavy load.
I've had to hunt down drivers to fix that often enough to know. The main culprits again were sata controller drivers and network drivers.
Try a fresh windows install without installing any of the drivers for your motherboard at all. You might get lucky and windows 10 have them in their database. But if not - you can watch the cpu go to 99% generally just by moving your mouse.
I'm not completely ruling out the 1080Ti being cpu bottlenecked @ 4.3 ghz, but at the same time - his 1060 should not have been at his high resolution, though it would struggle with it, which would at least give SOME increase upon swapping to a more powerful gpu up to the point of the cpu not being fast enough. I've had to trouble shoot a 2nd gen intel system enough to say - check for driver faults with Poolmon..it's pretty imformative.. you might find an older driver works better than the latest one.
Yeah but in your scenario I imagine he’d have a horrible experience with his machine in general. I doubt he would have just dropped all that coin to upgrade to 1080ti if his overall machine was horrible. Instead, it’s exactly this that I’ve seen time and time again on these forums. 2600K folks upgrading to 1080 or 1080ti and not understanding why they don’t seen a difference in performance.
This screen grab of the 2600k cpu running DCS is exactly why the 1080ti makes no (or little) difference. There is no headroom in the single thread CPU performance, hence the CPU bottleneck when running DCS.
Sent from my iPhone using Tapatalk
-
Of course you can, because monitor size simply does not matter. Pixel density is a physical characteristic of the monitor build — stuff the graphics card couldn't care less about.
If you push an image of a given resolution to a 10" screen, it will require exactly the same amount of GPU power as if the same resolution was displayed on a 100" screen, irrespective of the much lower pixel density of the latter. If you pushed a 25% larger image to a 25% larger screen, you'd need 25% more GPU power, even though the pixel density would be exactly the same.[/Quote]
But this isn’t even the argument we are making....
We are discussing two monitors (well in this case six monitors) of exactly the same size. Three of those monitors are 1080p, three are 1440p.
If you are going to sit there and tell me that the 1440p monitors won’t tax the gpu harder, then there really isn’t anything I’m going to be able to say to convince you otherwise.
My post here was telling the OP that if he wants to see that 1080ti trimble at the knees, he’d have to increase his resolution- because there is nothing his CPU will be able to do to increase FPS in any significant way.
Close' date=' but not quite. If you move [i']to a higher resolution[/i], you tax the GPU more, but again, monitor size and pixel density are not factors in that increased load. Only the resolution itself matters.Exactly. The whole 1080 vs 1440 is not really what mattes — it's the total resolution, and his setup means that he's already running at a resolution that taxes the GPU pretty hard. Further increasing this resolution might tax the card more — no surprise there — but that doesn't really help resolve anything.
Yes, CPU limitations are a likely candidate for explaining the lack of improvement, just as the screwy multi-monitor rendering suggested by toutenglisse, or any of a number of driver issues as Headwarp suggests. That was never really the question — just that even with his current setup, he is pushing the card harder than the 1440 setup that it was compared against and that you can't just look at vertical resolution and conclude which one is more punishing for the graphics card.
I agree with all of this, except the last paragraph second sentence. I was suggesting to increase monitor resolution by moving to 1440- the OP agreed with his “I’d need a loan” comment. This wouldn’t tax the CPU anywhere near as much as it would hurt the GPU. (Of course I also don’t recommend DCS in 1440p- plane spotting is hard enough in 1080! That increase in pixel density makes it much harder for your physical eye to see the speck in the horizon!)
Cheers- I’m off to the day job.
TJ
Sent from my iPhone using Tapatalk
-
Dude if his drivers are faulty and causing high cpu usage, cleaning them up will increase performance in a CPU hogging game by multitudes.. I've experienced the issue and what i can do to gaming. I never once mentioned going from 1080P to 1440p, which 3440x1440 isn't 1440p lol. What I did say that modern cards are powerful enough that @ 1080P CPU's under a certain clockspeed are bottlenecked.
Even a faulty network driver can hog cpu cycles and drastically reduce performance on a machine. In fact the only difference between his rig and my 2500k is hyperthreading, resolution of which he has more, and his GPUs. With that res his 1060 should have been pegged.. but the fact that NO fps increase came from the upgrade, shows that it wasn't.
Again.. if it's JUST his CPU -increasing clock speed would net FPS increases.
You've not really proven a thing.. there are multitudes of things that can cause high cpu usage, and it takes more than looking at task manager and coming to a conclusion. \ The fact that I didn't get an increase going from a 4.3ghz 2500k to a 4.7ghz 8700k, tells me taht I could downclock to a lower CPU speed and not take an fps hit because my GPU is maxed. If he were to plug his 1060 back in and it not be pegged at 99% at that resolution - I'd bet money it was one of the janky drivers I was talking about.
I'd be willing to bet that if you overclocked that 3770k of yours to 4.5 or more ghz you'd see increases in fps without paying for more than a new cpu cooler, provided - you have the best working drivers for your system.
Show me a driver issue that nets you even a ten percent change in performance and I’ll be amazed. This isn’t an AMD platform we are talking about here. :)
TJ
Sent from my iPhone using Tapatalk
-
I'm simply looking at the context of that quote and what Sub2k was responding to. It seemed like you were suggesting that the difference Headwarp saw when he went to 3440×1440 was somehow due to the change in vertical resolution, whereas (by the sound of it) Sub2k should not have seen a change because he remained at 1080. The exchange I followed was:
I'm saying that the vertical resolution isn't what matters — it's the sum total of the amount of pixels that need to be pushed that does. The reason the Headwarp's card is taxed is because he's pushing out almost 5 Mpx per frame which is (apparently) more than the card can handle. Sub2k is pushing out even more — 6.2 Mpx. It's 69% more than you have to feed a regular 2560×1440 display; it's 26% more than what's needed for 3440×1440 that Headwarp was talking about; it is (obviously) 3× more than a regular 1920×1080 display.
So the 1440p vs 1080p comparison is a red herring. Between the 5760×1080 and 3440×1440 setups, the former — the 1080p setup — is the more taxing one.
More to the point, while it might not max out a 1080 Ti, it should be a severe strain on a 1060 and thus the upgrade from the latter to the former should have made a difference. But it didn't. So either a 1060 is more than enough to process up that huge resolution, and the Ti much more so, but in that case, he shouldn't suffer such low fps on either card. Or he was pushing the limit of the 1060 (which would explain the low fps there), in which case we have to wonder why he did not see an improvement when he gets a graphics card that isn't tortured by that resolution — some other bottleneck is keeping it down, and it's not resolution bound.
No. Pixel density is just a matter for the monitor manufacturer to worry about in terms of maintaining build and image quality. It makes zero difference for the graphics card because the graphics card does not care about the monitor size — it just pushes pixels.
It takes the exact same amount of work to push 3.7 Mpx to a 24" display (122 px/in) as it does to push 3.7 Mpx to a 30" display (98 px/in), even though the former has 25% higher pixel density.
I just can’t man. I don’t even know where to begin. You obviously can’t compare MPX in two different monitor sizes! Sure, a triple monitor setup will require MORE PIXELS to compute, which requires more gpu. Also, if you increase the pixel density with the same size monitor (ala moving from 1080 to 1440) you’ll tax the gpu much harder. Again- more pixels per square inch (say two 27 inch monitors one at 1080 the other at 1440) causes the gpu to work about 25% harder on each frame for the 1440 due to the increase in pixel density per square inch.
These are basic fundamental facts.
His CPU (as posted in a screenshot earlier) is the cause for his poor performance. It’s old and showing it’s age in its single thread performance. Sure he might clean some drivers up and get a small increase, but an old Sandy Bridge won’t power his 1060, let alone the 1080ti. Again, at 1080. Increase your resolution and you can tax that gpu much harder- say by moving to three same size 1440p monitors. Or add a fourth 1080... I didn’t realize nvidia allows four monitors on the 10xx series.
I’m going to let this die. It’s clear from your long post (and the multiple edits that took place while I wrote this quick reply) that you have time to argue and debate. I’m not interested. I was here to help a guy understand how his 2600k was his bottleneck and what he might do to improve it. I’ve proven that point and will move on.
Good day.
TJ
Sent from my iPhone using Tapatalk
-
@TJ,
If I'm gonna upgrade my three 1080 monitors to three 1440..... I'm gonna need a loan!... ;)
#TruthFact! I’m still nursing a 780Ti for a similar reason!
TJ
Sent from my iPhone using Tapatalk
-
You're confusing a couple of things here.
You're right that sheer amount of pixels matters, but screen size or pixel density do not. Nor does vertical resolution take any kind of precedence over horizontal — it's the product of the two that matters, so increasing one is exactly the same as increasing the other. Just because he's running 1080 vertical resolution does not mean he's not putting a lot of strain on the graphics card — he is, just on the horizontal axis instead. In fact, he's doing a lot more so with his setup than what you're suggesting.
At 5760×1080, he's pushing 6.2 Mpx compared to merely 3.7 Mpx for a regular 1440p display — almost 70% more. He has already put a lot more strain on the graphics card than just going from a 1080p display to a 1440p one.
We were comparing two setups from two people, both with triple monitors. One went to 1440p the other stayed at 1080p and the two posters were trying to figure out why the 1080p setup wasn’t taxing the gpu like the other poster who had 1440p. My post still stands.
The only way he is going to tax the 1080ti gpu further is to increase resolution. And since he already has three screens- that means more pixel density. 1440p it is!
Also- pixel density is everything. If I can cram 25% more pixels into the same space- the gpu will work harder to render that frame. Your first sentence in your second paragraph is factually incorrect. Although I do agree with the rest of your post- I think you just lost the context of the conversation.
TJ
Sent from my iPhone using Tapatalk
-
@TJ,
Actually, since I'm using 3 x 1080p monitors, I'm at 5760.
Your talking width, which does task the gpu harder but isn’t the same as the last number (which in your screenshot is still 1080.) Once you move to 1440 resolution, you’ll have 25% more pixel density per square inch of monitor space. Or put another way, two exact same size monitors with one running in 1080p and the other in 1440p- the 1440p will require 25% more GPU power as it has 25% more pixels in the same area as the 1080p monitor. The gpu has to render more pixels in the same space as that 1080p monitor and it’s much tougher on the gpu.
Hence- if you really want to see that 1080ti pushed to the limit on your setup, trade those 1080p panels for 1440p panels.
TJ
Sent from my iPhone using Tapatalk
-
Much better video card = same frame rate...
But I do have to wonder how you could've been cpu bottlenecked with a 1060 when my gpu was and still is a bottleneck at lower resolutions with a more powerful card.
That being said - my experience ends at the 980Ti - I might learn some things when I do ugprade, hopefully this month. For the life of me - the fact that with more pixels, your 1080Ti is giving the exact same framerate as your 1060 as if you were CPU bound on both cards where my 980Ti has been at 99% ever since I got a 3440x1440 monitor on the i5 counterpart of your exact build has me baffled.
Resolution is the reason. You went from 1080p to 1440p- original poster did not (he is still 1080p.) Increasing resolution puts relatively little strain on CPU but tortures the GPU.
This is also why most folks don’t recommend a 1080ti for 1080p gaming- you just aren’t going to see any major improvements because at 1080p there still isn’t enough CPU power to push that card.
TJ
Sent from my iPhone using Tapatalk
-
@ TJ,
Actually, that screencap was taken over Dubai.
Also, I have my pagefile set to 16 GB. Could that be the difference?
No- you’ll only hit the page file if you run out of memory. And you’ll feel it as a hard studder.
My guess is that DCS is loading most of the terrain into the 11gb of on board memory that GPU has and that’s why your seeing low DDR3 memory usage. This is a great thing- ddr5 is much faster.
It’s easy to test if you still have that 1060 laying around. Just swap them and see what your physical memory does (I’d expect it to go up quite a few GB.) If Newegg and Amazon keep teasing me with super low 1080 prices, I might try this test in the next day or two. :)
TJ
Sent from my iPhone using Tapatalk
-
Exactly like my setup.
One logical core maxed out while the other 7 are not doing much...and the GPU at 60%.
I’d say you’re CPU-bound, just like me.
We don’t have too many options but cranking down CPU-heavy settings (trees, civ traffic, etc) and hope the Vulkan revamp to kicks ass and allows for better multi-core performance.
Yes- the screenshots clearly show a cpu bound scenario. The primary core is maxed and the gpu is only 60%. About all you can do from here is increase resolution (which taxes the gpu and not the cpu.) however, 1440 monitors aren’t cheap and plane spotting in this resolution isn’t exactly easy.
TJ
Sent from my iPhone using Tapatalk
RB-15 Mavericks don't work for me in Open Beta
in DCS: AJS37 Viggen
Posted
Uhh. The key binds changed in the open beta (Thanks ED! /sarcasm)
Most likely when you upgrade to OB your key binds are hosed due to EDs update. Check your binds in the sim in the OB, you’ll probably find them jacked.
TJ
Sent from my iPhone using Tapatalk