Jump to content

Recommended Posts

Posted (edited)
I have Network speed set to Lan but I am seeing aircraft appear and then disappear. So what setting is best to use now? I have 60GB fibre optic cable. Thanks!

That's fast!

I guess you mean 60Mb/s. Anyway, the warping you seeing might not be on your end, or your ping to the server might be too high.

Edited by Cyb0rg

[sIGPIC][/sIGPIC]

Asteroids

____________________________________________

Update this

 

:D
Posted

1024 kb, for MP.

Windows 10 Pro, Ryzen 2700X @ 4.6Ghz, 32GB DDR4-3200 GSkill (F4-3200C16D-16GTZR x2),

ASRock X470 Taichi Ultimate, XFX RX6800XT Merc 310 (RX-68XTALFD9)

3x ASUS VS248HP + Oculus HMD, Thrustmaster Warthog HOTAS + MFDs

Posted

Actually to add to that it's also pretty well accepted that if there is substantial ground action or CA involvement 2048 should be the ceiling of your connection. Depends on what servers you tend to join- but if they're more adversarial / A2A then 1024 should be fine. More A2G related missions should be 2048.

 

It does make a difference in terms of the load the servers can handle and I think the general message out there is "don't just keep it on LAN because that's the way it's always been and it's fine." Things change and as the platform changes, we all need to make adjustments.

 

The ONLY time I've found (personally) that 1024 didn't work well for me was when I was actually hooked up via LAN (the server comp sits 2 feet away from my client comp). I experienced substantial warping. However, with that being said now that I'm typing this the mission being run at the time (the 127 stress test mission) was running enormous amounts of AI in the air and on the ground and that may have been contributing.

 

Worth some thought either way.

"ENO"

Type in anger and you will make the greatest post you will ever regret.

 

"Sweetest's" Military Aviation Art

Posted

As a rule of thumb:

 

UPLOAD-Bandwith of server minus 20% / number of clients is max that you should dial in.

 

Most data centre servers nowadays feature a 1-Gbit connection to the TOR ( Top of Rack ) switch,

those are usually connected with 10 or 40 Gbit connections to the backbone appliances.

 

With a 1 Gigabit connection you can run roughly 80-90 clients at 10Mbit, the bandwidth needed for the server-OS is marginal and can be tuned by the admin of that said server ( automatic Virus pattern updates, other automated update procedures.. )

 

The game is totally different if you run the server off a asymmetric home line that has FAR more download bandwidth than upload bandwidth, like ADSL ( Asynchron Digital Subscriber Line ).

Company connects usually run a SDSL ( Synchronous Digital Subscriber Line ) and feature Up- and download at the same speed, just the cost for such a line is a few hundred per month, Euros or Dollars.

 

Most connections for home users don't have more than 2Mbit, most of them have less. Only in good connected areas and smaller highly developed countries end users can get a 100/100Mbit line or even faster for a fair, payable, family price.

 

Take a 20Mbit UPLOAD connection, subtract 20% = 18 Mbit left to share among clients.

with 1024 up to 18 should be able to connect, at 2048 it's half of that and at 10Mbit the second player will kill it all.

 

This not only applies to game servers, any other server module has the same boundaries and limitations, aka Backup Servers, File Servers etc..

 

 

Bit

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Sapphire  Nitro+ 7800XT - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus XG27ACG QHD 180Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Posted (edited)

 

Take a 20Mbit UPLOAD connection, subtract 20% = 18 Mbit left to share among clients.

with 1024 up to 18 should be able to connect, at 2048 it's half of that and at 10Mbit the second player will kill it all.

 

This not only applies to game servers, any other server module has the same boundaries and limitations, aka Backup Servers, File Servers etc..

 

 

Bit

 

Not trying to nit-pick, but 20% of 20MBit is 4MBit which leaves 16MBit :)

 

But I see what you're saying, and it's what I've said to others and what Server Ops try to explain to clients.

 

The Server Load with Clients on LAN is crazy (Eno can show the docs to prove that), Server load with ADSL2048 is significantly less, and ADSL1024 is less than that.

 

 

Some Observations I've seen:

 

With everyone on LAN,

5-6 Pilots will be smooth, as you approach 10 Pilots, warping sets in, and if you keep pushing it, you get a Ping timeout from server,

Users with bad upload bandwidth will warp, as mission progresses and groups spawn and move, chances of a Ping timeout from server increase.

 

With Everyone on ADSL2048, We had up to 12 Pilots smooth, Ground Units smooth, In Massive Missions

 

With Everyone on ADSL1024, We had some issues with Heavy missions.

 

 

 

 

So, Heavy Missions with Air and Ground Units, JTAC, etc etc should prolly use ADSL2048

 

Servers that run a simple A2A Mission with just player spawn points, can definitely run with ADSL1024 Settings

 

As the amount of data being transmitted over the network will be less, and wont need a larger connection setting.

Edited by SkateZilla
  • Like 1

Windows 10 Pro, Ryzen 2700X @ 4.6Ghz, 32GB DDR4-3200 GSkill (F4-3200C16D-16GTZR x2),

ASRock X470 Taichi Ultimate, XFX RX6800XT Merc 310 (RX-68XTALFD9)

3x ASUS VS248HP + Oculus HMD, Thrustmaster Warthog HOTAS + MFDs

Posted

in addition:

 

Servers with many clients connected also do for sure have more data to pump out than a server with the exact same mission but only ⅓ of the clients connected, at the same connection speed.

 

That means you can connect with LAN settings and no more data will be sent than on 1024 if the server just hasn't got any data to send to you. The game changes drastically once enough clients have connected and the amount of data that must be sent to each client goes up considerably.

That is when the bandwidth cap comes to play ( causing other headaches ). The server will/must decide what he sends to you and what not ( if DCS works this way, I honestly don't know how this is handled ). There is a max amount of how much data can be sent to you in time so it arrives before it is too late, it's kind of streaming, if the packets arrive too late the show is over already.

 

If the server has more data to pump out than it physically can, all players connected get issues.

Packets come too late or not at all ( dropped by server ), general IP issues show up if you clog the Interface, Ping will go sky high, dropped packets, overflow..etc...

 

If, on the other hand, all players settle with the least amount possible in regards of the number of players it must somehow match the overall data needed to be sent with that many players total for that given mission. It aint easy and it changes by mission, number of clients and their client site settings of bandwidth limitations being too high in general or too low for the amount of players being connected. The "being too high" won't matter until a certain amount of users connect and use up the bandwidth for themselves rather than sharing fair amounts to all.

 

If your bandwidth is too low for the given number of players and mission you will basically experience the same issues as too high bandwidth setting of some players using up all the available bandwidth, packets meant to be sent to you get lost or delayed, both is a critical issue.

 

I highly think the game server uses UDP protocol for its raw game data since they carry less data overhead. UDP doesn't care for delays, if the packet doesn't come in the right order the ones that come too late get tossed, pretty simple and effective to keep calculation resources at a minimum.

Streaming videos or music mostly uses UDP as well, if the song is 2 seconds further down playing, that missed bit of music that came another route and thus delayed is of no use for the stream anymore and disregarded/tossed over board. Come in time or be killed is the name of the game in time critical transmissions. TCP couldn't fix it as well, it would just cause far more calculation power needed but you can't cheat time flow.

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Sapphire  Nitro+ 7800XT - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus XG27ACG QHD 180Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Posted

Masterarms.se run all missions with the LAN setting. We experience warping if we run with lower.

 

At least in Sweden it's common with fibre, I pay 38€/month for 100/100mbit.

i7 8700K | GTX 1080 Ti | 32GB RAM | 500GB M.2 SSD | TIR5 w/ Trackclip Pro | TM Hotas Warthog | Saitek Pro Flight Rudder

 

[sigpic]http://www.132virtualwing.org[/sigpic]

 

Posted
Masterarms.se run all missions with the LAN setting. We experience warping if we run with lower.

 

At least in Sweden it's common with fibre, I pay 38€/month for 100/100mbit.

 

 

As I said, small highly developed country, also NL has such high speed connections for home users.

 

Where I come from there is no such thing :( We pay 39,90 € for a 32/2 Mbit cable line !!

 

 

You are a lucky bunch of guys up there ;)

 

 

Bit

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Sapphire  Nitro+ 7800XT - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus XG27ACG QHD 180Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Posted
Masterarms.se run all missions with the LAN setting. We experience warping if we run with lower.

 

At least in Sweden it's common with fibre, I pay 38€/month for 100/100mbit.

 

Yes - the servers are still encouraged to use LAN. And 100/100... Drool.

"ENO"

Type in anger and you will make the greatest post you will ever regret.

 

"Sweetest's" Military Aviation Art

Posted

30€ for >32MBit is common in Germany today with VDSL or cable. But I can remember times when we had 64 kBit ISDN and in the US there was this mysterious "cable" technology providing >1 MBit...;)

Posted (edited)

From what I have read it obviously counts on how many players are connected then, is this right? So, if this is the case why do servers allow so many people to connect when it is only going to ruin the experience? I mean, nobody likes to see aircraft warping all over the show do they?

Edited by Dudester22
Posted

Servers can see how much data they send to each player and weigh it against how much upload bandwidth they have. They have a good idea how many people they can host and keep performance up.

"ENO"

Type in anger and you will make the greatest post you will ever regret.

 

"Sweetest's" Military Aviation Art

Posted

How come the max connection speed is client setting instead of server choosing the max speed for the client as the value is highly dependent on the mission and connection of the server and client should just pick the max allowed.

DCS Finland: Suomalainen DCS yhteisö -- Finnish DCS community

--------------------------------------------------

SF Squadron

Posted
How come the max connection speed is client setting instead of server choosing the max speed for the client as the value is highly dependent on the mission and connection of the server and client should just pick the max allowed.

 

 

This is a good point. If you are required to change from 1024 to 2048 everytime you join a server, how is the client going to know what to set. Is this mission going to require 2048 because it is a busy mission, with plenty of ground action, or just a simple mission with air to air, and thus require 1024.

 

Although I have seen plenty of warping on the VA server when they are reaching numbers of 30-40 clients, and there is no ground action going on in that mission.

 

It would be nice if it could be set in the mission editor, if it was possible.

Posted (edited)

Hmm, i dunno. The bandwidth setting client side is the receiving side. Wouldn't it just show the server that it just would have have enough space to dumb data in a time frame WHEN the server has it. Sounds more like there's spikes of data to be send and it's a lottery on who receives all packages when the server has a bandwidth limit for all clients connected.

It might already be serverside auto-throttled, but spikes will be spikes. Once it starts with continuation of spikes it cascades.

 

For me it doesn't make sense to have a setting in a game with network focus for a client that will blow up a connection for a server at any given time for many years. I wonder even if that setting really has any effect when running a client instead of a server.

Edited by BRooDJeRo
Posted (edited)

The basic problem is still the same, too many servers run a scenario that they shouldn't run with that many players in consideration of their Server's upload capability. You can do the math back and forth but if the UL-bw is too low in general the server admin should set a player limit that fits the limits. Anything else is causing problems.

 

The spikes could be explained like this ( at least this is how I explain this logically to myself ):

Some players fly over heavy ground-battle active terrain and get more data than others sitting in their shelters revving up the turbines or flying over the sea at 15k with nothing in sight but the sky and blue sea below. Now imagine mixing this a bit up & down, more players enter the heavy ground battle area, 2 Su's take off and zoom into a heavy A2A area and suddenly the server has a lot more data to pump out to certain players for some moments ( or longer ) thus producing spikes. If the spikes stay below the max UL-bw it's all ok, if they exceed the limit, some algorithm

must decide what has higher priority and what not to know what to drop and what to send under all circumstances.

 

The core problem is not the game code, it's the limited bandwidth some servers have.

Anything below 100Mbit is not suited for 30+ players I guess.

 

OK, one could optimize the code, compress the data further down to smaller packets, but in general that is not the fix. The fix is a 1Gbit dedicated Server in a data centre with multiple backbone access at 10+Gbit near your clients location. This is the same across all Multiplayer games, regardless if Egoshooter, F1-Racing or our beloved flight sim.

 

If you can't send your data in time, most packet in the waiting line are lost cause it renders obsolete through the ongoing gameplay not being able to make use of it anymore. Some packets may still make sense to be sent a second later, like "Player XYZ joined at Batumi with A-10C",

that is not a critical packet in sense of time accuracy but having two choppers shoot Vikhrs at the same tank and each one wanting the kill, it does matter which Vikhr hits the tank and which one hits a wreck... or Dogfight maneuvers, the nightmare for Ping latency...

 

Bit

Edited by BitMaster

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Sapphire  Nitro+ 7800XT - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus XG27ACG QHD 180Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Posted

Another thing that could potentially cause warps and delays in sending many packets:

 

IF you call a 100/100Mbit line your own, aka our swedish and dutch friends, it also plays a BIG role what kind of router you have if you go beyond a certain amount of players being connected.

 

Usually, typical home-user style routers are not capable of handling many packets and connections at the same time, their internal CPU is just too small, causing processing based delays that you can't fix unless you exclude that router and replace it with a more powerful model.

 

With a standard Home-Box I wouldn't see a problem up to 10 clients connected.

 

For those interested in upgrading their routing power, have a look at this project:

http://www.ipcop.org it`s free, fairly easy to install on an older PC, 500MHz is plenty routing power for up to 30 players.. my guess. If you have a 1-2GHz Celeron or similar that will allow you to connect more players than you can from your line, guaranteed.

Install 1Gbit cards if possible, their latency is much lower than 100Mbit cards, have 1GB RAM or more and you are good to go.

 

You must call a DSL-Modem or Cable-Modem your own to make use of this. Just having a DMZ in your home-router that connects to your ISP and placing the Linux IPCop in the DMZ won`t fix this.

You can, if needed for VoIP or tother services, make it the other way round:

 

MODEM<-->Interface-0<>IPCop<>Interface-1<-->DCS-Server

 

plus

 

MODEM<-->Interface-2<>IPCop<>Interface_3<-->Home-Box-Router<--> Your Home LAN

 

You can contact me via this thread or PM if you need more help on this topic.

 

Bit

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Sapphire  Nitro+ 7800XT - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus XG27ACG QHD 180Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...