Hello all,
I'm a blind gamer who would *love* to dive into DCS World. I'm prepared to invest in real hardware (HOTAS, cockpit panels, head tracking), study the systems, and take this simulation as seriously as anyone else here.
But right now, **DCS is effectively impossible to play without vision.**
The good news?
It doesn’t have to stay that way.
---
**Why DCS Is So Promising for Accessibility**
Unlike many other flight sims, DCS already has a modular, data-rich cockpit interface via [DCS-BIOS](https://github.com/DCS-Skunkworks/dcs-bios). This ecosystem exposes:
* Cockpit states
* Gauges and indicator lights
* Radios and communication panels
* Weapon systems
* MFDs, HUDs and more
(and [MFD label support is actively being developed](https://github.com/DCS-Skunkworks/dcs-bios/issues/1208))
These values can be used to generate **spoken output, sonification, or braille** — but in practice, that potential is locked behind huge technical barriers.
---
**Where Accessibility Hits a Wall**
Right now, the available tools are unusable without vision:
* Web-based interfaces like `dcs-bios-webinterface` are entirely graphical and not usable with screen readers like NVDA or JAWS
* The raw data (e.g. `0x3B7F: 65534`) is meaningless without deep knowledge of each aircraft’s internals
* Python connectors exist, but are poorly documented or incomplete
* Input/output mapping must be configured *per aircraft*, often manually, with little abstraction
Even as a tech-savvy user, I simply can't build a usable interface from scratch without sight.
---
**What Would Make a Real Difference**
This isn’t a request to redesign DCS itself — just to improve how data is exposed via DCS-BIOS and related tools.
**Example: Accessible MFD Output**
Thanks to the work now underway by the DCS-BIOS team, we may soon be able to retrieve MFD button labels dynamically.
If exposed in structured JSON:
```json
{
"mfd_left": {
"B1": "NAV",
"L3": "WP1",
"R4": "TGT ACQ"
}
}
```
This could be spoken aloud or navigated via keyboard — unlocking full MFD usage, including targeting and nav functions.
**Example: Radio Panel Readout**
Instead of:
```
0x43A0: 121500
```
We could have:
```json
{ "radio1_frequency": 121.5, "unit": "MHz", "mode": "AM" }
```
Spoken as:
> "Radio 1: 121.5 Megahertz AM."
**Example: Status and Warnings**
```json
{
"master_caution": true,
"gear_down": false,
"fuel": 35.2
}
```
Spoken as:
> "Warning: Gear is up. Fuel at 35 percent."
**Input Mapping**
* DCS-BIOS already supports input via serial or UDP
* Physical buttons or keyboard keys can be mapped (e.g. G = Gear Toggle)
* This could be defined in JSON configs without needing to write custom code
---
**Audio Interaction Already Works — Just Not in DCS**
Blind players already enjoy:
* **MSFS / FSX** using TTS plug-ins
* **Elite Dangerous** with audio-based MFD mods
* **X-Plane** via structured plugin data
* Even **Star Trek bridge simulators** with full voice interfaces
But DCS has the **best cockpit data access of all** — and we're not using it for accessibility.
---
**What I'm Asking**
1. Expose DCS-BIOS output in structured, documented JSON
2. Provide a schema per aircraft — so values have context
3. Create a CLI or screen reader–friendly monitor for DCS-BIOS
4. Involve accessibility-focused testers — I’m happy to help
---
**Why It Matters**
DCS is one of the most realistic combat simulators ever created. With only minimal changes in how cockpit data is structured and surfaced, it could become **the most accessible one, too**.
The technology is already here.
What’s missing is **visibility and intent**.
Let’s fix that — together.
> PS: This is also posted to the DCS-BIOS GitHub:
> [https://github.com/DCS-Skunkworks/dcs-bios/discussions/1207](https://github.com/DCS-Skunkworks/dcs-bios/discussions/1207)
---
**One More Thing: `Export.lua` – An Untapped Ally**
While DCS-BIOS is an excellent bridge to the cockpit, it's worth noting that `Export.lua` can be used **alongside it** to expose additional flight data:
* Speed, altitude, heading
* Aircraft state, sensor values
* Cockpit lights, warnings, and targeting info
* Even some data **not available through DCS-BIOS**
This makes it possible to build **custom assistive tools**, voice-controlled UI layers, or sonified interfaces — all without changing the simulator itself.
If anyone in the community has explored this route — or is working on accessibility, speech interfaces, or similar solutions — I'd love to connect.
Let’s share knowledge, collaborate, and push this forward.