Dataaja95 Posted Monday at 02:18 PM Posted Monday at 02:18 PM (edited) Hello all, I'm a blind gamer who would *love* to dive into DCS World. I'm prepared to invest in real hardware (HOTAS, cockpit panels, head tracking), study the systems, and take this simulation as seriously as anyone else here. But right now, **DCS is effectively impossible to play without vision.** The good news? It doesn’t have to stay that way. --- **Why DCS Is So Promising for Accessibility** Unlike many other flight sims, DCS already has a modular, data-rich cockpit interface via [DCS-BIOS](https://github.com/DCS-Skunkworks/dcs-bios). This ecosystem exposes: * Cockpit states * Gauges and indicator lights * Radios and communication panels * Weapon systems * MFDs, HUDs and more (and [MFD label support is actively being developed](https://github.com/DCS-Skunkworks/dcs-bios/issues/1208)) These values can be used to generate **spoken output, sonification, or braille** — but in practice, that potential is locked behind huge technical barriers. --- **Where Accessibility Hits a Wall** Right now, the available tools are unusable without vision: * Web-based interfaces like `dcs-bios-webinterface` are entirely graphical and not usable with screen readers like NVDA or JAWS * The raw data (e.g. `0x3B7F: 65534`) is meaningless without deep knowledge of each aircraft’s internals * Python connectors exist, but are poorly documented or incomplete * Input/output mapping must be configured *per aircraft*, often manually, with little abstraction Even as a tech-savvy user, I simply can't build a usable interface from scratch without sight. --- **What Would Make a Real Difference** This isn’t a request to redesign DCS itself — just to improve how data is exposed via DCS-BIOS and related tools. **Example: Accessible MFD Output** Thanks to the work now underway by the DCS-BIOS team, we may soon be able to retrieve MFD button labels dynamically. If exposed in structured JSON: ```json { "mfd_left": { "B1": "NAV", "L3": "WP1", "R4": "TGT ACQ" } } ``` This could be spoken aloud or navigated via keyboard — unlocking full MFD usage, including targeting and nav functions. **Example: Radio Panel Readout** Instead of: ``` 0x43A0: 121500 ``` We could have: ```json { "radio1_frequency": 121.5, "unit": "MHz", "mode": "AM" } ``` Spoken as: > "Radio 1: 121.5 Megahertz AM." **Example: Status and Warnings** ```json { "master_caution": true, "gear_down": false, "fuel": 35.2 } ``` Spoken as: > "Warning: Gear is up. Fuel at 35 percent." **Input Mapping** * DCS-BIOS already supports input via serial or UDP * Physical buttons or keyboard keys can be mapped (e.g. G = Gear Toggle) * This could be defined in JSON configs without needing to write custom code --- **Audio Interaction Already Works — Just Not in DCS** Blind players already enjoy: * **MSFS / FSX** using TTS plug-ins * **Elite Dangerous** with audio-based MFD mods * **X-Plane** via structured plugin data * Even **Star Trek bridge simulators** with full voice interfaces But DCS has the **best cockpit data access of all** — and we're not using it for accessibility. --- **What I'm Asking** 1. Expose DCS-BIOS output in structured, documented JSON 2. Provide a schema per aircraft — so values have context 3. Create a CLI or screen reader–friendly monitor for DCS-BIOS 4. Involve accessibility-focused testers — I’m happy to help --- **Why It Matters** DCS is one of the most realistic combat simulators ever created. With only minimal changes in how cockpit data is structured and surfaced, it could become **the most accessible one, too**. The technology is already here. What’s missing is **visibility and intent**. Let’s fix that — together. > PS: This is also posted to the DCS-BIOS GitHub: > [https://github.com/DCS-Skunkworks/dcs-bios/discussions/1207](https://github.com/DCS-Skunkworks/dcs-bios/discussions/1207) --- **One More Thing: `Export.lua` – An Untapped Ally** While DCS-BIOS is an excellent bridge to the cockpit, it's worth noting that `Export.lua` can be used **alongside it** to expose additional flight data: * Speed, altitude, heading * Aircraft state, sensor values * Cockpit lights, warnings, and targeting info * Even some data **not available through DCS-BIOS** This makes it possible to build **custom assistive tools**, voice-controlled UI layers, or sonified interfaces — all without changing the simulator itself. If anyone in the community has explored this route — or is working on accessibility, speech interfaces, or similar solutions — I'd love to connect. Let’s share knowledge, collaborate, and push this forward. Edited Monday at 02:29 PM by Dataaja95 text formatting refined 1
razo+r Posted Monday at 02:36 PM Posted Monday at 02:36 PM Out of curiosity, and sorry for going off-topic but, how does one fly an aircraft if you cannot see? Would you get the software to read out one instrument after another all the time or how does that work when you fly? I mean having working vision is one of the basic requirements for flying an aircraft, so... This request leaves me with more questions than answers... 2
Dataaja95 Posted Monday at 03:19 PM Author Posted Monday at 03:19 PM 42 minutes ago, razo+r said: Out of curiosity, and sorry for going off-topic but, how does one fly an aircraft if you cannot see? Would you get the software to read out one instrument after another all the time or how does that work when you fly? I mean having working vision is one of the basic requirements for flying an aircraft, so... This request leaves me with more questions than answers... Great question — and no worries at all, it's not off-topic. You're absolutely right that in real-world aviation, vision is a strict requirement. But in a simulation, things change — the limitations are different, and so are the opportunities. So how do blind players fly aircraft in other sims like MSFS, X-Plane, or Elite Dangerous? By translating **instrument and status data into speech** or sound cues. Here's how it usually works: * **TTS (Text-to-Speech)** reads out current altitude, speed, heading, vertical speed, etc. * **Audio menus** allow selecting things like waypoints, radio channels, or even MFD pages. * Some systems use **positional audio** (e.g., a warning comes from the left speaker if you're banking left too hard). * In combat sims, things like "target locked" or "missile away" can be spoken aloud automatically. The goal isn’t to read out every gauge constantly — that would be overwhelming — but to give relevant, contextual feedback just like you'd get from scanning your instruments visually. This kind of setup is already working in: * **MSFS with Talking Flight Monitor** * **X-Plane with SonarPM plugins** * **Elite Dangerous** using custom talking interfaces * Even some **multicrew Star Trek sims** where blind players act as tactical officers So the idea is to build a layer on top of DCS-BIOS (and possibly `Export.lua`) that turns cockpit data into useful speech feedback and control mappings. We don’t need DCS itself to change — just access to the data in a structured way. It’s not about making DCS *easy*. It’s about making it *possible*. Let me know if you’re curious to hear more — happy to share how others are already flying like this! 1 1
Tom Kazansky Posted Monday at 07:57 PM Posted Monday at 07:57 PM Hi, I myself got very curious, too about how that works. For those who want to see and hear how that works in a civil sim, here is a YouTube video. It's interesting right from the start but if you want the fast access, you may wanna jump to 28 min to the lined up position on the runway. 1
ShuRugal Posted yesterday at 06:38 PM Posted yesterday at 06:38 PM man, i clicked on this thinking it was going to be focused on contrast visibility options for the color-blind. But what's shown here with other sims converting visual cues to audio cues is wild. i never would have even imagined this.
Dataaja95 Posted 12 hours ago Author Posted 12 hours ago 13 hours ago, ShuRugal said: man, i clicked on this thinking it was going to be focused on contrast visibility options for the color-blind. But what's shown here with other sims converting visual cues to audio cues is wild. i never would have even imagined this. Really appreciate the open-mindedness here. The blind sim community is small — but incredibly dedicated. What’s great is that **none of this requires changes to the DCS engine itself**. Everything I’m suggesting can be done using **existing public interfaces** like `DCS-BIOS` and `Export.lua`. We’re not asking ED to rewrite core systems or overhaul the sim — just to help expose cockpit data in a way that tools can turn into audio or tactile output. If I were a developer myself, I would’ve already built a full accessibility layer on top of DCS-BIOS. But my own background is more in **Linux systems, Proxmox, and backend admin work** — so I rely on collaboration with devs who know these tools inside out. Still, I’m ready to test, document, and support any progress in this area. The potential is all there — just waiting to be unlocked. 1
Recommended Posts