Tekkx Posted December 4, 2016 Posted December 4, 2016 Hello, dear community. Today I made my first little steps with a Teensy 3.1. I ordered two of them a few weeks ago. After solving a few starter problems (typical, as I see at this forum) I got that thing into business. I expect some more performance compared to "common" Arduino. Also available "uncounted" I/Os make it very appetizingly :) My hope is to solve a display problem (partially). Just for fun I uploaded a randomly (working on UNO) DCS-BIOS-Slave sketch and got uncounted error messages. So my initial questions are: - Is it possible - to a noob like me - to adopt DCS-BIOS without diving too deep into the libraries? - Will this become a bottomless pit? - Is it worth to give it a try? With respect Tekkx Manual for my version of RS485-Hardware, contact: tekkx@dresi.de Please do not PM me with DCS-BIOS-related questions. If the answer might also be useful to someone else, it belongs in a public thread where it can be discovered by everyone using the search function. Thank You.
FSFIan Posted December 5, 2016 Posted December 5, 2016 (edited) It should work in DCSBIOS_DEFAULT_SERIAL mode (make sure to use v0.2.6 or later of the Arduino library, earlier versions had a nasty bug that affected DEFAULT_SERIAL mode). I don't think I have properly explained the trade-offs of DEFAULT_SERIAL vs. IRQ_SERIAL anywhere, so I'll do it here by explaining the reason IRQ_SERIAL was introduced. In the beginning, there was only DEFAULT_SERIAL mode. In that mode, all communication is done using Serial.read() and Serial.write(). The control flow inside loop() looked like the following pseudocode: (1) while(Serial.available()) { process(Serial.read() } (2) for each input: if it has changed, use Serial.write() to send a command to the PC Outputs would be updated inside that process() call as soon as new information for them had been received. To understand the problems with this approach, it is important to understand what happens when a new byte of data comes in from the PC. Incoming data is put into a receive buffer. In step (1), the data in that buffer is processed until the buffer is empty. By default, that buffer is only 64 bytes long on devices with 2K RAM. Which means that if more than 64 bytes arrive during step (2) or while outputs are updated during step (1), all hell breaks loose because incoming data is being dropped on the floor. DCS-BIOS will try to send 30 updates per second, sending every piece of information that has changed since the last update. For the A-10C, each update will result in a data burst of about 120 bytes in length; the very first data burst is about 700 bytes long. Step (2) usually executes very fast, as it only involves a few digitalRead() calls. Sometimes a short message has to be sent to the PC, but that's fine as well -- in the time it takes to send, say, 20 bytes to the PC, a maximum of 20 bytes can be received, and they fit in the receive buffer. So unless there are lots of inputs that need to be checked or there is a high likelyhood that two or three inputs have changed at the same time (*cough* jittery potentiometers *cough*), even a 64-byte buffer is enough to handle step (2). For most people, the problems started when they had outputs that took a long time to update because their drivers deliberately included calls to delayMicroseconds() to wait for the hardware to catch up. The usual culprit were (multiple) displays using the LiquidCrystal library. So how can we fix that problem? One solution would be a receive buffer that is large enough to hold the largest expected update. But a 700-byte receive buffer is not a good idea on a device with only 2048 bytes of RAM total. Another solution (the one implemented in IRQ_SERIAL mode) is to implement the serial communication ourselves. When the program is interrupted because a new byte of data has arrived, that byte is not only put into a receive buffer, but everything in that buffer is processed before control is handed back to the main program. Also, "processing" no longer means "write the data to a display"; instead, it only means "ignore all data that no output is interested in; store every interesting peace of info and update any connected LEDs, displays, etc. later". Also, IRQ_SERIAL mode can send and receive data at the same time. On the Teensy 3.1, you have 64K of RAM, so if you can figure out how to increase the size of the receive buffer of the standard Serial library to, say, 1024 bytes, you should be fine. Or you could implement IRQ_SERIAL mode for the Teensy's ARM processor (instantiate a ProtocolParser, set up the UART; when a byte is received, pass it to parser.processCharISR() from your interrupt service routine; implement DcsBios::sendDcsBiosMessage() to send data to the PC). Increasing the RX buffer size should be easy enough (maybe the Teensy even uses a larger one by default?). Implementing IRQ_SERIAL would involve delving into the datasheet of whatever ARM CPU the Teensy uses (but you could look at the code of the HardwareSerial implementation on the Teensy to get an idea of how to do it). Edited December 5, 2016 by [FSF]Ian DCS-BIOS | How to export CMSP, RWR, etc. through MonitorSetup.lua
WhoMadeWho Posted December 7, 2016 Posted December 7, 2016 DEFAULT_SERIAL vs. IRQ_SERIAL Ian;2973997']It should work in DCSBIOS_DEFAULT_SERIAL mode (make sure to use v0.2.6 or later of the Arduino library, earlier versions had a nasty bug that affected DEFAULT_SERIAL mode). I don't think I have properly explained the trade-offs of DEFAULT_SERIAL vs. IRQ_SERIAL anywhere, so I'll do it here by explaining the reason IRQ_SERIAL was introduced. In the beginning, there was only DEFAULT_SERIAL mode. In that mode, all communication is done using Serial.read() and Serial.write(). The control flow inside loop() looked like the following pseudocode: (1) while(Serial.available()) { process(Serial.read() } (2) for each input: if it has changed, use Serial.write() to send a command to the PC Outputs would be updated inside that process() call as soon as new information for them had been received. To understand the problems with this approach, it is important to understand what happens when a new byte of data comes in from the PC. Incoming data is put into a receive buffer. In step (1), the data in that buffer is processed until the buffer is empty. By default, that buffer is only 64 bytes long on devices with 2K RAM. Which means that if more than 64 bytes arrive during step (2) or while outputs are updated during step (1), all hell breaks loose because incoming data is being dropped on the floor. DCS-BIOS will try to send 30 updates per second, sending every piece of information that has changed since the last update. For the A-10C, each update will result in a data burst of about 120 bytes in length; the very first data burst is about 700 bytes long. Step (2) usually executes very fast, as it only involves a few digitalRead() calls. Sometimes a short message has to be sent to the PC, but that's fine as well -- in the time it takes to send, say, 20 bytes to the PC, a maximum of 20 bytes can be received, and they fit in the receive buffer. So unless there are lots of inputs that need to be checked or there is a high likelyhood that two or three inputs have changed at the same time (*cough* jittery potentiometers *cough*), even a 64-byte buffer is enough to handle step (2). For most people, the problems started when they had outputs that took a long time to update because their drivers deliberately included calls to delayMicroseconds() to wait for the hardware to catch up. The usual culprit were (multiple) displays using the LiquidCrystal library. So how can we fix that problem? One solution would be a receive buffer that is large enough to hold the largest expected update. But a 700-byte receive buffer is not a good idea on a device with only 2048 bytes of RAM total. Another solution (the one implemented in IRQ_SERIAL mode) is to implement the serial communication ourselves. When the program is interrupted because a new byte of data has arrived, that byte is not only put into a receive buffer, but everything in that buffer is processed before control is handed back to the main program. Also, "processing" no longer means "write the data to a display"; instead, it only means "ignore all data that no output is interested in; store every interesting peace of info and update any connected LEDs, displays, etc. later". Also, IRQ_SERIAL mode can send and receive data at the same time. On the Teensy 3.1, you have 64K of RAM, so if you can figure out how to increase the size of the receive buffer of the standard Serial library to, say, 1024 bytes, you should be fine. Or you could implement IRQ_SERIAL mode for the Teensy's ARM processor (instantiate a ProtocolParser, set up the UART; when a byte is received, pass it to parser.processCharISR() from your interrupt service routine; implement DcsBios::sendDcsBiosMessage() to send data to the PC). Increasing the RX buffer size should be easy enough (maybe the Teensy even uses a larger one by default?). Implementing IRQ_SERIAL would involve delving into the datasheet of whatever ARM CPU the Teensy uses (but you could look at the code of the HardwareSerial implementation on the Teensy to get an idea of how to do it). This is good info on IRQ_SERIAL! Thanks for sharing. Somewhat related, any issue with multiple Auduino or similar devices running at the same time? For example my cockpit build will utilize ~40+ Arduinos Nanos, each on it's own COM port with a dedicated instance of socat.exe hooked to each COM port. I've tested 15 Nanos simultaneously and didn't note any issues at all (performance or otherwise). Thanks!!
FSFIan Posted December 7, 2016 Posted December 7, 2016 (edited) For example my cockpit build will utilize ~40+ Arduinos Nanos, each on it's own COM port with a dedicated instance of socat.exe hooked to each COM port. I've tested 15 Nanos simultaneously and didn't note any issues at all (performance or otherwise). Performance wise I see no reason why this shouldn't work. However, you will likely run into some limitations with that many USB devices. bnepethomas has mentioned before that he ran into problems with 'only' 20 devices and had to install another USB controller card. Handling 40 concurrent instances of socat is also very, very annoying. This will eventually get less annoying when I get around to write DCS-BIOS 2.0, but you will still have to configure a list of 40 COM ports to use. Which might change from time to time just because Windows feels like it (i.e. it thinks the USB-to-serial converter is plugged in somewhere else, doesn't recognize it because it does not have a unique serial number, and then assigns it a new COM port number). Before I switched to Windows 10, the COM port numbers that my Windows 7 install assigned to the three Arduino boards I used for testing had gotten up to COM 46... The official solution to that is to use an RS-485 bus. That way, you will only need one or two COM ports to talk to the RS-485 bus masters and maybe a few more if you have panels that cannot use the RS-485 bus for some reason (for example because you used a Arduino-compatible board with a fast ARM CPU, where IRQ_SERIAL mode is not supported, to drive a graphic display for the CDU). The RS-485 code is in DCS-BIOS, although there is no official documentation yet. Several people have gotten it to work, but the whole thing is still in beta status and we don't have a good set of troubleshooting steps yet. Switching a sketch from IRQ_SERIAL mode to RS-485 is easy, as long as you have one free digital I/O pin to use for the TX_ENABLE signal. Edited December 7, 2016 by [FSF]Ian DCS-BIOS | How to export CMSP, RWR, etc. through MonitorSetup.lua
Recommended Posts