How to sync microcontroller to MIDI controller output - midi

I am looking to receive MIDI messages to control a microcontroller based synthesizer and I am working on understanding the MIDI protocol so I may implement a MIDI handler. I've read MIDI is transmitted at 31.25kHz without a dedicated clock line - must I sample the line at 31.25kHz with the microcontroller in order to receive MIDI bytes?

The MIDI specification says:
The hardware MIDI interface operates at 31.25 (+/- 1%) Kbaud, asynchronous, with a start bit, 8 data bits (D0 to D7), and a stop bit. […] Bytes are sent LSB first.
This describes a standard UART protocol; you can simply use the UART hardware that most microcontrollers have built in. (The baud rate of 31250 Hz was chosen because it can be easily derived from a 1 Mhz (or multiple) clock.)
If you really wanted to implement the receiver in software, you would sample the input signal at a higher rate to able to reliably detect the level in the middle of each bit; for details, see What exactly is the start bit error in UART? and How does UART know the difference between data bits and start/stop bits?

Related

Implement a virtual Instrument (MIDI OUT) on STM32, Note-On not works

I'm implementing a virtual instrument on STM32 (STM32F103x). Like a normal "hello world", I tried to start with the simplest MIDI message, sending NOTE-ON MIDI message in a loop to see if it works.
I send MIDI NOTE-ON message for every 500ms, only NOTE-ON message in a forever loop, but it not works. Is there any other MIDI message must be sent to make MIDI works? Like some initialize message?
Based on MIDI 1.0 spec, I finish my code on STM32 by sending MIDI message via USART interface.
Then use a logical analyzer, I confirmed my MIDI message is fully match the MIDI 1.0 spec. At least I believe it correct, no issue. 31250 baud rate, 1 start bit, 1 stop bit, one status message 0x92 followed by two data bytes 0x48 and 0x7F, LSB first.
Below is the NOTE-ON message captured by the logical analyzer on TX line of URAT interface. I continue send the same 3 bytes for every 500ms.
The totally time for those 3 bytes are 960 microseconds also match MIDI 1.0 spec mentioned value.
Then, based on "(CA-033) MIDI 1.0 Electrical Specification Update [2014]", I buy one 5 PIN MIDI OUT jack, connect the URAT TX line to PIN-5 through a 10 Ohm resister, connect 3.3V power to Pin-4 through a 33 Ohm resister, PIN-2 and Jack shield connect to GND. Other optional parts (buffer for Rc, ferrite beads) are not used.
However, when I use a USB-MIDI wire connect my MIDI OUT jack to my PC, nothing happen. The USB-MIDI wire has two LED indicators , one for power another for MIDI signal. After connect, the power LED is light, but the MIDI signal LED never light.
I tried to use logical analyzer to analysis PIN-5 of my MIDI OUT jack, the MIDI message byte are totally same with URAT TX line, and every 500ms one NOTE-ON message. However, it never works.
I also tried to create my own PCB with Dream SAM2695 by follow the SAM2695 Evaluation Boards. Then connect the URAT TX directly to MIDI_IN pin of SAM2695, it still no response. Since I have no confidence on my manual soldering PCB, not sure if the PCB itself has issue cause it no response. So I buy a USB-MIDI wire, but the result as I mentioned above, still no response.
======= Apr.26.2021 Update =========
Based on comments, have tried to check the output of the optocoupler.
Before do this check, I bought several BSS138 to transform the MIDI signal to 5V single use below circuit (of course change the resisters near MIDI JACK pin to 220 Ohm as the spec)
After this change I measured voltage of the 5V URAT TX, it show as 4.8+ V, and Logical Analyzer show correct MIDI note on message of this new 5V MIDI signal. However, it still not works.
The only left troubleshooting method for me is measure the output of the optocoupler on this USB-MIDI wire. But I didn't find an optocoupler on this MIDI USB wire PCB. There even do not have any one component has 4 pins on the PCB (based on my understanding, an optocoupler need at least 4 pins).
There only have one main chip on the PCB, it is possible that the optocoupler is embedded in that chip? Since the pin are too small I failed to connect my logical analyzer to those pin to check.

Profibus synchronisation using Linux (Raspberry Pi)

I am planning to develop a simple Profibus master (FDL level) in Linux, more specifically on a Raspberry Pi. I have an RS485 transceiver based on a MAX 481. The master must work on a bus where there are multiple masters.
According to the Profibus specification, you must count the number of '1' bits on the bus to determine when it is time to rotate the access token. Specifically after 11 '1' bits the next frame starts. 11 bits is also exactly one frame.
In Linux, how can I detect these 11 '1' bits? They won't be registered by the driver as there is no start bit. So I need a stream of bits, instead of decoded bytes.
What would be the best approach?
Unfortunately, making use of microcontroller/microprocessor UART is a BAD choice.
You can generate 11 bits setting START_BIT, STOP_BIT, and PARTITY_BIT (even) in your microcontroller UART peripheral. Maybe you will be lucky to receive whole bytes from a datagram without losses.
However, PROFIBUS DP datagram is up to 244 bytes and PROFIBUS DP requires NO IDLE bits between bytes during datagram transmission. You need a UART hardware or UART microcontroller peripheral with a FIFO or register that supports up to 244 bytes - Which is very uncommon, once this requirement is very specific from PROFIBUS.
Another aspect is related to the compatibility of baud rates. Usually, the whole range of PROFIBUS PD baud rates is not fully available on common microcontrollers UART.
My suggestions:
Implement this UART part on FPGA and interface with Raspberry Pi using e.g. SPI. You can decide on the extension of PROFIBUS stack portion you can 'outsource' to FPGA and the part you can keep on RPi.
Use an ASIC (maybe ASPC2, but outdated) and add another compatible processor to implement a deterministic portion of the stack. Later you can interface this processor with your RPi.
Implement using an industrial communication dedicated processor (Like TI Sitara am335x).

The relationship between RGMII to MDI in Ethernet communication

Let's say I am talking to a PHY chip via RGMII.
What is the relationship between the serial information transmitted on the RGMII to the signals that go out to the MDI?
I understood from the timing diagram of RGMII that the rising edge is 4 bits and the falling edge is 4 bits. So for each clock that gives 8 bits.
For 100Mbps, the clock required is 25MHz. So for every 25MHz clock cycle, 8 bits are transmitted.
Does the PHY chip simply send each 8 bits over the MDI immediately?
If that is the case, then how do I correctly package these serial 8 bits of data into a proper ethernet frame?
I a trying to troubleshoot a piece of hardware where the PHY does not work properly but the only way troubleshoot is if I can control the RGMII. However, I do not understand this relationship between the RGMII and how it affects the MDI.
I presume that if I look at wireshark, it will not show any packets of information unless I send a string of serialized data in a proper Ethernet frame.
The PHY should have some documentation and sample code. Without, finding out how exactly it works can be a very tedious task.
You can find the general RGMII description here: https://web.archive.org/web/20160303212629/http://www.hp.com/rnd/pdfs/RGMIIv1_3.pdf

How computers can receive data quickly?

In computer networks, we are trying to increase the transmission speed of data. Since data is nothing but electrical signals. How these electric signals can be converted into bits so quickly? This conversion is done by ADC - DAC. We can’t control the speed of computation of ADC then how can we translate the electric signals to bits so quickly. Next, Is this ADC integrated in our computer chipset?
Also, does it mean that every peripheral has ADC. For example, NIC card will have ADC. Is the information carried in the LAN cable like CAT 5, 6 are analog in nature?
You clock bits in by detecting a rising edge on a signal wired up to one of the pins of your chip. Then the rise lasts for a certain period of time, but only a fraction of a millisecond. There's a bit of tolerance so sender and receiver don't have to be exactly synchronised. The chip then transfers the bit to a buffer in very low level code. When it has a byte, slightly higher level code transfers the byte to another buffer, then the next level is user level - we have a stream of input bytes.
Whilst the wire is of course analogue, that is not analogue to digital conversion. Analogue to digital conversion is where we measure the signal, quantise it, then create a binary representation in place value notation.

PC receives wrong data using high baudrate of usart

i wanted to use 4Mb baud rate of stm32f103 usart. how can i check that data received in PC are correct? I used hyper terminal but in its setting there is no 4Mb baud rate and when i run my code i receive wrong characters.but in low baud rates like 115200b data received correctly.
If the transmitter and receiver are not sending at the same speed, the receiver will erroneously read the data. Each byte has a start bit that synchronizes the receiver, and the remaining bits are determined by time.
Typical PC RS-232 serial ports only go up to 115200 bps. It is likely that your PC can't handle the 4 Mbps rate. I would recommend using 115200 or lower speed.
If you are communicating between devices and need higher speed, and just using the PC to debug, you could change the speed for debug purposes and set it faster once your comms are working. Alternatively, you could use a logic analyzer - This could be tedious to do manually, but some may have functions to read serial data.
If you have two of the stm32f19 modules, connect them using the USART at 4Mb, at then send a block of data with a checksum (or even a hardcoded block that you can compare). On the receiving unit, either confirm the checksum, or compare the data to see if the link works.