what does vendor specific OUI type in 802.11 frames mean - beacon

I'm working on IEEE802.11.
In the captured beacon frames i found there is a vendor specific OUI type Byte coming after the 3 Bytes OUI, can u explain what this means?
802.11 Beacon Frame:

Related

Implement a virtual Instrument (MIDI OUT) on STM32, Note-On not works

I'm implementing a virtual instrument on STM32 (STM32F103x). Like a normal "hello world", I tried to start with the simplest MIDI message, sending NOTE-ON MIDI message in a loop to see if it works.
I send MIDI NOTE-ON message for every 500ms, only NOTE-ON message in a forever loop, but it not works. Is there any other MIDI message must be sent to make MIDI works? Like some initialize message?
Based on MIDI 1.0 spec, I finish my code on STM32 by sending MIDI message via USART interface.
Then use a logical analyzer, I confirmed my MIDI message is fully match the MIDI 1.0 spec. At least I believe it correct, no issue. 31250 baud rate, 1 start bit, 1 stop bit, one status message 0x92 followed by two data bytes 0x48 and 0x7F, LSB first.
Below is the NOTE-ON message captured by the logical analyzer on TX line of URAT interface. I continue send the same 3 bytes for every 500ms.
The totally time for those 3 bytes are 960 microseconds also match MIDI 1.0 spec mentioned value.
Then, based on "(CA-033) MIDI 1.0 Electrical Specification Update [2014]", I buy one 5 PIN MIDI OUT jack, connect the URAT TX line to PIN-5 through a 10 Ohm resister, connect 3.3V power to Pin-4 through a 33 Ohm resister, PIN-2 and Jack shield connect to GND. Other optional parts (buffer for Rc, ferrite beads) are not used.
However, when I use a USB-MIDI wire connect my MIDI OUT jack to my PC, nothing happen. The USB-MIDI wire has two LED indicators , one for power another for MIDI signal. After connect, the power LED is light, but the MIDI signal LED never light.
I tried to use logical analyzer to analysis PIN-5 of my MIDI OUT jack, the MIDI message byte are totally same with URAT TX line, and every 500ms one NOTE-ON message. However, it never works.
I also tried to create my own PCB with Dream SAM2695 by follow the SAM2695 Evaluation Boards. Then connect the URAT TX directly to MIDI_IN pin of SAM2695, it still no response. Since I have no confidence on my manual soldering PCB, not sure if the PCB itself has issue cause it no response. So I buy a USB-MIDI wire, but the result as I mentioned above, still no response.
======= Apr.26.2021 Update =========
Based on comments, have tried to check the output of the optocoupler.
Before do this check, I bought several BSS138 to transform the MIDI signal to 5V single use below circuit (of course change the resisters near MIDI JACK pin to 220 Ohm as the spec)
After this change I measured voltage of the 5V URAT TX, it show as 4.8+ V, and Logical Analyzer show correct MIDI note on message of this new 5V MIDI signal. However, it still not works.
The only left troubleshooting method for me is measure the output of the optocoupler on this USB-MIDI wire. But I didn't find an optocoupler on this MIDI USB wire PCB. There even do not have any one component has 4 pins on the PCB (based on my understanding, an optocoupler need at least 4 pins).
There only have one main chip on the PCB, it is possible that the optocoupler is embedded in that chip? Since the pin are too small I failed to connect my logical analyzer to those pin to check.

Profibus synchronisation using Linux (Raspberry Pi)

I am planning to develop a simple Profibus master (FDL level) in Linux, more specifically on a Raspberry Pi. I have an RS485 transceiver based on a MAX 481. The master must work on a bus where there are multiple masters.
According to the Profibus specification, you must count the number of '1' bits on the bus to determine when it is time to rotate the access token. Specifically after 11 '1' bits the next frame starts. 11 bits is also exactly one frame.
In Linux, how can I detect these 11 '1' bits? They won't be registered by the driver as there is no start bit. So I need a stream of bits, instead of decoded bytes.
What would be the best approach?
Unfortunately, making use of microcontroller/microprocessor UART is a BAD choice.
You can generate 11 bits setting START_BIT, STOP_BIT, and PARTITY_BIT (even) in your microcontroller UART peripheral. Maybe you will be lucky to receive whole bytes from a datagram without losses.
However, PROFIBUS DP datagram is up to 244 bytes and PROFIBUS DP requires NO IDLE bits between bytes during datagram transmission. You need a UART hardware or UART microcontroller peripheral with a FIFO or register that supports up to 244 bytes - Which is very uncommon, once this requirement is very specific from PROFIBUS.
Another aspect is related to the compatibility of baud rates. Usually, the whole range of PROFIBUS PD baud rates is not fully available on common microcontrollers UART.
My suggestions:
Implement this UART part on FPGA and interface with Raspberry Pi using e.g. SPI. You can decide on the extension of PROFIBUS stack portion you can 'outsource' to FPGA and the part you can keep on RPi.
Use an ASIC (maybe ASPC2, but outdated) and add another compatible processor to implement a deterministic portion of the stack. Later you can interface this processor with your RPi.
Implement using an industrial communication dedicated processor (Like TI Sitara am335x).

How to sync microcontroller to MIDI controller output

I am looking to receive MIDI messages to control a microcontroller based synthesizer and I am working on understanding the MIDI protocol so I may implement a MIDI handler. I've read MIDI is transmitted at 31.25kHz without a dedicated clock line - must I sample the line at 31.25kHz with the microcontroller in order to receive MIDI bytes?
The MIDI specification says:
The hardware MIDI interface operates at 31.25 (+/- 1%) Kbaud, asynchronous, with a start bit, 8 data bits (D0 to D7), and a stop bit. […] Bytes are sent LSB first.
This describes a standard UART protocol; you can simply use the UART hardware that most microcontrollers have built in. (The baud rate of 31250 Hz was chosen because it can be easily derived from a 1 Mhz (or multiple) clock.)
If you really wanted to implement the receiver in software, you would sample the input signal at a higher rate to able to reliably detect the level in the middle of each bit; for details, see What exactly is the start bit error in UART? and How does UART know the difference between data bits and start/stop bits?

how to use opus DTX from opensource OPUS demo binary

i want help in OPUS DTX
1.how OPUS DTX is working.
2.how opus encoder and decoder works for dtx
3.Is there any bit representation for DTX in OPUS
From rfc When DTX is enabled, only one frame is encoded
every 400 milliseconds it didnt works for me with opus_demo binary.
please help with OPUS DTX.
When using Opus over a network, using a protocol such as RTP where the packets are timestamped, DTX may be enabled if you want to reduce the packets sent during periods where there is no voice activity. A packet would still be sent about every 400 ms, updating background noise. Using the packet timestamps the receiver can determine the duration of any gaps and fill them in with the background noise to keep it sounding natural.
Enable DTX in the encoder using opus_encoder_ctl(enc, OPUS_SET_DTX(1));, or with the -dtx option on opus_demo. Then, simply do not send any packets produced by the encoder with a length of 2 bytes or less. (These "DTX packets" are just zero-length frames, with a normal 1- or 2-byte packet header (TOC), and do not contain any audio data.) Packets with a length larger than 2 bytes should be sent as usual.
The receiver should use normal packet loss concealment to handle missing packets, in the same manner as it would handle packet loss. In particular it can call opus_decode() with data = NULL, len = 0, and frame_size equal to the size of the missing frame(s), and the decoder will generate audio data to conceal the missing frame(s).

Is there an ethernet header in IEEE 802.11

I have been capturing some packets over wifi using wireshark for analysis. If I captured IEEE 802.11 frames on an interface in monitor mode. If I capture an IEEE packet on an open network without encryption then I cannot see any ethernet headers. However if I capture the same packets on a usual interface(not in monitor mode), then I can see ethernet headers. I was not able to decrypt wpa packets captured in monitor mode for more analysis. So is there actually an ethernet layer when an IEEE packet is transmitted? Or is it added to it by the driver before delivering to applications listening on the upper layers?
Here is a packet missing ethernet layer.
Ethernet is defined by IEEE 802.3, not IEEE 802.11 (Wi-Fi), so, no, there is no ethernet header in 802.11 frames; they are different network types, and IEEE 802.11 has its own frame format and headers. It the same with any of the IEEE 802.x LANs. For instance, IEEE 802.5 (token ring) has a different frame and header format, too.