How do I send Midi events to Software Synth in C++ - midi

Hi looking for some advice. I am writing some music composition software. I have cobbled together tools to read write and send midi data, they work fine. However I stumped on the following: I'm trying to send midi events to a SW synthesizer on the computer.
So I can control the sw synth from an external keyboard. I can control the keyboard from my own sw in the computer. But how do I get my sw to send midi to the sw synth in the same computer.
Also I'm trying to do this in a platform independent way if possible.
Thanks!
VMan
My question wasn't clear.
So I'm currently running on Win7. Cross platform is a priority but not for the first prototype.
Specifically my problem is with accessing kontakt player (v2) it works within it's own environment and with midi IO. But I can't access it from within my own software.
midiOutGetNumDevs returns just one device and it's the Microsoft GS Wavetable Synth.
I'm confused that I can send midi to the kontakt player via a midi/usb cable but that it doesn't show up as a midi device.
What am I missing?
Thx

Which API/OS are you using? Which SoftSynth?
Short answer: you might try "PortMidi".
http://portmedia.sourceforge.net/
Long(er) answer:
I haven't found any cross-platform MIDI lib able to speak to any kind of MIDI sink. In fact it depends on what the softsynth use for receiving MIDI events.
1) On Linux, you can use ALSA to speak to the ALSA synths. A softsynth can register itself as an ALSA sink. You can either:
* use the ALSA lib to connect to send MIDI events to this sink ;
* or you can register your application as a ALSA MIDI source and use another program (aconnectgui, qjackctl, patchage) to connect it to any sink.
http://www.alsa-project.org/alsa-doc/alsa-lib/rawmidi.html
http://www.alsa-project.org/alsa-doc/alsa-lib/seq.html
Downside: Linux specific
2) You can use JACK for MIDI. As in ALSA MIDI, an application can register MIDI sources and sinks. Yhe softsynth can register as a JACK MIDI sink. Then you need to make you app a JACK MIDI source and connect them with another program (qjackctl, patchage).
http://jackaudio.org/files/docs/html/index.html
Downside: need installation, configuration, launching of JACK
You have two solutions to move the ALSA sink/sources as JACK sink/sources :
either use the built-in functionality of JACK (commandline -Xseq) ;
or using "a2jmidi"
4) On MacOS you can do MIDI with CoreAudio. I don't know anything about it though.
5) On windows, I guess you use the midi* functions in
6) Use OSS on some other OSes
7) Communicate with the synth using socket/protocols
You could make your software send MIDI events using MIDI/RTP or MIDI/UDP so you don't care about the driver/OS. Most softsynth don't speak any of those directly so you need a program to do the bridging (qmidinet or others).
8) PortMidi is a cross platform lib for MIDI. However it doesn't seem to be able to use JACK as a backend directly (you can however make the ALSA devices available in JACK as explained above).
http://portmedia.sourceforge.net/
For example, on Linux, Fluidsynth can use ALSA, OSS and JACK for MIDI input. Timidity++ can use ALSA and the Windows API.

OK the answer to this problem is to use a virtual midi driver. I found a free one here:
http://nerds.de/en/loopbe1.html
This creates a midi output device that shows up with midiOutGetNumDevs that I can stream my midi notes too and other Midi software on the same computer can use it as a midi input device.
Works on Win7 and apprently also has mac support. My problem is solved.
"LoopBe1 is an internal MIDI device for transferring MIDI data between computer programs. Basically LoopBe1 is an "invisible cable" to connect a MIDI outport of an application to any other applicationĀ“s MIDI inport."

Agreed that a Virtual MIDI Driver is the easiest solution.
Another one that used to be very popular (not sure how it performs on Win7 though) is MIDI Yoke. http://www.midiox.com/
If you are looking to do something commercial, you may consider writing your own virtual MIDI driver to make your software controller(sender) a virtual MIDI source. Only on Windows will this be a lot of work.
On Mac and Linux your job is much easier. As #ysdx said, Mac using CoreMIDI and Linux using ALSA/Jack, this is very easy to do in your application by creating a Virtual MIDI port and connecting to destinations.

Related

How to flash without STLINK

My STLINKV2 is not working anymore, not detected by Linux, it failed after the first successful flash. I ordered a new one but it will take 60+ days to arrive. Meanwhile I have heard on Youtube you can program Bluepills directly by connecting cut open USB cable to certain pins and then using a jumper. But I cannot get any precie information on this, is this really possible and how?
You should use the embedded bootloader. You can flash it through several interfaces. Look at AN2606, maybe you can find an already written flasher. Good luck STM32CubeProgrammer handle it.
If you intend to program it through usb, look also at AN3156 all protocols document are referred in chapter 2 of AN2606
THOSE AREN'T CUT OPEN USB CABLES they are USB to serial adapters for arduino's bootloader
They connect them like this:
The problem is that this requires the Arduino STM32 bootloader to be flashed in it.
Another option will be to use STM32CubeProg this program allows you to program your stm over
Serial
SPI
I2C
USB
You'll need to set the BOOT0 and BOOT1 pins to the correct value (HIGH slash LOW) to allow it to go in flash mode during boot.
Here is semi outdated tutorial which tells most of the steps to program a STM using serial. (the Flash Loader Demonstrator is outdated and you should use STM32CubeProg)

Named-pipe/ FIFO on USB Mass Storage Gadget to Stream Audio for Car, Docks etc.

Many devices (cars, TVs, iPod Docks, AVR receivers etc) have the facility to access class compliant USB Mass Storage Devices and play wav files etc. stored upon them.
I understand I can use a small linux system with appropriate bi-mode USB host/ receiver ports (e.g. a Beagleboard black) to emulate a FAT32 mass storage device (a linux 'gadget') that can be plugged into a car and used as if it were a dumb memory stick - 'g_mass_storage'
http://www.linux-usb.org/gadget/file_storage.html
For static files this works fine. However, I would like to have the beagle board run a bluetooth receiver, decode the stream into PCM and then pipe this into a dummy.wav file that could be read (indefinitely) by the car (ipod Dock etc.)
E.g.
[Android or iPhone] --> [bluetooth a2dp] --> [beagleboard/ small linux system] --> [PCM audio]* --> [ g_mass_storage].'dummy.wav' --> [car's USB host]
The steps up to the * are trivial, but I can't work-out how to pipe data into a dummy.wav file as FAT32 doesn't support pipes and yet this is typically the only format supported by cars etc.
It seems something like this is at least conceptually possible:
http://www.dension.com/products/dbu
and 'cubund' on indigogo seems to be following the same principle (sorry can't paste second link as stackexchange won't let me)
I would have bought one if it had got-off the ground!
Any ideas?
Thanks,
Thomas
P.S. the first part of the chain (i.e. the phone via bluetooth) could be any mechanism and isn't particularly interesting. The challenge is to provide a virtual file that would enable 'streaming' of Google Music/ web radio etc. to devices only capable of reading files from a mass storage device.
It is already on the market for bluetooth. Check on ebay for bluetooth audio receiver. Is a little dongle that you can put in usb port and you may use it like regular usb drive.
Best regards,
Romeo

Writing USB Device Driver in Linux

ALSA or libusb api are two choices; both are new to me; its been years since I wrote a device driver and it was for Unix back in the 80's, but I do know I can figure out how, once I know what tools to use, which I'm guessing both use C still; I have looked at the libusb api; very nice; but I have no idea about ALSA project; seems they are geared into getting modules into the kernel to achieve this.
This is a generic question, but the Device I'm interested in is the Roland GR-55; it has MIDI and Audio from the same USB connection; it has Windows and MAC Drivers but no Linux.
Which libraries or tools do you prefer to use?
Do I write a Device Driver or Loadable Kernel Modules (LKM)?
libusb is useful and easy to get up and running. I would suggest that you start there, especially if you haven't written Linux drivers in a while. Use libusb to understand what the signalling protocol is for the Roland GR-55 and do some experiments.
USB supports several types of logical connections over the same physical wire. There will likely be DATA and CONTROL pipes available from the device and you will need to map that out before starting on a proper driver.
As I said, libusb is easy to get going and you could have something useful in a few days if you just want to write a control interface and store raw data from the device. However, ALSA is the way to go if you want to use the device with existing music software. Also, by updating ALSA you would be supporting the larger community because you could merge your work in to the ALSA project.
Writing a kernel module from scratch could be fun, but USB is a bit of a beast and would probably not be an ideal one to start with. With ALSA you will have a framework to guide you and won't need to worry so much about defining your own APIs.
As commented above, check out Linux Device Drivers http://lwn.net/Kernel/LDD3/ chapter 13 talks specifically about USB drivers. For development it's easier to write your driver as a kernel module because then you don't need to recompile the kernel when you make a change.
No need for writing a new driver - ALSA already has support for that since a while. See commit https://git.kernel.org/cgit/linux/kernel/git/torvalds/linux.git/commit/?id=0ef283247a0cf0fd2e8370ee467030292eb3129e
Since you probably want to hear sound on your sound card, you should opt for ALSA. That way after you are done you are done. If you write libusb driver, then you would have to write your own user space tools for feeding that sound card sound for playing.

Publishing a MIDI source as a Bonjour service

I have written a VST/AU/RTAS synthesiser plugin for OSX and Windows that also has an iPhone equivalent. I would like to allow the two to communicate with each other over a local area network so that the iPhone app can be used to send MIDI controller data to the plugin. I plan to create a MIDI source on the iPhone and publish it as a Bonjour service so that the plugin running on OSX or Windows can find it and receive midi from it.
I have a couple of questions to ask about this:
1) Do I actually have to publish the MIDI source as a Bonjour service or does a coremidi host (running on iPhone) automatically publish itself?
2) Are there any code examples available that show how to do this sort of thing?
I have seen the following post but the answer to this only covers the client side, finding a Bonjour service but not the publishing side, and it transmits MIDI via OSC, and it only covers OSX but not Windows (I know, I'm not asking much! ;) )
How to send MIDI or OSC signals to a Mac application from my iOS application?
Cheers,
John.
AFAIK you'll have to publish the service yourself. NSNetService and NSNetServiceBrowser are the classes you need. Check out the companion guide. I found this article on Cocoa for Scientists particularly helpful in getting started. Both have some decent code samples. The Bonjour Browser is useful for testing.
The list of bonjour service types already has
apple-midi
and
imidi
But I think it's best to make up your own application-specific type name unless your app is plug-compatible with one of these services.

Wireless communication: AVR based embedded system and iPhone

What is the best way to realize wireless communication between an embedded system (based on an AVR controller) and the iPhone? I think there are only two options: either WiFi or BlueTooth. The range is not really a problem, since both devices should stay in the same room.
I have no idea, if there are any useful WiFi boards that can be connected to an AVR based microcontroller system (or any small microcontroller), any hints would be highly welcome.
I guess the better solution would be BlueTooth, but there is also the problem: which BlueTooth board is best suited for attachment to an AVR system, and is it possible to use the iPhone BlueTooth stack for (serial) communication over BlueTooth with the AVR device.
I hope that somebody already realized such a system and can give some helpful tips...
You can get modules for both WiFi and Bluetooth that will connect to an embedded system through a UART interface, however a WiFi module will have far more processing power than your AVR microcontroller, often with spare capacity and I/O to execute additional user code, so connecting one to an AVR maybe somewhat redundant in many cases.
Bluetooth modules are simpler, less expensive, and the data-rate is better matched to the AVR's capabilities. For example these Parani modules. I have used them between an embedded system and a Laptop PC's Bluetooth, so given appropriate communications software, there is no technical reason why it could not be used with an iPhone I think. However this may be the flaw, on the PC the device was recognised as a virtual serial port, I don't know whether iPhone supports 'legacy' communications in quite the same way.
For comparison, a WiFi solution
From what I know, BlueTooth is very limited on the iPhone: There is only very few BlueTooth-Profiles implemented, and - even if they can be extended with a jailbroken iPhone - I doubt this is easy to use from the application layer.
On the other side, transferring via WiFi requires a lot of processing power and memory since much more things have to be implemented before you can even start transferring data: 802.11, cdma/ca, arp, tcp. That's a big task.
Is it an option to build a hardware extension to the iPhone ? You might be able to get the serial connection and power out of the dock connector. Then even ZigBee could be very helpful.
Here's an article you might find helpful. I would lean toward a WiFi solution just because of the added flexibility available.
http://www.embedded.com/design/networking/215801088
-t
Some of the other people at the office have done AVR <- Bluetooth -> Symbian and AVR <- Bluetooth -> PC solutions without trouble. There is lots of info, reference designs and source available. I have no idea of how hard it would be to use Bluetooth on Iphone.
The exact module is probability also not important as long as it got some type of serial interface (I2C,SPI) to interface to the AVR and some source code show how to use the module.
Is it an 8-bit or 32-bit AVR? For the AVR32 processors there's support
for WiFi in the Atmel 1.5.0 Software Framework using SD-card-mounted
WiFi modules from HD Wireless (http://www.hd-wireless.se), including
an IP stack (lwIP). Be aware that you need Ad-Hoc (IBSS) support to
connect directly to the iPhone.
There is WiSnap kit. It can connect directly to a standard RS232 interface or through the TTL UART interface to embedded processors. We are planning to use it in our project. It also has Ad-Hoc support.
There are some usage examples and an iPhone application for connection setup.
http://serialio.com/products/mobile/wifi/WiSnapKit1.php
What are you trying to communicate between your AVR and the Iphone? The Iphone is made for the web along with everything apple (which AVR's are decidedly not). So what works well is an embedded device that exposes a web-interface. Like the Transmission bittorrent client on Linux. Also nowadays many low-power small form-factor linux platforms exist that will allow you to do this.
For instance Gumstix has an ARM based platform that runs linux and includes WiFi (Overo Fire).