How to read current MIDI status for AKAI MIDIMIX? - midi

As per these three questions, there is no standard way to force a MIDI controller to send the current state of its sliders and knobs. However, all answers points to the possible existence of manufacturer-specific SysEx command to do so.
Is there such a SysEx command for the AKAI MIDIMIX controller?

Related

Understanding the role of time in a AVCaptureSession regarding CMSampleBuffers

I recently started programming in Swift as I am trying to work out an iOS camera app idea I've had. The main goal of the project is to save the prior 10 seconds of video before the record button is tapped. So the app is actually always capturing and storing frames, but also discarding the frames that are more than 10 seconds old if the app is not 'recording'.
My approach is to output video and audio data from the AVCaptureSession using respectively AVCaptureVideoDataOutput() and AVCaptureAudioDataOutput(). Using captureOutput() I receive a CMSampleBuffer for both video and audio, who I store in different arrays. I would like those arrays to later serve as an input for the AVAssetWriter.
This is the point where I'm not sure about the role of time and timing regarding the sample buffers and the capture session in general, because in order to present the sample buffers to the AVAssetWriter as an input (I believe) I need to make sure my video and audio data are the same length (duration wise) and synchronized.
I currently need to figure out at what rate the capture session is running, or how I can set that rate. Ideally I would have one audioSampleBuffer for each videoSampleBuffer, representing both the exact same duration. I don't know what realistic values are, but in the end my goal is to output 60fps, so it would be perfect if the videoSampleBuffer would contain 1 frame and the audioSampleBuffer would represent 1/60th of a second. I then could easily append the newest sample buffers to the arrays and drop the oldest.
I've of course done some research regarding my problem, but wasn't able to find what I was looking for.
My initial thought was I had to let the capture session run at some sort of set timescale, but didn't see such an option in the AVFoundation documentation. I then looked into Core Media if there was some way to set the clock the capture session was using, but couldn't find a way to say to the session to use a different CMClock (with properties I know), so I gave up this route. I still wasn't sure about the internal mechanics and timing of the capture session, so I tried to find more information about it, but without much luck. I've also stumbled on the synchronizationClock property of AVCaptureSession, but I couldn't find out how to implement this or find an example.
To this point my best guess is that with every step in time (represented by a timestamp) a new sample buffer for both video and audio is created. Which would be a good thing. But I've a feeling this is just wishful thinking and then would still not know what duration the buffers would represent.
Could anyone help me in the right direction, helping me to understand how time works in a capture session and how to get or set the duration of sample buffers?

Continuous data streaming from NFC to iPhone in Swift?

I have an NFC tag that has integrated environmental sensors inside (MLX90129 to be exact). I would like to make an iPhone app that can read the realtime data from the tag multiple times per second and graph them. I'm not looking for background tag reading, and you can assume that the app will be open and the phone is near the tag at all times.
From what I can see on Apple documentation and other sources, the Swift support for NFC tags is mostly built for single session interrogation. Has anyone succeeded in getting continuous and repeated NFC tag reading for this type of purpose?
As you pointed out: "to make continuous and repeated NFC readings" it's not the intended functionality.
While I think that you can sort this out, there's another thing that could be a headache... to make multiple readings per second it's directly confronted to the current implementation of NFC tag reading in iOS.
Every time you start a reading, it shows the native window which informs the user that you are making a NFC Reading. A part of this process is the interaction of the user, and is exactly that part the one that imposes a time constraint. Even if the interaction with the user is not needed, there is an animation, and that animation has its lifecycle's events (start reading, reading, OK, KO, close...).
Afaik you can't bypass that animation which definitely could represent a couple seconds in the best case.
With that said, you should have a few things in mind, if you still want to try:
NFCTagReaderSession can only have one active reading at a time, and when that reading ends (OK/KO), it should be invalidated. So if you want to make another reading, you'll need to create and configure a new instance.

Get current MIDI timecode from stopped device

Is there a way to ask a MIDI device for its current timecode value while it is stopped? Specifically, I want to poll Pro Tools for its current MTC value (via the macOS Audio MIDI Setup Utility, IAC bus). The only way I've been able to come up with is to send a play command, immediately followed by a stop command. But I'd like to find a way to do it without moving the bus. I've tried sending "pause", "reset", "shuttle", and "chase" commands, but nothing will get Pro Tools to send the current MTC time value besides "play." Hoping to not have to use the old HUI protocol (if it even works with PT anymore). Thanks
The MTC specification says that Full Time Code messages are sent "when equipment needs to be fast-forwarded or rewound, located or cued to a specific time". This implies that no messages are sent if the time has not changed.
So there is no standard way.

iPhone Remote IO Issues

I've been playing around with the SDK recently, and I had an idea to just build a personal autotuner (because I am just as awesome as T-Pain).
Intro aside, I wanted to attach a high-quality microphone into the headphone jack, and I wanted my audio to be processed in a callback, and then copied to the output buffer. This has several implications:
When my audio-in is being routed through the built-in microphone, I need to be able to process this input, and send it once my input has stopped (this works).
When my audio-in is being routed through the microphone-in input from the headset jack, I want the output to be sent immediately.
Routing, however, doesn't seem to work properly when using AudioSession modes and overrides, which technically should allow you to reroute output to the iPhone speakers, no matter where the input is coming from. This is documented to work, but in practice, doesn't really work.
Remote IO, however, is not documented at all. Anyone with experience using Remote IO audio units, can you give me a reasonable high-level overview on how to do this properly? I have been using the aurioTouch example code, but I am running into errors where I get error codes like -50 and -10863, none of which are documented.
Thanks in advance.
The aurioTouch example implements remoteIO play through.
You could modify the samples before passing them on.
It simply calls AudioUnitRender in the output render callback.
NB this trick does not seem to work if you port the code
to OSX style CoreAudio. There, 99% of the time, you need
to create two AUHALs (RemoteIO-a-likes) and pass
the samples between them.

XG MIDI File Format

I have a Yamaha MIDI guitar, that, when I play a MIDI file encoded using the XG MIDI standard, causes certain lights on the guitar to turn on and off. I am trying to determine the MIDI event that causes this so that I can programmatically send the same event without the use of a MIDI file (the same way I can send a Note On (144) or Note Off (128) command).
However, while I have been able to locate a copy of the MIDI protocol, I have not been able to locate the XG MIDI protocol. Is there a way, beyond trying to send all possible commands to the device until I locate the appropriate command, to determine what the MIDI event is that is causing the lights to change state? Or is there somewhere that I can get a copy of the XG MIDI protocol?
The Yamaha manuals for their products detail the information you are looking for. The XG commands are device specific. Some XG commands give direct access to the device memory and my manual for the MU2000 tone generator warns that "you can damage the unit by sending incorrect data"
Two things:
XG is the semantic extension of MIDI protocol. It doesn't change anything in the structure of the MIDI file. The only thing is, that if you use an XG-compatible instrument to record, say, changes of the resonance of the filter, it will cause the same effect on any other XG instrument. But on the MIDI procotol level, you will still have the CC (Control Change) message #71 (IIRC).
MIDI protocol is very extensible and leaves a lot of space for manufacturers. Not only you can use CC messages, but also Registered Parameter Numbers (RPNs) and NRPNs (Non-Registered ones). On top of it you have System Exclusive (SysEx) messages and I would bet that an appropriately crafted SysEx message could change the lights on the guitar. Try to get so-called "Data List" for your instrument, it should include all the information about the MIDI messages that are being sent/received by your guitar.
Wikipedia: "In 1999, the official GM [General MIDI] standard was updated to include more controllers, patches, RPNs and SysEx messages, in an attempt to reconcile the conflicting and proprietary Roland GS and Yamaha XG additions." This was called General MIDI 2.
I recommend looking into what Java (javax.sound.midi) has to offer (C# seems to be lacking a solid MIDI library). Read up on MetaMessage, ShortMessage, SysexMessage, and Patch. From what I understand, special system messages are sent through SysexMessage (the lighting data might be here).
If you need some sample code look at Java Sound Resources.
Other links I found:
Working with XG SYSEX on the Yamaha QY70
Win32API::MIDI::SysEX::Yamaha
For a managed .NET Midi Library look for the C# Midi Toolkit on codeproject.com.
I'm using the codeproject midi toolkit by Leslie Sanford to communicate with the guitar.
http://www.codeproject.com/KB/audio-video/MIDIToolkit.aspx
Everything you need to know about the guitars communications is in the manual on a single page near the back.
Here is a video of an editor I built - it features full communications with the guitar.
YouTube Video of Guitar Program
Ultimately, you'll need to find that information from the manufacturer. It's likely a sysex message, although it could also be a controller.
Walking through all the controllers is pretty simple in software so you could try that if you wanted. But the chances of stumbling upon the right sysex message by accident or exhaustive search is close to astronomical.
Dig through the back of your manuals. It might be in there. If not, google for the sysex for your device. Otherwise you'll need to ask Yamaha for the info.