Does flutter_ogg_piano audio plugin work in Android 12? - flutter

I've found flutter_ogg_piano to be the lowest latency audio plug-in (compared to audioplayers, soundpool and flutter_midi) since it uses the Oboe library, but it doesn't seem to work in Android 12. I tried it on a Samsung Galaxy S22 Ultra and it doesn't produce any sound. I see that Android 12 made some audio changes and I'm assuming these somehow affect flutter_ogg_piano. Any insights greatly appreciated!

Related

Catalina Beta 5: Quicktime Audio Recording Not Working on 2018 Macbook Pros sw

Starting a Quicktime Audio recording with Catalina Dev Beta 5 on 2018 or later Macbook Pros outputs files with no sound (Macbook Pro Microhone selected). Example file here: https://www.dropbox.com/s/ib67k0vg8cm93fn/test_no_audio%20%281%29.aifc?dl=0
During the recording recording Console shows this error:
"CMIO_Unit_Converter_Audio.cpp:590:RebuildAudioConverter AudioConverterSetProperty() failed (1886547824)"
We have an application that records the screen and audio at the same time using AVFoundation and the resulting video files also do not have audio. However when inspecting the CMSampleBuffers, they seem fine: https://gist.github.com/paulius005/faef6d6250323b7d3386a9a70c08f70b
Is anyone else experiencing this issue or possibly have more visibility if it's something Apple is working on?
Anything else that I should be looking at to tackle the issue?
Yes, Apple is changing a lot of things related to the audio subsystem layer on Catalina. I am aware that various audio applications are being rewritten for Catalina. Also since beta2, each new beta release comes with some deprecations, but also comes with some new implementations [to the new audio layer of the MacOS].
Current Beta 5 Audio Deprecations:
The OpenAL framework is deprecated and remains present for
compatibility purposes. Transition to AVAudioEngine for spatial audio
functionality.
AUGraph is deprecated in favor of AVAudioEngine.
Inter-App audio is deprecated. Use Audio Units for this functionality.
Carbon component-based Audio Units are deprecated and support will be removed in a future release.
Legacy Core Audio HAL audio hardware plug-ins are no longer supported. Use Audio Server plug-ins for audio drivers.
__
About AVFoundation [which you are using]:
Deprecated on Beta 5:
The previously deprecated 32-bit QuickTime framework is no longer available in macOS 10.15.
The symbols for QTKit, which relied on the QuickTime framework, are still present but the classes are non-functional.
The above item: Apple shipped the symbols for QTkit on Catalina Beta 5, but they are nulled, non-functional. This means, an application will run, but will not produce any result if it is using those AVFoundation classes. (I don't know if those deprecations directly or indirectly affects your program, but they are about AVFoundation)
I think they will be fully removed on next betas, but for now they are nulled non-functional, otherwise it would completely cause instant crashes on many audio/AV applications which tried to load them. This seems to be going like a step-by-step "migration thing" from beta to beta, to give time(?) to developers rewrite their audio apps to the new audio subsystem.
You can find more details on the release notes [along with links to some new classes and functions documentation to replace the deprecated ones], but it is not a good/rich documentation yet.
https://developer.apple.com/documentation/macos_release_notes/macos_catalina_10_15_beta_5_release_notes
PS: About my opinions, point of view and information written here: I am a senior MacOS developer, but not on AV/Audio/Media subsystem, my area is Kernel/Networking/Security. But I am closely following all the changes that are happening to the MacOS operating system on each Catalina beta release since the first, and the changes I am noticing that Apple is making on the audio subsystem are significant changes.
I cannot specifically help you with the audio programming issue, but you asked if it could be something Apple is working on, and yes, it is.
I hope this information can help you get complementary information to solve your application issue.

Unity3D: Cross Platform Video Streaming?

We are working on a prototyp application using unity3d. Your goal is to create a fluid and fun to use cross platform app.
The problem we facing right now is streaming (h.264 - mp4) video content over the web. This will be a major feature of our app.
I have already tried MovieTextures and the www class but it seems the files must be in ogg format which we can not provide. On the other hand handheld.playfullscreenmovie seems to be an android and ios only feature which uses the build in video player. This would be great if it would be supported on other platforms (e.g. Win8-Phone) as well.
Is there another cross platform option to stream (h.264 - mp4) video content over the web and display in full screen or as gui object? Or are there any plans to support something like this in the near future? Or is there a stable plugin for such a task?
Thanks
As of Unity 5 Handheld.PlayFullScreenMovie supports Windows Phone and Windows Store as per http://docs.unity3d.com/ScriptReference/Handheld.PlayFullScreenMovie.html
On Windows Phone 8, Handheld.PlayFullScreenMovie internally uses Microsoft Media Foundation for movie playback. On this platform, calling Handheld.PlayFullScreenMovie with full or minimal control mode is not supported.
On Windows Store Apps and Windows Phone 8.1, Handheld.PlayFullScreenMovie internally uses XAML MediaElement control.
On Windows Phone and Windows Store Apps, there generally isn't movie resolution or bitrate limit, however, higher resolution or bitrate movies will consume more memory for decoding. Weaker devices will also start skipping frames much sooner at extremely high resolutions. For example, Nokia Lumia 620 can only play videos smoothly up to 1920x1080. For these platforms, you can find list of supported formats here: Supported audio and video formats on Windows Store
mp4 is not a streamable container. If you read the ISO specification, you will see that MP4 can not be streamed. This is because the MOOV atom can not be written until all frames are know and accounted for. This 100% incompatible for live video. There are supersets of MP4 used in DASH that make this possible. Essentially, they create a little mp4 (called a fragment) file every couple seconds. Alternatively you can use a container designed for streaming such as FLV or TS.
You will probably need to step outside the unity sdk a bit to enable this.

How to access multiple build-in microphones on iPhone 4s (5, 5s)?

Recently, I'm working on a project using the build-in microphones recording the stereo sound. And then doing some signal processing. However, it seems like there is no specific solution to this question.
There is a link showing that it is quite reasonable only using the build-in microphones doing the stereo recording.
https://audioboo.fm/boos/1102187-recording-in-stereo-from-the-iphone-5#t=0m20s
However, I still do not know how to do it! Is there someone solving this problem?
There are some resources showing how to access different build-in mic.
use rear microphone of iphone 5
Also, it may quite easy using an Android phone implementing this project.
How to access the second mic android such as Galaxy 3 ,
How can I capture audio input from 2 mics of my android phone real time and simultaneously

Video on Android Virtual Devices?

I have installed Android SDK Manager to test web sites on Android Virtual Devices.
I have problem in seeing videos on it:
I hear audio normally but I see only black instead of video (controllbar is OK).
The video is encoded H264/AAC 1Mbs and plays well on browsers and iPhones emulator.
Do I have to add new hardware properties ?
You can see my configuration here
Thanks for your help
Paul
No, there are no specific hardware properties that are required for video playback. The emulator will only play a limited subset of video encodings while a real device will support a more comprehensive set. GPU emulation might help if your emulator is too slow to play back the video, but is not required.
Make sure your video is encoded using a baseline profile that can be played back in the Android emulator. This answer might help: What formats of video will play in emulator?

Simple embeddable MidiSynth for iOS?

I have a guitar diagram app for Android that I am porting to iOS. Android has a embedded midi synthesizer (sonivox), so I can generate midi files and let Android handle the playback.
Is there a way to do this on iOS? Or are there very lightweight embeddable synths for iOS?
Since iOS 5 there's now the AUSampler audio unit for which you can load a sound bank (apple preset and soundfonts) and then control the sampler/synth via MIDI messages.
Seee this example app:
https://developer.apple.com/library/ios/#samplecode/LoadPresetDemo/Listings/ReadMe_txt.html
It works great!
Update: My answer is out-of-dated. #lukebuehler's answer is much appropriate.
If you don't mind non-opensource solution, try FMOD. Being a commercial audio engine for games, fmod equips a simple MIDI synth. I've tried the free evaluation version. It plays GM MIDI files correctly on my iPhone 3G.
If what you want is not just a SMF file player, you want a full function GS/GM softsynth, which can response individual midi events in realtime. You can try the midisynth from CrimsonTech. Its license fee is fair. Crimsontech provides several demo apps in the appstore. Besides, it also provides an evaluation SDK for free. You don't need to pay a penny for the license until you're really going to publish your app.
I don't think MIDI support in iOS 4.2 allows playback of MIDI data from the phone itself. It is merely for sending and receiving MIDI commands to other MIDI devices.
From the recent iOS 4.2 docs, it seems that you can use the MIDI support to send MIDI commands to other devices for playback. You can also receive commands from other devices and make changes to these commands, or to save it in a file. However I can't find any support to actually play MIDI file from the phone directly. Someone please correct me if I'm wrong (I wish I'm wrong!!).
There is MIDI support in iOS 4.2. If it is the same as what OS X provides then trhere will also be a basic synth included. Check it out.