AUTimePitch or AUPitch in IOS 5 - ios5

I'm poking around the ios documentation at the moment in the core audio section.
I'm just trying to do pitch shifting without having to write my own processing callback and I'm really confused with the documentation telling me one thing and the headers saying another.
My first question is about kAudioUnitSubType_Pitch
First ,In the ios section of the docs here the pitch unit is listed but when I try to add it to code its not listed in the code hint and in the audio unit header it says that its for desktop only. Is it possible to use this in ios 5 at all or am I looking at the wrong docs.
Second , also in the ios section of the docs here I'm interested in kAudioUnitSubType_TimePitch. It is listed but states ios 2.0 through ios 2.0. Does this mean that you cant use it in ios5 ?
Could somebody give me some clarity on the issue?

Apple's time pitch modification AU is currently not available in iOS 5, only in the desktop Mac OS. The early AU docs have been corrected. The current iOS 5.x only supports a pitch resampler unit.
But there appear to be one or more commercial iOS library solutions for audio time pitch modification, if you don't want to roll your own.
ADDED later: Only the 1st gen iPad is limited to iOS 5.x. iOS 7 includes the NewTimePitch Audio Unit. But this audio unit is (currently) lower in quality than the OS X TimePitch audio unit

Related

Catalina Beta 5: Quicktime Audio Recording Not Working on 2018 Macbook Pros sw

Starting a Quicktime Audio recording with Catalina Dev Beta 5 on 2018 or later Macbook Pros outputs files with no sound (Macbook Pro Microhone selected). Example file here: https://www.dropbox.com/s/ib67k0vg8cm93fn/test_no_audio%20%281%29.aifc?dl=0
During the recording recording Console shows this error:
"CMIO_Unit_Converter_Audio.cpp:590:RebuildAudioConverter AudioConverterSetProperty() failed (1886547824)"
We have an application that records the screen and audio at the same time using AVFoundation and the resulting video files also do not have audio. However when inspecting the CMSampleBuffers, they seem fine: https://gist.github.com/paulius005/faef6d6250323b7d3386a9a70c08f70b
Is anyone else experiencing this issue or possibly have more visibility if it's something Apple is working on?
Anything else that I should be looking at to tackle the issue?
Yes, Apple is changing a lot of things related to the audio subsystem layer on Catalina. I am aware that various audio applications are being rewritten for Catalina. Also since beta2, each new beta release comes with some deprecations, but also comes with some new implementations [to the new audio layer of the MacOS].
Current Beta 5 Audio Deprecations:
The OpenAL framework is deprecated and remains present for
compatibility purposes. Transition to AVAudioEngine for spatial audio
functionality.
AUGraph is deprecated in favor of AVAudioEngine.
Inter-App audio is deprecated. Use Audio Units for this functionality.
Carbon component-based Audio Units are deprecated and support will be removed in a future release.
Legacy Core Audio HAL audio hardware plug-ins are no longer supported. Use Audio Server plug-ins for audio drivers.
__
About AVFoundation [which you are using]:
Deprecated on Beta 5:
The previously deprecated 32-bit QuickTime framework is no longer available in macOS 10.15.
The symbols for QTKit, which relied on the QuickTime framework, are still present but the classes are non-functional.
The above item: Apple shipped the symbols for QTkit on Catalina Beta 5, but they are nulled, non-functional. This means, an application will run, but will not produce any result if it is using those AVFoundation classes. (I don't know if those deprecations directly or indirectly affects your program, but they are about AVFoundation)
I think they will be fully removed on next betas, but for now they are nulled non-functional, otherwise it would completely cause instant crashes on many audio/AV applications which tried to load them. This seems to be going like a step-by-step "migration thing" from beta to beta, to give time(?) to developers rewrite their audio apps to the new audio subsystem.
You can find more details on the release notes [along with links to some new classes and functions documentation to replace the deprecated ones], but it is not a good/rich documentation yet.
https://developer.apple.com/documentation/macos_release_notes/macos_catalina_10_15_beta_5_release_notes
PS: About my opinions, point of view and information written here: I am a senior MacOS developer, but not on AV/Audio/Media subsystem, my area is Kernel/Networking/Security. But I am closely following all the changes that are happening to the MacOS operating system on each Catalina beta release since the first, and the changes I am noticing that Apple is making on the audio subsystem are significant changes.
I cannot specifically help you with the audio programming issue, but you asked if it could be something Apple is working on, and yes, it is.
I hope this information can help you get complementary information to solve your application issue.

Is there a way to turn off the automatic low-frequency filtering of the audio input on iOS 6.0?

I am working on an app that analyzes incoming audio from the built in microphone on iPhone/iPad using the iOS 6.0 SDK.
I have been struggling some time with very low levels of the lower frequencies (i.e. below 200 Hz) and I have (on the web) found others having the same problems without any answers to the problem.
Various companies working with audio tools for iOS states that there was (previous to iOS 6.0) a built in low-frequency rolloff filter that was causing these low signals on the lower frequencies BUT those sources also states that starting with iOS 6.0, it should be possible to turn off this automatic low-frequency filtering of the input audio signals.
I have gone through the audio unit header files, the audio documentation in Xcode as well as audio-related sample code without success. I have played with the different parameters and properties of the AudioUnit (which mentions low-pass filters and such) without solving the problem.
Does anybody know how to turn off the automatic low-frequency rolloff filter for RemoteIO input in iOS 6.0?
Under iOS 6.0 there is the possibility to set the current AVAudioSession to AVAudioSessionModeMeasurement like this:
[[AVAudioSession sharedInstance] setMode: AVAudioSessionModeMeasurement error:NULL];
This removes the low frequency filtering.
Link:
http://developer.apple.com/library/ios/#documentation/AVFoundation/Reference/AVAudioSession_ClassReference/Reference/Reference.html
I hope this helps.
I'm not sure if there is any way to ever accomplish this on these devices. Most microphones have difficulty with frequencies below 200 HZ (and above the 20 kHZ range as well). In fact, a lot of speakers can barely play audio at that range either. In order to get a clean signal at the <200 HZ range, you would require good enough hardware, which I think is a bit beyond the capabilities of the built in microphones of the iPhone/iPad. That's probably why Apple has filtered out these low frequency sounds, as they cannot guarantee a good enough recording, OR a good enough playback. Here's a link describing the situation better for the older devices (iPhone 4, iPhone 3GS, and iPad 1).
Apple is also very picky about what they will and won't let you play with. Even if you do find out where this filtering is taking place, interrupting that code will most likely result in your app being rejected by the app store. And due to hardware limitations, you probably wouldn't be able to achieve what you want to anyways.
Hope that Helps!

Record iPhone app video (without simulator)

How can I record a video of an iPhone app? I can't use the Simulator because the application is very OpenGL-heavy and uses an accelerometer/gyroscope.
You can have the iphone output video and capture on another device: How can I use MPTVOutWindow iPhone undocumented class?
one of the links in that answer says it doesn't work in iOS4+, but on a project I worked on less than one month ago, we used that feature from an iPhone4 to present, so I would challenge that (unless the developer that handled that portion used another approach)
I'm not sure there is a "native" solution here, short of building video capture into your actual app.
The cleanest way of handling this, assuming your game/app has a cleanly designed input pipeline, is probably to mock the input:
Put in (debug-only code) that lets you "record" all the raw input events.
Using the device, play out the demo session to create a "recording".
Run the app in the simulator, and feed it the input "recording" you made on the device.
The simulator will run GL stuff just fine, and probably at a higher framerate than your device will.

Does iphone support QuickTime VR?

I have a 10MB QuickTime VR file and I was wondering if it would be possible to play it on an iPod/iPhone/iPad?
I've seen multiple messages about the subject around but nobody could give a straight answer if the iPhone fully support this format, partly support the format or doesn't support the format at all. If this format is supported, which OS version supports it?
Nope, I don't have an iPhone at my disposal to check this, unbelievable right?
Gilad.
It's not possible to use QTVR at all, it's never been developed by apple on iPhone
but there are some other similar object you can use.
take a look at my old answer to a similar:
How to rotate QTVR image 360 degree in iPhone?
There's an app called iPano which views QTVRs, both panoramas and objects. It's ideal for viewing them and it let's you keep a collection of them on your phone. And it's really neat on the iPad too!

Use images form camera without taking a photo

Is it possible to analyze the images without taking a foto on the iPhone ?
I want to analyze some matrix codes, without taking any foto. I saw this on some Nokia models and it's quite impressive: really fast!
On the iPhone, I've seen analyzing codes after taking a snapshot (or photo), and it's slower.
Also the iPhone camera is not good as some other mobile cameras.
I'm refering to iPhone 3G/3GS.
thanks,
r.
Yes.
Under iOS 4.x, you can use the new AVFoundation Framework to access camera pixel buffers on the 3G and 3GS, as well as the newer iPhone 4, without taking a photo/snapshot. The newer devices have higher resolution camera's.
I don't know if it's possible. But I think you should take a look at the AVCamDemo Apple's Code from de WWDC 2010. I think it could help you even if I didn't read a lot of code (juste compiled tje xCode project and tried it)
(Sorry but I can't find back the link)
You should be able to find the code in your emails if you are registred as an iPhone Developer. Or maybe on the developer.apple.com/iphone/ website.
By the way, don't think doing it on iPhone 3G (impossible I think). The iPhone 3GS should be fast enough for this kind of app.
Good Luck !