Audio/Voice Visualization - iphone

Hey you Objective-C bods.
Does anyone know how I would go about changing (transforming) an image based on the input from the Microphone on the iPhone?
i.e. When a user speaks into the Mic, the image will pulse or skew.
[edit] Anyone have any ideas, I have (what is basically) a voice recording app. I just wanted something to change as the voice input is provided. I've seen it in a sample project, but that wasn't with an UIImage. [/edit]
Thanking you!!

Apple put together some great frameworks for this! The AVFoundation framework and CoreAudio framework will be the most useful to you.
To get audio level information AVAudioRecorder is useful. Although it is made to be used for recording, it also provides levels data for the microphone. This would be useful for deforming your image base on how loud the user is shouting at his phone ;)
Here is the apple documentation for AVAudioRecorder: AVAudioRecorder Class Reference
A bit more detail:
// You will need an AVAudioRecorder object
AVAudioRecorder *myRecorderObject;
// To be able to get levels data from the microphone you need
// to enable metering for your recorder object
[myRecorderObject prepareToRecord];
myRecorderObject.meteringEnabled=YES;
// Now you can poll the microphone to get some levels data
float peakPower = [myRecorderObject peakPowerForChannel:0];
float averagePower = [myRecorderObject averagePowerForChannel:0];
If you want to see a great example of how an AVAudioRecorder object can be used to get levels data, check out this tutorial.
As far as deforming your image, that would be up to an image library. There are a lot to choose from and some great ones from apple. I am not familiar with anything though so that might be up for someone else to answer.
Best of luck!

You may try using gl-data-visualization-view extensible framework in order to visualize your sound levels.

Related

Recording Audio and Video using AVFoundation frame by frame

How to record audio and video using AVFoundation frame by frame in iOS4?
The AVCamDemo you mention is close to what you need to do and should be able to use that as reference, among those these are the following classes you need to use in order to achive what you are trying... All the classes are part of AVFoundation, you need
AVCaptureVideoDataOutput and AVCaptutureAudioDataOutput - use these classes to get raw samples from the video camera and the microphone
Use AVAssetWriter and AVAssetWriterInput in order to encode the raw samples into a file - the following sample mac OS X project shows how to use these classes (the sample should work for ios too), however they use an AVAssetReader for input (it reencodes a movie file) instead of the Camera and microphone... You can use the outputs mentioned above as the input in your case to write what you want
That should be all you need in order to achieve what you want to do...
Heres a link showing how to use VideoDataOutput
Hope it helps
If you are a registered developer, look at the videos from the 2011 WWDC (which you can find by searching in the developer portal). There are two sessions relating to AVFoundation. There was also some sample code from one of the WWDC sessions, which was extremely useful.

playing audio and recording video on iphone simultaneously

I need to play audio and record video from camera at the same time in my app.
How I can to do that?
There are many ways you can do that. Have you tried anything so far? Some documents of interest:
- AVAudioPlayer
- AVCaptureSession
- AVCaptureVideoPreviewLayer
As well a plethora of other related classes. It's impossible to be more precise without knowing specifics of your requirements though.

MPMusicPlayerController Speed Adjustment

HI,
Is there any way to adjust the speed of the song being played by the MPMusicPlayerController?
I've searched everywhere but didn't find anything useful. If there is no way to do it where can I find an example which does it with other components? Some say OpenAL, but I can't find any clear way to use the iPod library with this and change the speed of the song...
Mainly the thing I need is:
The user chooses a song from the iPod library trough MPMusicPlayerController
You have 2 buttons: Slow Down & Speed Up
If the user presses "Slow Down" the speed of the song is slowed down by lets say 5% or something. "Speed Up" visa-versa.
I really hope someone can help me with this!
Thanks in advance!
You could do this as of iOS 3.2 and later -
[musicPlayer play];
[musicPlayer setCurrentPlaybackRate:2.0];
Ref: https://developer.apple.com/library/ios/documentation/mediaplayer/reference/MPMediaPlayback_protocol/Reference/Reference.html#//apple_ref/occ/intfp/MPMediaPlayback/currentPlaybackRate
You can use AVAudioPlayer and AVPlayer for the above purpose.
They have a rate property which can set the speed of the song.
[AVPlayer setRate:1.25]; // 25% faster
However, AVPlayer cannot change the speed by 5% but 25% and AVAudioPlayer works for iOS 5 and above.
If you wish to go for other alternative then Dirac is the best option. Try DIRAC 3 LE which is free.
Use this Link to get an idea on how to use Dirac. For more information, let me know.
#girish_vr
No, you can't do this using only the MPMusicPlayerController.
If you don't mind the pitch going up and down proportional to the speed changes, then you can use the OpenAL resampler after converting your sound file to raw PCM sample files or buffers.
If you don't want the pitch to change, what you are looking for is DSP time pitch stretching/modification/correction techniques. OpenAL can't do this, but there are commercial DSP libraries (DIRAC may be one) available that can do this on iOS. You could also try writing your own phase vocoder, but this is non-trivial digital signal processing code.

Detecting sound in iphone

Can anyone tell me how to detect a sound on iphone....
Please help....please provide any source code or link if possible
You can use AVAudioRecorder
http://mobileorchard.com/tutorial-detecting-when-a-user-blows-into-the-mic/
That tutorial works for any sound really... it goes through the steps of creating a high-pass filter to try and only detect blowing. You can change the values to make it more sensitive to sound or less sensitive.

iPhone video capture (best method)

I am curious about the new APIs for iPhone iOS: AVCapture...
Does this include a documented way to grab a screenshot of the camera preview? The doc seems a bit confusing to me, and since it is out of NDA now, I thought I would post my question here.
Many thanks,
Brett
With AVFoundation you can grab photos from the camera session...The way it works is you use one of the subclasses of AVCaptureOutput in order to get what you need, for still images you are going to want to use the AVCaptureSTillImageOutput subclass, here is a link AVCaptureStillImageOutput ref. Besides that you also have AVCaptureMovieFileOutput which is used to record a quicktime movie from the capture session to a file, AVCaptureVideoDataOutput which allows you to intercept uncompressed individual frames from the capture session, you also have audio outputs which you can use as well...hope this helps