I am currently recording video using AVFoundation api and have specified a fileUrl to write to:
NSURL *fileUrl = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#%#", NSTemporaryDirectory(), #"output.mov"]];
However, right now its recording to a temp directory. How do i write this file to the camera roll? Whats the camera roll directory?
Thanks!
After recording to the disk do the following to copy the video to Camera Roll
if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(outFileName)) {
UISaveVideoAtPathToSavedPhotosAlbum(outFileName, self, #selector(video:didFinishSavingWithError: contextInfo:), nil);
}
Note that UIVideoAtPathIsCompatibleWithSavedPhotosAlbum() will be successful only of the recorded video is proper QuickTIme format.
I found some documentation on Apples site from the AVCamDemo code.
Related
I have some wav files and they completely working with MPMoviePlayer. but not working with AVPlayer. For some reason I have changed my player to AVPlayer. But after implementation I found that some of my audio files are not working with AVPlayer.I have used apples sample code of AVPlayer. Any suggestion would be appreciated.
Does other format work ? Perhaps it's a coding problem.
Anyway. If your WAV files are ok, import them into iTunes, convert them once (right click on the file into the library), and convert the converted one back to wav (same method). You should then have an iPhone compatible WAV file.
If that does not work, that should be a coding problem :-)
Try :
NSString* path = [[NSBundle mainBundle] pathForResource:soundfileName ofType:fileType];
if ([[NSFileManager defaultManager] fileExistsAtPath:path] == NO) NSLog (#"No file");
AVPlayer* player = [AVPlayer playerWithURL:[NSURL fileURLWithPath:path]];
[player play];
I want to play a movie in my app and the video is stored in my ipad library.
Can anyone provide me the necessary guidance for it?
Get the MPMediaItem of the video you want using MPMediaQuery
Get the Asset URL like this:
NSURL *videoURL = [mediaItem valueForProperty:MPMediaItemPropertyAssetURL];
Instantiate an MPMoviePlayerViewController using videoURL as the content URL.
If the video is stored in your App's bundle then do this to get the URL:
NSURL *videoURL = [[NSBundle mainBundle] URLForResource:#"nameoffile" withExtension:#"mp4"];
See a sample of a custom view/viewController combo in Pragmatic iPad Programming. Check the source code for chapter 8 (free download from that page). They use a video provided as a file inside the project. Btw, if the file in the video appears red in XCode, you'll have to remove it and re add it, I think the project definition is a bit screwed.
I have an app that plays video clips through the MPMovieplayer. These clips are in mp4 format and everything works dandy. I want to take that same clip and save it into the photo album. This works if I manually sync the video from a computer through iTunes to the phone. It appears to transcode the video file and store it as a .MOV format.
However, when I try and save the video while in the app via code, I get a video format error. So my question is how do I get my video to save in the photo album? If this is not possible with mp4 how do I transcode (in app) to .MOV?
Here is the code:
ALAssetsLibrary* library = [[ALAssetsLibrary alloc]init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:moviePlayerController.contentURL])
{
NSURL *clipURl = moviePlayerController.contentURL;
[library writeVideoAtPathToSavedPhotosAlbum:clipURl completionBlock:^(NSURL *assetURL, NSError *error)
{
if (error)
[ErrorAlertView showError:error];
else [ErrorAlertView showErrorTitle:#"Success" message:#"Your video clip is saved"];
}];
}
[library release];
Is your contentURL a file path URL or a web URL? For the writeVideoAtPathToSavedPhotosAlbum method, in my testing you actually need a file path URL created (for example) in this way:
NSString *pathString = [[NSBundle mainBundle] pathForResource:#"filename" ofType:#"mp4"];
NSString *pathURL = [NSURL fileURLWithPath:pathString isDirectory:NO];
This means you need to first download the movie from the web, if you're using a web URL. I recommend ASIHTTPRequest, using the setDownloadDestinationPath: method to set the download directory.
In general, mp4 files should work if they're the right resolution for the right (retina, non-retina, iPad) device (see my tests of supported video files here).
If the video still gives a NO response on videoAtPathIsCompatibleWithSavedPhotosAlbum: after making absolutely sure the file path URL is correct, then you'll need to use AVAssetExportSession (with AVAssetExportPresetLowQuality, AVAssetExportPresetMediumQuality, or AVAssetExportPresetHighestQuality) to get a device-appropriate file that you can then save to the Photo Album.
I want to upload a video to webserver. I can upload the video but the problem is how should i pick a video. Means I know there is a default UIImagePickerController that i can use to pick image, is there any thing similar to pick movies in iPhone?
Hope you are getting my problem.
Thanks
Check out this SO entry. It has a similar discussion.
NSString *path = [[NSBundle mainBundle] pathForResource:#"ddd" ofType:#"avi"];
NSData *data = [NSData dataWithContentsOfFile:path];
Assets library is another option that you can fetch all photos and videos programaticaly but in that you case you have to make your own pickerviewcontroller .
Use AVFoundation to capture video and upload it using MKNetworkKit.
I'm using the following to play an m4a file:
NSString *path = [[[NSBundle mainBundle] resourcePath] stringByAppendingPathComponent: fileName];
SystemSoundID soundID;
NSURL *filePath = [NSURL fileURLWithPath:path isDirectory:NO];
AudioServicesCreateSystemSoundID((CFURLRef)filePath, &soundID);
AudioServicesPlaySystemSound(soundID);
It works fine on the simulator but I hear nothing on the device. Sounds files I'm using all stay in the bundle. Here is what filePath looks like from the device:
file://localhost/var/mobile/Applications/418945F3-3711-4B4D-BC65-0D78993C77FB/African%20Adventure.app/Switch%201.m4a
Is there an issue with the file path or any thing different I need to do for the device?
Just as a sidenote - I was having the exact same problem and spent probably close to an hour on converting files to the correct format, etc.. Yet the problem was the "mute" switch on the iPad. So even though the volume was up, and I could hear other sounds on the iPad, because the mute switch was turned on, it wasn't playing system sounds.
To add to the confusion, this app uses text-to-speech and the volume coming from the dictation was perfectly fine, it was only the sounds coming from AudioServicesPlaySystemSound() that weren't being played.
I had trouble with this too. Finally I realised it was because AudioServices can only play audio with the following constratints.
Sound files that you play using this
function must be:
- No longer than 30 seconds in duration
- In linear PCM or IMA4 (IMA/ADPCM) format
- Packaged in a .caf, .aif, or .wav file
From Apple docs: http://developer.apple.com/library/ios/#documentation/AudioToolbox/Reference/SystemSoundServicesReference/Reference/reference.html
You might want to use the AVAudioPlayer instead of AudioServices.
The following code will take an audio file (.m4a) and play the audio file 1 time. Don't forget to release "audioPlayer" when you're done with it.
NSString *urlAddress = [[NSBundle mainBundle] pathForResource:#"filename" ofType:#"m4a"];
NSURL *url = [NSURL fileURLWithPath:urlAddress];
NSError *error;
AVAudioPlayer *audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error];
audioPlayer.numberOfLoops = 0;
if (audioPlayer == nil)
{
NSLog([error description]);
}
else
{
[audioPlayer play];
}
Hope this example helps you with playing audio on the actual device. It might also be a good idea to increase the device audio when the file is playing.
Note: You will need to add the AVFoundation framework to your project if you have not already done so. As well as import the header file.
#import <AVFoundation/AVFoundation.h>
Update:
From Apple's Core Audio Overview Document
Audio Session Services
Audio Session Services lets you manage audio sessions in your application—coordinating the audio behavior in your application with background applications on an iPhone or iPod touch. Audio Session Services consists of a subset of the functions, data types, and constants declared in the AudioServices.h header file in AudioToolbox.framework.
The AVAudioPlayer Class
The AVAudioPlayer class provides a simple Objective-C interface for playing sounds. If your application does not require stereo positioning or precise synchronization, and if you are not playing audio captured from a network stream, Apple recommends that you use this class for playback. This class is declared in the AVAudioPlayer.h header file in AVFoundation.framework.
Start by error-checking your returns. Is filePath nil? Do either of the AudioServices functions return an error? The most likely cause is case-sensitivity. The iPhone filesystem is case sensitive while the Mac is not. But the first step in debugging is to look at the errors the system is providing.
The simulator uses regular QuickTime for playback, so it's easy to have media assets which work in the sim, but fail on the device due to missing / unsupported codecs. The test is if you can play the file at all on the device, eg through Safari or the iPod app.