Playing a video in ipad - iphone

I want to play a movie in my app and the video is stored in my ipad library.
Can anyone provide me the necessary guidance for it?

Get the MPMediaItem of the video you want using MPMediaQuery
Get the Asset URL like this:
NSURL *videoURL = [mediaItem valueForProperty:MPMediaItemPropertyAssetURL];
Instantiate an MPMoviePlayerViewController using videoURL as the content URL.
If the video is stored in your App's bundle then do this to get the URL:
NSURL *videoURL = [[NSBundle mainBundle] URLForResource:#"nameoffile" withExtension:#"mp4"];

See a sample of a custom view/viewController combo in Pragmatic iPad Programming. Check the source code for chapter 8 (free download from that page). They use a video provided as a file inside the project. Btw, if the file in the video appears red in XCode, you'll have to remove it and re add it, I think the project definition is a bit screwed.

Related

Save MP4 into iPhone photo album

I have an app that plays video clips through the MPMovieplayer. These clips are in mp4 format and everything works dandy. I want to take that same clip and save it into the photo album. This works if I manually sync the video from a computer through iTunes to the phone. It appears to transcode the video file and store it as a .MOV format.
However, when I try and save the video while in the app via code, I get a video format error. So my question is how do I get my video to save in the photo album? If this is not possible with mp4 how do I transcode (in app) to .MOV?
Here is the code:
ALAssetsLibrary* library = [[ALAssetsLibrary alloc]init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:moviePlayerController.contentURL])
{
NSURL *clipURl = moviePlayerController.contentURL;
[library writeVideoAtPathToSavedPhotosAlbum:clipURl completionBlock:^(NSURL *assetURL, NSError *error)
{
if (error)
[ErrorAlertView showError:error];
else [ErrorAlertView showErrorTitle:#"Success" message:#"Your video clip is saved"];
}];
}
[library release];
Is your contentURL a file path URL or a web URL? For the writeVideoAtPathToSavedPhotosAlbum method, in my testing you actually need a file path URL created (for example) in this way:
NSString *pathString = [[NSBundle mainBundle] pathForResource:#"filename" ofType:#"mp4"];
NSString *pathURL = [NSURL fileURLWithPath:pathString isDirectory:NO];
This means you need to first download the movie from the web, if you're using a web URL. I recommend ASIHTTPRequest, using the setDownloadDestinationPath: method to set the download directory.
In general, mp4 files should work if they're the right resolution for the right (retina, non-retina, iPad) device (see my tests of supported video files here).
If the video still gives a NO response on videoAtPathIsCompatibleWithSavedPhotosAlbum: after making absolutely sure the file path URL is correct, then you'll need to use AVAssetExportSession (with AVAssetExportPresetLowQuality, AVAssetExportPresetMediumQuality, or AVAssetExportPresetHighestQuality) to get a device-appropriate file that you can then save to the Photo Album.

playing a sound doesn't work on iPhone

I've added a sound to my app.
In my .h i've added:
CFURLRef soundFileURL;
SystemSoundID soundFile;
in my viewDidLoad in my .m:
soundFileURL = CFBundleCopyResourceURL(
CFBundleGetMainBundle(),
CFSTR("sound"),
CFSTR("mp3"),
NULL);
AudioServicesCreateSystemSoundID(
soundFileURL,
&soundFile);
and lastly i've added a -playSound method:
-(void)playSound {
NSLog(#"playSound");
AudioServicesPlayAlertSound(soundFile); }
It works fine on the iPhone Simulator, but when I build the app on my iPhone the console says the sound was played but it wasn't.
I read that many others had this problem too but I didn't find any solutions.
What's wrong?
It could be your encoding - the audio codecs on the simulator are much different than those provided by your actual iPhone hardware. Try reencoding.
Also - you don't want to use an mp3 for a sound, you want to use a non-compressed file format (I suggest aiff from experience) because there isn't dedicated decoding hardware to support the sound decoding you're trying to do. (There is for playing music, which mp3 is recommended for)
I would recommend AVAudioPlayer in the AVFoundation framework. It has a simple asynchronous interface for sound playback on iOS.
Check out the Apple programming guide here.
I had a similar issue. It turned out I was referencing "Horn.caf" with
NSURL *hornSound = [[NSBundle mainBundle] URLForResource: #"horn"
withExtension: #"caf"];
Note the difference in case; My OS X Lion install is on a case insensitive file system.
Changing the code to the following fixed the issue.
NSURL *hornSound = [[NSBundle mainBundle] URLForResource: #"Horn"
withExtension: #"caf"];
Hope that helps others.

How to upload a Video in iPhone SDK

I want to upload a video to webserver. I can upload the video but the problem is how should i pick a video. Means I know there is a default UIImagePickerController that i can use to pick image, is there any thing similar to pick movies in iPhone?
Hope you are getting my problem.
Thanks
Check out this SO entry. It has a similar discussion.
NSString *path = [[NSBundle mainBundle] pathForResource:#"ddd" ofType:#"avi"];
NSData *data = [NSData dataWithContentsOfFile:path];
Assets library is another option that you can fetch all photos and videos programaticaly but in that you case you have to make your own pickerviewcontroller .
Use AVFoundation to capture video and upload it using MKNetworkKit.

How do you implement seeking using the AudioStreamer sample application as a base?

I need to play some audio streaming contents on iPhone, but there are some options and problems I can't solve:
1. http://cocoawithlove.com/2009/06/revisiting-old-post-streaming-and.html
It is a library provided by Matt Gallagher, but I saw the Limited scope, and I would play
a single song, not a radio station.
2. Suggestion from zonble
NSURL *URL = [NSURL URLWithString:#"http://zonble.net/MIDI/orz.mp3"];
NSData *data = [NSData dataWithContentsOfURL:URL];
AVAudioPlayer *player = [[AVAudioPlayer alloc] initWithData:data error:nil];
But I need streaming for each "single song".
3. Using MPMoviePlayerController to wrapper.
Since the Class update new method, and
#interface MyAudioStreamer : UIViewController {
MPMoviePlayerController *mp ;
}
instead of [self addSubview:mp.view] I use my own xib file to implement play and
stop methods, but in this case, I have no idea to implement seekForward or seekBackward.
Is there any good idea to do AudioStreamer?
==== Update ====
after google 'AVPlayer' ,i used 3. Using MPMoviePlayerController to wrapper. to implement a auidoStreamer-like player ,there is something to share :
1.'playbackDidFinished' state
there're 2 conditions : a song did finished and play next song or a song interrupt by user press stop or exception, i used enum to filter my state.
2 multi-task playing your song in background
reference from https:// devforums.apple.com/message/264397
and if iOS SDK update, the solution might be changed because method would be diplicated. so i suggest to read library provided by Matt Gallagher.
anyone who konws when the library have no codec match download item (for example, my item encode by .AAC 128-bit and the library not support or at most .AAC 64-bit), what happen would the player be ?
You can either parse the icy meta data yourself, or if you're using iOS 4.0 you can playback using an AVPlayer and observe the timed metadata to discover the song boundaries and titles.

Sound on simulator but not device

I'm using the following to play an m4a file:
NSString *path = [[[NSBundle mainBundle] resourcePath] stringByAppendingPathComponent: fileName];
SystemSoundID soundID;
NSURL *filePath = [NSURL fileURLWithPath:path isDirectory:NO];
AudioServicesCreateSystemSoundID((CFURLRef)filePath, &soundID);
AudioServicesPlaySystemSound(soundID);
It works fine on the simulator but I hear nothing on the device. Sounds files I'm using all stay in the bundle. Here is what filePath looks like from the device:
file://localhost/var/mobile/Applications/418945F3-3711-4B4D-BC65-0D78993C77FB/African%20Adventure.app/Switch%201.m4a
Is there an issue with the file path or any thing different I need to do for the device?
Just as a sidenote - I was having the exact same problem and spent probably close to an hour on converting files to the correct format, etc.. Yet the problem was the "mute" switch on the iPad. So even though the volume was up, and I could hear other sounds on the iPad, because the mute switch was turned on, it wasn't playing system sounds.
To add to the confusion, this app uses text-to-speech and the volume coming from the dictation was perfectly fine, it was only the sounds coming from AudioServicesPlaySystemSound() that weren't being played.
I had trouble with this too. Finally I realised it was because AudioServices can only play audio with the following constratints.
Sound files that you play using this
function must be:
- No longer than 30 seconds in duration
- In linear PCM or IMA4 (IMA/ADPCM) format
- Packaged in a .caf, .aif, or .wav file
From Apple docs: http://developer.apple.com/library/ios/#documentation/AudioToolbox/Reference/SystemSoundServicesReference/Reference/reference.html
You might want to use the AVAudioPlayer instead of AudioServices.
The following code will take an audio file (.m4a) and play the audio file 1 time. Don't forget to release "audioPlayer" when you're done with it.
NSString *urlAddress = [[NSBundle mainBundle] pathForResource:#"filename" ofType:#"m4a"];
NSURL *url = [NSURL fileURLWithPath:urlAddress];
NSError *error;
AVAudioPlayer *audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error];
audioPlayer.numberOfLoops = 0;
if (audioPlayer == nil)
{
NSLog([error description]);
}
else
{
[audioPlayer play];
}
Hope this example helps you with playing audio on the actual device. It might also be a good idea to increase the device audio when the file is playing.
Note: You will need to add the AVFoundation framework to your project if you have not already done so. As well as import the header file.
#import <AVFoundation/AVFoundation.h>
Update:
From Apple's Core Audio Overview Document
Audio Session Services
Audio Session Services lets you manage audio sessions in your application—coordinating the audio behavior in your application with background applications on an iPhone or iPod touch. Audio Session Services consists of a subset of the functions, data types, and constants declared in the AudioServices.h header file in AudioToolbox.framework.
The AVAudioPlayer Class
The AVAudioPlayer class provides a simple Objective-C interface for playing sounds. If your application does not require stereo positioning or precise synchronization, and if you are not playing audio captured from a network stream, Apple recommends that you use this class for playback. This class is declared in the AVAudioPlayer.h header file in AVFoundation.framework.
Start by error-checking your returns. Is filePath nil? Do either of the AudioServices functions return an error? The most likely cause is case-sensitivity. The iPhone filesystem is case sensitive while the Mac is not. But the first step in debugging is to look at the errors the system is providing.
The simulator uses regular QuickTime for playback, so it's easy to have media assets which work in the sim, but fail on the device due to missing / unsupported codecs. The test is if you can play the file at all on the device, eg through Safari or the iPod app.