Reference link:
http://www.raywenderlich.com/13418/how-to-play-record-edit-videos-in-ios
I'm selecting the video frames and store the video using the below delegate(didFinishPickingMediaWithInfo) method,
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
// 1 - Get media type
NSString *mediaType = [info objectForKey: UIImagePickerControllerMediaType];
// 2 - Dismiss image picker
[self dismissModalViewControllerAnimated:NO];
// Handle a movie capture
if (CFStringCompare ((__bridge_retained CFStringRef)mediaType, kUTTypeMovie, 0) == kCFCompareEqualTo) {
// 3 - Play the video
MPMoviePlayerViewController *theMovie = [[MPMoviePlayerViewController alloc]
initWithContentURL:[info objectForKey:UIImagePickerControllerMediaURL]];
[self presentMoviePlayerViewControllerAnimated:theMovie];
// 4 - Register for the playback finished notification
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(myMovieFinishedCallback:)
name:MPMoviePlayerPlaybackDidFinishNotification object:theMovie];
if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum (moviePath)) {
UISaveVideoAtPathToSavedPhotosAlbum (moviePath,self, #selector(video:didFinishSavingWithError:contextInfo:), nil);
}
}
}
I want to know how the videos are trimming and stored in the mobile local.In the below path the file is storing. I'm getting those values in the NSDictionary media url.
Path ::: {
UIImagePickerControllerMediaType = "public.movie";
UIImagePickerControllerMediaURL = "file://localhost/private/var/mobile/Applications/3175A4BD-F24F-4745-B7AE-FBA4B9EBE90D/tmp//trim.zuyfI0.MOV";
UIImagePickerControllerReferenceURL = "assets-library://asset/asset.MOV?id=740204CB-EE25-4A83-992A-F46115FC5B9F&ext=MOV";
}
Before storing the video i would like to change the currentPlaybackRate ? Can anybody provide some idea to implement ?
You can do theMovie.currentPlaybackRate = /* your value */; if you just want to change the time scale during playback. If you want to actually save the video with a different time scale, then you'll have to dive into AV Foundation.
Related
I'm trying to get some data about a video i chose in the UIImagePicker.
So when it gets into the UIImagePicker delegate method (below) i understand i need to use the UIImagePickerControllerReferenceURL key from the info dictionary.
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
NSURL *url = [info objectForKey:UIImagePickerControllerReferenceURL];
NSLog(#"%#",url);
[library addAssetURL:[info objectForKey:UIImagePickerControllerReferenceURL] toAlbum:#"Compedia" withCompletionBlock:^(NSError *error) {
if (error!=nil) {
NSLog(#"Big error: %#", [error description]);
}
}];
}
The problem is that the url is resulting with nil.
I printed the description of info and it looks like this:
Printing description of info:
{
UIImagePickerControllerMediaType = "public.movie";
UIImagePickerControllerMediaURL = "file://localhost/private/var/mobile/Applications/6630FBD3-1212-4ED0-BC3B-0C23AEEFB267/tmp/capture-T0x1f57e880.tmp.Ulfn5o/capturedvideo.MOV";
}
After i've done some research i found out that if i set the camera with kUTTypeMovie it should give me both media and reference url.
This is how i defined my camera:
cameraUI = [[UIImagePickerController alloc] init];
cameraUI.sourceType = UIImagePickerControllerSourceTypeCamera;
cameraUI.mediaTypes = [[NSArray alloc] initWithObjects:(NSString *)kUTTypeMovie, nil];
cameraUI.allowsEditing = YES;
cameraUI.delegate = delegate;
cameraUI.showsCameraControls = NO;
cameraUI.cameraOverlayView = [self getCustomToolBar];
Is there something i'm missing?
Thanks,
Nearly went insane on this one. Every where I looked everyone seemed to be convinced that the UIImagePickerControllerReferenceURL was always there.
Anyway the issue is that if your delegate is being called after you have taken a pic using the camera than the UIImagePickerControllerReferenceURL object will not be there as the image has not been saved to the camera roll yet so therefore there is no UIImagePickerControllerReferenceURL. It is only there when the delegate is called after selecting an image from the camera roll.
So my way around this is to use the ALAssetLibrary to save it to the camera roll then read back the URL.
Heres my solution:
First you want to detect your PickerController sourceType to see if it was the camera or photosLibrary/savedPhotosAlbum. If it is the camera then we use the Asset library's writeImageToSavedPhotosAlbum:orientation:completionBlock: to write the image to the camera roll and give us back the images URL in the completion block.
However if the its not the camera then that means is was either the photosLibrary or savedPhotosAlbum which in both cases [info objectForKey:UIImagePickerControllerReferenceURL] would return a valid url
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
//dismiss the imagepicker view
[self dismissViewControllerAnimated:YES completion:nil];
if( [picker sourceType] == UIImagePickerControllerSourceTypeCamera )
{
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
ALAssetsLibrary *library = [Utils defaultAssetsLibrary];
[library writeImageToSavedPhotosAlbum:image.CGImage orientation:(ALAssetOrientation)image.imageOrientation completionBlock:^(NSURL *assetURL, NSError *error )
{
//here is your URL : assetURL
}];
}
else
{
//else this is valid : [info objectForKey:UIImagePickerControllerReferenceURL]];
}
}
Hope that helps.
ALAssetsLibrary is deprecated... use PHFetchOptions..
Refer below answer:
https://stackoverflow.com/a/44328085/5315917
I have an app for iPhone and iPad that plays an audio stream using AVPlayer, I am using the same player of the Apple Sample StitchedStreamPlayer, but I made some changes to play music instead of video.
When I run the app, I can listen for some few seconds and then, the device restarts and following error is displayed:
Terminating in response to SpringBoard's termination.
(when I am running using xcode on the device it plays some minutes, but when I unplug the device and run the app again the app crashes)
I am using the iPhone 4 and an iPad mini for testing, none of them are Jailbroken and booth are iOS 6.
The code is quite big, but here is some parts:
header:
#interface NewPlayer : NSObject <AVAudioSessionDelegate>
#property (strong) AVPlayer *player;
#property (strong) AVPlayerItem *playerItem;
some important methods of Implementation
-(void)play:(NSString *)audio
{
/* Has the user entered a audio URL? */
NSURL *audioUrl = [NSURL URLWithString:audio];
if ([audioUrl scheme]) /* Sanity check on the URL. */
{
/*
Create an asset for inspection of a resource referenced by a given URL.
Load the values for the asset keys "tracks", "playable".
*/
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:audioUrl options:nil];
NSArray *requestedKeys = [NSArray arrayWithObjects:kTracksKey, kPlayableKey, nil];
/* Tells the asset to load the values of any of the specified keys that are not already loaded. */
[asset loadValuesAsynchronouslyForKeys:requestedKeys completionHandler:
^{
dispatch_async( dispatch_get_main_queue(),
^{
/* IMPORTANT: Must dispatch to main queue in order to operate on the AVPlayer and AVPlayerItem. */
[self prepareToPlayAsset:asset withKeys:requestedKeys];
});
}];
}
}
- (void)prepareToPlayAsset:(AVURLAsset *)asset withKeys:(NSArray *)requestedKeys
{
/* Make sure that the value of each key has loaded successfully. */
for (NSString *thisKey in requestedKeys)
{
NSError *error = nil;
AVKeyValueStatus keyStatus = [asset statusOfValueForKey:thisKey error:&error];
if (keyStatus == AVKeyValueStatusFailed)
{
[self assetFailedToPrepareForPlayback:error];
return;
}
/* If you are also implementing the use of -[AVAsset cancelLoading], add your code here to bail
out properly in the case of cancellation. */
}
/* Use the AVAsset playable property to detect whether the asset can be played. */
if (!asset.playable)
{
/* Generate an error describing the failure. */
NSString *localizedDescription = NSLocalizedString(#"Item cannot be played", #"Item cannot be played description");
NSString *localizedFailureReason = NSLocalizedString(#"The assets tracks were loaded, but could not be made playable.", #"Item cannot be played failure reason");
NSDictionary *errorDict = [NSDictionary dictionaryWithObjectsAndKeys:
localizedDescription, NSLocalizedDescriptionKey,
localizedFailureReason, NSLocalizedFailureReasonErrorKey,
nil];
NSError *assetCannotBePlayedError = [NSError errorWithDomain:#"StitchedStreamPlayer" code:0 userInfo:errorDict];
/* Display the error to the user. */
[self assetFailedToPrepareForPlayback:assetCannotBePlayedError];
return;
}
/* At this point we're ready to set up for playback of the asset. */
/* Stop observing our prior AVPlayerItem, if we have one. */
if (self.playerItem)
{
/* Remove existing player item key value observers and notifications. */
[self.playerItem removeObserver:self forKeyPath:kStatusKey];
[[NSNotificationCenter defaultCenter] removeObserver:self
name:AVPlayerItemDidPlayToEndTimeNotification
object:self.playerItem];
}
/* Create a new instance of AVPlayerItem from the now successfully loaded AVAsset. */
self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
/* Observe the player item "status" key to determine when it is ready to play. */
[self.playerItem addObserver:self
forKeyPath:kStatusKey
options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
context:MyStreamingAudioViewControllerPlayerItemStatusObserverContext];
/* When the player item has played to its end time we'll toggle
the movie controller Pause button to be the Play button */
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:self.playerItem];
/* Create new player, if we don't already have one. */
if (![self player])
{
/* Get a new AVPlayer initialized to play the specified player item. */
[self setPlayer:[AVPlayer playerWithPlayerItem:self.playerItem]];
/* Observe the AVPlayer "currentItem" property to find out when any
AVPlayer replaceCurrentItemWithPlayerItem: replacement will/did
occur.*/
[self.player addObserver:self
forKeyPath:kCurrentItemKey
options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
context:MyStreamingAudioViewControllerCurrentItemObservationContext];
}
/* Make our new AVPlayerItem the AVPlayer's current item. */
if (self.player.currentItem != self.playerItem)
{
/* Replace the player item with a new player item. The item replacement occurs
asynchronously; observe the currentItem property to find out when the
replacement will/did occur*/
[[self player] replaceCurrentItemWithPlayerItem:self.playerItem];
[self syncPlayPauseButtons];
}
}
- (void)observeValueForKeyPath:(NSString*) path
ofObject:(id)object
change:(NSDictionary*)change
context:(void*)context
{
/* AVPlayerItem "status" property value observer. */
if (context == MyStreamingAudioViewControllerPlayerItemStatusObserverContext)
{
[self syncPlayPauseButtons];
AVPlayerStatus status = [[change objectForKey:NSKeyValueChangeNewKey] integerValue];
switch (status)
{
/* Indicates that the status of the player is not yet known because
it has not tried to load new media resources for playback */
case AVPlayerStatusUnknown:
{
NSLog(#"desconhecido");
}
break;
case AVPlayerStatusReadyToPlay:
{
/* Once the AVPlayerItem becomes ready to play, i.e.
[playerItem status] == AVPlayerItemStatusReadyToPlay,
its duration can be fetched from the item. */
NSLog(#"ready to play");
[player play];
[self.delegate tocandoMusica];
}
break;
case AVPlayerStatusFailed:
{
AVPlayerItem *thePlayerItem = (AVPlayerItem *)object;
[self assetFailedToPrepareForPlayback:thePlayerItem.error];
NSLog(#"falhou");
[self.delegate acabouMusica];
}
break;
}
}
/* AVPlayer "rate" property value observer. */
else if (context == MyStreamingAudioViewControllerRateObservationContext)
{
//[self syncPlayPauseButtons];
}
/* AVPlayer "currentItem" property observer.
Called when the AVPlayer replaceCurrentItemWithPlayerItem:
replacement will/did occur. */
else if (context == MyStreamingAudioViewControllerCurrentItemObservationContext)
{
AVPlayerItem *newPlayerItem = [change objectForKey:NSKeyValueChangeNewKey];
/* New player item null? */
if (newPlayerItem == (id)[NSNull null])
{
//[self disablePlayerButtons];
//[self disableScrubber];
}
else /* Replacement of player currentItem has occurred */
{
/* Specifies that the player should preserve the video’s aspect ratio and
fit the video within the layer’s bounds. */
[self syncPlayPauseButtons];
}
}
/* Observe the AVPlayer "currentItem.timedMetadata" property to parse the media stream
timed metadata. */
else if (context == MyStreamingAudioViewControllerTimedMetadataObserverContext)
{
//NSArray* array = [[player currentItem] timedMetadata];
//for (AVMetadataItem *metadataItem in array)
//{
//}
}
else
{
[super observeValueForKeyPath:path ofObject:object change:change context:context];
}
return;
}
If you want to take a deep look, just take a look on StitchedStreamPlayer Sample, I have no idea. I have looked at:
Failed to play audio file using AVPlayer in iPhone
memory leak in AudioToolbox library AVAudioPlayer
AudioToolBox leak in iOS6?
and many others..
I have tried to forget all this implementation and use just
player = [AVPlayer playerWithURL:[NSURL URLWithString:url]];
[player play];
but it crashes!
Some idea?
EDITED
I Have tried the MPMoviePlayerController but the same happened, the music started and then the device restarted.
This is the code I have used:
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:NULL];
player = [[MPMoviePlayerController alloc] initWithContentURL:[NSURL URLWithString:[[arrRadios objectAtIndex:indexPath.row] objectForKey:#"url"]]];
[player play];
I'm doing this test app where I want to receive notification when the iPod changes the now playing item (song), the test is working nice while app is in foreground but as soon as the app goes to the background it stop getting notifications which is OK, when I tap on the app again (comes to foreground) I get all notifications according to all the times the now playing changed while the app was in background but everytime I'm getting the same song information, so how can I get the correct song information for each notification?
This is the test I did, in the AppDelegate:
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
NSNotificationCenter *notificationCenter = [NSNotificationCenter defaultCenter];
MPMusicPlayerController *player = [MPMusicPlayerController iPodMusicPlayer];
[notificationCenter addObserver:self
selector:#selector(nowPlayingItemChanged:)
name:MPMusicPlayerControllerNowPlayingItemDidChangeNotification
object:player];
[player beginGeneratingPlaybackNotifications];
return YES;
}
-(void) nowPlayingItemChanged:(NSNotification *)notification {
MPMusicPlayerController *player = (MPMusicPlayerController *)notification.object;
MPMediaItem *song = [player nowPlayingItem];
if (song) {
NSString *title = [song valueForProperty:MPMediaItemPropertyTitle];
NSString *album = [song valueForProperty:MPMediaItemPropertyAlbumTitle];
NSString *artist = [song valueForProperty:MPMediaItemPropertyArtist];
NSString *playCount = [song valueForProperty:MPMediaItemPropertyPlayCount];
NSLog(#"title: %#", title);
NSLog(#"album: %#", album);
NSLog(#"artist: %#", artist);
NSLog(#"playCount: %#", playCount);
}
}
See this post your options in the background are pretty restricted:
StackOverFlow Post
And the Apple Docs regarding that state it is not really possible:
Apple Documentation on Background states
Be sure to remove the observer when going into the background:
[[NSNotificationCenter defaultCenter] removeObserver:self name:MPMusicPlayerControllerNowPlayingItemDidChangeNotification object:musicPlayer];[player endGeneratingPlaybackNotifications];
Add it again when entering the foreground.
I have a video playing app which displays nothing in the AVPlayerLayer after repeatedly presenting and hiding the modal view which contains it. If I dismiss the modal view when this happens, the next load usually displays fine (?!!). The black screen issue happens roughly 20% of the time.
I build an AVMutableComposition to make the AVPlayerItem, but this bug happens even if there's only a single sample involved.
The issue can also be reproduced with a lot of app switching and turning music on and off. I do include music controls in my app (along with a simple view which displays the currently playing iTunes track).
This only happens on iOS 4. It used to happen on iOS 5 as well, but when I started recycling the view which contains the AVPlayerLayer, it worked fine. The only things I don't recycle are the AVPlayer and the relevant AVPlayerItem.
Here's how I load the assets and build a player:
- (void)loadAssetsFromFiles:(id)sender {
NSLog(#"loadAssetsFromFiles called");
assert ([assetURL2clipID count] > 0);
self.isWaitingToLoadAssets = NO;
composition = [AVMutableComposition new];
videoComposition = [AVMutableVideoComposition new];
[self releaseAssets];
//We're going to add this asset to a composition, so we'll need to have random access available
//WARNING: This can cause slow initial loading, so consider loading files later and as needed.
NSDictionary *assetOptions = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES]
forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
//iterate through the asset urls we know we need to load
for (NSURL *fileURL in [assetURL2clipID allKeys])
{
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:fileURL options:assetOptions];
assert(asset);
//index assets by clipID
[assets setObject:asset forKey:[assetURL2clipID objectForKey:fileURL]];
NSString *tracksKey = #"tracks";
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:tracksKey] completionHandler:
^{
NSLog(#"an asset completed loading values for keys.");
NSLog(#"Tracks loaded:");
[asset.tracks enumerateObjectsUsingBlock:
^(AVAssetTrack *obj, NSUInteger index, BOOL *stop)
{
NSLog(#"\n mediaType: %#\n trackID: %d\n", obj.mediaType, obj.trackID);
}];
NSArray *metadata = [asset commonMetadata];
for ( AVMetadataItem* item in metadata ) {
NSString *key = [item commonKey];
NSString *value = [item stringValue];
NSLog(#" metadata key = %#, value = %#", key, value);
}
if (!viewIsActive)
{
NSLog(#"An asset finished loading while the player view was inactive! Did you make sure cancelLoading called on this asset?");
}
// Completion handler block.
NSError *error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error];
if (status == AVKeyValueStatusLoaded && error == nil) {
//if we've loaded all of our assets, it's time to build the composition and prepare the player!
loadedAssets++;
if (loadedAssets == [assets count])
{
CGSize videoSize = [asset naturalSize];
//every video composition needs these set
videoComposition.renderSize = videoSize;
videoComposition.frameDuration = CMTimeMake(1, 30); // 30 fps. TODO: Set this to the framerate of one of the assets
//using the assets we've already got
[self buildCompositions];
self.playerItem = [AVPlayerItem playerItemWithAsset:composition];
self.playerItem.videoComposition = videoComposition;
//TODO: Adding observer stuff should be on the main thread to prevent a partial notification from happening
[playerItem addObserver:self forKeyPath:#"status"
options:0 context:&ItemStatusContext];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:playerItem];
self.player = [AVPlayer playerWithPlayerItem:playerItem];
[playerView setPlayer:player];
[self.player addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionNew context:nil];
self.isObservingPlayerStatus = YES;
}
} else if (error != nil) {
// Deal with the error appropriately.
NSLog(#"WARNING: An asset's tracks were not loaded, so the composition cannot be completed. Error:\n%#\nstatus of asset: %d", [error localizedDescription], status);
}
else
{
//There was no error but we don't know what the problem was.
NSLog(#"WARNING: An asset's tracks were not loaded, so the composition cannot be completed. No error was reported.\nstatus of asset: %d", status);
}
}];
}
}
That [self buildCompositions] function you see builds an AVVideoComposition to do opacity ramps, but I tried bypassing it and get the same problem.
When profiling the program, CoreAnimation reports a framerate of ~45 FPS when everything is working correctly, and 0-4 FPS when the blank screen rears its presumably ugly head.
This guy seems to have had a similar problem, but for me recycling the views really only fixed things for iOS 5:
Playing many different videos on iphone using AVPlayer
I used the following code to record video.
UIImagePickerController *m_objpicker;=[[UIImagePickerController alloc] init];
m_objpicker.sourceType = UIImagePickerControllerSourceTypeCamera;
m_objpicker.mediaTypes = [NSArray arrayWithObject:(NSString *)kUTTypeMovie];
// hide the camera controls
//picker.showsCameraControls=NO;
m_objpicker.delegate = self;
//picker.allowsImageEditing = NO;
m_objpicker.allowsEditing=NO;
// and put our overlay view in
//picker.cameraOverlayView=m_objOverlayView;
[self presentModalViewController:m_objpicker animated:YES];
When we finish recording
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info{
NSURL *m_objMediaURL=[info objectForKey:UIImagePickerControllerMediaURL];
[m_objpicker dismissModalViewControllerAnimated:YES];
}
My doubt is, how to save the captured video to a location we specify. Also how to use
UISaveVideoAtPathToSavedPhotosAlbum .
What all things i need to change in my code so that i can save video to a specified location
Thanks,
If you would like to save to the Camera Roll photos/videos album on the phone:
Definition:
void UISaveVideoAtPathToSavedPhotosAlbum (
NSString *videoPath,
id completionTarget,
SEL completionSelector,
void *contextInfo
);
Where and how to execute it:
- (void)imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
{
.... code here .....
NSString* m_objMediaURL= [info objectForKey:UIImagePickerControllerMediaURL];
//remember to test that the video is compatible for saving to the photos album
UISaveVideoAtPathToSavedPhotosAlbum(m_objMediaURL, self, #selector(video:didFinishSavingWithError:contextInfo:), nil);
.... code here .....
}
There is no control beyond saving this to the Camera Roll re: location unless you want to save to the application bundle which I do not recommend.
You can save video to a location that you specify, please check following link how to capture video in iphone
To the Save the Video in Photo Album:
-(void)imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info {
NSString *tempFilePath = [[info objectForKey:UIImagePickerControllerMediaURL] path];
if (_newMedia){
if ( UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(tempFilePath))
{
// Copy it to the camera roll.
UISaveVideoAtPathToSavedPhotosAlbum(tempFilePath, self, #selector(video:didFinishSavingWithError:contextInfo:), (__bridge void *)(tempFilePath));
}
}
-(void) video: (NSString *) videoPath
didFinishSavingWithError: (NSError *) error
contextInfo: (void *) contextInfo {
NSLog(#"Finished saving video with error: %#", error);
}