Fast batch saving of photos to camera roll - iphone

As we all know, saving a photo, be it UIImage or raw NSData to iOS' Camera Roll takes some amount of time. But I have gone even further and not only do I save the photo to the Camera Roll, I then move it to the designated ALAssetGroup:
[assetsLibrary writeImageToSavedPhotosAlbum:img
orientation:(ALAssetOrientation)image.imageOrientation
completionBlock:^(NSURL* assetURL, NSError* error) {
if (error.code == 0) {
NSLog(#"saved image completed:\nurl: %#", assetURL);
// try to get the asset
[assetsLibrary assetForURL:assetURL resultBlock:^(ALAsset *asset) {
// assign the photo to the album
[[groups objectAtIndex:page] addAsset:asset];
I'm calling this method on a background thread but it still takes very considerate amount of time to complete - about 3 - 4 seconds.
What I need is I want to get as closest as possible to the default iOS camera behavior of "batch-shooting" - when the image is saved immediately after it has been taken, and then the next one, and the next one and the next one.
How should I approach this? Using NSOpertaionQueue might be an option but will it be considerably faster?
Any decisions working on iOS 6 and up are highly appreciated

Related

Hardware accelerated h.264 decoding to texture, overlay or similar in iOS

Is it possible, and supported, to use the iOS hardware accelerated h.264 decoding API to decode a local (not streamed) video file, and then compose other objects on top of it?
I would like to make an application that involves drawing graphical objects in front of a video, and use the playback timer to synchronize what I am drawing on top, to what is being played on the video. Then, based on the user's actions, change what I am drawing on top (but not the video)
Coming from DirectX, OpenGL and OpenGL ES for Android, I am picturing something like rendering the video to a texture, and using that texture to draw a full screen quad, then use other sprites to draw the rest of the objects; or maybe writing an intermediate filter just before the renderer, so I can manipulate the individual output frames and draw my stuff; or maybe drawing to a 2D layer on top of the video.
It seems like AV Foundation, or Core Media may help me do what I am doing, but before I dig into the details, I would like to know if it is possible at all to do what I want to do, and what are my main routes to approach the problem.
Please refrain from "this is too advanced for you, try hello world first" answers. I know my stuff, and just want to know if what I want to do is possible (and most importantly, supported, so the app won't get eventually rejected), before I study the details by myself.
edit:
I am not knowledgeable in iOS development, but professionally do DirectX, OpenGL and OpenGL ES for Android. I am considering making an iOS version of an Android application I currently have, and I just want to know if this is possible. If so, I have enough time to start iOS development from scratch, up to doing what I want to do. If it is not possible, then I will just not invest time studying the entire platform at this time.
Therefore, this is a technical feasibility question. I am not requesting code. I am looking for answers of the type "Yes, you can do that. Just use A and B, use C to render into D and draw your stuff with E", or "No, you can't. The hardware accelerated decoding is not available for third-party applications" (which is what a friend told me). Just this, and I'll be on my way.
I have read the overview for the video technologies in page 32 of the ios technology overview. It pretty much says that I can use Media Player for the most simple playback functionality (not what I'm looking for), UIKit for embedding videos with a little more control over the embedding, but not over the actual playback (not what I'm looking for), AVFoundation for more control over playback (maybe what I need, but most of the resources I find online talk about how to use the camera), or Core Media to have full low-level control over video (probably what I need, but extremely poorly documented, and even more lacking in resources on playback than even AVFoundation).
I am concerned that I may dedicate the next six months to learn iOS programming full time, only to find at the end that the relevant API is not available for third party developers, and what I want to do is unacceptable for iTunes store deployment. This is what my friend told me, but I can't seem to find anything relevant in the app development guidelines. Therefore, I came here to ask people who have more experience in this area, whether or not what I want to do is possible. No more.
I consider this a valid high level question, which can be misunderstood as an I-didn't-do-my-homework-plz-give-me-teh-codez question. If my judgement in here was mistaken, feel free to delete, or downvote this question to your heart's contempt.
Yes, you can do this, and I think your question was specific enough to belong here. You're not the only one who has wanted to do this, and it does take a little digging to figure out what you can and can't do.
AV Foundation lets you do hardware-accelerated decoding of H.264 videos using an AVAssetReader, at which point you're handed the raw decoded frames of video in BGRA format. These can be uploaded to a texture using either glTexImage2D() or the more efficient texture caches in iOS 5.0. From there, you can process for display or retrieve the frames from OpenGL ES and use an AVAssetWriter to perform hardware-accelerated H.264 encoding of the result. All of this uses public APIs, so at no point do you get anywhere near something that would lead to a rejection from the App Store.
However, you don't have to roll your own implementation of this. My BSD-licensed open source framework GPUImage encapsulates these operations and handles all of this for you. You create a GPUImageMovie instance for your input H.264 movie, attach filters onto it (such as overlay blends or chroma keying operations), and then attach these filters to a GPUImageView for display and/or a GPUImageMovieWriter to re-encode an H.264 movie from the processed video.
The one issue I currently have is that I don't obey the timestamps in the video for playback, so frames are processed as quickly as they are decoded from the movie. For filtering and re-encoding of a video, this isn't a problem, because the timestamps are passed through to the recorder, but for direct display to the screen this means that the video can be sped up by as much as 2-4X. I'd welcome any contributions that would let you synchronize the playback rate to the actual video timestamps.
I can currently play back, filter, and re-encode 640x480 video at well over 30 FPS on an iPhone 4 and 720p video at ~20-25 FPS, with the iPhone 4S being capable of 1080p filtering and encoding at significantly higher than 30 FPS. Some of the more expensive filters can tax the GPU and slow this down a bit, but most filters operate in these framerate ranges.
If you want, you can examine the GPUImageMovie class to see how it does this uploading to OpenGL ES, but the relevant code is as follows:
- (void)startProcessing;
{
NSDictionary *inputOptions = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
AVURLAsset *inputAsset = [[AVURLAsset alloc] initWithURL:self.url options:inputOptions];
[inputAsset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:#"tracks"] completionHandler: ^{
NSError *error = nil;
AVKeyValueStatus tracksStatus = [inputAsset statusOfValueForKey:#"tracks" error:&error];
if (!tracksStatus == AVKeyValueStatusLoaded)
{
return;
}
reader = [AVAssetReader assetReaderWithAsset:inputAsset error:&error];
NSMutableDictionary *outputSettings = [NSMutableDictionary dictionary];
[outputSettings setObject: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey: (NSString*)kCVPixelBufferPixelFormatTypeKey];
// Maybe set alwaysCopiesSampleData to NO on iOS 5.0 for faster video decoding
AVAssetReaderTrackOutput *readerVideoTrackOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:[[inputAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] outputSettings:outputSettings];
[reader addOutput:readerVideoTrackOutput];
NSArray *audioTracks = [inputAsset tracksWithMediaType:AVMediaTypeAudio];
BOOL shouldRecordAudioTrack = (([audioTracks count] > 0) && (self.audioEncodingTarget != nil) );
AVAssetReaderTrackOutput *readerAudioTrackOutput = nil;
if (shouldRecordAudioTrack)
{
audioEncodingIsFinished = NO;
// This might need to be extended to handle movies with more than one audio track
AVAssetTrack* audioTrack = [audioTracks objectAtIndex:0];
readerAudioTrackOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:audioTrack outputSettings:nil];
[reader addOutput:readerAudioTrackOutput];
}
if ([reader startReading] == NO)
{
NSLog(#"Error reading from file at URL: %#", self.url);
return;
}
if (synchronizedMovieWriter != nil)
{
__unsafe_unretained GPUImageMovie *weakSelf = self;
[synchronizedMovieWriter setVideoInputReadyCallback:^{
[weakSelf readNextVideoFrameFromOutput:readerVideoTrackOutput];
}];
[synchronizedMovieWriter setAudioInputReadyCallback:^{
[weakSelf readNextAudioSampleFromOutput:readerAudioTrackOutput];
}];
[synchronizedMovieWriter enableSynchronizationCallbacks];
}
else
{
while (reader.status == AVAssetReaderStatusReading)
{
[self readNextVideoFrameFromOutput:readerVideoTrackOutput];
if ( (shouldRecordAudioTrack) && (!audioEncodingIsFinished) )
{
[self readNextAudioSampleFromOutput:readerAudioTrackOutput];
}
}
if (reader.status == AVAssetWriterStatusCompleted) {
[self endProcessing];
}
}
}];
}
- (void)readNextVideoFrameFromOutput:(AVAssetReaderTrackOutput *)readerVideoTrackOutput;
{
if (reader.status == AVAssetReaderStatusReading)
{
CMSampleBufferRef sampleBufferRef = [readerVideoTrackOutput copyNextSampleBuffer];
if (sampleBufferRef)
{
runOnMainQueueWithoutDeadlocking(^{
[self processMovieFrame:sampleBufferRef];
});
CMSampleBufferInvalidate(sampleBufferRef);
CFRelease(sampleBufferRef);
}
else
{
videoEncodingIsFinished = YES;
[self endProcessing];
}
}
else if (synchronizedMovieWriter != nil)
{
if (reader.status == AVAssetWriterStatusCompleted)
{
[self endProcessing];
}
}
}
- (void)processMovieFrame:(CMSampleBufferRef)movieSampleBuffer;
{
CMTime currentSampleTime = CMSampleBufferGetOutputPresentationTimeStamp(movieSampleBuffer);
CVImageBufferRef movieFrame = CMSampleBufferGetImageBuffer(movieSampleBuffer);
int bufferHeight = CVPixelBufferGetHeight(movieFrame);
int bufferWidth = CVPixelBufferGetWidth(movieFrame);
CFAbsoluteTime startTime = CFAbsoluteTimeGetCurrent();
if ([GPUImageOpenGLESContext supportsFastTextureUpload])
{
CVPixelBufferLockBaseAddress(movieFrame, 0);
[GPUImageOpenGLESContext useImageProcessingContext];
CVOpenGLESTextureRef texture = NULL;
CVReturn err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, coreVideoTextureCache, movieFrame, NULL, GL_TEXTURE_2D, GL_RGBA, bufferWidth, bufferHeight, GL_BGRA, GL_UNSIGNED_BYTE, 0, &texture);
if (!texture || err) {
NSLog(#"Movie CVOpenGLESTextureCacheCreateTextureFromImage failed (error: %d)", err);
return;
}
outputTexture = CVOpenGLESTextureGetName(texture);
// glBindTexture(CVOpenGLESTextureGetTarget(texture), outputTexture);
glBindTexture(GL_TEXTURE_2D, outputTexture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
for (id<GPUImageInput> currentTarget in targets)
{
NSInteger indexOfObject = [targets indexOfObject:currentTarget];
NSInteger targetTextureIndex = [[targetTextureIndices objectAtIndex:indexOfObject] integerValue];
[currentTarget setInputSize:CGSizeMake(bufferWidth, bufferHeight) atIndex:targetTextureIndex];
[currentTarget setInputTexture:outputTexture atIndex:targetTextureIndex];
[currentTarget newFrameReadyAtTime:currentSampleTime];
}
CVPixelBufferUnlockBaseAddress(movieFrame, 0);
// Flush the CVOpenGLESTexture cache and release the texture
CVOpenGLESTextureCacheFlush(coreVideoTextureCache, 0);
CFRelease(texture);
outputTexture = 0;
}
else
{
// Upload to texture
CVPixelBufferLockBaseAddress(movieFrame, 0);
glBindTexture(GL_TEXTURE_2D, outputTexture);
// Using BGRA extension to pull in video frame data directly
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, bufferWidth, bufferHeight, 0, GL_BGRA, GL_UNSIGNED_BYTE, CVPixelBufferGetBaseAddress(movieFrame));
CGSize currentSize = CGSizeMake(bufferWidth, bufferHeight);
for (id<GPUImageInput> currentTarget in targets)
{
NSInteger indexOfObject = [targets indexOfObject:currentTarget];
NSInteger targetTextureIndex = [[targetTextureIndices objectAtIndex:indexOfObject] integerValue];
[currentTarget setInputSize:currentSize atIndex:targetTextureIndex];
[currentTarget newFrameReadyAtTime:currentSampleTime];
}
CVPixelBufferUnlockBaseAddress(movieFrame, 0);
}
if (_runBenchmark)
{
CFAbsoluteTime currentFrameTime = (CFAbsoluteTimeGetCurrent() - startTime);
NSLog(#"Current frame time : %f ms", 1000.0 * currentFrameTime);
}
}

Release textures (GLKTextureInfo objects) allocated by GLKTextureLoader

New to developing on iOS and in particular the new OpenGL related features on iOS 5, so I apologize if any of my questions are so basic.
The app I am working on is designed to receive camera frames and display them on screen via OpenGL ES (the graphic folks will take over this and add the actual OpenGL drawing about which I know very little). The application is developed XCode4, and the target is iPhone4 running iOS 5. For the moment, I used the ARC and the GLKit functionality and all is working fine except for the memory leak in loading the images as texture. The app receives a "memory warning" very soon.
Specifically, I would like to ask how to release the textures allocated by
#property(retain) GLKTextureInfo *texture;
-(void)setTextureCGImage:(CGImageRef)image
{
NSError *error;
self.texture = [GLKTextureLoader textureWithCGImage:image options:nil error:&error];
if (error)
{
NSLog(#"Error loading texture from image: %#",error);
}
}
The image is a quartz image built from the camera frame (sample code from apple). I know the problem is not in that part of the code since if I disable the assignment, the app does not receive the warning.
Super hacky solution I believe, but it seems to work:
Add the following before the assignment:
GLuint name = self.texture.name;
glDeleteTextures(1, &name);
If there's a more official way (or if this is the official way), I would appreciate if someone could let me know.
Not a direct answer, but something I noticed and it wont really fit in a comment.
If you're using GLKTextureLoader to load textures in the background to replace an existing texture, you have to delete the existing texture on the main thread. Deleting a texture in the completion handler will not work.
AFAIK this is because:
Every iOS thread requires its own EAGLContext, so the background queue has its own thread with its own context.
The completion handler is run on the queue you passed in, which is most likely not the main queue. (Else you wouldn't be doing the loading in the background...)
That is, this will leak memory.
NSDictionary *options = #{GLKTextureLoaderOriginBottomLeft:#YES};
dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
[self.asyncTextureLoader textureWithContentsOfFile:#"my_texture_path.png"
options:options
queue:queue
completionHandler:^(GLKTextureInfo *texture, NSError *e){
GLuint name = self.myTexture.name;
//
// This delete textures call has no effect!!!
//
glDeleteTextures(1, &name);
self.myTexture = texture;
}];
To get around this issue you can either:
Delete the texture before the upload happens. Potentially sketchy depending on how your GL is architected.
Delete the texture on the main queue in the completion handler.
So, to fix the leak you need to do this:
//
// Method #1, delete before upload happens.
// Executed on the main thread so it works as expected.
// Potentially leaves some GL content untextured if you're still drawing it
// while the texture is being loaded in.
//
// Done on the main thread so it works as expected
GLuint name = self.myTexture.name;
glDeleteTextures(1, &name)
NSDictionary *options = #{GLKTextureLoaderOriginBottomLeft:#YES};
dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
[self.asyncTextureLoader textureWithContentsOfFile:#"my_texture_path.png"
options:options
queue:queue
completionHandler:^(GLKTextureInfo *texture, NSError *e){
// no delete required, done previously.
self.myTexture = texture;
}];
or
//
// Method #2, delete in completion handler but do it on the main thread.
//
NSDictionary *options = #{GLKTextureLoaderOriginBottomLeft:#YES};
dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
[self.asyncTextureLoader textureWithContentsOfFile:#"my_texture_path.png"
options:options
queue:queue
completionHandler:^(GLKTextureInfo *texture, NSError *e){
// you could potentially do non-gl related work here, still in the background
// ...
// Force the actual texture delete and re-assignment to happen on the main thread.
dispatch_sync(dispatch_get_main_queue(), ^{
GLuint name = self.myTexture.name;
glDeleteTextures(1, &name);
self.myTexture = texture;
});
}];
Is there a way to simply replace the contents of the texture to the same GLKTextureInfo.name handle? When using glgentextures you can use the returned texture handle to load new texuture data using glteximage2d. But with GLKTextureLoader it seems that glgentextures is being called every time new texture data is loaded...

Weird popping noise when playing different sounds with different volumes set through OpenAL on the iPhone

I'm using OpenAL sound framework on the iPhone, and I'm setting different volumes on individual sounds. I'm running into a problem where I'm hearing an initial popping/clicking noise when switching from one sound to the next.
It's really noticeable when I have one sound that's got a high volume (1.0) and a second
sound that has a low one (0.2). When I hit the loud sound, and then
hit the soft sound, I hear the pop/click. But when I go from the soft
sound to the loud, I don't notice anything. So the pop/click really
happens when switching from loud to soft sounds.
Here's the init sound method:
- (id) initWithSoundFile:(NSString *)file doesLoop:(BOOL)loops
{
self = [super init];
if (self != nil)
{
if(![self loadSoundFile:file doesLoop:loops])
{
debug(#"Failed to load the sound file: %#...", file);
[self release];
return nil;
}
self.sourceFileName = file;
//temporary sound queue
self.temporarySounds = [NSMutableArray array];
//default volume/pitch
self.volume = 1.0;
self.pitch = 1.0;
}
return self;
}
and here's the play function:
- (BOOL) play
{
if([self isPlaying]) //see if the base source is busy...
{
//if so, create a new source
NSUInteger tmpSourceID;
alGenSources(1, &tmpSourceID);
//attach the buffer to the source
alSourcei(tmpSourceID, AL_BUFFER, bufferID);
alSourcePlay(tmpSourceID);
//add the sound id to the play queue so we can dispose of it later
[temporarySounds addObject: [NSNumber numberWithUnsignedInteger:tmpSourceID]];
//a "callback" for when the sound is done playing +0.1 secs
[self performSelector:#selector(deleteTemporarySource)
withObject:nil
afterDelay:(duration * pitch) + 0.1];
return ((error = alGetError()) != AL_NO_ERROR);
}
//if the base source isn't busy, just use that one...
alSourcePlay(sourceID);
return ((error = alGetError()) != AL_NO_ERROR);
}
and here's the function where i set the volume for each sound immediately after playing (ive tried setting it before playing too):
- (void) setVolume:(ALfloat)newVolume
{
volume = MAX(MIN(newVolume, 1.0f), 0.0f); //cap to 0-1
alSourcef(sourceID, AL_GAIN, volume);
//now set the volume for any temporary sounds...
for(NSNumber *tmpSourceID in temporarySounds)
{
//tmpSourceID is the source ID for the temporary sound
alSourcef([tmpSourceID unsignedIntegerValue], AL_GAIN, volume);
}
}
Any help is greatly appreciated as I've tried everything I can think of. I would be so grateful.
All I had to do was use calloc instead of malloc to allocate memory for the OpenAL buffer.
Or you could also zero set the memory with memset.
The wierd popping noise went off. It was due to junk memory in my case. That's why it was random too. Hope this helps.
This problem is caused by not calling alSourceStop.
The documentation doesn't really state this, but alSourceStop must be called on a sound source before it can be reused even if the sound had already finished and the AL_SOURCE_STATE parameter of the source is not AL_PLAYING.
I've randomly got to this unaswered question and, finding that the problem was not solved, I'll try to give my answer, even if a long time has passed.
I don't know OpenAL, but it sounds like this is a purely audio problem. It is normal to hear short clicks when you change suddenly the level of the audio, especially from a high value to a low value. For example, if you map directly the volume of the audio to a slider, which value is updated every few ms, you can easily hear clicks and pops when sliding fast the control. What audio software developers do is smoothing the parameter changes with a low pass filter.
In your case, I would suggest you to stop the clip after fading it out, and start a new clip by fading it in. The fade time can be as short as 2 ms: it's not audible, and the sound will play just finely.
I wonder if (some versions of) OpenAL can automatically deal with this issue.

UIImagePickerController for movie Item

I am writing a simple video uploader application on iPhone 3GS where I first direct the user to photos album, and then select the video to share or upload. I am using the UIImagePickerController in the following way:
videoPickerCtrl = [[UIImagePickerController alloc] init];
videoPickerCtrl.delegate = self;
videoPickerCtrl.sourceType = UIImagePickerControllerSourceTypePhotoLibrary;
videoPickerCtrl.mediaTypes = [UIImagePickerController availableMediaTypesForSourceType:videoPickerCtrl.sourceType];
videoPickerCtrl.allowsImageEditing = NO;
videoPickerCtrl.mediaTypes = [NSArray arrayWithObject:(NSString *)kUTTypeMovie];
[window addSubview:videoPickerCtrl.view];
But I can see that once the controller is invoked, there is a disturbing video trimming interface that is presented. Once I press "choose", the video is always trimmed no matter whether I touch the trimming controls or not. Is there any way to get around this trimming interface and directly get the path of the video file ?
You should set allowsEditing = NO; instead of allowsImageEditing = NO; (which has been deprecated in 3.1). Then, the trimming interface should not appear unless the selected movie is longer than 10 minutes (from the docs: "Maximum movie duration is 10 minutes. If a user picks a movie that is longer in duration than 10 minutes, they are forced to trim it before saving it.").
Interesting problem. This is just for information should anyone else be looking at this. On iPad OS 3.2 I have found some problems retrieving video, although the picker works and I can select video from albums and not just from the camera roll.
Here's my working code frag
The call
NSArray *mediaTypesAllowed = [UIImagePickerController availableMediaTypesForSourceType:UIImagePickerControllerSourceTypePhotoLibrary];
[picker setMediaTypes:mediaTypesAllowed];
picker.delegate = self;
picker.allowsEditing = NO;
picker.wantsFullScreenLayout = YES;
if(!IsEmpty(self.editBackgroundPopover)){
[self.editBackgroundPopover setContentViewController:picker animated:YES];
}
And here is the delegate method
imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
[self.editBackgroundPopover dismissPopoverAnimated:true];
NSString* mediaType = [info objectForKey:UIImagePickerControllerMediaType];
//not production code, do not use hard coded string in real app
if ( [ mediaType isEqualToString:#"public.image" ]) {
NSLog(#"Picked a photo");
}
//not production code, do not use hard coded string in real app
else if ( [ mediaType isEqualToString:#"public.movie" ]){
NSLog(#"Picked a movie at URL %#", [info objectForKey:UIImagePickerControllerMediaURL]);
NSURL *url = [info objectForKey:UIImagePickerControllerMediaURL];
NSLog(#"> %#", [url absoluteString]);
}
[[picker self] dismissModalViewControllerAnimated:YES];
}
However the video URL which I retrieve from the picker has the form
file:/localhost/private/var/mobile/Applications/C6FAC491-D27D-45A6-B805-951727ED2CEC/tmp/-Tmp-/trim.KOzqps.MOV
So it looks to me that the Video might be being processed through the trimming code even if I'm selecting the video as a whole. Note also that the movie, originally of type m4v when I loaded it through iTunes is of type MOV, which is of course unplayable on the device! I did try playing the URL but I received an alert saying "This kind of movie can't be played"
I don't quite understand what Apple is playing at here, the API appears not to really be usable as a way of loading and playing video from the photo library.
Hopefully IOS 4 will be more forthcoming, but for my iPad app, that's still months away.
Ok so I got it working long back after carefully looking at SDK docs. I am able to get videos from Camera Roll directory on my 3GS. But I can not find any way in which UIImagePickerController can choose video from directories other than Camera Roll(for instance, device's Photo Library where the user syncs videos through iTunes). Is there any standard way in SDK to do that ?

iphone sdk > 3.0 . Video Thumbnail?

From what i have read apple doesnt expose the api to allow developers to get a thumbnail of a movie using the current sdk.
Can any one share some code as to how they are going about this?
I have heard you can access the camera picker view and find the image view just before the ui image picker is closed. This seems kinda ugly.
Also ive considered using ffmpeg to grab a frame out of the movie but wouldnt hava a clue as to how to compile it as a library for the iphone. Any pointers would be greatly appreciated
Hope my code could help you guys. It's ugly. I think apple should open this kind of APIs.
Of course, all NSLog() should be removed. It's just for demonstration.
alvin
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info{
// e.g.
NSString *tempFilePath = [(NSURL *)[info valueForKey:UIImagePickerControllerMediaURL] absoluteString];
NSLog(#"didFinishPickingMediaWithInfo: %#",tempFilePath);
// e.g. /private/var/mobile/Applications/D1E784A4-EC1A-402B-81BF-F36D3A08A332/tmp/capture/capturedvideo.MOV
tempFilePath = [[tempFilePath substringFromIndex:16] retain];
NSLog(#"didFinishPickingMediaWithInfo: %#",tempFilePath);
NSLog(#"===Try to save video to camera roll.===");
NSLog(#"UIVideoAtPathIsCompatibleWithSavedPhotosAlbum: %#",UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(tempFilePath)? #"YES":#"NO");
// Check if the video file can be saved to camera roll.
if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(tempFilePath)){
// YES. Copy it to the camera roll.
UISaveVideoAtPathToSavedPhotosAlbum(tempFilePath, self, #selector(video:didFinishSavingWithError:contextInfo:), tempFilePath);
}
[self dismissModalViewControllerAnimated:YES];
}
- (void)video:(NSString *)videoPath didFinishSavingWithError:(NSError *)error contextInfo:(NSString *)contextInfo{
NSLog(#"didFinishSavingWithError--videoPath in camera roll:%#",videoPath);
NSLog(#"didFinishSavingWithError--videoPath in temp directory:%#",contextInfo);
// The thumbnail jpg should located in this directory.
NSString *thumbnailDirectory = [[contextInfo stringByDeletingLastPathComponent] stringByDeletingLastPathComponent];
// Debug info. list all files in the directory of the video file.
// e.g. /private/var/mobile/Applications/D1E784A4-EC1A-402B-81BF-F36D3A08A332/tmp/capture
NSLog([contextInfo stringByDeletingLastPathComponent]);
NSLog([[[NSFileManager defaultManager] contentsOfDirectoryAtPath:[contextInfo stringByDeletingLastPathComponent] error:nil] description]);
// Debug info. list all files in the parent directory of the video file, i.e. the "~/tmp" directory.
// e.g. /private/var/mobile/Applications/D1E784A4-EC1A-402B-81BF-F36D3A08A332/tmp
NSLog(thumbnailDirectory);
NSLog([[[NSFileManager defaultManager] contentsOfDirectoryAtPath:thumbnailDirectory error:nil] description]);
///////////////////
// Find the thumbnail for the video just recorded.
NSString *file,*latestFile;
NSDate *latestDate = [NSDate distantPast];
NSDirectoryEnumerator *dirEnum = [[NSFileManager defaultManager] enumeratorAtPath:[[contextInfo stringByDeletingLastPathComponent]stringByDeletingLastPathComponent]];
// Enumerate all files in the ~/tmp directory
while (file = [dirEnum nextObject]) {
// Only check files with jpg extension.
if ([[file pathExtension] isEqualToString: #"jpg"]) {
NSLog(#"***latestDate:%#",latestDate);
NSLog(#"***file name:%#",file);
NSLog(#"***NSFileSize:%#", [[dirEnum fileAttributes] valueForKey:#"NSFileSize"]);
NSLog(#"***NSFileModificationDate:%#", [[dirEnum fileAttributes] valueForKey:#"NSFileModificationDate"]);
// Check if current jpg file is the latest one.
if ([(NSDate *)[[dirEnum fileAttributes] valueForKey:#"NSFileModificationDate"] compare:latestDate] == NSOrderedDescending){
latestDate = [[dirEnum fileAttributes] valueForKey:#"NSFileModificationDate"];
latestFile = file;
NSLog(#"***latestFile changed:%#",latestFile);
}
}
}
// The thumbnail path.
latestFile = [NSTemporaryDirectory() stringByAppendingPathComponent:latestFile];
NSLog(#"****** The thumbnail file should be this one:%#",latestFile);
// Your code ...
// Your code ...
// Your code ...
}
There is a jpg in the same folder as the the movie that is the thumbnail. I have only tested it with video recorded from the phone, but it works fine. Its not named the same as the movie, so get the directory path the movie is in and iterate for a .jpg and away you go.
Best method I've found... MPMoviePlayerController thumbnailImageAtTime:timeOption
NSString *videoLink=[info objectForKey:#"UIImagePickerControllerMediaURL"];
MPMoviePlayerController *player = [[MPMoviePlayerController alloc] initWithContentURL:(NSURL*)videoLink];
UIImage *imageSel = [player thumbnailImageAtTime:1.0 timeOption:MPMovieTimeOptionNearestKeyFrame];
[player stop];
[player release];
Here's a blog post about extracting frames from movies using ffmpeg:
http://www.codza.com/extracting-frames-from-movies-on-iphone
The corresponding project on github: iFrameExtractor
Just wanted to provide an update on this. Things seem to work in terms of getting the thumbnail when you've set your picker to UIImagePickerControllerSourceTypeCamera. However when you try to pick from the existing library (i.e. UIImagePickerControllerSourceTypePhotoLibrary)
the .jpg for the thumbnail never gets created. I even tried re-saving with UISaveVideoAtPathToSavedPhotosAlbum(... ) and still no dice. It seems like the thumbnail is first created when the video is captured. That is your only opportunity to "grab" it. Afterwards it gets moved into a subdirectory of /var/mobile/Media/DCIM/ . While you can actually list the thumbnail image and see that is there, that file is unreadable and uncopyable as I believe that directory is protected.
If anyone has a workaround for this I would greatly appreciate it as in my case I need to grab the thumbnail of existing video after choosing it from the library.
If your using UIImagePickerController to get the image, then a JPG would not be stored in the tmp directory. I'm not sure of the answer for this, but ffmpeg, more specifically the libraries behind it, may be your option.