Creating 16:9 Video Rather than 4:3 - AVAsset Writer - iPhone - iphone

I am using the below code to make a video from a static 16:9 image using AVAsset writer. The problem is that for some reason the video that is produced is in a 4:3 format.
Can anyone suggest a way that I can either amend the code to produce a 16:9 video, or alternatively, how I can convert the 4:3 video to 16:9.
Thank you
- (void) createVideoFromStillImage
{
//Set the size according to the device type (iPhone or iPad).
CGSize size = CGSizeMake(screenWidth, screenHeight);
NSString *betaCompressionDirectory = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/IntroVideo.mov"];
NSError *error = nil;
unlink([betaCompressionDirectory UTF8String]);
//----initialize compression engine
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:betaCompressionDirectory]
fileType:AVFileTypeQuickTimeMovie
error:&error];
NSParameterAssert(videoWriter);
if(error)
NSLog(#"error = %#", [error localizedDescription]);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: AVVideoCodecH264,AVVideoCodecKey,
[NSNumber numberWithInt:size.height], AVVideoWidthKey,
[NSNumber numberWithInt:size.width], AVVideoHeightKey, nil];
AVAssetWriterInput *writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);
if ([videoWriter canAddInput:writerInput])
NSLog(#"I can add this input");
else
NSLog(#"i can't add this input");
[videoWriter addInput:writerInput];
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
//CGImageRef theImage = [finishedMergedImage CGImage];
CGImageRef theImage = [introImage CGImage];
//dispatch_queue_t dispatchQueue = dispatch_queue_create("mediaInputQueue", NULL);
int __block frame = 0;
//Calculate how much progress % one frame completion represents. Maximum of 75%.
float currentProgress = 0.0;
float progress = (80.0 / kDurationOfIntroOutro);
//NSLog(#"Progress is %f", progress);
for (int i=0; i<=kDurationOfIntroOutro; i++) {
//Update our progress view for every frame that is generated.
[self updateProgressView:currentProgress];
currentProgress +=progress;
//NSLog(#"CurrentProgress is %f", currentProgress);
frame++;
[NSThread sleepForTimeInterval:0.05]; //Delay to allow buffer to be ready.
CVPixelBufferRef buffer = (CVPixelBufferRef)[self pixelBufferFromCGImage:theImage size:size];
if (buffer) {
if (adaptor.assetWriterInput.readyForMoreMediaData)
{
if(![adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(frame, 20)])
NSLog(#"FAIL");
else
NSLog(#"Success:%d", frame);
CFRelease(buffer);
}
}
}
[writerInput markAsFinished];
[videoWriter finishWriting];
[videoWriter release];
//NSLog(#"outside for loop");
//Grab the URL for the video so we can use it later.
NSURL * url = [self applicationDocumentsDirectory : kIntroVideoFileName];
[assetURLArray setObject:url forKey:kIntroVideo];
}
- (CVPixelBufferRef )pixelBufferFromCGImage:(CGImageRef)image size:(CGSize)size
{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width, size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options, &pxbuffer);
// CVReturn status = CVPixelBufferPoolCreatePixelBuffer(NULL, adaptor.pixelBufferPool, &pxbuffer);
NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width, size.height, 8, 4*size.width, rgbColorSpace, kCGImageAlphaPremultipliedFirst);
NSParameterAssert(context);
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}

So that this can be closed out, I'll restate what I did above. The videoSettings dictionary you use should be using the target dimensions of your video, but you're passing in the dimensions of your view. Unless that's what you want to record, you'll need to change the values passed in for the AVVideoWidthKey and AVVideoWidthKey to be the correct output sizes.
Given that the iOS device screens have aspect ratios close to 4:3, this is what probably was leading to that ratio on the recorded video.

Related

iPhone - AVAssetWriter - Error creating movie from photos at 1920×1080 pixels

I am trying to create a movie from some pictures. It works just fine with hd pictures ({720, 1280}) or lower resolutions . But when i try to create the movie with full hd pictures {1080, 1920} , the video is scrambled. Here is a link to see how it looks http://www.youtube.com/watch?v=BfYldb8e_18 . Do you have any ideas what i may be doing wrong?
- (void) createMovieWithOptions:(NSDictionary *) options
{
#autoreleasepool {
NSString *path = [options valueForKey:#"path"];
CGSize size = [(NSValue *)[options valueForKey:#"size"] CGSizeValue];
NSArray *imageArray = [options valueForKey:#"pictures"];
NSInteger recordingFPS = [[options valueForKey:#"fps"] integerValue];
BOOL success=YES;
NSError *error = nil;
AVAssetWriter *assetWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:path]
fileType:AVFileTypeQuickTimeMovie
error:&error];
NSParameterAssert(assetWriter);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithFloat:size.width], AVVideoWidthKey,
[NSNumber numberWithFloat:size.height], AVVideoHeightKey,
nil];
AVAssetWriterInput *videoWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings];
// Configure settings for the pixel buffer adaptor.
NSDictionary* bufferAttributes = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput
sourcePixelBufferAttributes:bufferAttributes];
NSParameterAssert(videoWriterInput);
NSParameterAssert([assetWriter canAddInput:videoWriterInput]);
videoWriterInput.expectsMediaDataInRealTime = NO;
[assetWriter addInput:videoWriterInput];
//Start a session:
[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:kCMTimeZero];
CVPixelBufferRef buffer = NULL;
//convert uiimage to CGImage.
int frameCount = 0;
float progress = 0;
float progressFromFrames = _progressView.progress; //only for create iflipbook movie
for(UIImage * img in imageArray)
{
if([[NSThread currentThread] isCancelled])
{
[NSThread exit];
}
[condCreateMovie lock];
if(isCreateMoviePaused)
{
[condCreateMovie wait];
}
uint64_t totalFreeSpace=[Utils getFreeDiskspace];
if(((totalFreeSpace/1024ll)/1024ll)<50)
{
success=NO;
break;
}
// #autoreleasepool {
NSLog(#"size:%#",NSStringFromCGSize(img.size));
buffer = [[MovieWritter sharedMovieWritter] pixelBufferFromCGImage:[img CGImage] andSize:size];
BOOL append_ok = NO;
int j = 0;
while (!append_ok && j < 60)
{
if(adaptor.assetWriterInput.readyForMoreMediaData)
{
CMTime frameTime = CMTimeMake(frameCount, recordingFPS);
append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
CVPixelBufferRelease(buffer);
[NSThread sleepForTimeInterval:0.1];
if(isCreatingiFlipBookFromImported)
progress = (float)frameCount/(float)[imageArray count]/2.0 + progressFromFrames;
else
progress = (float)frameCount/(float)[imageArray count];
[[NSNotificationCenter defaultCenter] postNotificationName:#"movieCreationProgress" object:[NSNumber numberWithFloat:progress]];
}
else
{
[NSThread sleepForTimeInterval:0.5];
}
j++;
}
if (!append_ok)
{
NSLog(#"error appending image %d times %d\n", frameCount, j);
}
frameCount++;
[condCreateMovie unlock];
}
//Finish the session:
[videoWriterInput markAsFinished];
[assetWriter finishWriting];
NSDictionary *dict = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:success], #"success",
path, #"path", nil];
[[NSNotificationCenter defaultCenter] postNotificationName:#"movieCreationFinished" object:dict];
}
}
*Edit . Here is the code for [[MovieWritter sharedMovieWritter] pixelBufferFromCGImage:]
- (CVPixelBufferRef) pixelBufferFromCGImage: (CGImageRef) image andSize:(CGSize) size
{
#autoreleasepool {
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width,
size.height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef) options,
&pxbuffer);
NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width,
size.height, 8, 4*size.width, rgbColorSpace,
kCGImageAlphaNoneSkipFirst);
NSParameterAssert(context);
CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}
}
I had the same problem and this answer resolved it: the size of the video must be a multiple of 16.
Pretty sure that this is either a HW limitation or a bug. Please file a Radar.
how about something like this to get pixel buffer
//you could use a cgiimageref here instead
CFDataRef imageData= CGDataProviderCopyData(CGImageGetDataProvider(imageView.image.CGImage));
NSLog (#"copied image data");
cvErr = CVPixelBufferCreateWithBytes(kCFAllocatorDefault,
FRAME_WIDTH,
FRAME_HEIGHT,
kCVPixelFormatType_32BGRA,
(void*)CFDataGetBytePtr(imageData),
CGImageGetBytesPerRow(imageView.image.CGImage),
NULL,
NULL,
NULL,
&pixelBuffer);
NSLog (#"CVPixelBufferCreateWithBytes returned %d", cvErr);
CFAbsoluteTime thisFrameWallClockTime = CFAbsoluteTimeGetCurrent();
CFTimeInterval elapsedTime = thisFrameWallClockTime - firstFrameWallClockTime;
NSLog (#"elapsedTime: %f", elapsedTime);
CMTime presentationTime = CMTimeMake(elapsedTime * TIME_SCALE, TIME_SCALE);
// write the sample
BOOL appended = [assetWriterPixelBufferAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:presentationTime];
CVPixelBufferRelease(pixelBuffer);
CFRelease(imageData);
if (appended) {
NSLog (#"appended sample at time %lf", CMTimeGetSeconds(presentationTime));
} else {
NSLog (#"failed to append");
[self stopRecording];
self.startStopButton.selected = NO;
}
You may also want to set the capture settings preset , although high usually is suitable and that is default
*/
Constants to define capture setting presets using the sessionPreset property.
NSString *const AVCaptureSessionPresetPhoto;
NSString *const AVCaptureSessionPresetHigh;
NSString *const AVCaptureSessionPresetMedium;
NSString *const AVCaptureSessionPresetLow;
NSString *const AVCaptureSessionPreset352x288;
NSString *const AVCaptureSessionPreset640x480;
NSString *const AVCaptureSessionPreset1280x720;
NSString *const AVCaptureSessionPreset1920x1080;
NSString *const AVCaptureSessionPresetiFrame960x540;
NSString *const AVCaptureSessionPresetiFrame1280x720;
*/
//set it like this
self.captureSession.sessionPreset = AVCaptureSessionPreset1920x1080;
//or like this when you define avcapturesession
[self.captureSession setSessionPreset:AVCaptureSessionPreset1920x1080];

NSArray of UIImages to video error has distortion in the output

I am relatively new to programming and although i am ok with normal functions, i am however completely new to video editing
So i have managed to find some code online to do the jobs shown below:
- (void)writeImagesAsMovie:(NSArray *)array {
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDirectory, YES);
NSString *documentDirectory = [paths objectAtIndex:0];
NSString *saveLocation = [documentDirectory stringByAppendingString:#"/temp.mov"];
if ([[NSFileManager defaultManager] fileExistsAtPath:saveLocation]) {
[[NSFileManager defaultManager] removeItemAtPath:saveLocation error:NULL];
}
UIImage *first = [array objectAtIndex:0];
CGSize frameSize = first.size;
NSError *error = nil;
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
[NSURL fileURLWithPath:saveLocation] fileType:AVFileTypeQuickTimeMovie
error:&error];
if(error) {
NSLog(#"error creating AssetWriter: %#",[error description]);
}
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:frameSize.width], AVVideoWidthKey,
[NSNumber numberWithInt:frameSize.height], AVVideoHeightKey,
nil];
AVAssetWriterInput *writerInput = [AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings];
NSMutableDictionary *attributes = [[NSMutableDictionary alloc] init];
[attributes setObject:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32ARGB] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey];
[attributes setObject:[NSNumber numberWithUnsignedInt:frameSize.width] forKey:(NSString*)kCVPixelBufferWidthKey];
[attributes setObject:[NSNumber numberWithUnsignedInt:frameSize.height] forKey:(NSString*)kCVPixelBufferHeightKey];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
sourcePixelBufferAttributes:attributes];
[videoWriter addInput:writerInput];
// fixes all errors
writerInput.expectsMediaDataInRealTime = YES;
//Start a session:
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
CVPixelBufferRef buffer = NULL;
buffer = [self pixelBufferFromCGImage:[first CGImage]];
BOOL result = [adaptor appendPixelBuffer:buffer withPresentationTime:kCMTimeZero];
if (result == NO) //failes on 3GS, but works on iphone 4
NSLog(#"failed to append buffer");
if(buffer) {
CVBufferRelease(buffer);
}
//int reverseSort = NO;
NSArray *newArray = array;
int fps = 10;
int i = 0;
for (UIImage *image in newArray)
{
[NSThread sleepForTimeInterval:0.02];
if (adaptor.assetWriterInput.readyForMoreMediaData) {
i++;
CMTime frameTime = CMTimeMake(1, fps);
CMTime lastTime = CMTimeMake(i, fps);
CMTime presentTime = CMTimeAdd(lastTime, frameTime);
UIImage *imgFrame = image;//[UIImage imageWithContentsOfFile:filePath] ;
buffer = [self pixelBufferFromCGImage:[imgFrame CGImage]];
BOOL result = [adaptor appendPixelBuffer:buffer withPresentationTime:presentTime];
if (result == NO) //failes on 3GS, but works on iphone 4
{
NSLog(#"failed to append buffer");
NSLog(#"The error is %#", [videoWriter error]);
[NSThread sleepForTimeInterval:0.5];
}
if(buffer) {
CVBufferRelease(buffer);
}
} else {
NSLog(#"error");
i--;
}
}
//Finish the session:
[writerInput markAsFinished];
[videoWriter finishWriting];
CVPixelBufferPoolRelease(adaptor.pixelBufferPool);
NSLog(#"Movie created successfully");
}
- (CVPixelBufferRef) pixelBufferFromCGImage: (CGImageRef) image
{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
CVPixelBufferRef pxbuffer = NULL;
CVPixelBufferCreate(kCFAllocatorDefault, CGImageGetWidth(image),
CGImageGetHeight(image), kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef) options,
&pxbuffer);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, CGImageGetWidth(image),
CGImageGetHeight(image), 8, 4*CGImageGetWidth(image), rgbColorSpace,
kCGImageAlphaNoneSkipFirst);
CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
// CGAffineTransform flipVertical = CGAffineTransformMake(
// 1, 0, 0, -1, 0, CGImageGetHeight(image)
// );
// CGContextConcatCTM(context, flipVertical);
// CGAffineTransform flipHorizontal = CGAffineTransformMake(
// -1.0, 0.0, 0.0, 1.0, CGImageGetWidth(image), 0.0
// );
//
// CGContextConcatCTM(context, flipHorizontal);
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}
But the problem that i am having is that the output of the video is some how corrupted (it does play although it has funny lines shown below:
I would be so grateful for any help
Many Thanks
Thomas
I have been seeing problems with the H264 video encoding hardware where it can corrupt input that does not match an known aspect ratio. For example, my testing shows that if one video dimension is smaller than 128 pixel, the video will not encode.
What I have seen working is 128x128, 192x128, 240x160, 480x320, and others.
See this page on aspect ratios
P.S.
You will likely want to use the AVAssetWriterInputPixelBufferAdaptor since it contains a pixel buffer pool that you can use via CVPixelBufferPoolCreatePixelBuffer(). Also, you will want to assert(adaptor.pixelBufferPool); after calling startSessionAtSourceTime to ensure that your adaptor can write to the writer.

iOS5 AVFoundation image to video

I'm trying to create a video from a single image, and save it to my photos library, I've been googling around for ages - and cannot find a solution.
I have this code:
#autoreleasepool {
NSString *path = [NSHomeDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:#"Documents/movie2.mp4"]];
UIImage *img = [UIImage imageWithData:[[self imageDataArrya]objectAtIndex:0]imageData];
[self writeImageAsMovie:img toPath:path size:CGSizeMake(640, 960) duration:10];
UISaveVideoAtPathToSavedPhotosAlbum (path,self, #selector(video:didFinishSavingWithError: contextInfo:), nil);
}
I call the above mentioned method in a background thread. This is the code for 'writeImageAsMovie':
- (void)writeImageAsMovie:(UIImage*)image toPath:(NSString*)path size:(CGSize)size duration:(int)duration {
NSError *error = nil;
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
[NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
error:&error];
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:size.width], AVVideoWidthKey,
[NSNumber numberWithInt:size.height], AVVideoHeightKey,
nil];
[self setInput:[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings]];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput:input
sourcePixelBufferAttributes:nil];
[videoWriter addInput:input];
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
CVPixelBufferRef buffer = [self pixelBufferFromCGImage:image.CGImage];
[adaptor appendPixelBuffer:buffer withPresentationTime:kCMTimeZero];
[adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(duration-1, 2)];
[input markAsFinished];
[videoWriter endSessionAtSourceTime:CMTimeMake(duration, 2)];
[videoWriter finishWriting];
}
The utility method for converting an Image to a CVPixelBufferRef:
- (CVPixelBufferRef) pixelBufferFromCGImage: (CGImageRef) image {
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault,
self.view.frame.size.width,
self.view.frame.size.height,
kCVPixelFormatType_32ARGB,
(__bridge CFDictionaryRef) options,
&pxbuffer);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, self.view.frame.size.width,
self.view.frame.size.height, 8, 4*self.view.frame.size.width, rgbColorSpace,
kCGImageAlphaNoneSkipFirst);
CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}
Now if I try to run the code from the Simulator, it gives me an error saying that the data is corrupt.
If I run it on my device, it saves a 2 second video to my photo library but its only green, my image isn't in there.
Any help will be appreciated :)
I totally got this working - sorry I didn't see your reply before today.
This is what I used:
Create a Temp File
NSString *path = [NSHomeDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:#"Documents/flipimator-tempfile.mp4"]];
//overwrites it if it already exists.
if([fileManager fileExistsAtPath:path])
[fileManager removeItemAtPath:path error:NULL];
Call the export images method to save images to the temp file:
[self exportImages:frames
asVideoToPath:path
withFrameSize:imageSize
framesPerSecond:fps];
Save the temp file to the photo album:
UISaveVideoAtPathToSavedPhotosAlbum (path,self, #selector(video:didFinishSavingWithError: contextInfo:), nil);
- (void)video:(NSString *) videoPath didFinishSavingWithError: (NSError *) error contextInfo: (void *) contextInfo {
NSLog(#"Finished saving video with error: %#", error);
UIAlertView *alert = [[UIAlertView alloc]initWithTitle:#"Done"
message:#"Movie succesfully exported."
delegate:nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil, nil];
[alert show];
}
Code for the exportImages method:
- (void)exportImages:(NSArray *)imageArray
asVideoToPath:(NSString *)path
withFrameSize:(CGSize)imageSize
framesPerSecond:(NSUInteger)fps {
NSLog(#"Start building video from defined frames.");
NSError *error = nil;
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
[NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
error:&error];
NSParameterAssert(videoWriter);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:imageSize.width], AVVideoWidthKey,
[NSNumber numberWithInt:imageSize.height], AVVideoHeightKey,
nil];
AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput
sourcePixelBufferAttributes:nil];
NSParameterAssert(videoWriterInput);
NSParameterAssert([videoWriter canAddInput:videoWriterInput]);
videoWriterInput.expectsMediaDataInRealTime = YES;
[videoWriter addInput:videoWriterInput];
//Start a session:
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
CVPixelBufferRef buffer = NULL;
//convert uiimage to CGImage.
int frameCount = 0;
for(UIImage * img in imageArray) {
buffer = [self pixelBufferFromCGImage:[img CGImage] andSize:imageSize];
BOOL append_ok = NO;
int j = 0;
while (!append_ok && j < 30) {
if (adaptor.assetWriterInput.readyForMoreMediaData) {
//print out status::
NSString *border = #"**************************************************";
NSLog(#"\n%#\nProcessing video frame (%d,%d).\n%#",border,frameCount,[imageArray count],border);
CMTime frameTime = CMTimeMake(frameCount,(int32_t) fps);
append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
if(!append_ok){
NSError *error = videoWriter.error;
if(error!=nil) {
NSLog(#"Unresolved error %#,%#.", error, [error userInfo]);
}
}
}
else {
printf("adaptor not ready %d, %d\n", frameCount, j);
[NSThread sleepForTimeInterval:0.1];
}
j++;
}
if (!append_ok) {
printf("error appending image %d times %d\n, with error.", frameCount, j);
}
frameCount++;
}
//Finish the session:
[videoWriterInput markAsFinished];
[videoWriter finishWriting];
NSLog(#"Write Ended");
}
Paramenters to the method
imageArray : NSArray of UIImage.
path : Temporary path to write to while you process (temp defined above).
imageSize : The size of the video in pixels (width, and height).
fps : How many images should be displayed per second in the video.
Hope it helps!
Sorry about the formatting - I'm still very new to StackOverflow.com.
This is where I used the code: http://www.youtube.com/watch?v=DDckJyF2bnA

Using AVAssetWriter to create a movie from images is not working as expected on a 3GS device

The call to appendPixelBuffer is returning NO on 3GS device (IOS 4.1), but is working well on iPhone 4 devices.
The following call to appendPixelBuffer is the source of the problem:
CVPixelBufferRef buffer = NULL;
buffer = [self pixelBufferFromCGImage:[[UIImage imageNamed:#"frame1.png"] CGImage]];
BOOL result = [adaptor appendPixelBuffer:buffer withPresentationTime:kCMTimeZero];
if (result == NO) //failes on 3GS, but works on iphone 4
NSLog(#"failed to append buffer");
}
Full Code:
-(void)writeImagesAsMovie:(NSArray *)array toPath:(NSString*)path {
NSLog(path);
NSError *error = nil;
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
[NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
error:&error];
if(error) {
NSLog(#"error creating AssetWriter: %#",[error description]);
}
The error (ONLY ON 3GS, iphone 4 is fine) is
Error Domain=AVFoundationErrorDomain
Code=-11800 "The operation couldn’t be
completed. (AVFoundationErrorDomain
error -11800.)" UserInfo=0x4970530
{NSUnderlyingError=0x496d2c0 "The
operation couldn’t be completed.
(OSStatus error -12908.)"}
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:frameSize.width], AVVideoWidthKey,
[NSNumber numberWithInt:frameSize.height], AVVideoHeightKey,
nil];
AVAssetWriterInput* writerInput = [[AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings] retain];
NSMutableDictionary *attributes = [[NSMutableDictionary alloc]init];
[attributes setObject:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32ARGB] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey];
[attributes setObject:[NSNumber numberWithUnsignedInt:frameSize.width] forKey:(NSString*)kCVPixelBufferWidthKey];
[attributes setObject:[NSNumber numberWithUnsignedInt:frameSize.height] forKey:(NSString*)kCVPixelBufferHeightKey];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
sourcePixelBufferAttributes:attributes];
[videoWriter addInput:writerInput];
// fixes all errors
writerInput.expectsMediaDataInRealTime = YES;
//Start a session:
BOOL start = [videoWriter startWriting];
NSLog(#"Session started? %d", start);
[videoWriter startSessionAtSourceTime:kCMTimeZero];
CVPixelBufferRef buffer = NULL;
buffer = [self pixelBufferFromCGImage:[[UIImage imageNamed:#"frame1.png"] CGImage]];
BOOL result = [adaptor appendPixelBuffer:buffer withPresentationTime:kCMTimeZero];
if (result == NO) //failes on 3GS, but works on iphone 4
NSLog(#"failed to append buffer");
if(buffer)
CVBufferRelease(buffer);
[NSThread sleepForTimeInterval:0.05];
//for (int i = 1;i<[array count]; i++)
for (int i = 1;i<20; i++)
{
if (adaptor.assetWriterInput.readyForMoreMediaData)
{
NSLog(#"inside for loop %d",i);
CMTime frameTime = CMTimeMake(1, 15);
CMTime lastTime=CMTimeMake(i, 15);
CMTime presentTime=CMTimeAdd(lastTime, frameTime);
NSString *imgName = [NSString stringWithFormat:#"frame%d.png",i];
UIImage *imgFrame = [UIImage imageNamed:imgName] ;
buffer = [self pixelBufferFromCGImage:[imgFrame CGImage]];
BOOL result = [adaptor appendPixelBuffer:buffer withPresentationTime:presentTime];
if (result == NO) //failes on 3GS, but works on iphone 4
{
NSLog(#"failed to append buffer");
NSLog(#"The error is %#", [videoWriter error]);
}
if(buffer)
CVBufferRelease(buffer);
[NSThread sleepForTimeInterval:0.05];
}
else
{
NSLog(#"error");
i--;
}
}
//Finish the session:
[writerInput markAsFinished];
[videoWriter finishWriting];
CVPixelBufferPoolRelease(adaptor.pixelBufferPool);
[videoWriter release];
[writerInput release];
NSLog(#"Movie created successfully");
}
- (CVPixelBufferRef) pixelBufferFromCGImage: (CGImageRef) image
{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
CVPixelBufferRef pxbuffer = NULL;
CVPixelBufferCreate(kCFAllocatorDefault, self.view.frame.size.width,
self.view.frame.size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options,
&pxbuffer);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, self.view.frame.size.width,
self.view.frame.size.height, 8, 4*self.view.frame.size.width, rgbColorSpace,
kCGImageAlphaNoneSkipFirst);
CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}
Check Console Log in Organizer for device.
I got same error on Ipod 2nd Gen and Console Log Reported
com.apple.mediaserverd[18] : VTSelectAndCreateVideoEncoderInstance: no video encoder found for 'avc1'
Exactly what that means still working on, however it points me in a direction.

ASSETWriterInput for making Video from UIImages on Iphone Issues

I try the following 2 methods of appending UIImages pixelbuffer to ASSETWriterInput. Everything looks good except there's No data in the video file. What's wrong?
1 Adaptor class
AVAssetWriterInputPixelBufferAdaptor * avAdaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:NULL];
[avAdaptor appendPixelBufferixelBuffer withPresentationTime:CMTimeMake(1, 10)];
2 Making the
// Create sample buffer.
CMSampleBufferRef sampleBuffer = NULL;
result = CMSampleBufferCreateForImageBuffer(kCFAllocatorDef ault, pixelBuffer, true, NULL, NULL, videoInfo, &timing, &sampleBuffer);
// Ship out the frame.
NSParameterAssert(CMSampleBufferDataIsReady(sample Buffer));
NSParameterAssert([writerInput isReadyForMoreMediaData]);
BOOL success = [writerInput appendSampleBuffer:sampleBuffer];
I found that for some reason I needed to append the buffer more than once. The timing in this example from a test app I made might not be proper, but since it works it should give you a good idea.
+ (void)writeImageAsMovie:(UIImage*)image toPath:(NSString*)path size:(CGSize)size duration:(int)duration
{
NSError *error = nil;
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
[NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
error:&error];
NSParameterAssert(videoWriter);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:size.width], AVVideoWidthKey,
[NSNumber numberWithInt:size.height], AVVideoHeightKey,
nil];
AVAssetWriterInput* writerInput = [[AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings] retain];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
sourcePixelBufferAttributes:nil];
NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);
[videoWriter addInput:writerInput];
//Start a session:
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
//Write samples:
CVPixelBufferRef buffer = [Utils pixelBufferFromCGImage:image.CGImage size:size];
[adaptor appendPixelBuffer:buffer withPresentationTime:kCMTimeZero];
[adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(duration-1, 2)];
//Finish the session:
[writerInput markAsFinished];
[videoWriter endSessionAtSourceTime:CMTimeMake(duration, 2)];
[videoWriter finishWriting];
}
This method is not required, but is used here as an example of a pixel buffer source:
+ (CVPixelBufferRef) pixelBufferFromCGImage:(CGImageRef)image size:(CGSize)size
{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width,
size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options,
&pxbuffer);
status=status;//Added to make the stupid compiler not show a stupid warning.
NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width,
size.height, 8, 4*size.width, rgbColorSpace,
kCGImageAlphaNoneSkipFirst);
NSParameterAssert(context);
//CGContextTranslateCTM(context, 0, CGImageGetHeight(image));
//CGContextScaleCTM(context, 1.0, -1.0);//Flip vertically to account for different origin
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}
I've had a bit of a problem with this code. It geave me a skewed image as a result.
Changing:
CGContextRef context = CGBitmapContextCreate(pxdata,
size.width,
size.height,
8,
4 * size.width,
rgbColorSpace,
kCGImageAlphaNoneSkipFirst);
To:
CGContextRef context = CGBitmapContextCreate(pxdata,
size.width,
size.height,
8,
CVPixelBufferGetBytesPerRow(pxbuffer),
rgbColorSpace,
kCGImageAlphaNoneSkipFirst);
helped.
Hold on, though the answer given by #Peter DeWeese is the direction to follow, the code has two huge issues: firstly, you need to wait while the system is ready to append a new media and secondly, you've got a great memory leak as you need to release your buffer after it was appended to the video writer.
This is true in your very case, but even more in a general case, where you want to loop in multiple frame as follows:
NSInteger i = 0;
for (; i<n; i++) {
image = allImages[i];
CVPixelBufferRef buffer = [self pixelBufferFromCGImage:image.CGImage cropFrame:frame];
// wait for more media data is ready
while (adaptor.assetWriterInput.readyForMoreMediaData == FALSE) {
NSDate *maxDate = [NSDate dateWithTimeIntervalSinceNow:0.1];
[[NSRunLoop currentRunLoop] runUntilDate:maxDate];
}
NSLog(#"Panorama: appending frame %ld out of %ld", (long)i, (long)n);
// Append data
[adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(i, freq)];//kCMTimeZero];
// release the buffer
CVPixelBufferRelease(buffer);
}