I try the following 2 methods of appending UIImages pixelbuffer to ASSETWriterInput. Everything looks good except there's No data in the video file. What's wrong?
1 Adaptor class
AVAssetWriterInputPixelBufferAdaptor * avAdaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:NULL];
[avAdaptor appendPixelBufferixelBuffer withPresentationTime:CMTimeMake(1, 10)];
2 Making the
// Create sample buffer.
CMSampleBufferRef sampleBuffer = NULL;
result = CMSampleBufferCreateForImageBuffer(kCFAllocatorDef ault, pixelBuffer, true, NULL, NULL, videoInfo, &timing, &sampleBuffer);
// Ship out the frame.
NSParameterAssert(CMSampleBufferDataIsReady(sample Buffer));
NSParameterAssert([writerInput isReadyForMoreMediaData]);
BOOL success = [writerInput appendSampleBuffer:sampleBuffer];
I found that for some reason I needed to append the buffer more than once. The timing in this example from a test app I made might not be proper, but since it works it should give you a good idea.
+ (void)writeImageAsMovie:(UIImage*)image toPath:(NSString*)path size:(CGSize)size duration:(int)duration
{
NSError *error = nil;
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
[NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
error:&error];
NSParameterAssert(videoWriter);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:size.width], AVVideoWidthKey,
[NSNumber numberWithInt:size.height], AVVideoHeightKey,
nil];
AVAssetWriterInput* writerInput = [[AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings] retain];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
sourcePixelBufferAttributes:nil];
NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);
[videoWriter addInput:writerInput];
//Start a session:
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
//Write samples:
CVPixelBufferRef buffer = [Utils pixelBufferFromCGImage:image.CGImage size:size];
[adaptor appendPixelBuffer:buffer withPresentationTime:kCMTimeZero];
[adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(duration-1, 2)];
//Finish the session:
[writerInput markAsFinished];
[videoWriter endSessionAtSourceTime:CMTimeMake(duration, 2)];
[videoWriter finishWriting];
}
This method is not required, but is used here as an example of a pixel buffer source:
+ (CVPixelBufferRef) pixelBufferFromCGImage:(CGImageRef)image size:(CGSize)size
{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width,
size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options,
&pxbuffer);
status=status;//Added to make the stupid compiler not show a stupid warning.
NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width,
size.height, 8, 4*size.width, rgbColorSpace,
kCGImageAlphaNoneSkipFirst);
NSParameterAssert(context);
//CGContextTranslateCTM(context, 0, CGImageGetHeight(image));
//CGContextScaleCTM(context, 1.0, -1.0);//Flip vertically to account for different origin
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}
I've had a bit of a problem with this code. It geave me a skewed image as a result.
Changing:
CGContextRef context = CGBitmapContextCreate(pxdata,
size.width,
size.height,
8,
4 * size.width,
rgbColorSpace,
kCGImageAlphaNoneSkipFirst);
To:
CGContextRef context = CGBitmapContextCreate(pxdata,
size.width,
size.height,
8,
CVPixelBufferGetBytesPerRow(pxbuffer),
rgbColorSpace,
kCGImageAlphaNoneSkipFirst);
helped.
Hold on, though the answer given by #Peter DeWeese is the direction to follow, the code has two huge issues: firstly, you need to wait while the system is ready to append a new media and secondly, you've got a great memory leak as you need to release your buffer after it was appended to the video writer.
This is true in your very case, but even more in a general case, where you want to loop in multiple frame as follows:
NSInteger i = 0;
for (; i<n; i++) {
image = allImages[i];
CVPixelBufferRef buffer = [self pixelBufferFromCGImage:image.CGImage cropFrame:frame];
// wait for more media data is ready
while (adaptor.assetWriterInput.readyForMoreMediaData == FALSE) {
NSDate *maxDate = [NSDate dateWithTimeIntervalSinceNow:0.1];
[[NSRunLoop currentRunLoop] runUntilDate:maxDate];
}
NSLog(#"Panorama: appending frame %ld out of %ld", (long)i, (long)n);
// Append data
[adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(i, freq)];//kCMTimeZero];
// release the buffer
CVPixelBufferRelease(buffer);
}
Related
I am relatively new to programming and although i am ok with normal functions, i am however completely new to video editing
So i have managed to find some code online to do the jobs shown below:
- (void)writeImagesAsMovie:(NSArray *)array {
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDirectory, YES);
NSString *documentDirectory = [paths objectAtIndex:0];
NSString *saveLocation = [documentDirectory stringByAppendingString:#"/temp.mov"];
if ([[NSFileManager defaultManager] fileExistsAtPath:saveLocation]) {
[[NSFileManager defaultManager] removeItemAtPath:saveLocation error:NULL];
}
UIImage *first = [array objectAtIndex:0];
CGSize frameSize = first.size;
NSError *error = nil;
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
[NSURL fileURLWithPath:saveLocation] fileType:AVFileTypeQuickTimeMovie
error:&error];
if(error) {
NSLog(#"error creating AssetWriter: %#",[error description]);
}
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:frameSize.width], AVVideoWidthKey,
[NSNumber numberWithInt:frameSize.height], AVVideoHeightKey,
nil];
AVAssetWriterInput *writerInput = [AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings];
NSMutableDictionary *attributes = [[NSMutableDictionary alloc] init];
[attributes setObject:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32ARGB] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey];
[attributes setObject:[NSNumber numberWithUnsignedInt:frameSize.width] forKey:(NSString*)kCVPixelBufferWidthKey];
[attributes setObject:[NSNumber numberWithUnsignedInt:frameSize.height] forKey:(NSString*)kCVPixelBufferHeightKey];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
sourcePixelBufferAttributes:attributes];
[videoWriter addInput:writerInput];
// fixes all errors
writerInput.expectsMediaDataInRealTime = YES;
//Start a session:
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
CVPixelBufferRef buffer = NULL;
buffer = [self pixelBufferFromCGImage:[first CGImage]];
BOOL result = [adaptor appendPixelBuffer:buffer withPresentationTime:kCMTimeZero];
if (result == NO) //failes on 3GS, but works on iphone 4
NSLog(#"failed to append buffer");
if(buffer) {
CVBufferRelease(buffer);
}
//int reverseSort = NO;
NSArray *newArray = array;
int fps = 10;
int i = 0;
for (UIImage *image in newArray)
{
[NSThread sleepForTimeInterval:0.02];
if (adaptor.assetWriterInput.readyForMoreMediaData) {
i++;
CMTime frameTime = CMTimeMake(1, fps);
CMTime lastTime = CMTimeMake(i, fps);
CMTime presentTime = CMTimeAdd(lastTime, frameTime);
UIImage *imgFrame = image;//[UIImage imageWithContentsOfFile:filePath] ;
buffer = [self pixelBufferFromCGImage:[imgFrame CGImage]];
BOOL result = [adaptor appendPixelBuffer:buffer withPresentationTime:presentTime];
if (result == NO) //failes on 3GS, but works on iphone 4
{
NSLog(#"failed to append buffer");
NSLog(#"The error is %#", [videoWriter error]);
[NSThread sleepForTimeInterval:0.5];
}
if(buffer) {
CVBufferRelease(buffer);
}
} else {
NSLog(#"error");
i--;
}
}
//Finish the session:
[writerInput markAsFinished];
[videoWriter finishWriting];
CVPixelBufferPoolRelease(adaptor.pixelBufferPool);
NSLog(#"Movie created successfully");
}
- (CVPixelBufferRef) pixelBufferFromCGImage: (CGImageRef) image
{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
CVPixelBufferRef pxbuffer = NULL;
CVPixelBufferCreate(kCFAllocatorDefault, CGImageGetWidth(image),
CGImageGetHeight(image), kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef) options,
&pxbuffer);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, CGImageGetWidth(image),
CGImageGetHeight(image), 8, 4*CGImageGetWidth(image), rgbColorSpace,
kCGImageAlphaNoneSkipFirst);
CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
// CGAffineTransform flipVertical = CGAffineTransformMake(
// 1, 0, 0, -1, 0, CGImageGetHeight(image)
// );
// CGContextConcatCTM(context, flipVertical);
// CGAffineTransform flipHorizontal = CGAffineTransformMake(
// -1.0, 0.0, 0.0, 1.0, CGImageGetWidth(image), 0.0
// );
//
// CGContextConcatCTM(context, flipHorizontal);
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}
But the problem that i am having is that the output of the video is some how corrupted (it does play although it has funny lines shown below:
I would be so grateful for any help
Many Thanks
Thomas
I have been seeing problems with the H264 video encoding hardware where it can corrupt input that does not match an known aspect ratio. For example, my testing shows that if one video dimension is smaller than 128 pixel, the video will not encode.
What I have seen working is 128x128, 192x128, 240x160, 480x320, and others.
See this page on aspect ratios
P.S.
You will likely want to use the AVAssetWriterInputPixelBufferAdaptor since it contains a pixel buffer pool that you can use via CVPixelBufferPoolCreatePixelBuffer(). Also, you will want to assert(adaptor.pixelBufferPool); after calling startSessionAtSourceTime to ensure that your adaptor can write to the writer.
I am using the below code to make a video from a static 16:9 image using AVAsset writer. The problem is that for some reason the video that is produced is in a 4:3 format.
Can anyone suggest a way that I can either amend the code to produce a 16:9 video, or alternatively, how I can convert the 4:3 video to 16:9.
Thank you
- (void) createVideoFromStillImage
{
//Set the size according to the device type (iPhone or iPad).
CGSize size = CGSizeMake(screenWidth, screenHeight);
NSString *betaCompressionDirectory = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/IntroVideo.mov"];
NSError *error = nil;
unlink([betaCompressionDirectory UTF8String]);
//----initialize compression engine
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:betaCompressionDirectory]
fileType:AVFileTypeQuickTimeMovie
error:&error];
NSParameterAssert(videoWriter);
if(error)
NSLog(#"error = %#", [error localizedDescription]);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: AVVideoCodecH264,AVVideoCodecKey,
[NSNumber numberWithInt:size.height], AVVideoWidthKey,
[NSNumber numberWithInt:size.width], AVVideoHeightKey, nil];
AVAssetWriterInput *writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);
if ([videoWriter canAddInput:writerInput])
NSLog(#"I can add this input");
else
NSLog(#"i can't add this input");
[videoWriter addInput:writerInput];
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
//CGImageRef theImage = [finishedMergedImage CGImage];
CGImageRef theImage = [introImage CGImage];
//dispatch_queue_t dispatchQueue = dispatch_queue_create("mediaInputQueue", NULL);
int __block frame = 0;
//Calculate how much progress % one frame completion represents. Maximum of 75%.
float currentProgress = 0.0;
float progress = (80.0 / kDurationOfIntroOutro);
//NSLog(#"Progress is %f", progress);
for (int i=0; i<=kDurationOfIntroOutro; i++) {
//Update our progress view for every frame that is generated.
[self updateProgressView:currentProgress];
currentProgress +=progress;
//NSLog(#"CurrentProgress is %f", currentProgress);
frame++;
[NSThread sleepForTimeInterval:0.05]; //Delay to allow buffer to be ready.
CVPixelBufferRef buffer = (CVPixelBufferRef)[self pixelBufferFromCGImage:theImage size:size];
if (buffer) {
if (adaptor.assetWriterInput.readyForMoreMediaData)
{
if(![adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(frame, 20)])
NSLog(#"FAIL");
else
NSLog(#"Success:%d", frame);
CFRelease(buffer);
}
}
}
[writerInput markAsFinished];
[videoWriter finishWriting];
[videoWriter release];
//NSLog(#"outside for loop");
//Grab the URL for the video so we can use it later.
NSURL * url = [self applicationDocumentsDirectory : kIntroVideoFileName];
[assetURLArray setObject:url forKey:kIntroVideo];
}
- (CVPixelBufferRef )pixelBufferFromCGImage:(CGImageRef)image size:(CGSize)size
{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width, size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options, &pxbuffer);
// CVReturn status = CVPixelBufferPoolCreatePixelBuffer(NULL, adaptor.pixelBufferPool, &pxbuffer);
NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width, size.height, 8, 4*size.width, rgbColorSpace, kCGImageAlphaPremultipliedFirst);
NSParameterAssert(context);
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}
So that this can be closed out, I'll restate what I did above. The videoSettings dictionary you use should be using the target dimensions of your video, but you're passing in the dimensions of your view. Unless that's what you want to record, you'll need to change the values passed in for the AVVideoWidthKey and AVVideoWidthKey to be the correct output sizes.
Given that the iOS device screens have aspect ratios close to 4:3, this is what probably was leading to that ratio on the recorded video.
please tell me where is leak in this code...
//here I did video with images from Document Directory
- (void) testCompressionSession:(NSString *)path
{
if ([[NSFileManager defaultManager] fileExistsAtPath:path]) {
[[NSFileManager defaultManager] removeItemAtPath:path error:nil];
}
NSArray *array = [dictInfo objectForKey:#"sortedKeys"];
NSString *betaCompressionDirectory = path;
NSError *error = nil;
unlink([betaCompressionDirectory UTF8String]);
NSLog(#"array = %#",array);
NSData *imgDataTmp = [NSData dataWithContentsOfFile:[projectPath stringByAppendingPathComponent:[array objectAtIndex:0]]];
NSLog(#"link : %#",[projectPath stringByAppendingPathComponent:[array objectAtIndex:0]]);
CGSize size = CGSizeMake([UIImage imageWithData:imgDataTmp].size.width, [UIImage imageWithData:imgDataTmp].size.height);
//----initialize compression engine
NSLog(#"size : w : %f, h : %f",size.width,size.height);
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:betaCompressionDirectory]
fileType:AVFileTypeQuickTimeMovie
error:&error];
NSParameterAssert(videoWriter);
if(error)
NSLog(#"error = %#", [error localizedDescription]);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:size.width], AVVideoWidthKey,
[NSNumber numberWithInt:size.height], AVVideoHeightKey, nil];
AVAssetWriterInput *writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);
if ([videoWriter canAddInput:writerInput])
NSLog(#"I can add this input");
else
NSLog(#"i can't add this input");
[videoWriter addInput:writerInput];
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
dispatch_queue_t dispatchQueue = dispatch_queue_create("mediaInputQueue", NULL);
[writerInput requestMediaDataWhenReadyOnQueue:dispatchQueue usingBlock:^{
//BOOL isEffect = NO;
int i = 0;
float totalTime = 0.0f;
float nextTime = 0;
if ([writerInput isReadyForMoreMediaData]) {
while (1)
{
if (i <= [array count] && i > 0) {
nextTime = [[dictInfo objectForKey:[array objectAtIndex:i-1]] floatValue];
}
totalTime += i == 0 ? 0 : nextTime;
CMTime presentTime=CMTimeMake(totalTime, 1);
printf("presentTime : %f ",CMTimeGetSeconds(presentTime));
if (i >= [array count])
{
NSData *imgData = [NSData dataWithContentsOfFile:[projectPath stringByAppendingPathComponent:[array objectAtIndex:i-1]]];
UIImage* tmpImg = [UIImage imageWithData:imgData];
tmpImg = [self imageWithImage:tmpImg scaledToSize:size];
while ( !writerInput.readyForMoreMediaData)
{
sleep(0.01);
}
CVPixelBufferRef buffer = NULL;
buffer = (CVPixelBufferRef)[self pixelBufferFromCGImage:[tmpImg CGImage] size:size];
[adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(totalTime-nextTime+(nextTime/2.0), 1)];
NSLog(#"%f",totalTime-nextTime+(nextTime/2.0));
[writerInput markAsFinished];
[videoWriter finishWriting];
//CVPixelBufferPoolRelease(adaptor.pixelBufferPool);
[videoWriter release];
break;
} else {
NSData *imgData = [NSData dataWithContentsOfFile:[projectPath stringByAppendingPathComponent:[array objectAtIndex:i]]];
UIImage* tmpImg = [UIImage imageWithData:imgData];
//tmpImg = [self imageWithImage:tmpImg scaledToSize:size];
//UIImageWriteToSavedPhotosAlbum(tmpImg, nil, nil, nil);
while (!adaptor.assetWriterInput.readyForMoreMediaData && !writerInput.readyForMoreMediaData)
{
sleep(0.01);
}
CVPixelBufferRef buffer = NULL;
buffer = (CVPixelBufferRef)[self pixelBufferFromCGImage:[tmpImg CGImage] size:size];
if (buffer)
{
if(![adaptor appendPixelBuffer:buffer withPresentationTime:presentTime])
NSLog(#"FAIL");
else
NSLog(#"Success:%d",i);
CVPixelBufferRelease(buffer);
}
}
i++;
}
}
}];
//and here I did CVPixelBufferRef from CGImageRef
- (CVPixelBufferRef )pixelBufferFromCGImage:(CGImageRef)image size:(CGSize)size
{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width, size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options, &pxbuffer);
NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width, size.height, 8, 4*size.width, rgbColorSpace, kCGImageAlphaPremultipliedFirst);
NSParameterAssert(context);
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
leak log is :
CVObject CFRetain 00:37.957.985 2 0x1ecae0 0 CoreVideo CVPixelBufferPool::createPixelBuffer(__CFAllocator const*, __CFDictionary const*, int*)
Malloc 96 Bytes Malloc 00:40.015.872 1 0x1f0750 96 CoreVideo CVBuffer::init()
CVPixelBuffer Malloc 00:40.969.716 1 0x1f2570 96 CoreVideo CVObject::alloc(unsigned long, __CFAllocator const*, unsigned long, unsigned long)
Look here:
CVPixelBufferRef buffer = NULL;
CVPixelBufferPoolCreatePixelBuffer(NULL, adaptor.pixelBufferPool, &buffer);
CVPixelBufferLockBaseAddress(buffer, 0);
buffer = (CVPixelBufferRef)[self pixelBufferFromCGImage:[tmpImg CGImage] size:size];
first a pixel buffer gets created and its address put info buffer variable, then the same variable gets overwritten by pixelBufferFromCGImage, so its previous content cannot be released any more.
EDIT
you've just removed the code I used, so my answer is now no more applicable.
Now this part:
CVPixelBufferRef buffer = NULL;
buffer = (CVPixelBufferRef)[self pixelBufferFromCGImage:[tmpImg CGImage] size:size];
[adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(totalTime-nextTime+(nextTime/2.0), 1)];
NSLog(#"%f",totalTime-nextTime+(nextTime/2.0));
...
You have a commented out CVPixelBufferPoolRelease(adaptor.pixelBufferPool), which is okay, since in this version you have not pixel buffer pool, but I miss here a call to CVPixelBufferRelease(buffer).
The call to appendPixelBuffer is returning NO on 3GS device (IOS 4.1), but is working well on iPhone 4 devices.
The following call to appendPixelBuffer is the source of the problem:
CVPixelBufferRef buffer = NULL;
buffer = [self pixelBufferFromCGImage:[[UIImage imageNamed:#"frame1.png"] CGImage]];
BOOL result = [adaptor appendPixelBuffer:buffer withPresentationTime:kCMTimeZero];
if (result == NO) //failes on 3GS, but works on iphone 4
NSLog(#"failed to append buffer");
}
Full Code:
-(void)writeImagesAsMovie:(NSArray *)array toPath:(NSString*)path {
NSLog(path);
NSError *error = nil;
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
[NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
error:&error];
if(error) {
NSLog(#"error creating AssetWriter: %#",[error description]);
}
The error (ONLY ON 3GS, iphone 4 is fine) is
Error Domain=AVFoundationErrorDomain
Code=-11800 "The operation couldn’t be
completed. (AVFoundationErrorDomain
error -11800.)" UserInfo=0x4970530
{NSUnderlyingError=0x496d2c0 "The
operation couldn’t be completed.
(OSStatus error -12908.)"}
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:frameSize.width], AVVideoWidthKey,
[NSNumber numberWithInt:frameSize.height], AVVideoHeightKey,
nil];
AVAssetWriterInput* writerInput = [[AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings] retain];
NSMutableDictionary *attributes = [[NSMutableDictionary alloc]init];
[attributes setObject:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32ARGB] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey];
[attributes setObject:[NSNumber numberWithUnsignedInt:frameSize.width] forKey:(NSString*)kCVPixelBufferWidthKey];
[attributes setObject:[NSNumber numberWithUnsignedInt:frameSize.height] forKey:(NSString*)kCVPixelBufferHeightKey];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
sourcePixelBufferAttributes:attributes];
[videoWriter addInput:writerInput];
// fixes all errors
writerInput.expectsMediaDataInRealTime = YES;
//Start a session:
BOOL start = [videoWriter startWriting];
NSLog(#"Session started? %d", start);
[videoWriter startSessionAtSourceTime:kCMTimeZero];
CVPixelBufferRef buffer = NULL;
buffer = [self pixelBufferFromCGImage:[[UIImage imageNamed:#"frame1.png"] CGImage]];
BOOL result = [adaptor appendPixelBuffer:buffer withPresentationTime:kCMTimeZero];
if (result == NO) //failes on 3GS, but works on iphone 4
NSLog(#"failed to append buffer");
if(buffer)
CVBufferRelease(buffer);
[NSThread sleepForTimeInterval:0.05];
//for (int i = 1;i<[array count]; i++)
for (int i = 1;i<20; i++)
{
if (adaptor.assetWriterInput.readyForMoreMediaData)
{
NSLog(#"inside for loop %d",i);
CMTime frameTime = CMTimeMake(1, 15);
CMTime lastTime=CMTimeMake(i, 15);
CMTime presentTime=CMTimeAdd(lastTime, frameTime);
NSString *imgName = [NSString stringWithFormat:#"frame%d.png",i];
UIImage *imgFrame = [UIImage imageNamed:imgName] ;
buffer = [self pixelBufferFromCGImage:[imgFrame CGImage]];
BOOL result = [adaptor appendPixelBuffer:buffer withPresentationTime:presentTime];
if (result == NO) //failes on 3GS, but works on iphone 4
{
NSLog(#"failed to append buffer");
NSLog(#"The error is %#", [videoWriter error]);
}
if(buffer)
CVBufferRelease(buffer);
[NSThread sleepForTimeInterval:0.05];
}
else
{
NSLog(#"error");
i--;
}
}
//Finish the session:
[writerInput markAsFinished];
[videoWriter finishWriting];
CVPixelBufferPoolRelease(adaptor.pixelBufferPool);
[videoWriter release];
[writerInput release];
NSLog(#"Movie created successfully");
}
- (CVPixelBufferRef) pixelBufferFromCGImage: (CGImageRef) image
{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
CVPixelBufferRef pxbuffer = NULL;
CVPixelBufferCreate(kCFAllocatorDefault, self.view.frame.size.width,
self.view.frame.size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options,
&pxbuffer);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, self.view.frame.size.width,
self.view.frame.size.height, 8, 4*self.view.frame.size.width, rgbColorSpace,
kCGImageAlphaNoneSkipFirst);
CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}
Check Console Log in Organizer for device.
I got same error on Ipod 2nd Gen and Console Log Reported
com.apple.mediaserverd[18] : VTSelectAndCreateVideoEncoderInstance: no video encoder found for 'avc1'
Exactly what that means still working on, however it points me in a direction.
I have successfully created video from images using the following code
-(void)writeImageAsMovie:(NSArray *)array toPath:(NSString*)path size:(CGSize)size duration:(int)duration
{
NSError *error = nil;
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
[NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
error:&error];
NSParameterAssert(videoWriter);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:size.width], AVVideoWidthKey,
[NSNumber numberWithInt:size.height], AVVideoHeightKey,
nil];
AVAssetWriterInput* writerInput = [[AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings] retain];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
sourcePixelBufferAttributes:nil];
NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);
[videoWriter addInput:writerInput];
//Start a session:
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
CVPixelBufferRef buffer = NULL;
buffer = [self pixelBufferFromCGImage:[[array objectAtIndex:0] CGImage]];
[adaptor appendPixelBuffer:buffer withPresentationTime:kCMTimeZero];
//Write samples:
for (int i = 0;i<[array count]; i++)
{
if([writerInput isReadyForMoreMediaData])
{
NSLog(#"inside for loop %d",i);
CMTime frameTime = CMTimeMake(1, 20);
CMTime lastTime=CMTimeMake(i, 20); //i is from 0 to 24 of the loop above
CMTime presentTime=CMTimeAdd(lastTime, frameTime);
buffer = [self pixelBufferFromCGImage:[[array objectAtIndex:i] CGImage]];
[adaptor appendPixelBuffer:buffer withPresentationTime:presentTime];
}
else
{
NSLog(#"error");
i--;
}
}
NSLog(#"outside for loop");
//Finish the session:
[writerInput markAsFinished];
[videoWriter finishWriting];
}
Here I have used CVPixelBufferRef. Instead of this, I want to use the CVPixelBufferPoolRef in conjunction with AVAssetWriterInputPixelBufferAdaptor.
Can anybody provide an example which I can debug and use?
You are passing nil 'sourcePixelBufferAttributes', because of which the pixel buffer pool will not get created:
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:nil];
Instead pass some attributes, for example:
NSDictionary *bufferAttributes = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil];
Then you can use the pool to create the pixel buffers, like:
CVPixelBufferPoolCreatePixelBuffer (NULL, adaptor.pixelBufferPool, &pixelBuffer);
#Atulkumar V. Jain : great! good luck ^^
#Brian : you are right thanks, I correct it and I am getting it work now here is the working code (if someone else need it :-) )
CVPixelBufferRef buffer = NULL;
buffer = [self pixelBufferFromCGImage:[[imagesArray objectAtIndex:0] CGImage]];
CVPixelBufferPoolCreatePixelBuffer (NULL, adaptor_.pixelBufferPool, &buffer);
[adaptor_ appendPixelBuffer:buffer withPresentationTime:kCMTimeZero];
__block UInt64 convertedByteCount = 0;
dispatch_queue_t mediaInputQueue = dispatch_queue_create("mediaInputQueue", NULL);
static int i = 1;
int frameNumber = [imagesArray count];
[writerInput requestMediaDataWhenReadyOnQueue:mediaInputQueue usingBlock:^{
while (1){
if (i == frameNumber) {
break;
}
if ([writerInput isReadyForMoreMediaData]) {
CVPixelBufferRef sampleBuffer = [self pixelBufferFromCGImage:[[imagesArray objectAtIndex:i] CGImage]];
NSLog(#"inside for loop %d",i);
CMTime frameTime = CMTimeMake(1, 20);
CMTime lastTime=CMTimeMake(i, 20); //i is from 0 to 19 of the loop above
CMTime presentTime=CMTimeAdd(lastTime, frameTime);
if (sampleBuffer) {
[adaptor_ appendPixelBuffer:sampleBuffer withPresentationTime:presentTime];
i++;
CFRelease(sampleBuffer);
} else {
break;
}
}
}
NSLog (#"done");
[writerInput markAsFinished];
[videoWriter finishWriting];
CVPixelBufferPoolRelease(adaptor_.pixelBufferPool);
[videoWriter release];
[writerInput release];
[imagesArray removeAllObjects];
}];
Instead of using "for" use this code :
dispatch_queue_t mediaInputQueue = dispatch_queue_create("mediaInputQueue", NULL);
[writerInput requestMediaDataWhenReadyOnQueue:mediaInputQueue usingBlock:^{
CVPixelBufferRef buffer = NULL;
buffer = [self pixelBufferFromCGImage:[[array objectAtIndex:0] CGImage]];
CVPixelBufferPoolCreatePixelBuffer (NULL, adaptor.pixelBufferPool, &buffer);
[adaptor appendPixelBuffer:buffer withPresentationTime:kCMTimeZero];
int i = 1;
while (writerInput.readyForMoreMediaData) {
NSLog(#"inside for loop %d",i);
CMTime frameTime = CMTimeMake(1, 20);
CMTime lastTime=CMTimeMake(i, 20); //i is from 0 to 19 of the loop above
CMTime presentTime=CMTimeAdd(lastTime, frameTime);
if (i >= [array count]) {
buffer = NULL;
}else {
buffer = [self pixelBufferFromCGImage:[[array objectAtIndex:i] CGImage]];
}
//CVBufferRetain(buffer);
if (buffer) {
// append buffer
[adaptor appendPixelBuffer:buffer withPresentationTime:presentTime];
i++;
} else {
// done!
//Finish the session:
[writerInput markAsFinished];
[videoWriter finishWriting];
CVPixelBufferPoolRelease(adaptor.pixelBufferPool);
[videoWriter release];
[writerInput release];
NSLog (#"Done");
[imageArray removeAllObjects];
break;
}
}
}];
I got it all working!
Here is the sample code link: git#github.com:RudyAramayo/AVAssetWriterInputPixelBufferAdaptorSample.git
Here is the code you need:
- (void) testCompressionSession
{
CGSize size = CGSizeMake(480, 320);
NSString *betaCompressionDirectory = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Movie.m4v"];
NSError *error = nil;
unlink([betaCompressionDirectory UTF8String]);
//----initialize compression engine
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:betaCompressionDirectory]
fileType:AVFileTypeQuickTimeMovie
error:&error];
NSParameterAssert(videoWriter);
if(error)
NSLog(#"error = %#", [error localizedDescription]);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:size.width], AVVideoWidthKey,
[NSNumber numberWithInt:size.height], AVVideoHeightKey, nil];
AVAssetWriterInput *writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);
if ([videoWriter canAddInput:writerInput])
NSLog(#"I can add this input");
else
NSLog(#"i can't add this input");
[videoWriter addInput:writerInput];
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
//---
// insert demo debugging code to write the same image repeated as a movie
CGImageRef theImage = [[UIImage imageNamed:#"Lotus.png"] CGImage];
dispatch_queue_t dispatchQueue = dispatch_queue_create("mediaInputQueue", NULL);
int __block frame = 0;
[writerInput requestMediaDataWhenReadyOnQueue:dispatchQueue usingBlock:^{
while ([writerInput isReadyForMoreMediaData])
{
if(++frame >= 120)
{
[writerInput markAsFinished];
[videoWriter finishWriting];
[videoWriter release];
break;
}
CVPixelBufferRef buffer = (CVPixelBufferRef)[self pixelBufferFromCGImage:theImage size:size];
if (buffer)
{
if(![adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(frame, 20)])
NSLog(#"FAIL");
else
NSLog(#"Success:%d", frame);
CFRelease(buffer);
}
}
}];
NSLog(#"outside for loop");
}
- (CVPixelBufferRef )pixelBufferFromCGImage:(CGImageRef)image size:(CGSize)size
{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width, size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options, &pxbuffer);
// CVReturn status = CVPixelBufferPoolCreatePixelBuffer(NULL, adaptor.pixelBufferPool, &pxbuffer);
NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width, size.height, 8, 4*size.width, rgbColorSpace, kCGImageAlphaPremultipliedFirst);
NSParameterAssert(context);
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}