Hold multiple Frames in Memory before sending them to AVAssetWriter - iphone

I need to hold some video frames from a captureSession in memory and write them to a file when 'something' happens.
Similar to this solution, i use this code to put a frame into a NSMutableArray:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
{
//...
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
uint8 *baseAddress = (uint8*)CVPixelBufferGetBaseAddress(imageBuffer);
NSData *rawFrame = [[NSData alloc] initWithBytes:(void*)baseAddress length:(height * bytesPerRow)];
[m_frameDataArray addObject:rawFrame];
[rawFrame release];
//...
}
And this to write the video file:
-(void)writeFramesToFile
{
//...
NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:640], AVVideoWidthKey,
[NSNumber numberWithInt:480], AVVideoHeightKey,
AVVideoCodecH264, AVVideoCodecKey,
nil ];
AVAssetWriterInput *bufferAssetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSettings];
AVAssetWriter *bufferAssetWriter = [[AVAssetWriter alloc]initWithURL:pathURL fileType:AVFileTypeQuickTimeMovie error:&error];
[bufferAssetWriter addInput:bufferAssetWriterInput];
[bufferAssetWriter startWriting];
[bufferAssetWriter startSessionAtSourceTime:startTime];
for (NSInteger i = 1; i < m_frameDataArray.count; i++){
NSData *rawFrame = [m_frameDataArray objectAtIndex:i];
CVImageBufferRef imgBuf = [rawFrame bytes];
[pixelBufferAdaptor appendPixelBuffer:imgBuf withPresentationTime:CMTimeMake(1,10)]; //<-- EXC_BAD_ACCESS
[rawFrame release];
}
//... (finishing video file)
}
But something is wrong with the imgBuf reference. Any suggestions? Thanks in advance.

You're supposed to lock base address before accessing imageBuffer's properties.
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer, 0);
uint8 *baseAddress = (uint8*)CVPixelBufferGetBaseAddress(imageBuffer);
NSData *rawFrame = [[NSData alloc] initWithBytes:(void*)baseAddress length:(height * bytesPerRow)];
[m_frameDataArray addObject:rawFrame];
CVPixelBufferUnlockBaseAddress(imageBuffer, 0);

This is pretty old, but to help those that come after, there's a few issues to be fixed:
Lock/unlock the base address when you copy out as suggested by Alex's answer
CVImageBufferRef is an abstract base class type. You want to use CVPixelBufferCreateWithBytes to make an instance, not just typecast the raw pixel bytes. (The system needs to know the size/format of those pixels)
You should create and store the new CVPixelBuffer directly from the original one's data instead of using an intermediary NSData for storage. That way you only have to do one copy instead of two.

Related

Make UIImage from ByteArray getting from .Net WebService

My problem is same as Converting byte array coming from Web service to UIImage iPhone
.Now I am storing these bytes in NSMutableArray.But the method:
NSData *data = [NSData dataWithBytes:YOUR_BYTE_ARRAY length:ARRAY_LENGTH];
takes arrayOfBytes as parameter.So can anyone tell me that how to convert this array in byte array. I searched a lot but unable to find relevant contents.
Not sure how you're getting the mutable array to begin with. If you're using NSURLConnection, the delegate will get NSData, so you needn't use a mutable array. Consider getting the data using the connection asynch block method like this ...
NSURLRequest *myRequest = // the request you've already got working to get image data
[NSURLConnection sendAsynchronousRequest:myRequest queue:[NSOperationQueue mainQueue] completionHandler:^(NSURLResponse *response, NSData *data, NSError *error) {
if (!error) {
// image from data with no intermediate mutable array or byte array
UIImage *image = [UIImage imageWithData:data];
}
}];
Thanks for your co-operation. But it does not help me. After a long research i found my solution. I am sharing my information so that others can get right answer.
NSArray *byteArray = [[NSArray alloc]init]; //strore all data here coming from server in byte formate
unsigned c = [byteArray count];
uint8_t *bytes = malloc(sizeof(*bytes) * c);
unsigned i;
for (i = 0; i < c; i++)
{
NSString *str = [byteArray objectAtIndex:i];
int byte = [str intValue];
bytes[i] = (uint8_t)byte;
}
NSData *data = [[NSData alloc]initWithBytes:bytes length:c];
UIImage *image = [UIImage imageWithData:data];
NSLog(#"image %#",image);

h 264 Hardware encoding /decoding For IOS(IPhone/Ipad)?

i have done Real time video processing in ios using AVFoundation framework ,
help of this link.i have tested it is working fine. now i want to use h264 encoding and decoding[Before draw] .i try to get h264 encoded data from AVCaptureSession ,so i have set AVVideoCodecH264 in AVCaptureSession's videosetting,before Start capturing .
NSDictionary* videoSettings = [NSDictionary
dictionaryWithObjectsAndKeys:value,key,AVVideoCodecH264,AVVideoCodecKey,
nil];
[captureOutput setVideoSettings:videoSettings];
above code doesn't produce any change in output buffer,same buffer format get as like before. how to accomplish my requirement ? is it possible ? if so Please Help me to getting started ,h264 in ios.
There's no direct access to encoded keyframes. You may found some insightful comments about that here: https://stackoverflow.com/questions/4399162/how-to-stream-live-video-from-iphone-to-a-media-server-like-wowza
A pair AVVideoCodecH264, AVVideoCodecKey may be used as parameters in outputSettings in
NSError *error = nil;
NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey, nil];
AVAssetWriterInput *assetWriterInput = [[AVAssetWriterInput alloc]
initWithMediaType:AVMediaTypeVideo
outputSettings:outputSettings];
AVAssetWriter *assetWriter = [[AVAssetWriter alloc] initWithURL:fileURL
fileType:AVFileTypeMPEG4
error:&error];
if (!error && [assetWriter canAddInput:assetWriterInput])
[assetWriter addInput:assetWriterInput];

Cannot load image with the path URL returned by ALAssets

I am writing an image in iPad using ALAssets. When it finish I try to create an UIImage with the returned URL but it won't load. This is the code:
LAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeImageToSavedPhotosAlbum:[anImage CGImage] orientation:(ALAssetOrientation)[anImage imageOrientation] completionBlock:^(NSURL *assetURL, NSError *error){
if (!error) {
CGImageSourceRef src = CGImageSourceCreateWithURL((CFURLRef) [NSURL fileURLWithPath:[assetURL absoluteString]], NULL);
My purpose is to save an image to the device, then convert it to another format using ImageIO and finally send it to a web service. CGImageSourceRef is null, I also tried with standard UIImage with the same result.
What I am doing wrong here?
EDIT: The problem is when creating the CFURLRef.
If I do
CGImageSourceCreateWithURL((CFURLRef) assetURL, NULL);
I got this error
ImageIO: CGImageSourceCreateWithURL CFURLCreateDataAndPropertiesFromResource failed with error code -11.
But if I try to convert the URL with
[NSURL fileURLWithPath:[assetURL absoluteString]]
the path is changed to
assets-library:/asset/asset.JPG%3Fid=57BBBA99-E7BF-4DB7-839E-F915005E6DFA&ext=JPG -- file://localhost/
I cannot find how to properly create the CFURLRef needed by the method. I tried printing all the conversions I could think of and this are the results
[assetURL relativePath]
[assetURL relativeString]
[assetURL absoluteURL]
[assetURL absoluteString]
/asset.JPG ,
assets-library://asset/asset.JPG?id=57BBBA99-E7BF-4DB7-839E-F915005E6DFA&ext=JPG
assets-library://asset/asset.JPG?id=57BBBA99-E7BF-4DB7-839E-F915005E6DFA&ext=JPG
assets-library://asset/asset.JPG?id=57BBBA99-E7BF-4DB7-839E-F915005E6DFA&ext=JPG
[NSURL fileURLWithPath:[assetURL relativePath]]
[NSURL fileURLWithPath:[assetURL relativeString]]
[NSURL fileURLWithPath:[assetURL absoluteString]]
file://localhost/asset.JPG
assets-library:/asset/asset.JPG%3Fid=57BBBA99-E7BF-4DB7-839E-F915005E6DFA&ext=JPG -- file://localhost/
assets-library:/asset/asset.JPG%3Fid=57BBBA99-E7BF-4DB7-839E-F915005E6DFA&ext=JPG -- file://localhost/
Help please, I am stuck with this :-(
This is what I did for my case.
UIImage* anImage; //this is the original image
NSData * imgData = UIImageJPEGRepresentation(anImage, 0.7);
CGImageSourceRef src = CGImageSourceCreateWithData((CFDataRef) imgData, NULL);
NSMutableData *data = [NSMutableData data];
CFStringRef imageType = CFSTR("com.microsoft.bmp");
CGImageDestinationRef myImageDest = CGImageDestinationCreateWithData((CFMutableDataRef) data, imageType, 1, nil);
//Convert!
CGImageDestinationAddImageFromSource(myImageDest, src, 0, myOptions);
CGImageDestinationFinalize(myImageDest);
//Freeing things
CFRelease(myImageDest);
CFRelease(src);
But this just converts the image, it doesn't store it in any file... Not sure this should be an answer to the original question.
If you already have an ALAsset and your goal is a CGImageRef you can do something like this.
ALAssetRepresentation* rep = [asset defaultRepresentation];
NSDictionary* options = [[NSDictionary alloc] initWithObjectsAndKeys:
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailWithTransform,
(id)kCFBooleanTrue, (id)kCGImageSourceCreateThumbnailFromImageAlways,
(id)[NSNumber numberWithDouble:400], (id)kCGImageSourceThumbnailMaxPixelSize,
nil];
CGImageRef image = [rep CGImageWithOptions:options];

Why NSMutableDictionary don't want write to file?

- (void)viewDidLoad
{
[super viewDidLoad];
if ([[NSFileManager defaultManager] fileExistsAtPath:pathString])
{
infoDict = [[NSMutableDictionary alloc] initWithContentsOfFile:pathString];
}
else
{
infoDict = [[NSMutableDictionary alloc]initWithObjects:[NSArray arrayWithObjects:#"BeginFrame",#"EndFrame", nil] forKeys:[NSArray arrayWithObjects:[NSNumber numberWithBool:YES],[NSNumber numberWithBool:YES], nil]];
if ([infoDict writeToFile:pathString atomically:YES])
{
NSLog(#"Created");
}
else
{
NSLog(#"Is not created");
NSLog(#"Path %#",pathString);
}
}
This is my code. I check if file is created, if not - I create a NSMutableDictionary and I write it to file at path, but writeToFile method returns NO. Where is problem? If I create this file with NSFileManager it works, but doesn't when I want to write a dictionary.
writeToFile:atomically only works if the dictionary you call it on is a valid property list object (see docs).
For a NSDictionary to be a valid property list object, among other things, its keys must be strings, but in your example the keys are NSNumber instances.
You can not control the content you are going to write sometimes. For example, you can't avoid a null value when you are going to write a JSON object that is gotten from a server.
NSData is compatible with these "invalid" values, so converting NSArray or NSDictionary to NSData is an ideal way in these cases.
write:
NSData *data = [NSKeyedArchiver archivedDataWithRootObject:jsonObject];
[data writeToFile:path atomically:YES];
read:
NSData *data = [NSData dataWithContentsOfFile:path];
NSDictionary *jsonObject = [NSKeyedUnarchiver unarchiveObjectWithData:data];

How to copy songs from iPod Library to app and play with AVAudioPlayer?

Though a duplicate but i want to play song using AVAudioPlayer from iPod Library. So Is it possible to copy iPod Library songs after copying it to app?
I have one code which converts the song to caf format and then I am able to play using AVAudioPlayer. But Conversion takes too long time. So is there any other way I can do that?
The following code will read the ipod library song from its url and copy it to disk for use..
note this code wont work with m4as because the URLs from ipod library for m4as dont contains headers, for that you will need to use AVAssetExporter to write the header and music data to disk.
-(void)exportMP3:(NSURL*)url toFileUrl:(NSString*)fileURL
{
AVURLAsset *asset=[[[AVURLAsset alloc] initWithURL:url options:nil] autorelease];
AVAssetReader *reader=[[[AVAssetReader alloc] initWithAsset:asset error:nil] autorelease];
NSMutableArray *myOutputs =[[[NSMutableArray alloc] init] autorelease];
for(id track in [asset tracks])
{
AVAssetReaderTrackOutput *output=[AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:track outputSettings:nil];
[myOutputs addObject:output];
[reader addOutput:output];
}
[reader startReading];
NSFileHandle *fileHandle ;
NSFileManager *fm=[NSFileManager defaultManager];
if(![fm fileExistsAtPath:fileURL])
{
[fm createFileAtPath:fileURL contents:[[[NSData alloc] init] autorelease] attributes:nil];
}
fileHandle=[NSFileHandle fileHandleForUpdatingAtPath:fileURL];
[fileHandle seekToEndOfFile];
AVAssetReaderOutput *output=[myOutputs objectAtIndex:0];
int totalBuff=0;
while(TRUE)
{
CMSampleBufferRef ref=[output copyNextSampleBuffer];
if(ref==NULL)
break;
//copy data to file
//read next one
AudioBufferList audioBufferList;
NSMutableData *data=[[NSMutableData alloc] init];
CMBlockBufferRef blockBuffer;
CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(ref, NULL, &audioBufferList, sizeof(audioBufferList), NULL, NULL, 0, &blockBuffer);
for( int y=0; y<audioBufferList.mNumberBuffers; y++ )
{
AudioBuffer audioBuffer = audioBufferList.mBuffers[y];
Float32 *frame = audioBuffer.mData;
// Float32 currentSample = frame[i];
[data appendBytes:frame length:audioBuffer.mDataByteSize];
// written= fwrite(frame, sizeof(Float32), audioBuffer.mDataByteSize, f);
////NSLog(#"Wrote %d", written);
}
totalBuff++;
CFRelease(blockBuffer);
CFRelease(ref);
[fileHandle writeData:data];
// //NSLog(#"writting %d frame for amounts of buffers %d ", data.length, audioBufferList.mNumberBuffers);
[data release];
}
// //NSLog(#"total buffs %d", totalBuff);
// fclose(f);
[fileHandle closeFile];
}
You could just use the MPMusicPlayerController, which is built for including an iPod interface within an app. See the docs: http://developer.apple.com/library/ios/#documentation/mediaplayer/reference/MPMusicPlayerController_ClassReference/Reference/Reference.html.
If you really need to use the MPMediaPickerController (which is what I am assuming you are using to select songs from the users library), see this post for how to playback the songs:
Using MPMediaItems with AVAudioPlayer