ALAssets get video data - iphone

I am trying to access video data from ALAssets library using the below code
ALAssetRepresentation *rep = [asset defaultRepresentation];
Byte *buffer = (Byte*)malloc(rep.size);
NSError *error = nil;
NSUInteger buffered = [rep getBytes:buffer fromOffset:0.0 length:rep.size error:&error];
NSData *data = [NSData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];
It works fine for small video as well as pictures, But if am trying to get a large video, the code crashes saying
* Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '* -[NSConcreteData initWithBytes:length:copy:freeWhenDone:bytesAreVM:]: absurd length: 4294967295, maximum size: 2147483648 bytes'
I don't have an idea what's going on. Any one any thoughts?
Thanks in advance!

I found the solution. I guess the crash may be due to huge memory spike when we upload large files, because I am buffering data. Now I read file data as 5 MB chunks and this fix the crash. I am pasting my code below.
- (NSData *)getDataPartAtOffset:(NSInteger)offset {
__block NSData *chunkData = nil;
if (fileAsset_){
static const NSUInteger BufferSize = PART_SIZE; // 5 MB chunk
ALAssetRepresentation *rep = [fileAsset_ defaultRepresentation];
uint8_t *buffer = calloc(BufferSize, sizeof(*buffer));
NSUInteger bytesRead = 0;
NSError *error = nil;
#try
{
bytesRead = [rep getBytes:buffer fromOffset:offset length:BufferSize error:&error];
chunkData = [NSData dataWithData:[NSData dataWithBytesNoCopy:buffer length:bytesRead freeWhenDone:NO]];
}
#catch (NSException *exception)
{
free(buffer);
chunkData = nil;
// Handle the exception here...
}
free(buffer);
} else {
NSLog(#"failed to retrive Asset");
}
return chunkData;
}
And I I'll call this function as
int offset = 0; // offset that keep tracks of chunk data
do {
#autoreleasepool {
NSData *chunkData = [self getDataPartAtOffset:offset];;
if (!chunkData || ![chunkData length]) { // finished reading data
break;
}
// do your stuff here
offset +=[chunkData length];
}
} while (1);

chilitechno's bit here worked for me.

Related

How to read bytes from NSData

Can anyone suggest a method to read bytes from NSData (like read function in #interface NSInputStream : NSStream)
How to read binary bytes in NSData? may help you:
NSString *path = #"…put the path to your file here…";
NSData * fileData = [NSData dataWithContentsOfFile: path];
const char* fileBytes = (const char*)[fileData bytes];
NSUInteger length = [fileData length];
NSUInteger index;
for (index = 0; index<length; index++) {
char aByte = fileBytes[index];
//Do something with each byte
}
You can also create an NSInputStream from an NSData object, if you need the read interface:
NSData *data = ...;
NSInputStream *readData = [[NSInputStream alloc] initWithData:data];
[readData open];
However, you should be aware that initWithData copies the contents of data.
One of the simplest ways is to use NSData getBytes:range:.
NSData *data = ...;
char buffer[numberOfBytes];
[data getBytes:buffer range:NSMakeRange(position, numberOfBytes)];
where position and length is the position you want to read from in NSData and the length is how many bytes you want to read. No need to copy.
Alex already mentioned NSData getBytes:range: but there is also NSData getBytes:length: which starts from the first byte.
NSData *data = ...;
char buffer[numberOfBytes];
[data getBytes:buffer length:numberOfBytes];
May way of doing that..
do not forget to free byte array after usage.
NSData* dat = //your code
NSLog(#"Receive from Peripheral: %#",dat);
NSUInteger len = [dat length];
Byte *bytedata = (Byte*)malloc(len);
[dat getBytes:bytedata length:len];
int p = 0;
while(p < len)
{
printf("%02x",bytedata[p]);
if(p!=len-1)
{
printf("-");
}//printf("%c",bytedata[p]);
p++;
}
printf("\n");
// byte array manipulation
free(bytedata);

Stuck/leak when allocating data for NSData?

I'm stuck at this method and I don't know why!
Can anyone point me to some source code?
Thank you so much!
This is my source code:
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc]init];
.......
readData = [readFileHandle readDataOfLength:defaultLen]
NSData *decryptedData = nil;
//check is last buffer of file
NSUInteger exLen = [readData length] % 16;
NSUInteger decryptionLen = [readData length] - exLen;
NSData *dataForDecryption = nil;
if(exLen != 0)
{
stuck at here-> [readData getBytes:dataForDecryption length:decryptionLen];
//
decryptedData = [dataForDecryption AES256DecryptWithKey:KEY];
self.isEndOfFile = YES;
}
else
decryptedData = [readData AES256DecryptWithKey:KEY];
[readFileHandle closeFile];
.......
[pool drain];
I've used some functions such as:
NSData *dataForDecryption = [[[NSData alloc] initWithBytes:readData length:decryptionLen]autorelease];
NSData *dataForDecryption = [NSData dataWithBytes:readData length:decryptionLen];
But I get the same error.
When i'm using
dataForDecryption = [readFileHandle readDataOfLength:decryptionLen];
it's stuck at pos above and the size read is 0, although it's not EOF.
Thanks
stuck at here-> [readData getBytes:dataForDecryption length:decryptionLen];
You're passing dataForDecryption, which is a NSData*, but the parameter is supposed to be a buffer, i.e. void*. If you want a NSData*, you should instead use a method like subdataWithRange:.
dataForEncryption = [readData subdataWithRange:NSRangeMake(0, decryptionLen)];

Skip over frames while processing video on iOS

I'm trying to process a local video file and simply do some analysis on the pixel data. Nothing is being output. My current code iterates through each frame of the video but I'd actually like to skip ~15 frames at a time to speed things up. Is there a way to skip over frames without decoding them?
In Ffmpeg, I could simply call av_read_frame without calling avcodec_decode_video2.
Thanks in advance! Here's my current code:
- (void) readMovie:(NSURL *)url
{
[self performSelectorOnMainThread:#selector(updateInfo:) withObject:#"scanning" waitUntilDone:YES];
startTime = [NSDate date];
AVURLAsset * asset = [AVURLAsset URLAssetWithURL:url options:nil];
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:#"tracks"] completionHandler:
^{
dispatch_async(dispatch_get_main_queue(),
^{
AVAssetTrack * videoTrack = nil;
NSArray * tracks = [asset tracksWithMediaType:AVMediaTypeVideo];
if ([tracks count] == 1)
{
videoTrack = [tracks objectAtIndex:0];
videoDuration = CMTimeGetSeconds([videoTrack timeRange].duration);
NSError * error = nil;
// _movieReader is a member variable
_movieReader = [[AVAssetReader alloc] initWithAsset:asset error:&error];
if (error)
NSLog(#"%#", error.localizedDescription);
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt: kCVPixelFormatType_420YpCbCr8Planar];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
AVAssetReaderTrackOutput* output = [AVAssetReaderTrackOutput
assetReaderTrackOutputWithTrack:videoTrack
outputSettings:videoSettings];
output.alwaysCopiesSampleData = NO;
[_movieReader addOutput:output];
if ([_movieReader startReading])
{
NSLog(#"reading started");
[self readNextMovieFrame];
}
else
{
NSLog(#"reading can't be started");
}
}
});
}];
}
- (void) readNextMovieFrame
{
//NSLog(#"readNextMovieFrame called");
if (_movieReader.status == AVAssetReaderStatusReading)
{
//NSLog(#"status is reading");
AVAssetReaderTrackOutput * output = [_movieReader.outputs objectAtIndex:0];
CMSampleBufferRef sampleBuffer = [output copyNextSampleBuffer];
if (sampleBuffer)
{ // I'm guessing this is the expensive part that we can skip if we want to skip frames
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// Lock the image buffer
CVPixelBufferLockBaseAddress(imageBuffer,0);
// Get information of the image
uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
// do my pixel analysis
// Unlock the image buffer
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
CFRelease(sampleBuffer);
[self readNextMovieFrame];
}
else
{
NSLog(#"could not copy next sample buffer. status is %d", _movieReader.status);
NSTimeInterval scanDuration = -[startTime timeIntervalSinceNow];
float scanMultiplier = videoDuration / scanDuration;
NSString* info = [NSString stringWithFormat:#"Done\n\nvideo duration: %f seconds\nscan duration: %f seconds\nmultiplier: %f", videoDuration, scanDuration, scanMultiplier];
[self performSelectorOnMainThread:#selector(updateInfo:) withObject:info waitUntilDone:YES];
}
}
else
{
NSLog(#"status is now %d", _movieReader.status);
}
}
- (void) updateInfo: (id*)message
{
NSString* info = [NSString stringWithFormat:#"%#", message];
[infoTextView setText:info];
}
If you want less accurate frame processing (not frame by frame) you should use AVAssetImageGenerator.
This class returns a frame for a specified time you asked.
Specifically, build an Array filled with times between the clip's duration with 0.5s difference between each time (iPhone films at about 29.3 fps if you want every 15 frames its about frame for every 30 seconds) and let the image generator returns your frames.
For each frame you can see the time you requested and the actual time of the frame. It's default value is around 0.5s tolerance from the time you asked but you can also change that by changing the properties:
requestedTimeToleranceBefore
and
requestedTimeToleranceAfter
I hope I answered your question,
Good luck.

App crash when get fullResolutionImage

I try using my app ALAssetRepresentation.and when i loop om an images there are couple of image that crash the app
for(ALAsset *asset in _assets) {
NSMutableDictionary *workingDictionary = [[NSMutableDictionary alloc] init];
[workingDictionary setObject:[asset valueForProperty:ALAssetPropertyType] forKey:#"UIImagePickerControllerMediaType"];
ALAssetRepresentation *representation = [asset defaultRepresentation];
if (!representation) {
[workingDictionary release];
continue;
}
CGImageRef imageRef = [representation fullResolutionImage];//here the app crash
UIImage *img = [UIImage imageWithCGImage:imageRef];
if (!img) {
[workingDictionary release];
continue;
}
if (!img) {
[workingDictionary release];
continue;
}
[workingDictionary setObject:img forKey:#"UIImagePickerControllerOriginalImage"];
[workingDictionary setObject:[asset valueForProperty:ALAssetPropertyOrientation] forKey:#"orientation"];
[returnArray addObject:workingDictionary];
[workingDictionary release];
}
in this line i get crash without any msg:
CGImageRef imageRef = [representation fullResolutionImage];
This is the crash msg
Program received signal: “0”.
Data Formatters temporarily unavailable, will re-try after a 'continue'. (Unknown error loading shared library "/Developer/usr/lib/libXcodeDebuggerSupport.dylib")
That is most likely due to running out of memory, how big are the images that cause the crash?
I had a similar problem and after hours of lookin for solution I found this - the best solution of too big Asset bug:
// For details, see http://mindsea.com/2012/12/18/downscaling-huge-alassets-without-fear-of-sigkill
#import <AssetsLibrary/AssetsLibrary.h>
#import <ImageIO/ImageIO.h>
// Helper methods for thumbnailForAsset:maxPixelSize:
static size_t getAssetBytesCallback(void *info, void *buffer, off_t position, size_t count) {
ALAssetRepresentation *rep = (__bridge id)info;
NSError *error = nil;
size_t countRead = [rep getBytes:(uint8_t *)buffer fromOffset:position length:count error:&error];
if (countRead == 0 && error) {
// We have no way of passing this info back to the caller, so we log it, at least.
NSLog(#"thumbnailForAsset:maxPixelSize: got an error reading an asset: %#", error);
}
return countRead;
}
static void releaseAssetCallback(void *info) {
// The info here is an ALAssetRepresentation which we CFRetain in thumbnailForAsset:maxPixelSize:.
// This release balances that retain.
CFRelease(info);
}
// Returns a UIImage for the given asset, with size length at most the passed size.
// The resulting UIImage will be already rotated to UIImageOrientationUp, so its CGImageRef
// can be used directly without additional rotation handling.
// This is done synchronously, so you should call this method on a background queue/thread.
- (UIImage *)thumbnailForAsset:(ALAsset *)asset maxPixelSize:(NSUInteger)size {
NSParameterAssert(asset != nil);
NSParameterAssert(size > 0);
ALAssetRepresentation *rep = [asset defaultRepresentation];
CGDataProviderDirectCallbacks callbacks = {
.version = 0,
.getBytePointer = NULL,
.releaseBytePointer = NULL,
.getBytesAtPosition = getAssetBytesCallback,
.releaseInfo = releaseAssetCallback,
};
CGDataProviderRef provider = CGDataProviderCreateDirect((void *)CFBridgingRetain(rep), [rep size], &callbacks);
CGImageSourceRef source = CGImageSourceCreateWithDataProvider(provider, NULL);
CGImageRef imageRef = CGImageSourceCreateThumbnailAtIndex(source, 0, (__bridge CFDictionaryRef) #{
(NSString *)kCGImageSourceCreateThumbnailFromImageAlways : #YES,
(NSString *)kCGImageSourceThumbnailMaxPixelSize : [NSNumber numberWithInt:size],
(NSString *)kCGImageSourceCreateThumbnailWithTransform : #YES,
});
CFRelease(source);
CFRelease(provider);
if (!imageRef) {
return nil;
}
UIImage *toReturn = [UIImage imageWithCGImage:imageRef];
CFRelease(imageRef);
return toReturn;
}

iPhone OS 4.0: NSFileHandleDataAvailableNotification not providing callback at file-end

I'm a bit new to iPhone development, so be gentle! I'm supporting an app which loads a wav file from a URL file-stream, and plays it back through an AudioQueue.
We run a continual loop in another thread, and stop the Queue if we detect that it has no buffers in use, and the input FileStream has reached its end. In turn, we detect if the FileStream has ended within the waitForDataInBackgroundAndNotify callback on the NSFileHandleDataAvailableNotification for the stream, by checking whether the availableData has length 0.
This works under iOS 3.0 - we get a notification of 0 available data at the end of the file - but on iOS 4.0, we don't seem to receive the callback at file end. This happens on an OS 4.0 device, regardless of the target OS version.
Has the API changed between the two versions? How can I detect the end of the file now?
Hopefully-relevant code:
data-available callback:
- (void)readFileData:(NSNotification *)notification
{
#try
{
NSData *data = [[notification object] availableData];
if ([data length] == 0 && self.audioQueueState != AQS_END)
{
/***********************************************************************/
/* We've hit the end of the data but it's possible that more may be */
/* appended to the file (if we're still downloading it) so we need to */
/* wait for the availability of more data. */
/***********************************************************************/
[self setFileStreamerState:FSS_END];
[[notification object] waitForDataInBackgroundAndNotify];
}
else if (self.audioQueueState == AQS_END)
{
TRC_DBG(#"ignore read data as ending");
}
else
{
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
TRC_DBG(#"Read %d bytes", [data length]);
[self setFileStreamerState:FSS_DATA];
if (discontinuous)
{
TRC_DBG(#"AudioFileStreamParseBytes %d bytes, discontinuous", [data length]);
err = AudioFileStreamParseBytes(audioFileStream, [data length], [data bytes], kAudioFileStreamParseFlag_Discontinuity);
discontinuous = NO;
}
else
{
TRC_DBG(#"AudioFileStreamParseBytes %d bytes, continuous", [data length]);
err = AudioFileStreamParseBytes(audioFileStream, [data length], [data bytes], 0);
}
/***********************************************************************/
/* If error then get out, otherwise wait again for more data. */
/***********************************************************************/
if (err != 0)
{
[self failWithErrorCode:AS_FILE_STREAM_PARSE_BYTES_FAILED];
}
else
{
[[notification object] waitForDataInBackgroundAndNotify];
}
[pool release];
}
}
#catch (NSException *exception)
{
TRC_ERR(#"Exception: %#", exception);
TRC_ERR(#"Exception reason: %#", [exception reason]);
//[self failWithErrorCode:AS_FILE_AVAILABLE_DATA_FAILED];
}
}