I'm a bit new to iPhone development, so be gentle! I'm supporting an app which loads a wav file from a URL file-stream, and plays it back through an AudioQueue.
We run a continual loop in another thread, and stop the Queue if we detect that it has no buffers in use, and the input FileStream has reached its end. In turn, we detect if the FileStream has ended within the waitForDataInBackgroundAndNotify callback on the NSFileHandleDataAvailableNotification for the stream, by checking whether the availableData has length 0.
This works under iOS 3.0 - we get a notification of 0 available data at the end of the file - but on iOS 4.0, we don't seem to receive the callback at file end. This happens on an OS 4.0 device, regardless of the target OS version.
Has the API changed between the two versions? How can I detect the end of the file now?
Hopefully-relevant code:
data-available callback:
- (void)readFileData:(NSNotification *)notification
{
#try
{
NSData *data = [[notification object] availableData];
if ([data length] == 0 && self.audioQueueState != AQS_END)
{
/***********************************************************************/
/* We've hit the end of the data but it's possible that more may be */
/* appended to the file (if we're still downloading it) so we need to */
/* wait for the availability of more data. */
/***********************************************************************/
[self setFileStreamerState:FSS_END];
[[notification object] waitForDataInBackgroundAndNotify];
}
else if (self.audioQueueState == AQS_END)
{
TRC_DBG(#"ignore read data as ending");
}
else
{
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
TRC_DBG(#"Read %d bytes", [data length]);
[self setFileStreamerState:FSS_DATA];
if (discontinuous)
{
TRC_DBG(#"AudioFileStreamParseBytes %d bytes, discontinuous", [data length]);
err = AudioFileStreamParseBytes(audioFileStream, [data length], [data bytes], kAudioFileStreamParseFlag_Discontinuity);
discontinuous = NO;
}
else
{
TRC_DBG(#"AudioFileStreamParseBytes %d bytes, continuous", [data length]);
err = AudioFileStreamParseBytes(audioFileStream, [data length], [data bytes], 0);
}
/***********************************************************************/
/* If error then get out, otherwise wait again for more data. */
/***********************************************************************/
if (err != 0)
{
[self failWithErrorCode:AS_FILE_STREAM_PARSE_BYTES_FAILED];
}
else
{
[[notification object] waitForDataInBackgroundAndNotify];
}
[pool release];
}
}
#catch (NSException *exception)
{
TRC_ERR(#"Exception: %#", exception);
TRC_ERR(#"Exception reason: %#", [exception reason]);
//[self failWithErrorCode:AS_FILE_AVAILABLE_DATA_FAILED];
}
}
Related
I am trying to access video data from ALAssets library using the below code
ALAssetRepresentation *rep = [asset defaultRepresentation];
Byte *buffer = (Byte*)malloc(rep.size);
NSError *error = nil;
NSUInteger buffered = [rep getBytes:buffer fromOffset:0.0 length:rep.size error:&error];
NSData *data = [NSData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];
It works fine for small video as well as pictures, But if am trying to get a large video, the code crashes saying
* Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '* -[NSConcreteData initWithBytes:length:copy:freeWhenDone:bytesAreVM:]: absurd length: 4294967295, maximum size: 2147483648 bytes'
I don't have an idea what's going on. Any one any thoughts?
Thanks in advance!
I found the solution. I guess the crash may be due to huge memory spike when we upload large files, because I am buffering data. Now I read file data as 5 MB chunks and this fix the crash. I am pasting my code below.
- (NSData *)getDataPartAtOffset:(NSInteger)offset {
__block NSData *chunkData = nil;
if (fileAsset_){
static const NSUInteger BufferSize = PART_SIZE; // 5 MB chunk
ALAssetRepresentation *rep = [fileAsset_ defaultRepresentation];
uint8_t *buffer = calloc(BufferSize, sizeof(*buffer));
NSUInteger bytesRead = 0;
NSError *error = nil;
#try
{
bytesRead = [rep getBytes:buffer fromOffset:offset length:BufferSize error:&error];
chunkData = [NSData dataWithData:[NSData dataWithBytesNoCopy:buffer length:bytesRead freeWhenDone:NO]];
}
#catch (NSException *exception)
{
free(buffer);
chunkData = nil;
// Handle the exception here...
}
free(buffer);
} else {
NSLog(#"failed to retrive Asset");
}
return chunkData;
}
And I I'll call this function as
int offset = 0; // offset that keep tracks of chunk data
do {
#autoreleasepool {
NSData *chunkData = [self getDataPartAtOffset:offset];;
if (!chunkData || ![chunkData length]) { // finished reading data
break;
}
// do your stuff here
offset +=[chunkData length];
}
} while (1);
chilitechno's bit here worked for me.
I was woking on YouTube Resumable Uploads: https://developers.google.com/youtube/2.0/developers_guide_protocol_resumable_uploads#Sending_a_Resumable_Upload_API_Request
I use ASIHttpRequest to upload the videos.
For a direct uploading, I can use this method.
- (void)appendPostDataFromFile:(NSString *)file
But for an resume uploading, I can't append post data from the whole video file.
Maybe I can write a method like this:
- (void)appendPostDataFromFile:(NSString *)file offset:(long long)offset
But I don't know how to make it work. Any help will be appreciated!
And here is the code from ASIHttpRequest:
- (void)appendPostDataFromFile:(NSString *)file
{
[self setupPostBody];
NSInputStream *stream = [[[NSInputStream alloc] initWithFileAtPath:file] autorelease];
[stream open];
NSUInteger bytesRead;
while ( !_isTryCanceled && [stream hasBytesAvailable] ) {
unsigned char buffer[1024*256];
bytesRead = [stream read:buffer maxLength:sizeof(buffer)];
if (bytesRead == 0) {
break;
}
if ([self shouldStreamPostDataFromDisk]) {
[[self postBodyWriteStream] write:buffer maxLength:bytesRead];
} else {
[[self postBody] appendData:[NSData dataWithBytes:buffer length:bytesRead]];
}
}
[stream close];
}
As the code show above, if I can get a seekable NSInputStream, the problem will be solved. Is it possible to do so?
I am playing with GCDAsyncSocket (MRC / iOS5.1) for a while, especially with "large" files (5 - 10 mb). Unfortunately sometimes the read stream is never completed (e.g. it gets stuck) just a few bytes at the end of the stream; the didReadPartialDataOfLength: stops giving me information and the didReadData is not fired at all.
Here's some of my code (for both writing / reading examples the connection between the host and client have been established)
WRITING
#define kHeadTag 0
#define kDataTag 1
typedef struct {
NSUInteger size;
} head_t;
-(void)sendData:(NSData *)__data {
_sendData = [__data retain];
head_t header;
header.size = __data.length;
[_socket writeData:[NSData dataWithBytes:&header
length:sizeof(header)]
withTimeout:-1
tag:kHeadTag];
}
-(void)socket:(GCDAsyncSocket *)sock
didWriteDataWithTag:(long)tag {
if (tag == kHeadTag) {
[sock writeData:_sendData withTimeout:-1 tag:kDataTag];
[_sendData release];
_sendData = nil;
}
}
READING
-(void)socket:(GCDAsyncSocket *)sock
didReadData:(NSData *)data
withTag:(long)tag {
switch (tag) {
// ------------------------------------- HEAD
case kHeadTag:{
head_t head;
[data getBytes:&head length:sizeof(head)];
_readHead = head;
NSLog(#"Received header (total size = %i bytes)", head.size);
[sock readDataToLength:head.size
withTimeout:-1
tag:kDataTag];
}
break;
// ------------------------------------- BODY
case kDataTag: {
NSLog(#"Data received with %i bytes", data.length);
[sock readDataToLength:sizeof(head_t) withTimeout:-1 tag:kHeadTag];
}
break;
}
};
-(void)socket:(GCDAsyncSocket *)sock
didReadPartialDataOfLength:(NSUInteger)partialLength
tag:(long)tag {
if (tag == kDataTag) {
NSLog(#"Data read with %i bytes", partialLength);
}
};
I hope this is enough code to see what I'm doing wrong or maybe this is a bad practice for writing/reading large chunks of data.
I added
[self.partnerSocket writeData:[GCDAsyncSocket CRLFData] withTimeout:-1 tag:0];
at the end of sendData: method, and it works fine for me. You just have to append separator chars at the end of the data.
I can transfer around 48MB of file from one iDevice to other iDevice
I have a problem with NSInputStream. Here is my code:
case NSStreamEventHasBytesAvailable:
printf("BYTE AVAILABLE\n");
int len = 0;
NSMutableData *data = [[NSMutableData alloc] init];
uint8_t buffer[32768];
if(stream == iStream)
{
printf("Receiving...\n");
len = [iStream read:buffer maxLength:32768];
[data appendBytes:buffer length:len];
}
[iStream close];
I try to read small data and it works perfectly on simulator and real iPhone.
If I try to read large data (more than 4kB or maybe 5kB), the real iPhone just can read 2736 bytes and stop.
Why is it? Help me plz!
Merci d'avance!
Your data object needs to be external to your stream handler. It is often the case that when large abounts of data are coming in, you get it in chunks and not all at once. Just keep appending data to it until you receive bytesRead == 0; Then you can close your stream and use the data.
case NSStreamEventHasBytesAvailable: {
NSInteger bytesRead;
uint8_t buffer[32768];
// Pull some data off the network.
bytesRead = [self._networkStream read:buffer maxLength:sizeof(buffer)];
if (bytesRead == -1) {
[self _stopReceiveWithFailure];
} else if (bytesRead == 0) {
[self _stopReceiveWithSuccess];
} else {
[data appendBytes:buffer length:len];
}
Looks like you're creating a new data object every time... perhaps you should be creating & retaining it as a property, and appending to it as you are above.
I'm working with the EXIF library at http://code.google.com/p/iphone-exif/ and I have come across a real head scratcher of a bug. When I implement the library in a debug build everything works beautifully, but when I compile for ad hoc beta testing the app crashes hard.
I'm getting the following error:
Exception Type: EXC_BAD_ACCESS (SIGBUS)
Exception Codes: KERN_PROTECTION_FAILURE at 0x00000000
Crashed Thread: 5
With Thread 5:
0 Gaia GPS 0x000494e4 -[EXFJpeg scanImageData:] (EXFJpeg.m:372)
1 Gaia GPS 0x0000524c -[MyAppDelegate saveImage:] (MyAppDelegate.m:935)
2 Foundation 0x317fef32 0x317ad000 + 335666
3 Foundation 0x317ae09a 0x317ad000 + 4250
4 libSystem.B.dylib 0x329c892a 0x329a4000 + 149802
It is my suspicion that there is something different about the way the debug build handles memory vs. the way the ad-hoc build handles memory. It seems to me from the error that this code is trying to write to memory blocks it does not have access to, and when it does the Ad-hoc iPhone OS shuts the process down.
What would cause this behavior in the ad hoc distribution, but not in the debug build? The debug build works fine even when the phone is disconnected and the debugger off.
Many thanks in advance.
The code:
My Code to implement the library (Line 935 is the first line of this block, the alloc statment)
:
EXFJpeg* jpegScanner = [[EXFJpeg alloc] init];
[jpegScanner scanImageData:imgData];
CLLocation *location = self.gps.lastReading ? self.gps.lastReading : [self.gps.locationManager location];
[location retain];
NSMutableArray* locArray = [self createLocArray:location.coordinate.latitude];
EXFGPSLoc* gpsLoc = [[EXFGPSLoc alloc] init];
[self populateGPS: gpsLoc :locArray];
[jpegScanner.exifMetaData addTagValue:gpsLoc forKey:[NSNumber numberWithInt:EXIF_GPSLatitude];
[gpsLoc release];
locArray = [self createLocArray:location.coordinate.longitude];
gpsLoc = [[EXFGPSLoc alloc] init];
[self populateGPS: gpsLoc :locArray];
[locArray release];
[jpegScanner.exifMetaData addTagValue:gpsLoc forKey:[NSNumber numberWithInt:EXIF_GPSLongitude];
[gpsLoc release];
NSString *ref = (location.coordinate.latitude <0.0)?ref = #"S": #"N";
[jpegScanner.exifMetaData addTagValue: ref forKey:[NSNumber numberWithInt:EXIF_GPSLatitudeRef] ];
ref = (location.coordinate.longitude <0.0)? #"W": #"E";
[jpegScanner.exifMetaData addTagValue: ref forKey:[NSNumber numberWithInt:EXIF_GPSLongitudeRef]];
[jpegScanner.exifMetaData addTagValue: #"Apple" forKey:[NSNumber numberWithInt:EXIF_Make];
[jpegScanner.exifMetaData addTagValue: #"iPhone" forKey:NSNumber numberWithInt:EXIF_Model];
[jpegScanner.exifMetaData addTagValue:[NSNumber numberWithInt:0] forKey:[NSNumber numberWithInt:EXIF_GPSAltitudeRef] ];
NSArray *arr = [[NSArray alloc] initWithObjects:[NSNumber numberWithInt:0], NSNumber numberWithInt:0], [NSNumber numberWithInt:2], [NSNumber numberWithInt:2], nil];
[jpegScanner.exifMetaData addTagValue: arr forKey:[NSNumber numberWithInt:EXIF_GPSVersion] ];
[arr release];
long numDenumArray[2];
long* arrPtr = numDenumArray;
[EXFUtils convertRationalToFraction:&arrPtr: [NSNumber numberWithDouble:location.altitude]];
EXFraction *fract = [[EXFraction alloc] initWith:numDenumArray[0] :numDenumArray[1]];
[jpegScanner.exifMetaData addTagValue:fract forKey:[NSNumber
numberWithInt:EXIF_GPSAltitude] ];
NSMutableData *newData = [[NSMutableData alloc] init];
[jpegScanner populateImageData:newData];
[jpegScanner release];
And last but not lest the function from the library itself:
-(void) scanImageData: (NSData*) jpegData {
Debug(#"Starting scan headers");
// pointer to the end of the EXIF Data and the start of the rest of the image
ByteArray* endOfEXFPtr;
imageLength = CFDataGetLength((CFDataRef)jpegData);
// CFRetain(&imageLength);
Debug(#"Length of image %i", imageLength);
imageBytePtr = (UInt8 *) CFDataGetBytePtr((CFDataRef)jpegData);
imageStartPtr = imageBytePtr;
// check if a valid jpeg file
UInt8 val = [self readNextbyte];
if (val != M_BEG){
Debug(#"Not a valid JPEG File");
return;
}
val = [self readNextbyte];
if (val != M_SOI){
Debug(#"Not a valid start of image JPEG File");
return;
}
// increment this to position after second byte
BOOL finished =FALSE;
while(!finished){
// increment the marker
val = [self nextMarker];
Debug(#"Got next marker %x at byte count %i", val, (imageBytePtr - imageStartPtr));
switch(val){
case M_SOF0: /* Baseline */
case M_SOF1: /* Extended sequential, Huffman */
case M_SOF2: /* Progressive, Huffman */
case M_SOF3: /* Lossless, Huffman */
case M_SOF5: /* Differential sequential, Huffman */
case M_SOF6: /* Differential progressive, Huffman */
case M_SOF7: /* Differential lossless, Huffman */
case M_SOF9: /* Extended sequential, arithmetic */
case M_SOF10: /* Progressive, arithmetic */
case M_SOF11: /* Lossless, arithmetic */
case M_SOF13: /* Differential sequential, arithmetic */
case M_SOF14: /* Differential progressive, arithmetic */
case M_SOF15: /* Differential lossless, arithmetic */
// Remember the kind of compression we saw
{
int compression = *imageBytePtr; // <-----------LINE 372
self.exifMetaData.compression = compression;
// Get the intrinsic properties fo the image
[self readImageInfo];
}
break;
case M_SOS: /* stop before hitting compressed data */
Debug(#"Found SOS at %i", imageBytePtr - imageStartPtr);
// [self skipVariable];
// Update the EXIF
// updateExif();
finished = TRUE;
break;
case M_EOI: /* in case it's a tables-only JPEG stream */
Debug(#"End of Image reached at %i ", imageBytePtr - imageStartPtr);
finished =TRUE;
break;
case M_COM:
Debug(#"Got com at %i",imageBytePtr - imageStartPtr);
break;
case M_APP0:
case M_APP1:
case M_APP2:
case M_APP3:
case M_APP4:
case M_APP5:
case M_APP6:
case M_APP7:
case M_APP8:
case M_APP9:
case M_APP10:
case M_APP11:
case M_APP12:
case M_APP13:
case M_APP14:
case M_APP15:
// Some digital camera makers put useful textual
// information into APP1 and APP12 markers, so we print
// those out too when in -verbose mode.
{
Debug(#"Found app %x at %i", val, imageBytePtr - imageStartPtr);
NSData* commentData = [self processComment];
NSNumber* key = [[NSNumber alloc]initWithInt:val];
// add comments to dictionary
[self.keyedHeaders setObject:commentData forKey:key];
[key release];
// will always mark the end of the app_x block
endOfEXFPtr = imageBytePtr;
// we pass a pointer to the NSData pointer here
if (val == M_APP0){
Debug(#"Parsing JFIF APP_0 at %i", imageBytePtr - imageStartPtr);
[self parseJfif:(CFDataRef*)&commentData];
} else if (val == M_APP1){
[self parseExif:(CFDataRef*)&commentData];
Debug(#"Finished App1 at %i", endOfEXFPtr - imageStartPtr);
} else if (val == M_APP2){
Debug(#"Finished APP2 at %i", imageBytePtr - imageStartPtr);
}else{
Debug(#"Finished App &x at %i", val, imageBytePtr - imageStartPtr);
}
}
break;
case M_SOI:
Debug(#"SOI encountered at %i",imageBytePtr - imageStartPtr);
break;
default: // Anything else just gets skipped
Debug(#"NOt handled %x skipping at %i",val, imageBytePtr - imageStartPtr);
[self skipVariable]; // we assume it has a parameter count...
break;
}
}
// add in the bytes after the exf block
NSData* theRemainingdata = [[NSData alloc] initWithBytes:endOfEXFPtr length:imageLength - (endOfEXFPtr - imageStartPtr)];
self.remainingData = theRemainingdata;
[theRemainingdata release];
endOfEXFPtr = NULL;
imageStartPtr = NULL;
imageBytePtr = NULL;
}
I got this same problem a while ago, and found 2 solutions (or workarounds):
Use the precompiled library from http://code.google.com/p/iphone-exif/downloads/list instead of compiling from the source
Change line 1270 of EXFMetaData.m to:
CFDataGetBytes(*exifData, CFRangeMake(6,2), order);
as suggested here: http://code.google.com/p/iphone-exif/issues/detail?id=4&can=1
Patched IT :
Write this code in te file EXFJpeg.m at row 330
if (!imageBytePtr)
return;
Just before
UInt8 val = [self readNextbyte];
THAT'S ALL !!!!