I'm working with the EXIF library at http://code.google.com/p/iphone-exif/ and I have come across a real head scratcher of a bug. When I implement the library in a debug build everything works beautifully, but when I compile for ad hoc beta testing the app crashes hard.
I'm getting the following error:
Exception Type: EXC_BAD_ACCESS (SIGBUS)
Exception Codes: KERN_PROTECTION_FAILURE at 0x00000000
Crashed Thread: 5
With Thread 5:
0 Gaia GPS 0x000494e4 -[EXFJpeg scanImageData:] (EXFJpeg.m:372)
1 Gaia GPS 0x0000524c -[MyAppDelegate saveImage:] (MyAppDelegate.m:935)
2 Foundation 0x317fef32 0x317ad000 + 335666
3 Foundation 0x317ae09a 0x317ad000 + 4250
4 libSystem.B.dylib 0x329c892a 0x329a4000 + 149802
It is my suspicion that there is something different about the way the debug build handles memory vs. the way the ad-hoc build handles memory. It seems to me from the error that this code is trying to write to memory blocks it does not have access to, and when it does the Ad-hoc iPhone OS shuts the process down.
What would cause this behavior in the ad hoc distribution, but not in the debug build? The debug build works fine even when the phone is disconnected and the debugger off.
Many thanks in advance.
The code:
My Code to implement the library (Line 935 is the first line of this block, the alloc statment)
:
EXFJpeg* jpegScanner = [[EXFJpeg alloc] init];
[jpegScanner scanImageData:imgData];
CLLocation *location = self.gps.lastReading ? self.gps.lastReading : [self.gps.locationManager location];
[location retain];
NSMutableArray* locArray = [self createLocArray:location.coordinate.latitude];
EXFGPSLoc* gpsLoc = [[EXFGPSLoc alloc] init];
[self populateGPS: gpsLoc :locArray];
[jpegScanner.exifMetaData addTagValue:gpsLoc forKey:[NSNumber numberWithInt:EXIF_GPSLatitude];
[gpsLoc release];
locArray = [self createLocArray:location.coordinate.longitude];
gpsLoc = [[EXFGPSLoc alloc] init];
[self populateGPS: gpsLoc :locArray];
[locArray release];
[jpegScanner.exifMetaData addTagValue:gpsLoc forKey:[NSNumber numberWithInt:EXIF_GPSLongitude];
[gpsLoc release];
NSString *ref = (location.coordinate.latitude <0.0)?ref = #"S": #"N";
[jpegScanner.exifMetaData addTagValue: ref forKey:[NSNumber numberWithInt:EXIF_GPSLatitudeRef] ];
ref = (location.coordinate.longitude <0.0)? #"W": #"E";
[jpegScanner.exifMetaData addTagValue: ref forKey:[NSNumber numberWithInt:EXIF_GPSLongitudeRef]];
[jpegScanner.exifMetaData addTagValue: #"Apple" forKey:[NSNumber numberWithInt:EXIF_Make];
[jpegScanner.exifMetaData addTagValue: #"iPhone" forKey:NSNumber numberWithInt:EXIF_Model];
[jpegScanner.exifMetaData addTagValue:[NSNumber numberWithInt:0] forKey:[NSNumber numberWithInt:EXIF_GPSAltitudeRef] ];
NSArray *arr = [[NSArray alloc] initWithObjects:[NSNumber numberWithInt:0], NSNumber numberWithInt:0], [NSNumber numberWithInt:2], [NSNumber numberWithInt:2], nil];
[jpegScanner.exifMetaData addTagValue: arr forKey:[NSNumber numberWithInt:EXIF_GPSVersion] ];
[arr release];
long numDenumArray[2];
long* arrPtr = numDenumArray;
[EXFUtils convertRationalToFraction:&arrPtr: [NSNumber numberWithDouble:location.altitude]];
EXFraction *fract = [[EXFraction alloc] initWith:numDenumArray[0] :numDenumArray[1]];
[jpegScanner.exifMetaData addTagValue:fract forKey:[NSNumber
numberWithInt:EXIF_GPSAltitude] ];
NSMutableData *newData = [[NSMutableData alloc] init];
[jpegScanner populateImageData:newData];
[jpegScanner release];
And last but not lest the function from the library itself:
-(void) scanImageData: (NSData*) jpegData {
Debug(#"Starting scan headers");
// pointer to the end of the EXIF Data and the start of the rest of the image
ByteArray* endOfEXFPtr;
imageLength = CFDataGetLength((CFDataRef)jpegData);
// CFRetain(&imageLength);
Debug(#"Length of image %i", imageLength);
imageBytePtr = (UInt8 *) CFDataGetBytePtr((CFDataRef)jpegData);
imageStartPtr = imageBytePtr;
// check if a valid jpeg file
UInt8 val = [self readNextbyte];
if (val != M_BEG){
Debug(#"Not a valid JPEG File");
return;
}
val = [self readNextbyte];
if (val != M_SOI){
Debug(#"Not a valid start of image JPEG File");
return;
}
// increment this to position after second byte
BOOL finished =FALSE;
while(!finished){
// increment the marker
val = [self nextMarker];
Debug(#"Got next marker %x at byte count %i", val, (imageBytePtr - imageStartPtr));
switch(val){
case M_SOF0: /* Baseline */
case M_SOF1: /* Extended sequential, Huffman */
case M_SOF2: /* Progressive, Huffman */
case M_SOF3: /* Lossless, Huffman */
case M_SOF5: /* Differential sequential, Huffman */
case M_SOF6: /* Differential progressive, Huffman */
case M_SOF7: /* Differential lossless, Huffman */
case M_SOF9: /* Extended sequential, arithmetic */
case M_SOF10: /* Progressive, arithmetic */
case M_SOF11: /* Lossless, arithmetic */
case M_SOF13: /* Differential sequential, arithmetic */
case M_SOF14: /* Differential progressive, arithmetic */
case M_SOF15: /* Differential lossless, arithmetic */
// Remember the kind of compression we saw
{
int compression = *imageBytePtr; // <-----------LINE 372
self.exifMetaData.compression = compression;
// Get the intrinsic properties fo the image
[self readImageInfo];
}
break;
case M_SOS: /* stop before hitting compressed data */
Debug(#"Found SOS at %i", imageBytePtr - imageStartPtr);
// [self skipVariable];
// Update the EXIF
// updateExif();
finished = TRUE;
break;
case M_EOI: /* in case it's a tables-only JPEG stream */
Debug(#"End of Image reached at %i ", imageBytePtr - imageStartPtr);
finished =TRUE;
break;
case M_COM:
Debug(#"Got com at %i",imageBytePtr - imageStartPtr);
break;
case M_APP0:
case M_APP1:
case M_APP2:
case M_APP3:
case M_APP4:
case M_APP5:
case M_APP6:
case M_APP7:
case M_APP8:
case M_APP9:
case M_APP10:
case M_APP11:
case M_APP12:
case M_APP13:
case M_APP14:
case M_APP15:
// Some digital camera makers put useful textual
// information into APP1 and APP12 markers, so we print
// those out too when in -verbose mode.
{
Debug(#"Found app %x at %i", val, imageBytePtr - imageStartPtr);
NSData* commentData = [self processComment];
NSNumber* key = [[NSNumber alloc]initWithInt:val];
// add comments to dictionary
[self.keyedHeaders setObject:commentData forKey:key];
[key release];
// will always mark the end of the app_x block
endOfEXFPtr = imageBytePtr;
// we pass a pointer to the NSData pointer here
if (val == M_APP0){
Debug(#"Parsing JFIF APP_0 at %i", imageBytePtr - imageStartPtr);
[self parseJfif:(CFDataRef*)&commentData];
} else if (val == M_APP1){
[self parseExif:(CFDataRef*)&commentData];
Debug(#"Finished App1 at %i", endOfEXFPtr - imageStartPtr);
} else if (val == M_APP2){
Debug(#"Finished APP2 at %i", imageBytePtr - imageStartPtr);
}else{
Debug(#"Finished App &x at %i", val, imageBytePtr - imageStartPtr);
}
}
break;
case M_SOI:
Debug(#"SOI encountered at %i",imageBytePtr - imageStartPtr);
break;
default: // Anything else just gets skipped
Debug(#"NOt handled %x skipping at %i",val, imageBytePtr - imageStartPtr);
[self skipVariable]; // we assume it has a parameter count...
break;
}
}
// add in the bytes after the exf block
NSData* theRemainingdata = [[NSData alloc] initWithBytes:endOfEXFPtr length:imageLength - (endOfEXFPtr - imageStartPtr)];
self.remainingData = theRemainingdata;
[theRemainingdata release];
endOfEXFPtr = NULL;
imageStartPtr = NULL;
imageBytePtr = NULL;
}
I got this same problem a while ago, and found 2 solutions (or workarounds):
Use the precompiled library from http://code.google.com/p/iphone-exif/downloads/list instead of compiling from the source
Change line 1270 of EXFMetaData.m to:
CFDataGetBytes(*exifData, CFRangeMake(6,2), order);
as suggested here: http://code.google.com/p/iphone-exif/issues/detail?id=4&can=1
Patched IT :
Write this code in te file EXFJpeg.m at row 330
if (!imageBytePtr)
return;
Just before
UInt8 val = [self readNextbyte];
THAT'S ALL !!!!
Related
I'm new to Objective-C coding and for smart card programming as well. I'm developing application for iPhone. I found a difficulty reading a file from a smart card using iSmartSDK it returns NULL, though the selection of Applet is successful.
Does anyone faced such problem or know about this ? thanks
t0Command.classByte = 0;
t0Command.instructionByte = 0xCB;
t0Command.parameter1Byte = 0x3F;
t0Command.parameter2Byte = 0xFF;
t0Command.dataBytes = nil;
t0Command.lengthByte = t0Command.dataBytes.length;
t0Command.expectedResponseLengthByte = 0x7F;
t0Command.dataBytes = [self hexStringToBytes:#"5C035FC105"];
t0Command.commandId = READ_WRITE_CPU_CARD_COMMAND_ID;
resultCommand = [iSmartInstance sendCommandToCPUCard:t0Command];
if (resultCommand == nil) {
[self displayMessage:#"Read Binary Failed"];
}
else{
[self displayMessage:[NSString stringWithFormat:#"Read Status: %02x%02x", ((T0Command*)resultCommand).status1Byte, ((T0Command*)resultCommand).status2Byte]];
[self displayMessage:[NSString stringWithFormat:#"Response Length: %d", ((T0Command*)resultCommand).dataBytes.length]];
NSString *readBytes = [[[NSString alloc] initWithBytes:((T0Command*)resultCommand).dataBytes.bytes
length:((T0Command*)resultCommand).dataBytes.length
encoding:NSASCIIStringEncoding] autorelease];
[self displayMessage:[NSString stringWithFormat:#"Read Value: %# \n Bytes: %#", readBytes, ((T0Command*)resultCommand).dataBytes.bytes]];
}
First time asking a question here. I'm hoping the post is clear and sample code is formatted correctly.
I'm experimenting with AVFoundation and time lapse photography.
My intent is to grab every Nth frame from the video camera of an iOS device (my iPod touch, version 4) and write each of those frames out to a file to create a timelapse. I'm using AVCaptureVideoDataOutput, AVAssetWriter and AVAssetWriterInput.
The problem is, if I use the CMSampleBufferRef passed to captureOutput:idOutputSampleBuffer:fromConnection:, the playback of each frame is the length of time between original input frames. A frame rate of say 1fps. I'm looking to get 30fps.
I've tried using CMSampleBufferCreateCopyWithNewTiming(), but then after 13 frames are written to the file, the captureOutput:idOutputSampleBuffer:fromConnection: stops being called. The interface is active and I can tap a button to stop the capture and save it to the photo library for playback. It appears to play back as I want it, 30fps, but it only has those 13 frames.
How can I accomplish my goal of 30fps playback?
How can I tell where the app is getting lost and why?
I've placed a flag called useNativeTime so I can test both cases. When set to YES, I get all frames I'm interested in as the callback doesn't 'get lost'. When I set that flag to NO, I only ever get 13 frames processed and am never returned to that method again. As mentioned above, in both cases I can playback the video.
Thanks for any help.
Here is where I'm trying to do the retiming.
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
BOOL useNativeTime = NO;
BOOL appendSuccessFlag = NO;
//NSLog(#"in captureOutpput sample buffer method");
if( !CMSampleBufferDataIsReady(sampleBuffer) )
{
NSLog( #"sample buffer is not ready. Skipping sample" );
//CMSampleBufferInvalidate(sampleBuffer);
return;
}
if (! [inputWriterBuffer isReadyForMoreMediaData])
{
NSLog(#"Not ready for data.");
}
else {
// Write every first frame of n frames (30 native from camera).
intervalFrames++;
if (intervalFrames > 30) {
intervalFrames = 1;
}
else if (intervalFrames != 1) {
//CMSampleBufferInvalidate(sampleBuffer);
return;
}
// Need to initialize start session time.
if (writtenFrames < 1) {
if (useNativeTime) imageSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
else imageSourceTime = CMTimeMake( 0 * 20 ,600); //CMTimeMake(1,30);
[outputWriter startSessionAtSourceTime: imageSourceTime];
NSLog(#"Starting CMtime");
CMTimeShow(imageSourceTime);
}
if (useNativeTime) {
imageSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
CMTimeShow(imageSourceTime);
// CMTime myTiming = CMTimeMake(writtenFrames * 20,600);
// CMSampleBufferSetOutputPresentationTimeStamp(sampleBuffer, myTiming); // Tried but has no affect.
appendSuccessFlag = [inputWriterBuffer appendSampleBuffer:sampleBuffer];
}
else {
CMSampleBufferRef newSampleBuffer;
CMSampleTimingInfo sampleTimingInfo;
sampleTimingInfo.duration = CMTimeMake(20,600);
sampleTimingInfo.presentationTimeStamp = CMTimeMake( (writtenFrames + 0) * 20,600);
sampleTimingInfo.decodeTimeStamp = kCMTimeInvalid;
OSStatus myStatus;
//NSLog(#"numSamples of sampleBuffer: %i", CMSampleBufferGetNumSamples(sampleBuffer) );
myStatus = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault,
sampleBuffer,
1,
&sampleTimingInfo, // maybe a little confused on this param.
&newSampleBuffer);
// These confirm the good heath of our newSampleBuffer.
if (myStatus != 0) NSLog(#"CMSampleBufferCreateCopyWithNewTiming() myStatus: %i",myStatus);
if (! CMSampleBufferIsValid(newSampleBuffer)) NSLog(#"CMSampleBufferIsValid NOT!");
// No affect.
//myStatus = CMSampleBufferMakeDataReady(newSampleBuffer); // How is this different; CMSampleBufferSetDataReady ?
//if (myStatus != 0) NSLog(#"CMSampleBufferMakeDataReady() myStatus: %i",myStatus);
imageSourceTime = CMSampleBufferGetPresentationTimeStamp(newSampleBuffer);
CMTimeShow(imageSourceTime);
appendSuccessFlag = [inputWriterBuffer appendSampleBuffer:newSampleBuffer];
//CMSampleBufferInvalidate(sampleBuffer); // Docs don't describe action. WTF does it do? Doesn't seem to affect my problem. Used with CMSampleBufferSetInvalidateCallback maybe?
//CFRelease(sampleBuffer); // - Not surprisingly - “EXC_BAD_ACCESS”
}
if (!appendSuccessFlag)
{
NSLog(#"Failed to append pixel buffer");
}
else {
writtenFrames++;
NSLog(#"writtenFrames: %i", writtenFrames);
}
}
//[self displayOuptutWritterStatus]; // Expect and see AVAssetWriterStatusWriting.
}
My setup routine.
- (IBAction) recordingStartStop: (id) sender
{
NSError * error;
if (self.isRecording) {
NSLog(#"~~~~~~~~~ STOPPING RECORDING ~~~~~~~~~");
self.isRecording = NO;
[recordingStarStop setTitle: #"Record" forState: UIControlStateNormal];
//[self.captureSession stopRunning];
[inputWriterBuffer markAsFinished];
[outputWriter endSessionAtSourceTime:imageSourceTime];
[outputWriter finishWriting]; // Blocks until file is completely written, or an error occurs.
NSLog(#"finished CMtime");
CMTimeShow(imageSourceTime);
// Really, I should loop through the outputs and close all of them or target specific ones.
// Since I'm only recording video right now, I feel safe doing this.
[self.captureSession removeOutput: [[self.captureSession outputs] objectAtIndex: 0]];
[videoOutput release];
[inputWriterBuffer release];
[outputWriter release];
videoOutput = nil;
inputWriterBuffer = nil;
outputWriter = nil;
NSLog(#"~~~~~~~~~ STOPPED RECORDING ~~~~~~~~~");
NSLog(#"Calling UIVideoAtPathIsCompatibleWithSavedPhotosAlbum.");
NSLog(#"filePath: %#", [projectPaths movieFilePath]);
if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum([projectPaths movieFilePath])) {
NSLog(#"Calling UISaveVideoAtPathToSavedPhotosAlbum.");
UISaveVideoAtPathToSavedPhotosAlbum ([projectPaths movieFilePath], self, #selector(video:didFinishSavingWithError: contextInfo:), nil);
}
NSLog(#"~~~~~~~~~ WROTE RECORDING to PhotosAlbum ~~~~~~~~~");
}
else {
NSLog(#"~~~~~~~~~ STARTING RECORDING ~~~~~~~~~");
projectPaths = [[ProjectPaths alloc] initWithProjectFolder: #"TestProject"];
intervalFrames = 30;
videoOutput = [[AVCaptureVideoDataOutput alloc] init];
NSMutableDictionary * cameraVideoSettings = [[[NSMutableDictionary alloc] init] autorelease];
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt: kCVPixelFormatType_32BGRA]; //kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange];
[cameraVideoSettings setValue: value forKey: key];
[videoOutput setVideoSettings: cameraVideoSettings];
[videoOutput setMinFrameDuration: CMTimeMake(20, 600)]; //CMTimeMake(1, 30)]; // 30fps
[videoOutput setAlwaysDiscardsLateVideoFrames: YES];
queue = dispatch_queue_create("cameraQueue", NULL);
[videoOutput setSampleBufferDelegate: self queue: queue];
dispatch_release(queue);
NSMutableDictionary *outputSettings = [[[NSMutableDictionary alloc] init] autorelease];
[outputSettings setValue: AVVideoCodecH264 forKey: AVVideoCodecKey];
[outputSettings setValue: [NSNumber numberWithInt: 1280] forKey: AVVideoWidthKey]; // currently assuming
[outputSettings setValue: [NSNumber numberWithInt: 720] forKey: AVVideoHeightKey];
NSMutableDictionary *compressionSettings = [[[NSMutableDictionary alloc] init] autorelease];
[compressionSettings setValue: AVVideoProfileLevelH264Main30 forKey: AVVideoProfileLevelKey];
//[compressionSettings setValue: [NSNumber numberWithDouble:1024.0*1024.0] forKey: AVVideoAverageBitRateKey];
[outputSettings setValue: compressionSettings forKey: AVVideoCompressionPropertiesKey];
inputWriterBuffer = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo outputSettings: outputSettings];
[inputWriterBuffer retain];
inputWriterBuffer.expectsMediaDataInRealTime = YES;
outputWriter = [AVAssetWriter assetWriterWithURL: [projectPaths movieURLPath] fileType: AVFileTypeQuickTimeMovie error: &error];
[outputWriter retain];
if (error) NSLog(#"error for outputWriter = [AVAssetWriter assetWriterWithURL:fileType:error:");
if ([outputWriter canAddInput: inputWriterBuffer]) [outputWriter addInput: inputWriterBuffer];
else NSLog(#"can not add input");
if (![outputWriter canApplyOutputSettings: outputSettings forMediaType:AVMediaTypeVideo]) NSLog(#"ouptutSettings are NOT supported");
if ([captureSession canAddOutput: videoOutput]) [self.captureSession addOutput: videoOutput];
else NSLog(#"could not addOutput: videoOutput to captureSession");
//[self.captureSession startRunning];
self.isRecording = YES;
[recordingStarStop setTitle: #"Stop" forState: UIControlStateNormal];
writtenFrames = 0;
imageSourceTime = kCMTimeZero;
[outputWriter startWriting];
//[outputWriter startSessionAtSourceTime: imageSourceTime];
NSLog(#"~~~~~~~~~ STARTED RECORDING ~~~~~~~~~");
NSLog (#"recording to fileURL: %#", [projectPaths movieURLPath]);
}
NSLog(#"isRecording: %#", self.isRecording ? #"YES" : #"NO");
[self displayOuptutWritterStatus];
}
OK, I found the bug in my first post.
When using
myStatus = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault,
sampleBuffer,
1,
&sampleTimingInfo,
&newSampleBuffer);
you need to balance that with a CFRelease(newSampleBuffer);
The same idea holds true when using a CVPixelBufferRef with a piexBufferPool of an AVAssetWriterInputPixelBufferAdaptor instance. You would use CVPixelBufferRelease(yourCVPixelBufferRef); after calling the appendPixelBuffer: withPresentationTime: method.
Hope this is helpful to someone else.
With a little more searching and reading I have a working solution. Don't know that it is best method, but so far, so good.
In my setup area I've setup an AVAssetWriterInputPixelBufferAdaptor. The code addition looks like this.
InputWriterBufferAdaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput: inputWriterBuffer
sourcePixelBufferAttributes: nil];
[inputWriterBufferAdaptor retain];
For completeness to understand the code below, I also have these three lines in the setup method.
fpsOutput = 30; //Some possible values: 30, 10, 15 24, 25, 30/1.001 or 29.97;
cmTimeSecondsDenominatorTimescale = 600 * 100000; //To more precisely handle 29.97.
cmTimeNumeratorValue = cmTimeSecondsDenominatorTimescale / fpsOutput;
Instead of applying a retiming to a copy of the sample buffer. I now have the following three lines of code that effectively does the same thing. Notice the withPresentationTime parameter for the adapter. By passing my custom value to that, I gain the correct timing I'm seeking.
CVPixelBufferRef myImage = CMSampleBufferGetImageBuffer( sampleBuffer );
imageSourceTime = CMTimeMake( writtenFrames * cmTimeNumeratorValue, cmTimeSecondsDenominatorTimescale);
appendSuccessFlag = [inputWriterBufferAdaptor appendPixelBuffer: myImage withPresentationTime: imageSourceTime];
Use of the AVAssetWriterInputPixelBufferAdaptor.pixelBufferPool property may have some gains, but I haven't figured that out.
I am trying to add data to CoreData. It works fine when I build from Xcode to the phone but when I try to start the app directly from iPhone it crashes on first save to the Context.
I read a text file that is synced via iTunes File Sharing, the file is pretty big (~350 000 lines). The values I get from the file is added to two different arrays (barcodes and productNames). The arrays are later batched through and the sent to the function where I save the data.
From the array loop:
[...]
words = [rawText componentsSeparatedByString:#";"];
int loopCounter = 0;
int loopLimit = 20000;
int n = 0;
int wordType;
NSEnumerator *word = [words objectEnumerator];
NSLog(#"Create arrays");
while(tmpWord = [word nextObject]) {
if ([tmpWord isEqualToString: #""] || [tmpWord isEqualToString: #"\r\n"]) {
// NSLog(#"%#*** NOTHING *** ",tmpWord);
}else {
n++;
wordType = n%2;
if (wordType == kBarcode) {
[barcodes addObject: tmpWord];
}else if (wordType == kProduct) {
[productNames addObject: tmpWord];
}
// Send to batch //
loopCounter ++;
if (loopCounter == loopLimit) {
loopCounter = 0;
NSLog(#"adding new batch");
[self addBatchOfData];
[barcodes release];
[productNames release];
barcodes = [[NSMutableArray arrayWithCapacity:20000] retain];
productNames = [[NSMutableArray arrayWithCapacity:20000] retain];
}
}
[...]
And then the save-function:
-(void)addBatchOfData {
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
NSError *error;
NSUInteger loopLimit = 5000;
NSUInteger loopCounter = 0;
NSString *ean;
NSString *designation;
for (int i=0; i<[barcodes count];i++ ) {
ean = [barcodes objectAtIndex:i];
designation = [productNames objectAtIndex:i];
Product *product = (Product *)[NSEntityDescription insertNewObjectForEntityForName:#"Product" inManagedObjectContext:importContext];
[product setDesignation:designation];
[product setBarcode:ean];
loopCounter ++;
if (loopCounter == loopLimit) {
NSLog(#"Save CoreData");
[importContext save:&error];
[importContext reset];
[pool drain];
pool = [[NSAutoreleasePool alloc] init];
loopCounter = 0;
}
}
// Save any remaining records
if (loopCounter != 0) {
[importContext save:&error];
[importContext reset];
}
[pool drain];
}
It's really irritating that it works fine when I build from Xcode. Hopefully there is a setting that I missed or something...
EDIT: Forgot to mention that I don't get passed the Default-screen and I don't have any logs. Can it have something to do with the provisioning?
Offload your file loading in a background thread and let the phone start up your main window and view. iOS will kill your app if you do not present a view in a timely manor (this is what you are seeing).
I have to do something like this for my xml -> CoreData converter code. I just present the user with a view notifying them of what is going on and a progress bar (I use https://github.com/matej/MBProgressHUD).
something like:
self.hud = [[MBProgressHUD alloc] initWithView:window];
// Set determinate mode
hud.mode = MBProgressHUDModeDeterminate;
hud.delegate = self;
hud.labelText = #"Converting Data File";
[self.window addSubview:hud];
// Show the HUD while the provided method executes in a new thread
[hud showWhileExecuting:#selector(convertToCoreDataStoreTask) onTarget:self withObject:nil animated:YES];
You just have to make sure that you use a separate NSManagedObjectContext in the new thread.
I would suggest that you implement this delegate method and then try to see what is going on with memory.
when running in the simulator, you have no memory constraints, but when running in the phone you do
- (void)applicationDidReceiveMemoryWarning:(UIApplication *)application
{
}
I think I find the solution to my question.
What I was doing was that I started all the heavy data crunch in the "- (void) viewDidLoad {". When I changed it to start the crunch after I clicked a button in the app, it worked just fine.
Right now it's just finding out where the start the data crunch, any suggestions?
I have two problems
1) My App works fine on the device the first few times its run. Then it crashes after the First screen pops up(Tab BAr). If i connect the device to my MAC and then run the device app it works(not debug mode).
I Checked the crash Logs, it crashed cos of "EXC_BAD_ACCESS (SIGSEGV)" and Thread0 crashed.The error was NSAutorelease released a deallocated Object.
2) I ran the app using instruments on the simulator. It show a lot of leaks on this function call.
Here is a sample code. When i run it using Instruments it shows a leak on the "setObject" line.
//class A- subclass of NSObject
+(NSMutableDictionary *)Hello {
NSMutableDictionary *dctONE = [[NSMutableDictionary alloc]initWithCapacity:0];
NSMutableArray *arrKeys = [[NSMutableArray alloc] initWithCapacity:0];
[arrKeys addObject:#"Object1"];
[arrKeys addObject:#"Object2"];
[dctONE setObject:[NSString stringWithFormat:#"dsfsdf"] forKey:[arrKeys objectAtIndex:0]];
[dctONE setObject:[NSString stringWithFormat:#"dsfsdf"] forKey:[arrKeys objectAtIndex:1]];
[arrKeys release];
return dctONE;
}
/// class B
-(void)some_Function {
NSMutableDictionary * dct = [A Hello]; //all declarations are done
//do stuff with dct
[dct release];
}
WHy does it Leak at "setObject"?? I am releasing everything properly right? Only thing is the [NSString stringWithFormat:] but that is autorelease right??
This is driving me Crazy?
Are the two problems related??
PS: It Doesnt crash on the sim, and strangely doesnt crash even when i connect my device to my MAC and then test it on device(not debugging, directly clicking the app on the device)
EDIT:
-(NSMutableDictionary *) ExecuteDataSet:(NSString *)strQuery
{
NSMutableDictionary *dctResult = [[[NSMutableDictionary alloc] init] autorelease];
// BOOL isSucess = FALSE;
const char *sql = [strQuery UTF8String];
sqlite3_stmt *selectStatement;
//prepare the select statement
int returnValue = sqlite3_prepare_v2(database, sql, -1, &selectStatement, NULL);
if(returnValue == SQLITE_OK)
{
sqlite3_bind_text(selectStatement, 1, sql, -1, SQLITE_TRANSIENT);
//loop all the rows returned by the query.
NSMutableArray *arrColumns = [[NSMutableArray alloc] init];
for (int i=0; i<sqlite3_column_count(selectStatement); i++)
{
const char *st = sqlite3_column_name(selectStatement, i);
[arrColumns addObject:[NSString stringWithCString:st encoding:NSUTF8StringEncoding]];
}
int intRow =1;
while(sqlite3_step(selectStatement) == SQLITE_ROW)
{
NSMutableDictionary *dctRow = [[NSMutableDictionary alloc] init];
for (int i=0; i<sqlite3_column_count(selectStatement); i++)
{
int intValue = 0;
const char *strValue;
switch (sqlite3_column_type(selectStatement,i))
{
case SQLITE_INTEGER:
intValue = (int)sqlite3_column_int(selectStatement, i);
[dctRow setObject:[NSString stringWithFormat:#"%d",intValue] forKey:[arrColumns objectAtIndex:i]];
break;
case SQLITE_TEXT:
strValue = (const char *)sqlite3_column_text(selectStatement, i);
[dctRow setObject:[NSString stringWithCString:strValue encoding:NSUTF8StringEncoding] forKey:[arrColumns objectAtIndex:i]];
break;
default:
strValue = (const char *)sqlite3_column_value(selectStatement, i);
[dctRow setObject:[NSString stringWithCString:strValue encoding:NSUTF8StringEncoding] forKey:[arrColumns objectAtIndex:i]];
break;
}
}
[dctResult setObject:[dctRow retain] forKey:[NSString stringWithFormat:#"Table%d",intRow]];
intRow ++;
[dctRow release];
}
[arrColumns release];
}
return dctResult;
}
The leak appears to be in this line:
[dctResult setObject:[dctRow retain] forKey:[NSString stringWithFormat:#"Table%d",intRow]];
setObject will call retain on the object it stores, but you are actually retaining it manually, so its retain count will be 2 instead of 1, and when dctResult gets released, it won't be removed from memory.
You're doing a whole bunch of unnecessary work in +Hello. My first step at debugging is always to remove unnecessary complexity. Try it like this:
+(NSMutableDictionary *)Hello {
NSMutableDictionary *dctONE = [NSMutableDictionary dictionaryWithObjectsAndKeys:
#"dsfsdf", #"Object1", #"dsfsdf", #"Object2", nil];
return dctONE;
}
That NSMutableDictionary dictionaryWithObjectsAndKeys method takes a nil-terminated array that goes object, key, object, key. It returns an autoreleased NSMutableDictionary object, which is what you want to be returning.
There are no guarantees in this world, but I can darn near promise you that method won't leak.
I'm a bit new to iPhone development, so be gentle! I'm supporting an app which loads a wav file from a URL file-stream, and plays it back through an AudioQueue.
We run a continual loop in another thread, and stop the Queue if we detect that it has no buffers in use, and the input FileStream has reached its end. In turn, we detect if the FileStream has ended within the waitForDataInBackgroundAndNotify callback on the NSFileHandleDataAvailableNotification for the stream, by checking whether the availableData has length 0.
This works under iOS 3.0 - we get a notification of 0 available data at the end of the file - but on iOS 4.0, we don't seem to receive the callback at file end. This happens on an OS 4.0 device, regardless of the target OS version.
Has the API changed between the two versions? How can I detect the end of the file now?
Hopefully-relevant code:
data-available callback:
- (void)readFileData:(NSNotification *)notification
{
#try
{
NSData *data = [[notification object] availableData];
if ([data length] == 0 && self.audioQueueState != AQS_END)
{
/***********************************************************************/
/* We've hit the end of the data but it's possible that more may be */
/* appended to the file (if we're still downloading it) so we need to */
/* wait for the availability of more data. */
/***********************************************************************/
[self setFileStreamerState:FSS_END];
[[notification object] waitForDataInBackgroundAndNotify];
}
else if (self.audioQueueState == AQS_END)
{
TRC_DBG(#"ignore read data as ending");
}
else
{
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
TRC_DBG(#"Read %d bytes", [data length]);
[self setFileStreamerState:FSS_DATA];
if (discontinuous)
{
TRC_DBG(#"AudioFileStreamParseBytes %d bytes, discontinuous", [data length]);
err = AudioFileStreamParseBytes(audioFileStream, [data length], [data bytes], kAudioFileStreamParseFlag_Discontinuity);
discontinuous = NO;
}
else
{
TRC_DBG(#"AudioFileStreamParseBytes %d bytes, continuous", [data length]);
err = AudioFileStreamParseBytes(audioFileStream, [data length], [data bytes], 0);
}
/***********************************************************************/
/* If error then get out, otherwise wait again for more data. */
/***********************************************************************/
if (err != 0)
{
[self failWithErrorCode:AS_FILE_STREAM_PARSE_BYTES_FAILED];
}
else
{
[[notification object] waitForDataInBackgroundAndNotify];
}
[pool release];
}
}
#catch (NSException *exception)
{
TRC_ERR(#"Exception: %#", exception);
TRC_ERR(#"Exception reason: %#", [exception reason]);
//[self failWithErrorCode:AS_FILE_AVAILABLE_DATA_FAILED];
}
}