ALAssetLibrary and iOS 5.0 - iphone

I have coded simple method to fetch data from assets library which is working fine on iOS4.3 but makes delay in fetching images in iOS5. What should i do to fasten fetching process on iOS 5.
-(void)setImages
{
int count =0;
int photoNumber = [[templateDictionary objectForKey:#"ElementsOnPage"] intValue];
for (int i=currentCount; count<photoNumber; i++) {
[self data:count+1 count:i];
count++;
}
}
-(void)data:(int)photoNumber count:(int)currentCount
{
NSURL *url;
UIImageView *firstImageView = [[UIImageView alloc]init];
CGFloat x,y,wid,h;
float ang;
if (currentCount>=[ImageURLArray count]) {
[firstImageView release];
return;
}
else
{
url = [NSURL URLWithString:[ImageURLArray objectAtIndex:currentCount]];
switch (photoNumber) {
case 1:
{
x = [[templateDictionary objectForKey:#"FirstElement_X"]floatValue];
y=[[templateDictionary objectForKey:#"FirstElement_Y"]floatValue];
wid = [[templateDictionary objectForKey:#"FirstElement_Width"]floatValue];
h=[[templateDictionary objectForKey:#"FirstElement_Height"]floatValue];
ang =[[templateDictionary objectForKey:#"FirstElement_Angle"]floatValue];
firstImageView.tag = 1+10;
//FirstImage
}
break;
case 2:
{
x = [[templateDictionary objectForKey:#"SecondElement_X"]floatValue];
y=[[templateDictionary objectForKey:#"SecondElement_Y"]floatValue];
wid = [[templateDictionary objectForKey:#"SecondElement_Width"]floatValue];
h=[[templateDictionary objectForKey:#"SecondElement_Height"]floatValue];
ang =[[templateDictionary objectForKey:#"SecondElement_Angle"]floatValue];
firstImageView.tag=2+10;
//SecondImage
}
break;
case 3:
{
x = [[templateDictionary objectForKey:#"ThirdElement_X"]floatValue];
y=[[templateDictionary objectForKey:#"ThirdElement_Y"]floatValue];
wid = [[templateDictionary objectForKey:#"ThirdElement_Width"]floatValue];
h=[[templateDictionary objectForKey:#"ThirdElement_Height"]floatValue];
ang =[[templateDictionary objectForKey:#"ThirdElement_Angle"]floatValue];
firstImageView.tag = 3+10;
//ThirdImage
}
break;
default:
break;
}
[firstImageView setFrame:CGRectMake(x, y, wid, h)];
dispatch_async(dispatch_get_main_queue(), ^
{
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
// Enumerate just the photos and videos group by using ALAssetsGroupSavedPhotos.
[library assetForURL:url
resultBlock:^(ALAsset* asset)
{
UIImage* img = [UIImage imageWithCGImage:[asset.defaultRepresentation fullResolutionImage]];
[firstImageView setImage:img];
}
failureBlock:^(NSError* error)
{
NSLog(#"error requesting asset");
}
];
[library release];
// Group enumerator Block
[pool release];
});
if ([[contentType objectAtIndex:currentCount]isEqualToString:#"1"]) {
UIButton *videoImage = [[UIButton alloc]initWithFrame:CGRectMake((firstImageView.frame.size.width/2)-25,(firstImageView.frame.size.height/2)-25,50,50)];
videoImage.transform = CGAffineTransformMakeRotation(ang*(3.14/180));
[videoImage setBackgroundImage:[UIImage imageNamed:#"videothumb.png"] forState:UIControlStateNormal];
[videoImage addTarget:self action:#selector(PlayMusicOnClickofButton:) forControlEvents:UIControlEventTouchUpInside];
[firstImageView addSubview:videoImage];
videoImage.tag = currentCount+1000;
[videoImage release];
}
}
firstImageView.transform = CGAffineTransformMakeRotation(ang*(3.14/180));
firstImageView.userInteractionEnabled = YES;
[coverImageView addSubview:firstImageView];
[coverImageView setImage:[UIImage imageNamed:innerBackground]];
[firstImageView release];
}

I already reported a bug to Apple (following a request of an Apple employee in the dev forums) on the performance degradation of assetForUrl in iOS5.
Background: The assetLibrary was refactored and it is now based on CoreData, on each assetForUrl call, the SDK actually opens a new SQLite connection (BAH...), causing a significant performance hit.
Temp solution: In my app I need to load 200 pics using assetForUrl. in IOS4 it took 100ms, in iOS5 around 5+ seconds. I found out that enumerating the entire library (around 1500 pics) and caching it in a URL-->ASSET dictionary, takes around 3 seconds. I'm using this technique for now. Watch out for stale assets if you hold on to them and changes to the library occur.

Related

App crashes after download of 6th image

I have a problem, need help.
I have a table, on cell I have horizontal scroll with images. Images are downloaded from internet.
When i download the 6th image, my app crashes.
For async upload I use https://github.com/rs/SDWebImage
-(void) fastCreateImage
{
int tempID = self.currentPageNow;
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, 1.2f * NSEC_PER_SEC), dispatch_get_current_queue(), ^{
if(tempID==self.currentPageNow)
{
NSUInteger objIdx = [self.imageViews indexOfObject: [NSNumber numberWithInt:tempID]];
if(objIdx != NSNotFound) {
NSLog(#"WAS CACHED!!!!!!");
}
else
{
UIImageView *myImageView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, 320, 193.5f)];
NSString *urlInString =[NSString stringWithFormat:#"%#/uploads/gallery/hotels/%#",webSite,[self.urlGarbage objectAtIndex:self.currentPageNow]];
SDWebImageManager *manager = [SDWebImageManager sharedManager];
[manager downloadWithURL:[NSURL URLWithString:urlInString]
delegate:self
options:0
success:^(UIImage *image, BOOL cached)
{
myImageView.image = image;
[[self.views objectAtIndex:tempID] addSubview:myImageView];
[self.imageViews addObject:[NSNumber numberWithInt:tempID]];
NSLog(#"LOADED IMG");
}
failure:nil];
[myImageView release];
}
}
});
}
Since you're getting a memory warning, I guess your images are too large to fit in memory. An image uses a maximum of width*height*4 (depending on color depth) of your memory. Calculate this memory requirement for each of your images to see if this is the problem.

iPhone UIImageView delays display when setImage

I am working on displaying several Images in a scrollview loaded from a MapServer which returns an image of showing a map.
So what I did is I have created 4 UIImageViews, put them in a NSMutableDictionary.
Then when scrolling to the desirable Image it will initiate to load the data from the url asynchronous.
so first I display an UIActivityIndicatorView, then it will load the data and at the end hide the UIActivityIndicatorView and display the UIImageView.
everything is working more or less fine, apart that it takes sooo long until the image displays, allthough the image is not that big and I have a log text indicating that the end of the function arrived... this log message appears immediately but the image still does not show...
If I call the URL by the Webbrowser the image is shown immediately as well.
below you see my piece of code.
- (void) loadSRGImage:(int) page {
UIImageView *currentSRGMap = (UIImageView *)[srgMaps objectForKey:[NSString stringWithFormat:#"image_%i", page]];
UIActivityIndicatorView *currentLoading = (UIActivityIndicatorView *)[srgMaps objectForKey:[NSString stringWithFormat:#"loading_%i", page]];
// if the image has been loaded already, do not load again
if ( currentSRGMap.image != nil ) return;
if ( page > 1 ) {
MKCoordinateSpan currentSpan;
currentSpan.latitudeDelta = [[[srgMaps objectForKey:[NSString stringWithFormat:#"span_%i", page]] objectForKey:#"lat"] floatValue];
currentSpan.longitudeDelta = [[[srgMaps objectForKey:[NSString stringWithFormat:#"span_%i", page]] objectForKey:#"lon"] floatValue];
region.span = currentSpan;
region.center = mapV.region.center;
[mapV setRegion:region animated:TRUE];
//[mapV regionThatFits:region];
}
srgLegende.hidden = NO;
currentSRGMap.hidden = YES;
currentLoading.hidden = NO;
[currentLoading startAnimating];
NSOperationQueue *queue = [NSOperationQueue new];
NSInvocationOperation *operation = [[NSInvocationOperation alloc]
initWithTarget:self
selector:#selector(loadImage:)
object:[NSString stringWithFormat:#"%i", page]];
[queue addOperation:operation];
[operation release];
}
- (void) loadImage:(NSInvocationOperation *) operation {
NSString *imgStr = [#"image_" stringByAppendingString:(NSString *)operation];
NSString *loadStr = [#"loading_" stringByAppendingString:(NSString *)operation];
WGS84ToCH1903 *converter = [[WGS84ToCH1903 alloc] init];
CLLocationCoordinate2D coord1 = [mapV convertPoint:mapV.bounds.origin toCoordinateFromView:mapV];
CLLocationCoordinate2D coord2 = [mapV convertPoint:CGPointMake(mapV.bounds.size.width, mapV.bounds.size.height) toCoordinateFromView:mapV];
int x1 = [converter WGStoCHx:coord1.longitude withLat:coord1.latitude];
int y1 = [converter WGStoCHy:coord1.longitude withLat:coord1.latitude];
int x2 = [converter WGStoCHx:coord2.longitude withLat:coord2.latitude];
int y2 = [converter WGStoCHy:coord2.longitude withLat:coord2.latitude];
NSString *URL = [NSString stringWithFormat:#"http://map.ssatr.ch/mapserv?mode=map&map=import/dab/maps/dab_online.map&mapext=%i+%i+%i+%i&mapsize=320+372&layers=DAB_Radio_Top_Two", y1, x1, y2, x2];
NSData* imageData = [[NSData alloc] initWithContentsOfURL:[NSURL URLWithString:URL]];
UIImage* image = [[UIImage alloc] initWithData:imageData];
[imageData release];
UIImageView *currentSRGMap = (UIImageView *)[srgMaps objectForKey:imgStr];
UIActivityIndicatorView *currentLoading = (UIActivityIndicatorView *)[srgMaps objectForKey:loadStr];
currentSRGMap.hidden = NO;
currentLoading.hidden = YES;
[currentLoading stopAnimating];
[currentSRGMap setImage:image]; //UIImageView
[image release];
NSLog(#"finished loading image: %#", URL);
}
I had a similar thing in my app, and i used the SDWebImage from https://github.com/rs/SDWebImage. This has a category written over UIImageView. You need to mention a placeholder image and a url to the UIImageView. It will display the image once it is downloaded and also maintains a cache of the downloaded images, hence avoiding multiple server calls.

AVFoundation - Retiming CMSampleBufferRef Video Output

First time asking a question here. I'm hoping the post is clear and sample code is formatted correctly.
I'm experimenting with AVFoundation and time lapse photography.
My intent is to grab every Nth frame from the video camera of an iOS device (my iPod touch, version 4) and write each of those frames out to a file to create a timelapse. I'm using AVCaptureVideoDataOutput, AVAssetWriter and AVAssetWriterInput.
The problem is, if I use the CMSampleBufferRef passed to captureOutput:idOutputSampleBuffer:fromConnection:, the playback of each frame is the length of time between original input frames. A frame rate of say 1fps. I'm looking to get 30fps.
I've tried using CMSampleBufferCreateCopyWithNewTiming(), but then after 13 frames are written to the file, the captureOutput:idOutputSampleBuffer:fromConnection: stops being called. The interface is active and I can tap a button to stop the capture and save it to the photo library for playback. It appears to play back as I want it, 30fps, but it only has those 13 frames.
How can I accomplish my goal of 30fps playback?
How can I tell where the app is getting lost and why?
I've placed a flag called useNativeTime so I can test both cases. When set to YES, I get all frames I'm interested in as the callback doesn't 'get lost'. When I set that flag to NO, I only ever get 13 frames processed and am never returned to that method again. As mentioned above, in both cases I can playback the video.
Thanks for any help.
Here is where I'm trying to do the retiming.
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
BOOL useNativeTime = NO;
BOOL appendSuccessFlag = NO;
//NSLog(#"in captureOutpput sample buffer method");
if( !CMSampleBufferDataIsReady(sampleBuffer) )
{
NSLog( #"sample buffer is not ready. Skipping sample" );
//CMSampleBufferInvalidate(sampleBuffer);
return;
}
if (! [inputWriterBuffer isReadyForMoreMediaData])
{
NSLog(#"Not ready for data.");
}
else {
// Write every first frame of n frames (30 native from camera).
intervalFrames++;
if (intervalFrames > 30) {
intervalFrames = 1;
}
else if (intervalFrames != 1) {
//CMSampleBufferInvalidate(sampleBuffer);
return;
}
// Need to initialize start session time.
if (writtenFrames < 1) {
if (useNativeTime) imageSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
else imageSourceTime = CMTimeMake( 0 * 20 ,600); //CMTimeMake(1,30);
[outputWriter startSessionAtSourceTime: imageSourceTime];
NSLog(#"Starting CMtime");
CMTimeShow(imageSourceTime);
}
if (useNativeTime) {
imageSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
CMTimeShow(imageSourceTime);
// CMTime myTiming = CMTimeMake(writtenFrames * 20,600);
// CMSampleBufferSetOutputPresentationTimeStamp(sampleBuffer, myTiming); // Tried but has no affect.
appendSuccessFlag = [inputWriterBuffer appendSampleBuffer:sampleBuffer];
}
else {
CMSampleBufferRef newSampleBuffer;
CMSampleTimingInfo sampleTimingInfo;
sampleTimingInfo.duration = CMTimeMake(20,600);
sampleTimingInfo.presentationTimeStamp = CMTimeMake( (writtenFrames + 0) * 20,600);
sampleTimingInfo.decodeTimeStamp = kCMTimeInvalid;
OSStatus myStatus;
//NSLog(#"numSamples of sampleBuffer: %i", CMSampleBufferGetNumSamples(sampleBuffer) );
myStatus = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault,
sampleBuffer,
1,
&sampleTimingInfo, // maybe a little confused on this param.
&newSampleBuffer);
// These confirm the good heath of our newSampleBuffer.
if (myStatus != 0) NSLog(#"CMSampleBufferCreateCopyWithNewTiming() myStatus: %i",myStatus);
if (! CMSampleBufferIsValid(newSampleBuffer)) NSLog(#"CMSampleBufferIsValid NOT!");
// No affect.
//myStatus = CMSampleBufferMakeDataReady(newSampleBuffer); // How is this different; CMSampleBufferSetDataReady ?
//if (myStatus != 0) NSLog(#"CMSampleBufferMakeDataReady() myStatus: %i",myStatus);
imageSourceTime = CMSampleBufferGetPresentationTimeStamp(newSampleBuffer);
CMTimeShow(imageSourceTime);
appendSuccessFlag = [inputWriterBuffer appendSampleBuffer:newSampleBuffer];
//CMSampleBufferInvalidate(sampleBuffer); // Docs don't describe action. WTF does it do? Doesn't seem to affect my problem. Used with CMSampleBufferSetInvalidateCallback maybe?
//CFRelease(sampleBuffer); // - Not surprisingly - “EXC_BAD_ACCESS”
}
if (!appendSuccessFlag)
{
NSLog(#"Failed to append pixel buffer");
}
else {
writtenFrames++;
NSLog(#"writtenFrames: %i", writtenFrames);
}
}
//[self displayOuptutWritterStatus]; // Expect and see AVAssetWriterStatusWriting.
}
My setup routine.
- (IBAction) recordingStartStop: (id) sender
{
NSError * error;
if (self.isRecording) {
NSLog(#"~~~~~~~~~ STOPPING RECORDING ~~~~~~~~~");
self.isRecording = NO;
[recordingStarStop setTitle: #"Record" forState: UIControlStateNormal];
//[self.captureSession stopRunning];
[inputWriterBuffer markAsFinished];
[outputWriter endSessionAtSourceTime:imageSourceTime];
[outputWriter finishWriting]; // Blocks until file is completely written, or an error occurs.
NSLog(#"finished CMtime");
CMTimeShow(imageSourceTime);
// Really, I should loop through the outputs and close all of them or target specific ones.
// Since I'm only recording video right now, I feel safe doing this.
[self.captureSession removeOutput: [[self.captureSession outputs] objectAtIndex: 0]];
[videoOutput release];
[inputWriterBuffer release];
[outputWriter release];
videoOutput = nil;
inputWriterBuffer = nil;
outputWriter = nil;
NSLog(#"~~~~~~~~~ STOPPED RECORDING ~~~~~~~~~");
NSLog(#"Calling UIVideoAtPathIsCompatibleWithSavedPhotosAlbum.");
NSLog(#"filePath: %#", [projectPaths movieFilePath]);
if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum([projectPaths movieFilePath])) {
NSLog(#"Calling UISaveVideoAtPathToSavedPhotosAlbum.");
UISaveVideoAtPathToSavedPhotosAlbum ([projectPaths movieFilePath], self, #selector(video:didFinishSavingWithError: contextInfo:), nil);
}
NSLog(#"~~~~~~~~~ WROTE RECORDING to PhotosAlbum ~~~~~~~~~");
}
else {
NSLog(#"~~~~~~~~~ STARTING RECORDING ~~~~~~~~~");
projectPaths = [[ProjectPaths alloc] initWithProjectFolder: #"TestProject"];
intervalFrames = 30;
videoOutput = [[AVCaptureVideoDataOutput alloc] init];
NSMutableDictionary * cameraVideoSettings = [[[NSMutableDictionary alloc] init] autorelease];
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt: kCVPixelFormatType_32BGRA]; //kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange];
[cameraVideoSettings setValue: value forKey: key];
[videoOutput setVideoSettings: cameraVideoSettings];
[videoOutput setMinFrameDuration: CMTimeMake(20, 600)]; //CMTimeMake(1, 30)]; // 30fps
[videoOutput setAlwaysDiscardsLateVideoFrames: YES];
queue = dispatch_queue_create("cameraQueue", NULL);
[videoOutput setSampleBufferDelegate: self queue: queue];
dispatch_release(queue);
NSMutableDictionary *outputSettings = [[[NSMutableDictionary alloc] init] autorelease];
[outputSettings setValue: AVVideoCodecH264 forKey: AVVideoCodecKey];
[outputSettings setValue: [NSNumber numberWithInt: 1280] forKey: AVVideoWidthKey]; // currently assuming
[outputSettings setValue: [NSNumber numberWithInt: 720] forKey: AVVideoHeightKey];
NSMutableDictionary *compressionSettings = [[[NSMutableDictionary alloc] init] autorelease];
[compressionSettings setValue: AVVideoProfileLevelH264Main30 forKey: AVVideoProfileLevelKey];
//[compressionSettings setValue: [NSNumber numberWithDouble:1024.0*1024.0] forKey: AVVideoAverageBitRateKey];
[outputSettings setValue: compressionSettings forKey: AVVideoCompressionPropertiesKey];
inputWriterBuffer = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo outputSettings: outputSettings];
[inputWriterBuffer retain];
inputWriterBuffer.expectsMediaDataInRealTime = YES;
outputWriter = [AVAssetWriter assetWriterWithURL: [projectPaths movieURLPath] fileType: AVFileTypeQuickTimeMovie error: &error];
[outputWriter retain];
if (error) NSLog(#"error for outputWriter = [AVAssetWriter assetWriterWithURL:fileType:error:");
if ([outputWriter canAddInput: inputWriterBuffer]) [outputWriter addInput: inputWriterBuffer];
else NSLog(#"can not add input");
if (![outputWriter canApplyOutputSettings: outputSettings forMediaType:AVMediaTypeVideo]) NSLog(#"ouptutSettings are NOT supported");
if ([captureSession canAddOutput: videoOutput]) [self.captureSession addOutput: videoOutput];
else NSLog(#"could not addOutput: videoOutput to captureSession");
//[self.captureSession startRunning];
self.isRecording = YES;
[recordingStarStop setTitle: #"Stop" forState: UIControlStateNormal];
writtenFrames = 0;
imageSourceTime = kCMTimeZero;
[outputWriter startWriting];
//[outputWriter startSessionAtSourceTime: imageSourceTime];
NSLog(#"~~~~~~~~~ STARTED RECORDING ~~~~~~~~~");
NSLog (#"recording to fileURL: %#", [projectPaths movieURLPath]);
}
NSLog(#"isRecording: %#", self.isRecording ? #"YES" : #"NO");
[self displayOuptutWritterStatus];
}
OK, I found the bug in my first post.
When using
myStatus = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault,
sampleBuffer,
1,
&sampleTimingInfo,
&newSampleBuffer);
you need to balance that with a CFRelease(newSampleBuffer);
The same idea holds true when using a CVPixelBufferRef with a piexBufferPool of an AVAssetWriterInputPixelBufferAdaptor instance. You would use CVPixelBufferRelease(yourCVPixelBufferRef); after calling the appendPixelBuffer: withPresentationTime: method.
Hope this is helpful to someone else.
With a little more searching and reading I have a working solution. Don't know that it is best method, but so far, so good.
In my setup area I've setup an AVAssetWriterInputPixelBufferAdaptor. The code addition looks like this.
InputWriterBufferAdaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput: inputWriterBuffer
sourcePixelBufferAttributes: nil];
[inputWriterBufferAdaptor retain];
For completeness to understand the code below, I also have these three lines in the setup method.
fpsOutput = 30; //Some possible values: 30, 10, 15 24, 25, 30/1.001 or 29.97;
cmTimeSecondsDenominatorTimescale = 600 * 100000; //To more precisely handle 29.97.
cmTimeNumeratorValue = cmTimeSecondsDenominatorTimescale / fpsOutput;
Instead of applying a retiming to a copy of the sample buffer. I now have the following three lines of code that effectively does the same thing. Notice the withPresentationTime parameter for the adapter. By passing my custom value to that, I gain the correct timing I'm seeking.
CVPixelBufferRef myImage = CMSampleBufferGetImageBuffer( sampleBuffer );
imageSourceTime = CMTimeMake( writtenFrames * cmTimeNumeratorValue, cmTimeSecondsDenominatorTimescale);
appendSuccessFlag = [inputWriterBufferAdaptor appendPixelBuffer: myImage withPresentationTime: imageSourceTime];
Use of the AVAssetWriterInputPixelBufferAdaptor.pixelBufferPool property may have some gains, but I haven't figured that out.

UIImage problem

I am loading the images of size 450KB in UIImage view and then adding it to UIscrollview. while am scrolling the 30 images continously,its getting crashed.. what may be the reason..is this memory leak issue...or image size is the problem...? thanks in advance..
here is my code ..
#try{
NSAutoreleasePool *pool;
pool = [[NSAutoreleasePool alloc] init];
//NSArray *array = [global_ContentString componentsSeparatedByString:#"###"];
NSArray *array1 = [catalogURL componentsSeparatedByString:#"&"];
//**NSLog(#"array1****** = %#",array1);
NSLog(#"loading catalog image(method: loadCatalogImage).......%#%#",baseURL, [[[array1 objectAtIndex:0] componentsSeparatedByString:#"##"] objectAtIndex:0]);
//NSLog(#"baseURL = %#",baseURL);
NSLog(#"loading catalog image.......%#%#",baseURL, [[[array1 objectAtIndex:0] componentsSeparatedByString:#"##"] objectAtIndex:0]);
zoomedImageURL = [NSString stringWithFormat:#"%#%#", baseURL, [[[array1 objectAtIndex:0] componentsSeparatedByString:#"##"] objectAtIndex:1]];
[zoomedImageURL retain];
NSLog(#"aaaaaaaaaaaaaa = %#",zoomedImageURL);
//UIImage *img = [UIImage imageWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString:[NSString stringWithFormat:#"%#%#",baseURL, [[[array1 objectAtIndex:0] componentsSeparatedByString:#"##"] objectAtIndex:0]]]]];
UIImage *img = [UIImage imageWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString:[NSString stringWithFormat:#"%#",zoomedImageURL]]]];
imgView.contentMode = UIViewContentModeScaleAspectFit;
imgView.image = img;//[GPSTripTracking generatePhotoThumbnail:img:109];
[pool release];
[global_imgProgress stopAnimating];
}
#catch (NSException *e) {
[global_imgProgress stopAnimating];
NSLog(#"Exception....");
}
#finally {
}
am releasing my imgView in dealloc method..
i imlemented the following code in "scrollviewdidscroll"
- (void)scrollViewDidScroll:(UIScrollView *)sender {
// We don't want a "feedback loop" between the UIPageControl and the scroll delegate in
// which a scroll event generated from the user hitting the page control triggers updates from
// the delegate method. We use a boolean to disable the delegate logic when the page control is used.
if (pageControlUsed) {
// do nothing - the scroll was initiated from the page control, not the user dragging
//pageText.text = [NSString stringWithFormat:#"%d/%d", (pageControl.currentPage +1), pageControl.numberOfPages];
pageText.text = [NSString stringWithFormat:#"%d/%d", (pageControl.currentPage ), pageControl.numberOfPages];
//NSLog(#"not scrollling page....");
return;
}
// Switch the indicator when more than 50% of the previous/next page is visible
CGFloat pageWidth = scrollView.frame.size.width;
int page = floor((scrollView.contentOffset.x - pageWidth / 2) / pageWidth) + 1;
pageControl.currentPage = page;
// load the visible page and the page on either side of it (to avoid flashes when the user starts scrolling)
[self loadScrollViewWithPage:page - 1];
[self loadScrollViewWithPage:page];
[self loadScrollViewWithPage:page + 1];
//NSLog(#"scrolling page....%d", page);
// A possible optimization would be to unload the views+controllers which are no longer visible
}
and my code for " loadScrollViewWithPage" is
- (void)loadScrollViewWithPage:(int)page
{
//page--;
if (page < 0) return;
if (page >= numberOfPages) return;
if(!isViewCatalog && searchId == 1)
{
//NSLog(#"curre page = %d",pageControl.currentPage);
NSArray *array1 = [global_ContentString componentsSeparatedByString:#"###"];
if(searchInCatalogFlag == 1)
{
pageControl.currentPage=0;
NSArray *urlArray = [[array1 objectAtIndex:pageControl.currentPage] componentsSeparatedByString:#"##"];
//NSLog(#"url array** = %#",urlArray);
headerText.text = [NSString stringWithString:[urlArray objectAtIndex:0]];
pageText.text = [NSString stringWithFormat:#"%d/%d", pageControl.currentPage, (pageControl.numberOfPages - 1)];
}
else
{
NSArray *urlArray = [[array1 objectAtIndex:pageControl.currentPage] componentsSeparatedByString:#"##"];
//NSLog(#"url array** = %#",urlArray);
headerText.text = [NSString stringWithString:[urlArray objectAtIndex:0]];
pageText.text = [NSString stringWithFormat:#"%d/%d", pageControl.currentPage, (pageControl.numberOfPages - 1)];
}
if(page == selectedPage && ![global_imgProgress isAnimating])
[global_imgProgress startAnimating];
}
else
{
headerText.text = [NSString stringWithString:global_SelectedCatalogName];
pageText.text = [NSString stringWithFormat:#"%d/%d", (pageControl.currentPage + 1), (pageControl.numberOfPages - 1)];
if(page == selectedPage + 1 && ![global_imgProgress isAnimating] )
[global_imgProgress startAnimating];
// NSLog(#"header text = %#", headerText.text);
//headerText.text = [NSString stringWithString:[urlArray objectAtIndex:0]];
}
FullPageView *controller = [viewControllers objectAtIndex:page];
if ((NSNull *)controller == [NSNull null] ) {
//NSLog(#"Loading page =========== %d, %d", page, selectedPage);
//voucherPageNo = page;
//[voucherImage retain];
if(universalApp==2)
{
controller = [[FullPageView alloc] initWithNibName:#"FullPageView_iphone" bundle:nil];//:page];
[controller.view setFrame:CGRectMake(0, 0, 320,332)];
}
else
{
controller = [[FullPageView alloc] initWithNibName:#"FullPageView" bundle:nil];//:page];
[controller.view setFrame:CGRectMake(0, 192, 768, 691)];
}
//[controller.view setFrame:CGRectMake(0, 0, 320,480)];
//[controller.view setFrame:CGRectMake(0, 192, 768, 691)];
if((!isViewCatalog && searchId < 2 && searchInCatalogFlag == 0))// || searchInCatalogFlag == 1)
{
// NSLog(#">>>>>>>>>>>>>>>>>> LOADING IMAGE >>>>>>>>>>>>>>>>>>>>");
[controller setPageNo:page];
// if(page >= selectedPage - 1)
[NSThread detachNewThreadSelector:#selector(loadImage) toTarget:controller withObject:nil];
}
else //if((page >= (selectedPage - 1) && page <= (selectedPage + 1)) || !isFirstTimeLoading)
{
NSLog(#"Loading CATALOG IMAGE = %d, %d, %#", page, selectedPage, (isFirstTimeLoading ?#"YES" : #"NO"));
[controller setCatalogURL:[NSString stringWithFormat:#"%#", [catalogArray objectAtIndex:page+(searchId< 2 && !isViewCatalog && searchInCatalogFlag == 0?0:1)]]];
NSLog(#"loading image ipad= %#", [catalogArray objectAtIndex:page+(searchId< 2 && !isViewCatalog && searchInCatalogFlag == 0?0:1)]);
// if(page >= selectedPage - 1)
[NSThread detachNewThreadSelector:#selector(loadCatalogImage) toTarget:controller withObject:nil];
// if(page == (selectedPage + 1))
//isFirstTimeLoading = NO;
}
[viewControllers replaceObjectAtIndex:page withObject:controller];
[controller release];
}
// add the controller's view to the scroll view
if (nil == controller.view.superview)
{
// NSLog(#"Voucher view addead at page..... %d", page);
CGRect frame = scrollView.frame;
frame.origin.x = frame.size.width * page;
frame.origin.y = 0;
controller.view.frame = frame;
// NSLog(#">>>>>>>>> %f, %f", frame.size.width, frame.origin.x);
[scrollView addSubview:controller.view];
}
//if(page == pageControl.currentPage)
//[imgProgress startAnimating];
//else
//pageControlUsed = YES;
}
where wil be the problem..?
450KB is the compressed image size. When an image is loaded into memory, it is uncompressed.
A rule of thumb for working out how much memory an uncompressed image will occupy is:
width * height * 4
With 30 images it is very likely that you're running out of memory.
You should write your code to only keep images in memory if they are visible on screen.
It very likely that your program gets terminated by iOS because it consumes too much memory. Start it from XCode and look at the console - it will probably print that it receives memory warnings.
You will have to load the images on demand, i.e. only when the user gets close to seeing them, and you will have to release the ones that move out of view again. To that end, implement the
- (void)scrollViewDidScroll:(UIScrollView *)scrollView
delegate method, look at the scroll view's content offset and load/release the appropriate images. Alternatively, you can choose to not release them there but instead wait for a memory warning to do so. To do so, implement the
- (void)didReceiveMemoryWarning
method in your view controller.
This might be the issue of memory leaks.
Try to replace this line
UIImage *img = [UIImage imageWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString:[NSString stringWithFormat:#"%#",zoomedImageURL]]]]
and use this
UIImage *img = [[UIImage alloc] initWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString:[NSString stringWithFormat:#"%#",zoomedImageURL]]]];
<<<<<YOUR CODE>>>>
[img release];
I hope this helps

Adding large numbers of properties in Core Data, crashing when starting from phone but not from Xcode

I am trying to add data to CoreData. It works fine when I build from Xcode to the phone but when I try to start the app directly from iPhone it crashes on first save to the Context.
I read a text file that is synced via iTunes File Sharing, the file is pretty big (~350 000 lines). The values I get from the file is added to two different arrays (barcodes and productNames). The arrays are later batched through and the sent to the function where I save the data.
From the array loop:
[...]
words = [rawText componentsSeparatedByString:#";"];
int loopCounter = 0;
int loopLimit = 20000;
int n = 0;
int wordType;
NSEnumerator *word = [words objectEnumerator];
NSLog(#"Create arrays");
while(tmpWord = [word nextObject]) {
if ([tmpWord isEqualToString: #""] || [tmpWord isEqualToString: #"\r\n"]) {
// NSLog(#"%#*** NOTHING *** ",tmpWord);
}else {
n++;
wordType = n%2;
if (wordType == kBarcode) {
[barcodes addObject: tmpWord];
}else if (wordType == kProduct) {
[productNames addObject: tmpWord];
}
// Send to batch //
loopCounter ++;
if (loopCounter == loopLimit) {
loopCounter = 0;
NSLog(#"adding new batch");
[self addBatchOfData];
[barcodes release];
[productNames release];
barcodes = [[NSMutableArray arrayWithCapacity:20000] retain];
productNames = [[NSMutableArray arrayWithCapacity:20000] retain];
}
}
[...]
And then the save-function:
-(void)addBatchOfData {
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
NSError *error;
NSUInteger loopLimit = 5000;
NSUInteger loopCounter = 0;
NSString *ean;
NSString *designation;
for (int i=0; i<[barcodes count];i++ ) {
ean = [barcodes objectAtIndex:i];
designation = [productNames objectAtIndex:i];
Product *product = (Product *)[NSEntityDescription insertNewObjectForEntityForName:#"Product" inManagedObjectContext:importContext];
[product setDesignation:designation];
[product setBarcode:ean];
loopCounter ++;
if (loopCounter == loopLimit) {
NSLog(#"Save CoreData");
[importContext save:&error];
[importContext reset];
[pool drain];
pool = [[NSAutoreleasePool alloc] init];
loopCounter = 0;
}
}
// Save any remaining records
if (loopCounter != 0) {
[importContext save:&error];
[importContext reset];
}
[pool drain];
}
It's really irritating that it works fine when I build from Xcode. Hopefully there is a setting that I missed or something...
EDIT: Forgot to mention that I don't get passed the Default-screen and I don't have any logs. Can it have something to do with the provisioning?
Offload your file loading in a background thread and let the phone start up your main window and view. iOS will kill your app if you do not present a view in a timely manor (this is what you are seeing).
I have to do something like this for my xml -> CoreData converter code. I just present the user with a view notifying them of what is going on and a progress bar (I use https://github.com/matej/MBProgressHUD).
something like:
self.hud = [[MBProgressHUD alloc] initWithView:window];
// Set determinate mode
hud.mode = MBProgressHUDModeDeterminate;
hud.delegate = self;
hud.labelText = #"Converting Data File";
[self.window addSubview:hud];
// Show the HUD while the provided method executes in a new thread
[hud showWhileExecuting:#selector(convertToCoreDataStoreTask) onTarget:self withObject:nil animated:YES];
You just have to make sure that you use a separate NSManagedObjectContext in the new thread.
I would suggest that you implement this delegate method and then try to see what is going on with memory.
when running in the simulator, you have no memory constraints, but when running in the phone you do
- (void)applicationDidReceiveMemoryWarning:(UIApplication *)application
{
}
I think I find the solution to my question.
What I was doing was that I started all the heavy data crunch in the "- (void) viewDidLoad {". When I changed it to start the crunch after I clicked a button in the app, it worked just fine.
Right now it's just finding out where the start the data crunch, any suggestions?