I have a sequence of images needed to display in a short time (PNG sequence). There are totally 31 PNGs in total, each with file size about 45KB. I have already loaded them with the following codes:
imgArray = [[NSMutableArray alloc] init];
for(int i = 0; i <= 30; i++) {
NSString * filename = [NSString stringWithFormat:#"img_000%.2d.png", i];
UIImage *temp = [UIImage imageNamed:filename];
[imgArray addObject:temp];
[temp release];
temp = nil;
}
I use the following codes for displaying the images:
CGImageRef image = [(UIImage *)[imgArray objectAtIndex:imgFrame] CGImage];
imgLayer.contents = (id)image;
if(imgFrame < 29) {
imgFrame++;
} else {
imgFrame = 0;
imgLayer.hidden = TRUE;
[imgTimer invalidate];
}
where imgLayer is a CALayer. (imgTimer is a repeating timer with interval 0.03s)
But I found that when I call the images out, it is very laggy at the first time. Except the 1st appearance, other appearance has no problem.
Is it related to preloading images? Or are my images too big in file size?
The reason for your lags are hard to tell without profiling data. But here is a trick that might help: Join all your images into one large file. Try to make it rectangular (maybe 6x6 in your case or 4*8). Then load this single file and crop each image out for display (i.e. create an image for display with the size of a tile and copy one tile from the big image after the other into the display image).
Images are loaded when they are used and displayed.
If you use the time profiler in Instruments, you will the lag you're experiencing. If you then zoom to that lag and look at what is causing it, you will usually see that "copyImageBlockSetPNG" is the function taking time, right before "inflate".
What you must find a way to do is create your images and force load them before you need them. That's another story apparently.
Related
I am working a picture/video filter effect project. For making effect i am using GPUImage project. It works fine for pictures. Now i need to do the same effect on videos too. I grab images from video at 30 frames per second. Now for 1 min video i filtered about 1800 images. and filtering, for each image i allocate GPUImagePicture and GPUImageSepiaFilter classes, and release them manully. But these allocations is not released, and after processing on about 20 sec video application crashes due to memory warning.
Is it possible to allocate GPUImagePicture and Filter class once for makeing filter on all images. if yes, How?
Please tell me it will very helpfull.
Here is my code what actully i am doing...
The get image method is called by a NSTimer which is called 30 times per second, and getting image from app document directory and send for filtering to - (void)imageWithEffect:(UIImage *)image method.
-(void)getImage
{
float currentPlayBacktime = slider.value;
int frames = currentPlayBacktime*30;
NSString *str = [NSString stringWithFormat:#"image%i.jpg",frames+1];
NSString *fileName = [self.imgFolderPath stringByAppendingString:str];
if ([fileManager fileExistsAtPath:fileName])
[self imageWithEffect:[UIImage imageWithContentsOfFile:fileName]];
}
- (void)imageWithEffect:(UIImage *)image
{
GPUImagePicture *gpuPicture = [[GPUImagePicture alloc]initWithImage:img] ;
GPUImageSepiaFilter *filter = [[[GPUImageSepiaFilter alloc]init] autorelease];
gpuPicture = [gpuPicture initWithImage:image];
[gpuPicture addTarget:filter];
[gpuPicture processImage];
playerImgView.image = [filter imageFromCurrentlyProcessedOutputWithOrientation:0];
[gpuPicture removeAllTargets];
[filter release];
[gpuPicture release];
}
Why are you processing a movie as a series of still UIImages? Why are you allocating a new filter for each image? That's going to be incredibly slow.
Instead, if you need to process a movie file, use a GPUImageMovie input, and if you need access to the camera, use a GPUImageVideoCamera input. Both of these are tuned to provide fast video feeds through the filter pipeline. There's a lot of wasted processing cycles in converting still frames to and from UIImages.
Also, only set up a filter once, and reuse it as necessary. There's significant overhead in the creation of a GPUImageFilter (setting up an FBO, etc.), so you only want to create it once and then attach inputs and outputs as they change.
I am creating an app in which I am loading 800 jpg images for the Imageview.On occurance of different enevts different set of images are to be loaded for animation.on fire of first event it works fine but on 2nd event it crashes on iPhone.Any help?
my part of code is as below
for (int aniCount = 1; aniCount < 480; aniCount++){
UIImage *frameImage = [UIImage imageWithContentsOfFile: [[NSBundle mainBundle] pathForResource: [NSString stringWithFormat: #"ani_soft_whimper_%i", aniCount] ofType: #"jpg"]];
[_arr_ImagesSoftWhimper addObject:frameImage];
}
imageView.animationImages = _arr_ImagesSoftWhimper;
and i m getting crash for these set of images.
It is possible that you are running out of memory. But this is just a guess. Please provide some code.
1.if you are downloading from any server please call Asynchronous call.
2. check your image dimension,size,if dimension is big according to uiimageview frame then first resize or crop your image and then set into uiimageview.
Im attempting to save some images to a camera roll, all the images are in a array.
if (buttonIndex == 1) {
int done = 0;
NSMutableArray *copyOfImages = [[NSMutableArray alloc]initWithArray:saveImagesToArray];
while (!done == 1){
if (copyOfImages.count == 0) {
done = 1;
}
if (copyOfImages.count > 0){
[copyOfImages removeLastObject];
}
UIImage *image = [[UIImage alloc]init];
image = [copyOfImages lastObject];
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
}
}
because i dont know how many images there can be i use a while loop
Im testing this with 8 images, the array shows a count of 8 so thats good.
When i try saving the images this comes up in the console.
-[NSKeyedUnarchiver initForReadingWithData:]: data is NULL
Out of the 8 images im trying, only 5 show up in the camera roll.
Any help would be great.
Thanks :)
Why are you calling [copyOfImages removeLastObject]?
Every time you go through that look are you destroying the last object, which is strange because you haven't added it to the roll yet. Take out that line and see if you look works.
Also, rather than using a for loop, use the following pattern:
for (id object in array) {
// do something with object
}
This will enumerate though the objects in the array, stopping when it reaches the end. Just be sure not to modify the array while you are doing this.
I had this same issue and I resolved it by ensuring that the images are saved sequentially. I think there may be some kind of race condition going on.
Basically, I did:
UIImageWriteToSavedPhotosAlbum([self.images objectAtIndex:self.currentIndex], self,
#selector(image:didFinishSavingWithError:contextInfo:), nil);
Then in image:didFinishSavingWithError:contextInfo:, I increment currentIndex and try the next image, if there are any left.
This has worked for me in every case so far.
I had the same exact problem. No matter how much pictures i added to the share activityController, maximum of 5 were saved.
i have found that the problem was to send the real UIImage and not URL:
NSMutableArray *activityItems = [[NSMutableArray alloc] init];
for(UIImage *image in imagesArray) {
[activityItems addObject:image];
}
UIActivityViewController *activityController = [[UIActivityViewController alloc] initWithActivityItems:activityItems applicationActivities:nil];
To expand on Bill's answer, UIImageWriteToSavedPhotosAlbum seems to do its writing on some unknown thread asynchronously - but also there is some hidden limit to the number of images it can write at once. You can even tease out write busy-type errors if you dig in deep.
Tons more info here:
Objective-C: UIImageWriteToSavedPhotosAlbum() + asynchronous = problems
I also agree with Bill that serializing your writes is the only reasonable/reliable answer
I have seen.
I'm having trouble with the way iOS handles animated-GIF's. I know that you cannot use animated-GIF's on an UIImageView and you have to use custom animations on the UIImageView.
But..
I have a Java server that sends GIF images through a socketstream. The iOS (iPhone) receives that stream and converts it into an NSData type. I've succeeded in capturing and displaying this image in a UIImageView but as many of you already know.. it only displays the first frame.
I Also found some code to decode that GIF into separate images, but that code works from a GIF file and not NSData.
Question: How do i convert the NSData file into separate images and place them in an NSArray to use it as animation?
Note: In the NSData that is received are both an image and some text separated by a rare character. so the NSData looks like this: [image] [separator] [text].
Hope somebody can give me some pointers or some samples to work with..
Thanks in advance, i will keep searching untill you or me finds an answer :)
Unless you're targeting devices before IOS 4, use ImageIO.
NSMutableArray *frames = nil;
CGImageSourceRef src = CGImageSourceCreateWithData((CFDataRef)data, NULL);
if (src) {
size_t l = CGImageSourceGetCount(src);
frames = [NSMutableArray arrayWithCapacity:l];
for (size_t i = 0; i < l; i++) {
CGImageRef img = CGImageSourceCreateImageAtIndex(src, i, NULL);
if (img) {
[frames addObject:[UIImage imageWithCGImage:img]];
CGImageRelease(img);
}
}
CFRelease(src);
}
I made a wrapper that also handles animation time based on the code from Anomie
https://gist.github.com/3894888
Ok, I have the following very simple animation composed of 25 frames in PNG format. Each frame is 320 × 360 and about 170Kb in size. Here is the code I use
.h:
IBOutlet UIImageView *Animation_Normal_View;
In Interface Builder I have a UIImageView with a referencing outlet pointing to this. All my images are named normal_000_crop.png, normal_001_crop.png, normal_002_crop.png,...
.m:
Animation_Normal = [[NSMutableArray alloc] initWithCapacity:25];
for (int i = 0; i < 25; i++)
{
[Animation_Normal addObject:[UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:[NSString stringWithFormat:#"normal_%03d_crop.png", i] ofType:nil]]];
}
Animation_Normal_View.animationImages = Animation_Normal;
Animation_Normal_View.animationDuration = 1; // seconds
Animation_Normal_View.animationRepeatCount = 0; // 0 = loops forever
[Animation_Normal release];
[self.view addSubview:Animation_Normal_View];
[Animation_Normal_View startAnimating];
On the simulator everything loogs good visual animation start as soos as the startAnimating is issued.
But on the iPhone 3G running iOS 4.0.2, the visual animation starts a good 2 to 3 seconds after the startAnimating is issued.
I have tried about every technique on I could find in blogs or forum that should solve this to no avail.
Any hints appreciated even if it's a completly different way to to a PNG based animation.
Thanks.
This is a good question and I will address it here with a few thoughts.
First, you are loading a series of graphics that are around 4MB in total size. This may take a moment, especially on slower (older) devices.
In the #interface block of your .h file you may want to declare two properties such as:
IBOutlet UIImageView *animationViewNormal;
NSMutableArray *animationViewNormalImages;
The first is the UIImageView that you already have (just renamed for best practices) and the second is a mutable array to hold the stack of images for the image view. Let me state that if having "normal" implies state. For clarification, are you loading additional sets of images for different states?
In your .m file in the #interface create the following method:
- (void)loadAnimationImages;
This will provide the function to lead the image stack to the mutable array defined in the header.
In the same .m file in the #implementation you'll want the following:
- (void)loadAnimationImages {
for (NSUInteger i = 0; i < 23; i++) {
NSString *imageName = [NSString stringWithFormat:#"normalCrop%03u", i];
UIImage *image = [UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:imageName ofType:#"png"]];
if (image) {
[animationViewNormalImages addObject:image];
}
}
}
As you can see I renamed the PNG files from normal_%03u_crop to normalCrop%03u as it is best practice to put the index label at the end of the file name (also most apps will output the content this way). The loop loads an image, checks to see that it is an image and then adds the image to the "image stack" in the mutable array.
In the init() you'll need the following:
- (id)init {
...
animationViewNormalImages = [[NSMutableArray alloc] init];
...
}
This allocates the (animationViewNormalImages) mutable array to hold your stack of images for the image view.
We'll now move on to the code for the viewDidLoad():
- (void)viewDidLoad {
[super viewDidLoad];
...
[self loadAnimationImages];
[animationViewNormal setAnimationImages:animationViewNormalImages];
[animationViewNormal setAnimationDuration:1.1f];
[animationViewNormal setAnimationRepeatCount:0]; // 0=infinite loop
...
}
We load the stack of images into the mutable array then set the properties of our imageView with the image stack, duration and repeat count.
Next in the viewDidAppear() we start the image view animating:
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
...
[animationViewNormal startAnimating];
...
}
Once the imageView is animating as an infinite loop we need to handle when leaving the view in the viewWillDisappear():
- (void)viewWillDisappear:(BOOL)animated {
[super viewWillDisappear:animated];
...
[animationViewNormal stopAnimating];
...
}
Last (which should be the second thing we add the .m file) we cleanup in the mutable array in the dealloc():
- (void)dealloc {
...
[animationViewNormalImages release];
[super dealloc];
}
This is how we handle it and works for us, but then again, we're normally not loading 4MB of images into memory to animate.
The .PNG files are compressed when building the app and I am not sure if they are decompressed on the fly when loading the images our of the resource bundle. This is a boolean value int he Build Properties Build Settings (COMPRESS_PNG_FILES).
For performance you may want to consider the following:
Mark opaque views as such:
Compositing a view whose contents are
opaque requires much less effort than
compositing one that is partially
transparent. To make a view opaque,
the contents of the view must not
contain any transparency and the
opaque property of the view must be
set to YES.
Remove alpha channels from opaque PNG
files: If every pixel of a PNG image
is opaque, removing the alpha
channel avoids the need to blend the
layers containing that image. This
simplifies compositing of the image
considerably and improves drawing
performance.
Furthermore, you may find it's better the make one large image with all 24 frames (offset by the width of the individual frame) and then load it once. Then using Core Graphics with CGContextClipToRect then just offset the image context. This means more code but may be faster than using the standard stack method.
Lastly, you may want to consider is converting the .PNG files into .PVR (PVRTC) files. More information can be found here: Apple Tech QA, Apple Docs, and Sample Code.
I hope this helps and please vote it up if it does.
Best,
Kevin
Able Pear Software
imageWithContentsOfFile: tends to take a long time to process, especially if there are lots of files (25 is kind of a lot) and/or they're big.
One thing you can try is to switch it out for imageNamed:, i.e.
[UIImage imageNamed:[NSString stringWithFormat:#"normal_%03d_crop.png", i]]
imageNamed: is generally much faster, but tends to cache images more or less indefinitely.
If loading the images into memory and keeping them around throughout the whole app is unacceptable, you may need to do some tweaky things to load them in at an appropriate time and to unload them after they've been used. That stuff is always tricky, and requires multithreading to not block the main UI while loading. But doable. And there are examples.
Load those images before using the start animating method . I advice an easier way " Just call start animating method in the applicationDidEnterForeground . When calling this one don't forget your UIImageView's alpha property . If you set uiimageiew.alpha=0.01; like this , your start animating method will be called and the user can't see this animation " so there will be no lag anymore.