I have a UIAnimation view that plays an array of PNG images as an animation. There are about 200 frames and total size is about 8 MB. The animation works just fine on simulator and iPhone 4, but when I test on iPhone 3GS, the app crashes due to the animation.
I've tried using UIImage imageNamed:, but I read that using imageWithData might be faster, so I have this:
NSString *imageName=[NSString stringWithFormat:#"fishBg_%i.png", i];
NSString *fileLocation = [[NSBundle mainBundle] pathForResource:imageName ofType:nil];
NSData *imageData = [NSData dataWithContentsOfFile:fileLocation];
[animationArray addObject:[UIImage imageWithData:imageData]];
What can my problem be? When I reduce the number of frames to about 100, then the animation plays and the app doesn't crash. But when I bring up the frame count to 200, then the app crashes. What's a better way to do this? The animation is a PNG sequence of transparent images, so I'm not sure if I'd be able to convert this to a video and keep its transparency and place other images under it.
Since we need to conserve as much memory as possible here (assuming that’s why you’re crashing), try managing memory more explicitly:
NSString *imageName=[[NSString alloc] initWithFormat:#"fishBg_%i.png", i];
NSString *fileLocation = [[NSBundle mainBundle] pathForResource:imageName ofType:nil];
[imageName release];
UIImage *theImage = [[UIImage alloc] initWithContentsOfFile:fileLocation];
[animationArray addObject:theImage];
[theImage release];
Related
I took a 13.3MB image and added it to XCode. I'm aware that when compiling, XCode performs some tricks to bring down the file size. I did this, to test how big the image now was, after being converted into data:
UIImage *image = [UIImage imageNamed:#"image.jpg"];
NSData *data = UIImageJPEGRepresentation(image, 1.0);
NSLog(#"length: %i", data.length);
The length I got back was 26758066. If that's in bytes, then it reads to me as 26.7MB. How is the image so big suddenly? Is there another way for me to get the image in data form without going through UIImage first?
EDIT: Further testing reveals that this works, and brings out a data length of ~13.3MB - the expected amount:
NSString *filePath = [[NSBundle mainBundle] pathForResource:#"image" ofType:#"jpg"];
NSData *data = [NSData dataWithContentsOfFile:filePath];
NSLog(#"length: %i", data.length);
What your code is doing is decompressing the image into memory and then recompressing as JPEG, with the highest quality ratio (q=1.0). That’s why the image is suddenly so big.
If you want to check up on the file as stored in the resource bundle, ask NSBundle for the full file path and use NSFileManager to read the file size. You can do the same thing by hand on your Mac, just take a look into the BUILD_PRODUCTS_DIR for your project.
the screen image link,the savePic image link,You can find the different between screenImg and savePic.
I set the backgroundImageView a very small png. and i want save the backgroundimageview to a Picture。
In the iPhone Simulator Screen, the edge of image is normal, but the edge of savePic is not clear. Anybody can tell me how to save a high definition picture.
- (void)viewDidLoad
{
[super viewDidLoad];
//_backImgView 320*480
self.backImgView.image = [UIImage imageNamed:#"Purple.png"];
[self performSelector:#selector(savePic) withObject:nil afterDelay:0.1];
}
- (void)savePic
{
BOOL isDir;
NSString *dirPath = [NSHomeDirectory() stringByAppendingPathComponent:#"/Documents/Pic"];
NSFileManager *fm = [NSFileManager defaultManager];
if(![fm fileExistsAtPath:dirPath isDirectory:&isDir]){
BOOL bo = [fm createDirectoryAtPath:dirPath withIntermediateDirectories:YES attributes:nil error:nil];
if(!bo){
NSLog(#"-->Create CardSlide Dir Fail!!!");
return;
}
}
UIGraphicsBeginImageContext(_backImgView.bounds.size);
//UIGraphicsBeginImageContextWithOptions(_backImgView.bounds.size, _backImgView.opaque, 2.0);
CGContextSetAllowsAntialiasing(UIGraphicsGetCurrentContext(), YES);
[_backImgView drawRect:_backImgView.bounds];
//[_backImgView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *bgImg = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSString *path = [NSString stringWithFormat:#"save_%#.jpg", #"demo"];
NSString *jpgPath = [dirPath stringByAppendingPathComponent:path];
BOOL isSucc = [UIImageJPEGRepresentation(bgImg, 1.0) writeToFile:jpgPath atomically:YES];
NSLog(#"%# write photo %#!", jpgPath, isSucc?#"SUCC":#"FAIL");
}
Using a small image, is impossible to scale it the correct way to a bigger image, you will always go through blurry, noisy images BUT if you image could have stretchable repetitive parts, you can stretch the interior of the image without creating a new one. There are two API in UIImage called:
– resizableImageWithCapInsets: (Only ios5)
– stretchableImageWithLeftCapWidth:topCapHeight: (Deprecated in iOS 5.0)
If you find the correct inset you can avoid to create a new image, basically is the same system used to create the bubble in message application. The balloon is really small but it has a stretchable area that maintain the aspect ratio.
Since the image is really simple you could also think to draw it using Quartz functions creating a rect path, paths are "almost" res independent just need to scale them the correct size.
I am new to this and have searched a lot this week. I am trying to set a background image from a file I have downloaded from a url. I need to save the image for later. Specially if I am not connected to the internet I can still show this image.
I have validated by printing out the contents of the directory that the file exists. This is one of many code that I have tried to make the image appear in the background.
nNSString *docDir = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString *documentPath = [docDir stringByAppendingFormat:#"/mainback.png"];
// If you go to the folder below, you will find those pictures
NSLog(#"%#",documentPath);
if ([[NSFileManager defaultManager]fileExistsAtPath:documentPath]) {
NSData *data = [NSData dataWithContentsOfFile:documentPath];
UIImage *image = [[UIImage alloc] initWithData:data];
NSLog(#"WHAT - %f,%f",image.size.width,image.size.height);
UIColor *background = [[UIColor alloc] initWithPatternImage:[UIImage imageNamed:documentPath]];
self.view.backgroundColor = background;
[background release];
}
The code finds the image in the directory but I cannot set the image to the background. Please help.
Thanks
Use a UIImageView (Apple documentation linked) instead of doing what you are trying to do with UIColor.
It'll be a lot more straightforward and less problematic.
I am trying to load an saved image but when I check the UIImage it comes back as nil. Here is the code:
UIImage *img = [UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:#"/var/mobile/Applications/B74FDA2B-5B8C-40AC-863C-4030AA85534B/Documents/70.jpg" ofType:nil]];
I then check img to see if it is nil and it is. Listing the directory shows the file, what am I doing wrong?
You need to point to the Documents directory within your app then like this:
- (NSString *)applicationDocumentsDirectory
{
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *basePath = ([paths count] > 0) ? [paths objectAtIndex:0] : nil;
return basePath;
}
Usage:
UIImage *img = [UIImage imageWithContentsOfFile:[NSString stringWithFormat:#"%#/70.jpg",[self applicationDocumentsDirectory]]];
First, you are using pathForResource wrong, the correct way would be:
[[NSBundle mainBundle] pathForResource:#"70" ofType:#"jpg"]
The whole idea of bundling is to abstract the resource path such as that it will always be valid, no matter where in the system your app resides. But if all you want to do is load that image I would recommend you use imageNamed: since it automatically handles retina resolution (high resolution) display detection on the iPhone for you and loads the appropriate resource "automagically":
UIImage *img = [UIImage imageNamed:#"70.jpg"];
To easily support regular and retina resolution you would need to have two resources in your app bundle, 70.jpg and 70#2x.jpg with the #2x resource having both doubled with and height.
Try loading a UIImage with:
[UIImage imageNamed:#"something.png"]
It looks for an image with the specified name in the application’s main bundle. Also nice: It automatically chooses the Retina (xyz#2x.png) or non-Retina (xyz.png) version.
Your path simply wont work because your app is in a sandbox, and you are trying to use the full path.
You should be using the following instead:
UIImage *img = [UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:#"70" ofType:#"jpg"]];
or you can use, but is slower than the above:
UIImage *img = [UIImage imageNamed:#"70.jpg"];
I am attempting to save and load a UIImage to and from the iPhone documents directory after the image is picked from the iPhone Photo Library. It is able to do so, but for some reason, when I load the image, it rotates it 90 degrees counterclockwise. Here is my saveImage and loadImage methods:
Save Image:
- (void)saveImage: (UIImage*)image{
if (image != nil)
{
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,
NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString* path = [documentsDirectory stringByAppendingPathComponent:
[NSString stringWithString: #"lePhoto.png"] ];
//NSData* data = UIImagePNGRepresentation(image);
//[data writeToFile:path atomically:YES];
[UIImagePNGRepresentation(image) writeToFile:path atomically:YES];
}
}
Load Image:
- (NSData*)loadImage{
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,
NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString* path = [documentsDirectory stringByAppendingPathComponent:
[NSString stringWithString: #"lePhoto.png"] ];
NSData *data = [[NSData alloc] initWithContentsOfFile:path];
//UIImage *image = [[UIImage alloc] initWithData:data]; //my second try
//UIImage* image = [UIImage imageWithContentsOfFile:path]; //first try
//return image;
return data;
[data release];
}
And I am now loading the image like this, to see if it would, for some crazy reason, solve my problem (it didn't, obviously):
UIImage* theImage = [[UIImage alloc] initWithData:[self loadImage]];
I have done some testing where I have two UIImageViews side-by-side, one on the left that doesn't save and load the image before display (it just straight-up shows the image once a photo is picked from the ImagePicker), and one on the right that displays AFTER the save load occurs. The one on the right is the only one rotating 90 degrees CCW; the non Save/Load picture is displaying correctly. This only occurs on the actual iPhone. The simulator shows both images correctly. I have spent many, many hours attempting to figure it out, but to no avail. Any ideas as to what could be causing this?
EDIT: It should also be known that the image always says it's oriented up, even when it obviously isn't! It only rotates after ALL of the code has been run. I've tried doing a force redraw in between my code to see if that changes things, but it doesn't (though it is possible I'm even doing that wrong. Who knows, anymore).
Also, it will not autorotate photos that have been screenshot from the iPhone. Only photos that have been taken by the camera. [I haven't tried downloaded photos.] It's the weirdest thing...
And if you're wondering exactly how I'm pulling my picture, here's the method that will at least shed light on what I'm doing (it's not every method needed to do this function, obviously, but it gives insight on how I've written my code):
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
theimageView.image = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
[self saveImage:[info objectForKey:#"UIImagePickerControllerOriginalImage"]];
[picker dismissModalViewControllerAnimated:YES];
}
Any advice is greatly appreciated. And I do know this much: if there's a way to do something crazy to get some stupid, never-before-heard-of problem, chances are, I'll find it, apparently :P
Ok. I figured it out. Apparently when you save things with PNG representation, and not JPEG representation, the image orientation information is not saved with it. Because of this, every image loaded will default to showing orientation up. So, the easiest way is to do this:
NSData* data = UIImageJPEGRepresentation(image,0.0);
[data writeToFile:path atomically:YES];
instead of this:
[UIImagePNGRepresentation(image) writeToFile:path atomically:YES];
And, of course, changing all the picture names from .png to .jpg. The 0.0 part of the code above is to control Alpha levels, which is required of jpegs. Hope this helps people in the future!
I assume you're using the Assets Library Framework to get the image. If so, you'll want to get the orientation of the image in your results block, and manipulate the UIImage to adjust. I have something like the following:
ALAssetsLibraryAssetForURLResultBlock resultBlock = ^(ALAsset* asset)
{
ALAssetRepresentation* rep = [asset defaultRepresentation];
_orientation = rep.orientation;
...
}
Aside: What a great type name, huh? Holy moly, does Objective-C need namespaces!
Thanks so much...
This code has worked and saved several hours of work for me.
NSData* data = UIImageJPEGRepresentation(image,0.0);
[data writeToFile:path atomically:YES];
in the place of
[UIImagePNGRepresentation(image) writeToFile:path atomically:YES];