I am using the new Retina display feature of ios 4.0 in my iphone application.
I added the images for higher resolution with the naming convention as image#2x.png to my existing image folder.
eg. I am adding the image in the following way:
UIImageView *toolBarBg=[[UIImageView alloc] initWithFrame:CGRectMake(0,0,320,88) ];
NSString *toolBarBgImgPath=[[[NSBundle mainBundle]resourcePath]stringByAppendingPathComponent:#"bg_btm_bar.png"];
[toolBarBg setImage:[UIImage imageWithContentsOfFile:toolBarBgImgPath]];
I am having the image named "bg_btm_bar#2x.png" in my image folder as well.
But when I am running my application it is not taking the higher res image.
I am not understanding that how to make the application use the higher res image.
Please help me out!
Use imageNamed: instead of imageWithContentsOfFile:
Related
i'm sorry for my english.
i'm new in iphone development and happens to me a strange things.
I have a set of jpg images to show in a table view. When i test the app in iphone simulator everything is ok and work properly but when built and run the same code in iphone test device the same images aren't displayed.
Another strange behavior is that with a set of png images instead of jpg are shown perfectly in simulator like as in iphone test device.
Anyone can suggest me a solution?
I detect the name of image to load from a json file. This is the code that i use to show the image:
UIImageView *immaginePiadina = [[UIImageView alloc] initWithFrame:immagineValueRect];
[immaginePiadina setImage:[UIImage imageNamed:[item objectForKey:#"immagine"]]];
where [item objectForKey:#"immagine"] is an element of my json file like this:
"nome": "66",
"immagine": "pianetapiadabufalavesuviani",
"prezzo": "€ 7,50",
"nomeingrediente": [
"bufala",
"vesuviani"
]
How you can see i refer to the image with only the name of the file and without the file extension. I did it in this way to show image retina, it's wrong?
I exclude that i wrote a different case sensitive name because the png set works properly.
thanks a lot!!
There are possibly two separate issues that need separation, here is how to solve your problem.
First write a method using the NSFileManager, that given a file path verifies a file exists and has a size greater than 0. Insert a call to this everywhere you fail to open an image. If you use "imageNamed" then get the path to the bundle and create the path. If this method fails to find an image, fix it.
Second, the image decoding is different for the simulator and the device - the simulator uses the full OSX libraries. So take one image that fails to open and move it somewhere. Open it in Preview and export it using the same name but with png+alpha format. Change your code to expect a png not a jpg, and retest. Make sure you still use the first method to insure the file is there.
Once you get one success you can try other options, like using Preview exported jpegs. The reason this should work is that all of these image formats permit a huge range of options all of which iOS does not support.
Given that the current format us the problem, you can script changes using the "sips" program, which is what Preview uses.
I solve my problem using the following code:
NSString* path = [[NSBundle mainBundle] pathForResource:[item objectForKey:#"immagine"] ofType:#"jpg"];
NSLog(#"%#",path);
UIImage* theImage = [[UIImage alloc] initWithContentsOfFile:path];
I don't know the reason but in this way i haven't any problem to display jpg in my iPhone test device. Maybe because NSBundle specify also the extension of the file.
Thanks #David-h that directed me in the right way to solve my problem.
Check the extensions of your images. If you write .PNG, in the simulator is Ok, but not in the device.
I am using imageNamed to set images in my app. The app correctly takes 2x images when specifying the image name i.e.
[UIImage imageNamed:#"abc.png"]; //works fine on iPhone 4 and 3G(S)
But if I specify the image name in a variable, the 2x images are not picked up i.e.
NSString *imageNameVar = #"abc.png";
[UIImage imageNamed:imageNameVar]; //does not work
Can someone please help me out?
Thanks.
try NSString *imageNameVar =[NSString stringWithString:#"abc.png"];
What is the best pattern for displaying images both on iOS3 and iOS4?
For example, here's code for a custom activity animation:
activityImageView.animationImages = [NSArray arrayWithObjects:
[UIImage imageNamed:#"s01"],
[UIImage imageNamed:#"s02"],
[UIImage imageNamed:#"s03"],
[UIImage imageNamed:#"s04"],
[UIImage imageNamed:#"s05"],
nil];
On low-res and high-res devices running iOS4, the images load as expected. On iOS3 the images cannot be found since the .png extension is seen as missing.
What's the right way to deal with this situation? Do I need to throw hacky conditionals throughout my code or is there a cleaner method?
Is something from Matt Gallagher's article a good approach?
http://cocoawithlove.com/2010/07/tips-tricks-for-conditional-ios3-ios32.html
You still get the auto '#2x' loading behavior in iOS4 even if you use the file extension, so [UIImage imageNamed:#"so1.png"] is fine (it will find "so1#2x.png" if you're running on a Retina Display with OS 4) plus it's backwards compatible with 3, where it will always find the low-res version of the file.
See the "Loading Images into Your Application" section here.
Is there any way to test the iPhone camera in the simulator without having to deploy on a device? This seems awfully tedious.
There are a number of device specific features that you have to test on the device, but it's no harder than using the simulator. Just build a debug target for the device and leave it attached to the computer.
List of actions that require an actual device:
the actual phone
the camera
the accelerometer
real GPS data
the compass
vibration
push notifications...
I needed to test some custom overlays for photos. The overlays needed to be adjusted based on the size/resolution of the image.
I approached this in a way that was similar to the suggestion from Stefan, I decided to code up a "dummy" camera response.
When the simulator is running I execute this dummy code instead of the standard "captureStillImageAsynchronouslyFromConnection".
In this dummy code, I build up a "black photo" of the necessary resolution and then send it through the pipelined to be treated like a normal photo. Essentially providing the feel of a very fast camera.
CGSize sz = UIDeviceOrientationIsPortrait([[UIDevice currentDevice] orientation]) ? CGSizeMake(2448, 3264) : CGSizeMake(3264, 2448);
UIGraphicsBeginImageContextWithOptions(sz, YES, 1);
[[UIColor blackColor] setFill];
UIRectFill(CGRectMake(0, 0, sz.width, sz.height));
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imageData = UIImageJPEGRepresentation(image, 1.0);
The image above is equivalent to a 8MP photos that most of the current day devices send out. Obviously to test other resolutions you would change the size.
I never tried it, but you can give it a try!
iCimulator
Nope (unless they've added a way to do it in 3.2, haven't checked yet).
I wrote a replacement view to use in debug mode. It implements the same API and makes the same delegate callbacks. In my case I made it return a random image from my test set. Pretty trivial to write.
A common reason for the need of accessing the camera is to make screenshots for the AppStore.
Since the camera is not available in the simulator, a good trick ( the only one I know ) is to resize your view at the size you need, just the time to take the screenshots. You will crop them later.
Sure, you need to have the device with the bigger screen available.
The iPad is perfect to test layouts and make snapshots for all devices.
Screenshots for iPhone6+ will have to be stretched a little ( scaled by 1,078125 - Not a big deal… )
Good link to a iOS Devices resolutions quick ref : http://www.iosres.com/
Edit : In a recent project, where a custom camera view controller is used, I have replaced the AVPreview by an UIImageView in a target that I only use to run in the simulator. This way I can automate screenshots for iTunesConnect upload. Note that camera control buttons are not in an overlay, but in a view over the camera preview.
The #Craig answer below describes another method that I found quite smart - It also works with camera overlay, contrarily to mine.
Just found a repo on git that helps Simulate camera functions on iOS Simulator with images, videos, or your MacBook Camera.
Repo
I'm building an app which includes a number of image sequences (5 sequences with about 80 images each). It runs nicely in the iPhone simulator, but causes my iPhone to reboot when I test it. By the way, each png image is about 8k in size.
Has anyone successfully built a similar app?
Am I using too many resources for the iPhone to handle?
Anyone?
UPDATE:
Thanks to all for you answers! I've modified my code to use [UIImage imageWithContentsOfFile:] instead of [UIImage imageNamed:]
However I'm still unable to prevent the app from crashing my iPhone.
(please note that my pngs are not that big about 400x400px / 8k)
Does anyone have any suggestions?
Here's my code:
// code snippet:
myFrames = [[NSMutableArray alloc] initWithCapacity:maxFrames];
NSMutableString *curFrame;
num = 0;
// loop (maxframes = 80)
for(int f = 1; f < maxFrames+1; f++)
{
curFrame = [NSMutableString stringWithString:tName];
if(f < 10) [curFrame appendString:[NSString stringWithFormat:#"00%i",f]];
else if(f>9 && f<100) [curFrame appendString:[NSString stringWithFormat:#"0%i",f]];
else [curFrame appendString:[NSString stringWithFormat:#"%i",f]];
UIImage *img = [UIImage imageWithContentsOfFile: [[NSBundle mainBundle] pathForResource:curFrame ofType:#"png"]];
if(img) [myFrames addObject:img];
[img release];
}
// animate the images!
self.animationImages = myFrames;
self.animationDuration = (maxFrames * .05); // Seconds
[self startAnimating];
The best way to find out is to run the application under Instruments using Leaks or Object Alloc. If you see an upward trend that keeps rising, you might have a leak.
If you're using [UIImage imageNamed:], you should be aware that it pre-caches an optimized version which takes up more memory when compared with [UIImage imageWithContentsOfFile:]. Additionally, until iPhone 3.0, the cache created by [UIImage imageNamed:] doesn't get released when there's a memory warning.
The current-gen iPhone only has 128MB of ram, some of which is used by the OS itself. A 320x480 image fully uncompressed with an alpha channel can take 614k. If you have 400 unique images that are full screen, that's well over 128MB of ram, assuming it is loaded up and cached uncompressed.
The number one reason why an app would not crash on the simulator but on the phone would be memory
On the iphone simulator AFAIK the memory is not limited to 128Mb while on the iphone once it reaches 128Mb it restarts. So check your memory usage on the simulator. You have to change the way you are loading the images and or check for leaks. Also check if your getting low memory warnings by implementing the methods (I forgot what they are called :()
I've seen apps run in the simulator and not on the phone because of improper PNG formatting (even a single improperly formatted image can cause this crash). Check to make sure that the format of your images matches those of PNG files provided by apple in their example apps.
That being said 400 full screen images would easily cause it to run out of memory as in memory they will occupy far more than the 8kb. Not sure how big those images are, but if they're all in memory they will need to be very, very small on the iPhone.
The first answer to your question states that while your PNGs may take up only 8K on disk, that is the compressed on-disk form. When it is loaded into memory, it is decompressed and is much larger than 8K. At 32-bits per pixel, a 400x400 image will be 640K.
Even without the alpha channel, you're looking at 480K. 480K x 80 frames, that is 38.4MB, which is definitely creeping into using more memory than the iphone has available to give your app at once. Here is an article about some of the troubles with obtaining a substantial about of memory from the iPhone OS.