Anyone ever seen the problem of [UIColor initWithPatternImage] ignoring the alpha values of the PNG? If so, what's the best fix?
I am using a png as a background layer to my array of button objects and it contains several pre-set alpha values per pixel in the image that is to be used as a texture background. It loads fine as a pattern/texture-color, but it comes up with all key transparent area as opaque black.
It is important that I get the right alpha values so that the button images shows correctly. The button frames do not include the alpha shadows from the background as that is not the "clickable" portion of the button. Additionally, my button object's images and background images also makes use of transparency, so it really needs to have a clear background directly behind each button to let the proper true current color settings come through (lowest layer UIView will have its background color set to the current user's selected color). Setting just the single alpha value for the UIView layer containing this texture does not work for my needs either.
Any help would be appreciated. My current workaround would be to use fully-blown, repeatedly-programmed layout of several UIImageView using the png, instead of a single UIView with the pattern fill.
Here is a snippet of code, but it's pretty standard for turning a UIImage into a UIColor for use as a pattern/texture color:
UIView *selectorView = [[UIView alloc] initWithFrame:CGRectMake(0,0,320,320)];
UIColor *background = [[UIColor alloc] initWithPatternImage:[UIImage imageNamed:#"SelectorViewBackground.png"]];
selectorView.backgroundColor = background;
[mainView addSubview:selectorView]; // pattern background layer. Add UIButtons on top of this selectorView layer
[self addSubview:mainView]; // current user selected color set in mainView.
[selectorView release];
[background release];
I had the same problem with setting a background on a UIView with some transparancy,
this is how I solved it:
theView.backgroundColor = [UIColor clearColor];
theView.layer.backgroundColor = [[UIColor alloc] initWithPatternImage:[UIImage imageNamed:#"the_image_with_transparancy.png"]].CGColor;
This is probably related to:
Tiled Background Image: Can I do that easily with UIImageView?
Basically, try setting:
[view setOpaque:NO];
[[view layer] setOpaque:NO];
No I've never had an issue with this. I use code like the above all the time for apps (though often I use it in conjunction with a layer instead of a view but that shouldn't make a difference with transparency being recognized) and always have had transparency work fine.
I'd look into your PNG file. I've noticed iOS sometimes being finicky with certain PNG options/types (like an 8 bit PNG with 8 bit transparency). Make sure your PNG is saved as 24 bit with 8 bit transparency (32 bit total).
Also, stupid question, but have you verified there isn't anything black in the view/layer hierarchy behind your PNG? Sometimes it's the stupid things like that
For those who might need the work-around code where the background patterns can be laid out as rows in a UIScrollView, here it is (adjusted to remove confidentiality concerns, should work if variables properly set prior to call).
Note that there should be ways to reuse just the one allocated instance of UIImageView multiple times to either save memory or load times but time-to-market is my No. 1 driver right now. Enjoy the journey :)
UIImageView *selectorView;
for (int i = 0; i < numRows; ++i) {
selectorView = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"SelectorViewBackground.png"]];
selectorView.frame = CGRectMake(0, i * patternHeight, patternWidth, patternHeight);
[mainView addSubview:selectorView];
[selectorView release];
}
Related
I've literally piled through hundreds go searches on google :(. I can't seem to figure out what I'm doing wrong when I create an image in photoshop (960 x 600, -40 for the status bar). The image comes out to this:
When it should look like this:
(note this is not the actually size, crappy thumbnail version :P. The size is as stated above)
This is my code:
self.view.backgroundColor = [UIColor colorWithPatternImage:[UIImage imageNamed:#"MenuBkground.png"]];
Am I doing something wrong when I make the image? Is it in the code? Any ideas?
You're using colorWithPatternImage which basically means what it says. The image will repeat itself if the space is not entirely consumed by the image. If you want to have a true background image you should create the image as a subview.
UIImage* image = [UIImage imageNamed:#"MenuBKground.png"];
UIImageView* background = [[UIImageView alloc]initWithImage: image];
[self.view addSubview: background];
Another way if your using interface builder,
Drag an image view to your viewController.
Assign that as MenuBkground.png in the inspector (first drop down box)
I have a UITableView that has three UIImageView views per cell, with three cell displaying on the view at one time (for a total of nine UIImageView views). Think of it as a bookshelf. Sometimes I can have as many as 500 books.
I've added shadow to the UIImageView with code that is this:
UIImageView *itemImageView = [[UIImageView alloc] initWithFrame:CGRectMake(25, 7, 65, 75)];
itemImageView.contentMode = UIViewContentModeScaleAspectFit;
itemImageView.tag = 6;
itemImageView.layer.shadowColor = [UIColor blackColor].CGColor;
itemImageView.layer.shadowOffset = CGSizeMake(3, -1);
itemImageView.layer.shadowOpacity = 0.7;
itemImageView.layer.shadowRadius = 3.0;
itemImageView.clipsToBounds = NO;
[cell.contentView addSubview:itemImageView];
When I add the shadow code, as seen above, scrolling performance is just totally killed and becomes choppy. Each image has a different Rect so the shadow has to be created for each item as it scrolls. Anyone have any tips on how to add shadows to my images on a UITableView without having this issue?
You may see a performance improvement if you add
itemImageView.layer.shadowPath =
[UIBezierPath bezierPathWithRect:itemImageView.layer.bounds].CGPath;
But in general, layer operations like this will kill performance in a table view. I experienced exactly the same issue, and we just eliminated the shadow effect. It wasn't worth it.
You should review the WWDC videos on CoreAnimation best practices. You can request that the rasterized copy of the shadow be cached in memory. Cocoa is definitely fast enough to render these shadows on the fly without falling back to a pre-rendered image.
Example:
itemImageView.layer.shouldRasterize = YES;
Also I up-voted the answer regarding UIBezierPath. This is also mentioned in the best practices, but setting the shadow path of a CALayer is a huge performance boost. You can also create some interesting special effects by manipulating the shadow path.
Shadows are expensive and will kill your performance.
A better approach is to render the shadowed image in the background, cache/save it and display it on the view when its ready.
Edit: You way wish to look at Core Graphics / CGImage routines. Specifically CGContextSetShadowWithColor will draw you a shadow.
http://developer.apple.com/library/mac/#documentation/GraphicsImaging/Reference/CGContext/Reference/reference.html
I have the same problem with adding shadow to labels inside a tableviewcell. I tried to layout the elements inside cellForRowAtIndexPath when the cell will be created:
if(cell == nil){
//layout cell
These optimized the scrolling a little bit, but its quite choppy.
For optimizing your pictures quality you should add also the rasterizationScale if you activated "shouldRasterize":
aLabel.layer.shouldRasterize = YES;
aLabel.layer.rasterizationScale = [[UIScreen mainScreen] scale];
Maybe somebody has some ideas how to optimize the code to get the normal iOS scrolling. thx
Do you activate reusing? If not, the cells will redone each time the view changes (e.g., during scrolling). That would eat a lot of performance for sure.
Check out my same question here: App running slowly because of UIImageViews
I would use this code:
imageView.layer.masksToBounds = NO;
UIBezierPath *path = [UIBezierPath imageView.bounds];
imageView.layer.shadowPath = path.CGPath;
Ok So I have this code, which allows me to put a background image in:
I would love to know how to size this, so on the iPhone 4 I can get a 320x480 size but make it nice with an image of 570.855.
self.view.backgroundColor = [UIColor colorWithPatternImage:[UIImage imageNamed:#"background_stream-570x855.jpg"]];
I have tried this:
UIImageView *image = [[UIImageView alloc]
initWithImage:[UIImage imageNamed:#"background_stream-570x855.jpg"]];
[self.view sendSubviewToBack:streamBG];
image.frame = CGRectMake(0, 0, 320, 480);
Which works, however the problem is it's behind the view, so I can't see it. I could make the view clear, but it has objects on it that need to be displayed.
Any help would be most apretiated
There are multiple options to put Views at desired location.
[self.view sendSubviewToBack:streamBG]; //Sends subview to last position i.e. at the last
[self.view bringSubviewToFront:streamBG] //Brings subview to first position i.e. at the first.
[self.view insertSubview:streamBG atIndex:yourDesiredIndex];//Put at desired index.
This is not answer to your question, though it may help you to set your imageview to desired location.
Hope it helps.
To answer part of your question about sizing. You need to use 2 different images for your app if you want the full resolution of the retina display (iPhone4) You should provide a 320x480 image (i.e. myImage.png) for older iPhones and one at twice the resolution, 640x960, for the iPhone 4. Use the exact same name for the high res version but add an "#2x" to the end (i.e. myImage#2x.png). In your code all you ever have to call is the base name (i.e. myImage.png) the app will choose the correct image based on the hardware its running on. That will get you the best experience on your app.
On the other part about the background view visibility, you could make the background color clear ([UIColor clearColor]) on the view that is blocking it. That would leave the content visible in that view but the view its self would be clear. Alternatively you could just insert the background at a specific index as #Jennis has suggested instead of forcing it to the back.
If I create a PNG image in Photoshop and lower the opacity so it's 85% opaque, how can I maintain that same level of transparency when I add it to my iOS app?
I'm trying to set the background image of a UILabel to be this image, but the background image for the UILabel is fully opaque in my iOS app. Here's my code...
[lbl setBackgroundColor:[UIColor colorWithPatternImage:[UIImage imageNamed:#"labelBackground.png"]]];
Am I missing something or is this not possible?
Thanks so much for your wisdom!
If I were you, I'd put an UIImageView containing the image behind your UILabel. Then make sure your UILabel and UIImageView backgroundColors are both set to [UIColor clearColor]. That's how I do it anyway.
Have you tried lbl.opaque = NO;?
From my experience, iOS preserves opacity for PNG images.
I think I have an idea of what MAY be wrong (and pardon me for making any wrong assumptions)...
In photoshop, when saving a PNG image, there's an option to "Save Transparency" or something like that. Make sure that is checked before you save the PNG.
If this is not the problem, you can always use:
UIImageView.opacity = 85.0f/100.0f;
Let me know if this solves your problem :)
I ran into some background issues just like you, but I learned that most UIView subclasses have a backgroundView property which is accessed like this:
[aView backgroundView];
or
aView.backgroundView = someView;
UIImageViews keep the opacity of images. With these two things in mind you can just do:
UIImageView *imageView = [UIImageView initWithImage:[UIImage imageNamed:#"YourImage.png"]];
myLabel.backgroundColor = [UIColor clearColor];
myLabel.backgroundView = imageView;
I hope you find this useful.
With regards to the iPhone SDK:
I have a 512 x 512 pixel .png image, and I want to display it within a 200 x 200 pixel UIImageView WITH cropping (at native resolution, without zooming, fitting, or rotation). Is there a way to determine the image's coordinates that are visible inside the UIImageView?
For example, if the image in the exact center of the UIImageView, then I want to know what the coordinates of the image are at UIImageView's (0,0) and (200,200) points, which will be (156,156) and (356,356), respectively for the image being displayed.
I plan on implementing "pan" and "zoom" functions at some point, but I want to know how to address this issue first. The UIImageView class reference did not seem helpful... I was hoping to avoid using a UIView, but if that is necessary I will go that route.
Thanks!
You're right about what you've seen from the UIImageView reference - there is no easy way to do that using UIImageView alone. You would need to take into account things like the contentMode of the UIImageView or limit yourself to only calculating the position for a single UIViewContentMode.
However, when you eventually get to implementing pan and zoom, you'll be using a UIScrollView with a UIImageView as a subview of that UIScrollView. At that point, it would be most sane to size your UIImageView initially to the size of whatever image you're display (could be dynamic). You can then determine where the image is by determining where the image view is, which is something you can accomplish much more simply and with just a bit of math.
Thanks for your help. Working out the pan/zoom features became trivial after implementing the UIScrollView. Implementing the UIScrollView let me determine the coordinates I needed with the myScrollView.bounds.origin.x and myScrollView.bounds.origin.y values... zooming still requires calculation - as you stated above, but I am on the right track now.
here is the code I used to make the UIScrollView (based on multiple sources available on the net and this site - Apple's ScrollViewSuite was particularly helpful. Here is the code I used in the (void)viewDidLoad section of my program, hopefully others can find it useful:
UIImageView *tempImageView = [[UIImageView alloc] initWithImage:[imageArray objectAtIndex:0]]; //imageArray is an array/stack of images...
[self setMyImage:tempImageView]; //myImage is of type UIImageView
[myImage setTag:100]; //set this to an 'arbitrary' value so that when I update the scrollview I can remove the current image...
[tempImageView release];
myScrollView.contentSize = CGSizeMake(myImage.frame.size.width, myImage.frame.size.height);
myScrollView.maximumZoomScale = 10.0;
myScrollView.minimumZoomScale = 0.15;
myScrollView.clipsToBounds = YES;
myScrollView.delegate = self;
[myScrollView addSubview:myImage];