I have defined an UIImageView in my nib. After the app launches, I display an image in that UIImageView. Maybe I am on the wrong way, but what I want to do is the following:
The image which I load into the view is bigger than the view itself. I want that it is displayed in original size with hidden overlay. Then I want to slowly move that image inside that view in random directions.
Maybe you know html div containers with background images. there, you can set a position of that background image and change that position with JavaScript. I need something similar on iPhone.
Maybe an UIImageView is not the right thing for that? Or must I set the UIImageView to the full size of that image and then move the UIImageView around slowly? Could it be bigger than the iPhone's screen?
You need to crop the image. See Lounges' answer to this question.
This is the gist of my answer to pretty much the same question:
There isn't a simple class method to do this, but there is a function that you can use to get the desired results: CGImageCreateWithImageInRect(CGImageRef, CGRect) will help you out.
Here's a short example using it:
CGImageRef imageRef = CGImageCreateWithImageInRect([largeImage CGImage], cropRect);
[UIImageView setImage:[UIImage imageWithCGImage:imageRef]];
CGImageRelease(imageRef);
Related
I am trying to tile an image for an app I am making. I am making rope that I can set the length for. In order to do this, I have a function that takes my rope length, sets the frame size accordingly, and then sets the background color like this:
(picture is a UIView)
picture.backgroundColor = [UIColor colorWithPatternImage:[UIImage imageNamed:#"Rope.png"]];
The view is blank. I tested it before by displaying an image by making picture a UIImageView instead, and it worked, but it won't show up anymore.
EDIT: I have now tried a few different methods of tiling, the best I can get is a stretched image. I need a working way to tile an image and get the output of my tiled image into a UIImage, UIImageView, or UIView.
Not sure what I did wrong but it is working now.
In my view I have a picture.. The picture has a 3:2 scale and is huge, resolution wise.
I'm having a lot of trouble initializing a UIImage with the image to the original size and then zooming out of the UIImageView so that the entire image is visible within the scrollview.
The program stays only in the portrait orientation.
And no, please don't say to just use UIWebView. UIWebView doesn't let me set the content size during view load...instead, it just zooms out of the image by some arbitrary scale and I couldn't figure out a way to adjust the scale value (I don't think it's possible).
Thanks for any responses! I really appreciate them :D
Here's an example of placing an image that responds to pinch-to-zoom. Basically, you place the UIImageView in a UIScrollView and change some settings.
UIImageView *myImage;
UIScrollView *myScroll;
-(void)viewDidLoad{
myImage = [[UIImageView alloc] initWithFrame:CGRectMake(0,0,360,450)];
[myImage setImage:[UIImage imageNamed:#"coolpic.png"]];
myImage.contentMode = UIViewContentModeScaleAspectFit;
[myScroll addSubview:myImage];
[myScroll setContentSize:CGSizeMake(myImage.frame.size.width, myImage.frame.size.height)];
[myScroll setMinimumZoomScale:1.0];
[myScroll setMaximumZoomScale:4.0];
}
- (UIView *)viewForZoomingInScrollView:(UIScrollView *)scrollView{
return myImage;
}
Of course, set all your delegates and IB hooks properly.
Edit: I just reread your question. The portion of the example that answers your question is the frame specification of the UIImageView and its contentMode setting.
You almost certainly don't want to load your 'huge, resolution wise' image all at once on load regardless of it's scale. I'd suggest checking out some of Apple's sample code on this stuff (starting with ScrollViewSuite would be good, I'd say).
There was also a recent video released from WWDC where they implement this sort of thing live (they have more of a photo viewing app, but you could pretty easily do what they show with just one image too) so take a look for that too.
Go to the attributes tab of the UIImageView. Select "aspect to fill" or "aspect to fit". If you use "aspect to fill", you might not show the whole image. I just use "aspect to fit", and make the background black.
I have a UIImageView. I get a UIImage from a URL. The image displays in the UIImageView, but I can't get it to center correctly. The UIImage is 80 x 68 pixels. The UIImaveView's size is 90 x 90. When I display the UIImage in the UIImageView the UIImage is shrunken so that it fits, although it is already smaller than the UIImageView. I have tried all the Content Modes in IB. Some of them shift the image up so that it is no longer in the UIImageView. None of this really makes sense to me. It seems that the UIImage should display inside the UIImageView fine if it is already within the required size. Any help would be great.
With Mode of Center on the UIImageView
alt text http://img406.imageshack.us/img406/3696/screenshot20090915at447.png
With Mode of Aspect Fit on the UIImageView
alt text http://img406.imageshack.us/img406/9373/screenshot20090915at447e.png
With Center, the size of the image seems correct but you can see that it goes out of the UIImageView which has a red UIView underneath it and is actually 2px on each side larger than the image view. But with the Aspect Fit, it makes the image smaller so that it fits inside. But I don't understand why it would make it smaller if it is already small enough to fit.
Sorry for wasting anyone's time. Turns out I didn't try all of the content modes. Seems that Content Mode of Bottom top works. This is backwards from how I understood it. Thanks for all your help.
Try this. Change your UIImageView to be the same size as the UIImage that you are going to load, then position the UIImageView so that it's centered. This is the approach I've taken since I find UIImageView to be a bit funny to work with. Or, if you prefer, skip the UIImageView and just draw the UIImage using it's - (void)drawInRect:(CGRect)rect method.
Sounds like your image contains transparent areas, at least at the bottom, which shift the image off center.
Edit: Also, you should check that the resolution of the image is set to 72dpi.
I have a custom UIImageView class which I use to handle multi-touch events on the UIImageView. When the user touch begins, I want to increase the UIImageView's frame but keep the UIImage size fixed.
I tried changing the UIImageView's frame and then calling the drawInRect: method of UIImage to keep the UIImage's size fixed. But this is not working.
The contentMode for the UIImageView is set as ScaleAspectFit and as soon as I increase the frame size of the UIImageView, the UIImage size also increases (and is not affected by the drawInRect:)
Can someone please tell me how I can achieve this?
Thanks.
Adding more details
What I am trying to do is this
Place a UIImageView on the screen with the size same as the size of the image
When the user selects the image, anywhere he touches, the image edits as if the user is doing multi-touch with the image
If I increase the size of the imageview to detect touches any where, the image size also increases... Hope that makes things clearer!
Thanks
There may well be other ways to do it, but I think the UIImageView is doing what it's intended to do here.
What do you want the area of the view not covered by the image to look like? Be transparent? Have a solid colour?
Why do you want to do this? Is it to capture touch events from a wider area than that under the image itself?
If you have a good reason for needing to do this I would create a new view, probably just a plain UIView (with background set to transparent colour), and add the UIImageView to that. Make the plain view the one you resize.
I haven't specifically tried this, but I think it would work:
imageView.contentMode = UIViewContentModeCenter;
set a UIImageView.image property to a UIImage is unable add the image to the view,however if i set the imageview frame property,it works.can anyone tell me how it's happen?
and what will happen when i apply the UIImage instance method[drawInRect:] redraw the image frame which is larger than the view frame?i have tried but nothing happen,and what is this function doing actual?
Generally, how it works is, you add an Image to a UIImageView and then add the UIImageView to a UIView to display it on the screen. You can do this programmatically, or using Interface Builder. Optionally, you can create a CGRect (based on a frame, if you like), use this as the bounds for the UIImageView (or you can frame the UIView). There are several ways to do what you want to do.
It's hard to tell without seeing your code. It could be a number of things. You could be inserting the view with the image behind the currentView? You may not be retaining the UIImageView. We can't read minds here necessarily, (sometimes we do). But you'll need to be more helpful if you want a solid answer.
Or just take a look at this:
UIImageView Docs