Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I have an array of images, these images can be taken from camera or photo library, so it can be on different sizes.
The images are displayed as thumbnails like on photo library, and my goal is to show one image on screen with a screen related size once it is tapped, then i can move to the next image with a gesture, also as the photo library.
My main problem is i don't know the size of an image view to fit the images on screen, i tried to scale them, but as i can have several sizes of images i can't find a pattern to scale (images of different sizes should be scaled differentially).
So, how can scale these images to fit on screen proportionally to it's size(again, like photo library)?
On UIImageView you can setup a scaling mode that should help fit the images onto the screen as nicely as possible. Essentially you want to "fit" the image onto the screen by either padding the small sides with empty space, or clipping the edges of the largest sides.
Keep imageView.frame uniform for all images and play with both of these options to find out what works best for your needs:
/* no clipping - empty space on sides */
imageView.contentMode = UIViewContentModeScaleAspectFit;
/* clipping - no empty space */
imageView.contentMode = UIViewContentModeScaleAspectFill;
An answer (scroll down) here illustrates the difference with a nice diagram:
How to scale a UIImageView proportionally?
did you already check the Collection View Controller? You use that kind of container type for your application.
http://developer.apple.com/library/ios/#documentation/WindowsViews/Conceptual/CollectionViewPGforIOS/Introduction/Introduction.html
Another way would be to add implement an own logic... You create several e.g UIImageView's and than implement an logic to detect the smallest one. Based on the smallest "frame" you resize the other Views till they match the size of the smallest UIImageView based on his height...
But I think its much more comfortable to use the collection container :D
In addition here is an tutorial how to handle it if you dont want to read the full documentation from apple but I recommend to read it anyway!
http://www.raywenderlich.com/22324/beginning-uicollectionview-in-ios-6-part-12
this tutorial was created by Brandon Trebitowski
thank you
best regards
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
I'm trying to use autolayout in XCode to position these two images like they are in the screenshot, but when I run the simulator, they're in the wrong position. Here is basically what I want to do:
scale the images up if the screen size goes up (and vice-versa)
have the dog picture just above the middle of the screen
have the loading picture a certain distance below the dog (i.e: loading.y = dog.y + screenheight/4)
I have not gotten the hang of autolayout yet so any direction to what constraints I should use would be appreciated.
Image of View Controller
You would really benefit from just spending time reading through auto-layout tutorials / articles / discussions / documentation / etc.
Here is an example of what you are describing.
Using these two images:
You can lay out your image views in Storyboard like this:
The dog imageView has:
Centered horizontally
Bottom constrained 90% to the CenterY position (just above the middle)
Width is 50% of the safe-area width
Height determined by Aspect Ratio of 236:175 (the original size of the image)
The loading imageView has:
Centered horizontally
CenterY constrained 1.75% to the CenterY position (effectively screenHeight / 4)
Width is 75% of the safe-area width
Height determined by Aspect Ratio of 368:76 (the original size of the image)
and here is how it looks on 3 different device sizes:
The best way to learn this is to just work on it.
I'm trying to create an app for children. It will have different images. And those images will have different closed shapes like stars, balls or any other similar images.
I am aware of how to draw on iphone. I know how to select a color, how to change the brush size etc.
What I want to know is to select a color and on touch of image, it has to flood fill the closed area around touched co-ordinate. Is it possible? How can I do it?
(For example, if I touch inside the ball, the ball must be filled with one color)
check flood fill library
http://gauravstomar.blogspot.com/2014/09/uiimage-fill-color-objective-c.html
hope it will help
I think you need to use blending for that, see this answer:
Iphone App: How to fill an image with empty areas with coregraphics?
I am developing an iphone application which would let me modify the image by dragging other images over its top along with functionality that would let me choose a specific color from the image and replace it with some other color shades i choose. If anyone has some clue like how to actually implement it or some code, that would be great help for me to proceed in right direction
Thanks in advance :)
If you just want to overlay images, you can simply drag UIImageViews around and position them within the view hierarchy such that the appropriate images are overlaid one on top of the other.
For generating a new image out of those composited images, you may want to use something like CGContextDrawImage() within a UIView subclass's -drawRect: method or in a custom CGContext. You can also refer to these similar questions:
How can we combine images and get image object from this two images in iphone apps
Combining images
As far as selectively replacing a specific color within a UIImage, that's a commonly asked question here:
In a UIImage (or its derivatives), how can I replace one color with another?
How to make one color transparent on a UIImage?
I am implementing fun application.In which i want to animate some area(portion) of images.
I dont know how it possible.If u have any idea then please help me.
If you're looking to animate a portion of an image you could just create a smaller animation and put it at the right relative location inside a larger image.
If you want to know how to animate images in general, that's a different question entirely.
I have a png image file that is partly opaque and partly transparent. I display it in a UIImageView as a mask of sorts over another UIImageView layered behind it (as a sibling subview of a common superview). It gives me perfect borders around something painted using a finger on the lower UIImageView in my stack of UIImageViews. Perhaps there are better ways to do this, but I am new-ish, and this is the best way I came up with thus far. None the less, my app is in the App Store and now I want to enhance it to provide more images to use as the mask of sorts over the finger painting. But I don't want to bloat my bundle size by adding more static mask images as I did for the initial implementation. Not to mention I don't want to spend lots of time in photoshop making 100 masks. I'd rather programmatically change the color of the mask, without affecting the clear portion in the middle, which is not a simple regtangle or circle, but rather a complex shape. So my question is this: How can I change the colored portion of my loaded image without affecting the clear color portion in the middle? Is there a reasonably easy way to do this? Essentially I want to do what is described in this post (How would I tint an image programmatically on the iPhone?) without affecting the clear portion of my image. Thanks for any insights.
Have a look at the Tinted Image sample project. Try out the different modes until you get the effect you want.