I am developing an application in which i have one image and I need to specify the location in image.But i have left and top values of image only. But we need x,y,width and height values.Then how to find out values from these.So please tell me how to do that.
The left and top values are the x - y positions. The amount of space your image needs to cover the view (frame size) is the width and height. For example, if your image has to cover the full screen of iPhone, you should specify the frame as:
image.frame = CGRectMake(0, 0, 320, 480);
Related
I have a UIImageView with a frame of (0, 0, 568, 300) containing a UIImage with a native size of 512x384 pixels. The contentmode is set to Aspect fit.
If the user double-Taps on the view I change the size of the UIImageView with the following code:
self.imageViewerViewController.imageView.frame = CGRectMake(0, -63, 568, 426);
the result is that the right edge of the image is distorted, it does not properly scale to the new size.
Attached is an image with a black and white matrix, the distortion is on the right.
It seems that the right column of pixels is repeated to the right edge of the view.
Can anyone help?
I have changed the creation of the affected UIImageView from a xib file to creating it in code and using the method - (id)initWithFrame:(CGRect)aRect: with the maximum size of the scaled image for aRect. Now the image is properly scaled.
Obviously the UIImageView in the xib file has been inited with a 512x384 pixel frame.
I am simply using this code to transform imageView to scale double or triple.
imageView.transform = CGAffineTransformScale(CGAffineTransformIdentity, scale, scale);
I am drawing lines on this imageView in drawRect,after scaling everything got scaled. What i want is those lines which i am drawing should remain of same size or width after scaling.In order to do that ,i am redrawing lines on scaled imageView with proportional less height and width but result is like lines got blurred/shady.I think on transforming the pixels size also got increase of imageView..I want to know is there any way to draw proper lines without any blurred effect on transformed imageView etc..
Draw the lines AFTER you scale the image. Just change the width of the line to be the same scale (ie image is 4x, line width is 4x).
I want to resize images, but it should keep the height/width ratio when doing the resizing. What I do is, first I check whether which side (width or height) is the long. If the width is long, I'll give 150 to the width's size and resize the height without affecting to the shape of the image and vise versa. I am talking this resized image for a edge detection algorithm and output binary image is sent to the neural network which requires constant number of inputs. In this case, one side of (width or height) the image is 150 and other side is less than 150(vary from image to image). But I want to add black color to the other side(less than 150) until its size is 150. So, I can sent 150*150 inputs to the neural network.
Question is How can I add black color to the other side(less than 150) until its size is 150?
Thanks in advance
http://www.mathworks.com/help/toolbox/images/ref/imresize.html
http://www.mathworks.com/help/toolbox/images/ref/padarray.html
newim = imresize(im, 150 / max(size(im));
paddedim = padarray(newim, size(newim) - 150, 0);
Creates an matrix of zeros. Calculate the position of the top-left pixel. Then copy your image to that matrix slicing from the top-left pixel.
Im making a simple application to learn getsure recognizers.. I have created four views in my window but when i load the application, the size is completely different to how i had arranged them before.. any ideas why or how i can fix this?
i am importing some co-ordinates in my .h file for pinching.. could this be why?
thanks
Make sure you are setting the views' frames correctly. For example, if you want your view to be 100 pixels wide, 50 pixels tall, and be in the top-left corner of the window, you'd write the following:
myView.frame = CGRectMake(0, 0, 100, 50);
If you wanted to push that view over 10 pixels to the right and 15 pixels down (but with the same dimensions), you'd write the following instead:
myView.frame = CGRectMake(10, 15, 100, 50);
For more information on view frames, read through all the attributes in the "Configuring the Bounds and Frame Rectangles" section on Apple's UIView Class Reference.
I use a full screen imageView to display the image, as follows:
UIImageView*imageView=[[UIImageView alloc] initWithFrame:CGRectMake(0,0,320,480)]
imageView.contentMode=UIViewContentModeScaleAspectFit;
imageView.image=srcImage;
As the srcImage size differs, its position (frame) in the imageView differs, how can I
get the position of the image in imageView?
thanks a lot.
You have to do the math yourself. Calculate the aspect ratio of your image and compare with the aspect ratio of the image view's bounds.