PIL image.rotate center(0.0) - python-imaging-library

I rotate the image 45 degrees to the left corner, using Image.rotate, but the image disappears beyond the border of the frame. How can I fix this?
im2 = self.image
rot = im2.rotate(45, expand=True, center=(0, 0))
self.image = rot
Originally the image looks like this:
enter image description here
after a 45 degree rotation like this:
enter image description here

Edit: Judging from your new picture, the center=(0, 0) must mean the origin point of the frame, not the image.
To rotate with the top-left of the image fixed, try this:
rot = im2.rotate(45, expand=True, center=(im2.left, im2.top))
This is assuming that the image has properties that tell you its position within the frame.
You are rotating around the top-left corner center=(0, 0), the origin point for this image. Try using the image center (1/2 width, 1/2 height) as your center of rotation.

Related

UIImage in an UIImageView rendering distorted

I have an UIImageView 3 levels deep in a super view. (The white region on top of the gray rectangle is my superview, the gray elongated rectangle is one subview, the black square is another subview of the elongated gray rectangle, the cog is the image of the UIImageView which is a subview of the black square). The frame rectangle of the UIImageView is calculate as follows, where _normalImage is a UIImage object. I do this inside the subclass that represents the black square
CGFloat xPoint = self.bounds.size.width/2 - _normalImage.size.width/2;
CGFloat yPoint = self.bounds.size.height/2 - _normalImage.size.height/2;
CGRect frameRect = CGRectMake(xPoint, yPoint, _normalImage.size.width, _normalImage.size.height);
self.imageHolder = [[UIImageView alloc] initWithFrame:frameRect];
The _normalImage is 26X26 and should be a perfect square. However the image is rendering as distorted as if there is an aspect ratio loss.
Whats wrong ?
try using floorf() function for your coordinates to prevent them contain the fractional part.
Image is stretched because by default it use UIViewContentModeScaleToFill content mode.
self.imageHolder.contentMode = UIViewContentModeScaleAspectFit;

How to get coordinates of a view according to the coordinates of the uiimage on which it is placed in ios sdk?

I have an app where user takes a picture from camera and it is placed on a full screen image view. Now user can dear rectangular /square views on it and send the coordinates/size of the overlayed view to the server. The problem is i am getting the coordinates of the view according to the screen size but i want that the size should be according to the image on the imageview because normally image from the camera would be of larger resolution than the screen .
Please provide me a way to get the the size of the overlayed view according to the uiimage and not the screen
EDIT : For example :If user draw a view on a human face in the image and send the coordinates of the crop view to the server . Then it will be difficult to synchronize the coordinates with the real image if the coordinates are according to the screen and not according to the uiimage itself
The answer is simple:
You have the frame of your UIImageView, and the frame of your drew square (both relative to the self.view)
You only need to find the origin of your square, relative to the UIImageView.
Just subtract:
//get the frame of the square
CGRect *correctFrame = square.frame;
//edit square's origin
correctFrame.origin.x -= imageView.frame.origin.x;
correctFrame.origin.y -= imageView.frame.origin.y;
Now correctFrame is the frame of your square relative to the imageView, while square.frame is still relative to self.view (as we didn't change it)
To get the frame accordingly to the image resolution, do exact the same as above, then:
float xCoefficient = imageResolutionWidth / imageView.frame.size.width;
float yCoefficient = imageResolutionHeight / imageView.frame.size.height;
correctFrame.origin.x *= xCoefficient;
correctFrame.origin.y *= yCoefficient;
correctFrame.size.width *= xCoefficient;
correctFrame.size.height *= yCoefficient;
Why: the image resolution is much grater than the imageView.frame, so you gotta calculate a coefficient that will adjust your square.
That's all!

Changing the Bounding box for UIImageview When Rotating

I have an image that i rotate to face the touchLocation of the user, but because the bounding box of the UIImageView gets larger. This is ruining some collision detection.
My only plan is to code a new bounding box system to get each of the 4 points in a bounding box and rotate that myself, then write check collide code for that.
But before i do that, is there an easy way to do this?
My rotate code:
- (void)ObjectPointAtTouch{
//Get the angle
objectAngle = [self findAngleToPoint:Object :touchLocation];
//Convert to radian, +90 for image alignment
double radian = (90 + objectAngle)/(180 / M_PI);
//Transform by radian
Object.transform = CGAffineTransformMakeRotation(radian);
}
I figured it out, as seen in other tutorials they also resize the scale off the bouncing box to fix this problem. CGAffineTransform scale I think it is.

UIImage animation

I have 6 uiimage plates which are placed on the view. Now I am rotating the images 90 degrees on double tap by using the below code:
CGAffineTransform transform = CGAffineTransformMakeRotation(M_PI * 0.5);
plate.transform = transform;
For first double tap it rotates 90 degree but when double tap second time it doesn't rotate.
Am i missing something ? Thanks in advance.
AffineTransform keeps the data of the transformation saved. So if you scale/rotate/move the view, the AffineTransform will keep that info.
In order to rotate 90 degree each time, you will have to get the current value and then modify it, or just keep a variable of your current degree and change the variable.
It's because you are already there, you have already rotated those rads. Try by increwsing the rotation value everytime the user taps the image.

how to apply the center of image view for another image view center

I am developing image redraw app. i know the center of one image view which is having scaling property. i.e it can change it frame and center . that to be applied to another view center. when i apply center to newly formed Image view it giving wrong center because of the previous image view is having scaling property. so how can i get exact center to my new image view from previous imageview
Your view or image is
width*height
your center view should always be in position
(width/2,height/2)
whether the image is scaled or not.
Just recalculate your center after the scale if you need the "scaled" center or keep in memory the original center position if you don't.
Pseudocode:
getCenter(w,h){
pos[0]=w/2;
pos[1]=h/2;
return pos;
}
calc(image){
c = getCenter(image.w,image.h);
scaled = image.scale(80); //That is 80% of original
d = getCenter(scaled.w,scaled.h);
if(something) return c;
else return d;
}
Second explanation after discussion (read comments):
Let's assume you have a 640X480 image and you create a view of 320X240 (a quarter of it) and you move THIS view 100px right and 50 pixels down from position (0,0) which is usually top left corner of your image then:
your new center of the VIEW will be as usual in position (160,120) of the VIEW
the original center of the ORIGINAL image will remain in its position (320,240) which casually corresponds to the bottom right corner of your VIEW.
IF you want to know WHERE the original center of the ORIGINAL image DID end up AFTER movement and "cropping" then you just have to know where did you move the VIEW:
100px right becomes (original position - relative movement) (320 - 100) = 220
50px down becomes (original position - relative movement) (240 - 50) = 190
So your ORIGINAL center will be in position (220,190) of the new VIEW.