retrieve image in iphone library - iphone

i use this code to get image from facebook profile and show it on UIImageView
// Get the profile image
UIImage *image = [UIImage imageWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString:[result objectForKey:#"pic"]]]];
// Resize, crop the image to make sure it is square and renders
// well on Retina display
float ratio;
float delta;
float px = 100; // Double the pixels of the UIImageView (to render on Retina)
CGPoint offset;
CGSize size = image.size;
if (size.width > size.height) {
ratio = px / size.width;
delta = (ratio*size.width - ratio*size.height);
offset = CGPointMake(delta/2, 0);
} else {
ratio = px / size.height;
delta = (ratio*size.height - ratio*size.width);
offset = CGPointMake(0, delta/2);
}
CGRect clipRect = CGRectMake(-offset.x, -offset.y,
(ratio * size.width) + delta,
(ratio * size.height) + delta);
UIGraphicsBeginImageContext(CGSizeMake(px, px));
UIRectClip(clipRect);
[image drawInRect:clipRect];
UIImage *imgThumb = UIGraphicsGetImageFromCurrentImageContext();
[imgThumb retain];
imageFb=imgThumb;
[profilePhotoImageView setImage:imgThumb];
I want to make a button, when clicked allows the user to change the image by an image in the library of the iPhone (photos he took with the iphone ..) . I have no idea how I should proceed, help please

Access to the user's photo library is most easily mediated with the use of UIImagePickerController. You create and configure and instance, set a delegate and display it modally. Implement -imagePickerController:didFinishPickingMediaWithInfo to be notified when the user selects an image. You can then persist this image in the user's documents directory.
UIImagePickerController Class Reference

Related

How do I save a section on my screen to the users images (in swift)?

I want my user to be able to upload some images into a little square, and then I want all of them to be saved into one image on the user's iPhone.
I'm basically making an app the combines the users pictures beside each other (there's a ton of apps like that but I want to learn how they work), and then saves the total thing as an image on their phone.
Save all your images in an array(arrImage) and use the following method to merge images
- (UIImage *) mergeImages:(NSArray*)arrImage{
float width = 2024;//set the width of merged image
float height = 2024;//set the height of merged image
CGSize mergedImageSize = CGSizeMake(width, height);
float x = 0;
float y= 0;
UIGraphicsBeginImageContext(mergedImageSize);
for(UIImage *img in arrImages){
CGRect rect = CGRectMake(x, y, width/arrimage.count, height/arrImage.count);
[img drawInRect:rect];
x=x+(width/arrimage.count);
y=y+(height/arrImage.count))
}
 
UIImage* mergedImage = UIGraphicsGetImageFromCurrentImageContext();// it will return an image based on the contents of the current bitmap-based graphics context.
UIGraphicsEndImageContext();
return mergedImage;
}

Not able to get the exact portion cropped image by using OpenGL in ipad

In my Ipad app i'm using OpenGL framework to crop the image.I'm not much familier with OpenGL.Here in my app I need to crop some portion of image and display it on an image view. For cropping effect I use OpenGL framework. The cropping effect was working fine but when I am assigning that image to the imageview full black image is getting displayed instead of the cropped image.I'm not getting where I had gone wrong.If any one works on it please guide me and also please post/suggest links which are related to my requirement.
I'm using this code: _sourceImage is the captured image.
-(void)showResult
{
UIImage *imageCrop;
float scaleCrop;
if (_sourceImage.size.width >= IMAGEWIDTH)
{
scaleCrop = IMAGEWIDTH / _sourceImage.size.width;
imageCrop = [ImageCropViewController scaleImage:_sourceImage with:CGSizeMake(_sourceImage.size.width*scaleCrop, _sourceImage.size.height*scaleCrop)];
}
else
{
scaleCrop = 1;
imageCrop = _sourceImage;
}
float scale = _sourceImage.size.width / resizeImage.size.width * 2;
IplImage *iplImage = [ImageCropViewController CreateIplImageFromUIImage:imageCrop] ;
Quadrilateral rectan;
rectan.point[0].x = _touchLayer.rectan.pointA.x*scale*scaleCrop;
rectan.point[0].y = _touchLayer.rectan.pointA.y*scale*scaleCrop;
rectan.point[1].x = _touchLayer.rectan.pointB.x*scale*scaleCrop;
rectan.point[1].y = _touchLayer.rectan.pointB.y*scale*scaleCrop;
rectan.point[2].x = _touchLayer.rectan.pointC.x*scale*scaleCrop;
rectan.point[2].y = _touchLayer.rectan.pointC.y*scale*scaleCrop;
rectan.point[3].x = _touchLayer.rectan.pointD.x*scale*scaleCrop;
rectan.point[3].y = _touchLayer.rectan.pointD.y*scale*scaleCrop;
IplImage* dest = cropDoc2(iplImage,rectan);
IplImage *image = cvCreateImage(cvGetSize(dest), IPL_DEPTH_8U, dest->nChannels);
cvCvtColor(dest, image, CV_BGR2RGB);
cvReleaseImage(&dest);
UIImage *tempImage = [ImageCropViewController UIImageFromIplImage:image withImageOrientation:_sourceImage.imageOrientation];
[self crop:tempImage];
cvReleaseImage(&image);
}
-(void)crop:(UIImage*)image
{
//Adjust the image size, to scale the image to 1013 of width
float targetWidth = 1000.0f;
float scale = targetWidth / image.size.width;
float scaleheight = image.size.height * scale;
UIImage *imageToSent = [ImageCropViewController scaleImage:image with:CGSizeMake(targetWidth, scaleheight)];
// Image data with compression
imageData = UIImageJPEGRepresentation(imageToSent,0.75);
NSDate *now = [NSDate dateWithTimeIntervalSinceNow:0];
NSString *caldate = [now description];
appDelegate.imagefilePath= [NSString stringWithFormat:#"%#/%#.jpg", DOCUMENTS_FOLDER,caldate];
[imageData writeToFile:appDelegate.imagefilePath atomically:YES];
appDelegate.cropimage=imageToSent;
}
Black usually means you failed to pull the data out of OpenGL, and you ended up with a blank image.
How are you getting the image back from OpenGL? OpenGL doesn't work with normal images - you have to issued custom OpenGL calls to pull the image in.

Change UIImage ratio to fb cover in iphone Programmatically

I am picking an image from image picker and then i want to change ratios for that image as facebook cover. i have an image let suppose its resolution ia 640 widht and 480 height and i want to change it for facebook cover(851 pixels wide and 315 pixels tall) how will i do that programmatically in iphone
Check this link for cover picture details
Thanks.
Use this Method to Resize the image according to the given content mode, taking into account the image's orientation...
- (UIImage *)resizedImageWithContentMode:(UIViewContentMode)contentMode bounds:(CGSize)bounds
interpolationQuality:(CGInterpolationQuality)quality {
CGFloat horizontalRatio = bounds.width / self.size.width;
CGFloat verticalRatio = bounds.height / self.size.height;
CGFloat ratio;
switch (contentMode) {
case UIViewContentModeScaleAspectFill:
ratio = MAX(horizontalRatio, verticalRatio);
break;
case UIViewContentModeScaleAspectFit:
ratio = MIN(horizontalRatio, verticalRatio);
break;
default:
[NSException raise:NSInvalidArgumentException format:#"Unsupported content mode: %d", contentMode];
}
CGSize newSize = CGSizeMake(self.size.width * ratio, self.size.height * ratio);
return [self resizedImage:newSize interpolationQuality:quality];
}
and go this this link for more detail,Resizing Uiimage in right way .

how to set image in top avoiding space in UIimageView

I've an UIImageView with content mode Aspect Fit of size 220x155. I'm dynamically inserting different images in different resolutions, but all larger than the size of the UIImageView. As the content mode is set to Aspect Fit, the image is scaled with respect to the ratio to fit the UIImageView.
My problem is, that if for instance the image inside the UIImageView is scaled to 220x100, I would like the UIImageView to shrink from a height of 155 to 100 too to avoid space between my elements.
How can I do this?
I wrote this method to get me the frame of the image view once it loaded an image.
So , the requirements for me were the same as in your case:
1) image view with aspect fit content mode
2) get the exact frame of the image ( this way you can re-position the image view )
Hope this helps:
- (CGRect) getFrameOfImage:(AsyncImageView *) imgView
{
if(!imgView.loaded)
return CGRectZero;
CGSize imgSize = imgView.image.size;
CGSize frameSize = imgView.frame.size;
CGRect resultFrame;
if(imgSize.width < frameSize.width && imgSize.height < frameSize.height)
{
resultFrame.size = imgSize;
}
else
{
float widthRatio = imgSize.width / frameSize.width;
float heightRatio = imgSize.height / frameSize.height;
float maxRatio = MAX (widthRatio , heightRatio);
NSLog(#"widthRatio = %.2f , heightRatio = %.2f , maxRatio = %.2f" , widthRatio , heightRatio , maxRatio);
resultFrame.size = CGSizeMake(imgSize.width / maxRatio, imgSize.height / maxRatio);
}
resultFrame.origin = CGPointMake(imgView.center.x - resultFrame.size.width/2 , imgView.center.y - resultFrame.size.height/2);
return resultFrame;
}
I am using here AsyncImageView but it will work just as good with UIImageView. The important thing to remember is to call this method AFTER the image was loaded.
Cheers!
Its very simple, you just need to get image actual size, which can be done by
UIImage *image = [UIImage imageName:#""];
then you just need to set frame
Like :-
imageView.frame = CGRectMake(0.0, 0.0, image.size.width, image.size.height);
Hope this helps you.
Once the imageview's image is set to the new image (and thus scaled) you can get the height of the image inside the imageview (imageview.image.size.height) and set the imageview's height (frame) accordingly.

Copy part of scaled UIImage

I want to be able to scale and position an image, and then save just part of that image. I currently have a UIImageView inside a UIScrollView. After zooming and positioning the image I hit a button and then have the following code.
// imageScale = current scale of UIView
// get the new width and height of the scaled UIView
float scaledImageWidth = [scrollView viewWithTag:1].frame.size.width;
float scaledImageHeight = [scrollView viewWithTag:1].frame.size.height;
// get starting X and Y coords of target in relation to UIView
// target is a box in the middle of the screen, 210x255px
float imageY = scaledImageHeight / 2 - (scrollView.contentOffset.y * imageScale);
imageY = (imageY < 0) ? (140 * imageScale) + ((imageY * -1) + (scrollView.contentOffset.y *imageScale)) : (140 * imageScale) - imageY;
float imageX = scaledImageWidth / 2 - (scrollView.contentOffset.x * imageScale);
imageX = (imageX < 0) ? (56 * imageScale) + ((imageX * -1) + (scrollView.contentOffset.x *imageScale)) : (56 * imageScale) - imageX;
// image = original unscaled UIImage
// create new UIImage that matches the size of the scaled UIView we have been working with
UIGraphicsBeginImageContext( CGSizeMake(scaledImageWidth, scaledImageHeight) );
[image drawInRect:CGRectMake(0,0,scaledImageWidth, scaledImageHeight)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// now with the UIImage that is the size we want, copy a piece of the image
CGImageRef imageRef = CGImageCreateWithImageInRect([newImage CGImage], CGRectMake(imageX,imageY,210, 255));
UIImage* myThumbnail = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
This seems to work mostly ok. The problem I see is that when I make the final copy of just a piece of the image, the copy (myThumbnail) isn't scaled. However, the source (newImage) appears to scale without problems. Does anyone know what I am missing, or if there would be a different approach to this problem?
Edit:
Ok, I was a little off. The copy is scaling. The problem I'm having is its position is off. So if the image position is too far in one direction, the new copy wont be in the right position. For example, if I move the image so that I'm cropping the bottom left, it might give me a sliver on the right instead of that bottom left portion of the image.