How to resize image from library of UIImagePickerController - swift

How to resize image from UIImagePickerController if the length, width, or both of the photos exceed 1500, both the length and width must be reduced to 65%. For example if the size of the photo is 2000x3000 after resizing both parameters should be multiplied by 0.65 and become 1300x1950.
And, if the length and width of the photo does not exceed 1500, the photo should remain unchanged

Please find the code to resize based on max size and scale.
extension UIImage {
func resizeImage(maxSize: CGFloat, scale : CGFloat) -> UIImage {
let size = self.size
if (size.width > maxSize || size.height > maxSize) {
let newSize = CGSize(width: size.width * scale, height: size.height * scale)
let rect = CGRect(x: 0, y: 0, width: newSize.width, height: newSize.height)
UIGraphicsBeginImageContextWithOptions(newSize, false, 1.0)
self.draw(in: rect)
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return newImage!
}
return self
}
}
let resizedImage = yourImage.resizeImage(maxSize: 1500, scale: 0.65)

Related

Creating thumbnail -> ugly quality (Swift - preparingThumbnail)

This is how I create a thumbnail from Data:
let image = UIImage(data: data)!
.preparingThumbnail(of: .init(width: size, height: size))!
try image.pngData()!.write(to: url)
The data variable contains the original image. That looks good, but I want to create thumbnails from lists.
The size variable holds a value which is the same height as my Image in SwiftUI. The problem is, it looks horrible:
Thumbnail:
Original:
The 'thumbnail' is the same size as the image above, it really looks that bad on the device, it is not stretched out. What is the correct way to create a thumbnail of the same quality in iOS 15.0>?
Have you tried to consider the aspect ratio as well instead of just the size? Pass in the data (your let image = UIImage(data: data)! and see if that works)
func resizeImageWithAspect(image: UIImage,scaledToMaxWidth width:CGFloat,maxHeight height :CGFloat)->UIImage? {
let oldWidth = image.size.width;
let oldHeight = image.size.height;
let scaledBy = (oldWidth > oldHeight) ? width / oldWidth : height / oldHeight;
let newHeight = oldHeight * scaledBy;
let newWidth = oldWidth * scaledBy;
let newSize = CGSize(width: newWidth, height: newHeight)
UIGraphicsBeginImageContextWithOptions(newSize,false,UIScreen.main.scale);
image.draw(in: CGRect(x: 0, y: 0, width: newSize.width, height: newSize.height));
let newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage
}

Get preview image from dae file, to display in UIImage

I'm building an augmented reality app and would like to show preview images of the available DAE files.
Is it possible to use Swift to extract image data from a DAE file and to load that into an UIImage?
Out of the box there is no function to call that will pull a screen shot from a .dae file. Thats not to say that there isn't a way to do this on device.
To understand what you are asking lets consider the different components.
The .dae file is loaded using an ARSCNView, which is a reader for .dae files. The ARSCNView is traditionally added as a subview to your ViewController. In this instance you would need the viewer to render the 3D file offscreen so you need to create an ARSCNView but do not add it as a subview of the VC (i would add it during the initial setup to make sure it all loads, but then comment that line out)
Once you have your 3D loaded into your ARSCNView, use the below function to rip the context of the view into a JPG image
let viewSnapshot = takeSnapshotOfView(view: yourARSCNView)
and to rip the context and resize the JPG:
import UIKit
import CoreGraphics
func takeSnapshotOfView(view:UIView) -> UIImage? {
UIGraphicsBeginImageContext(CGSize(width: view.frame.size.width, height: view.frame.size.height))
view.drawHierarchy(in: CGRect(x: 0.0, y: 0.0, width: view.frame.size.width, height: view.frame.size.height), afterScreenUpdates: true)
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return image?.resize(CGRect(x: 0, y: 0, width: 300, height: (image?.size.height)! / 300 * (image?.size.width)!)) // resize image before returning
}
extension UIImage {
func resize(_ toSize: CGRect) -> UIImage {
let size = self.size
let widthRatio = toSize.width / self.size.width
let heightRatio = toSize.height / self.size.height
var newSize: CGSize
if widthRatio > heightRatio {
newSize = CGSize(width: size.width * heightRatio, height: size.height * heightRatio)
} else {
newSize = CGSize(width: size.width * widthRatio, height: size.height * widthRatio)
}
let rect = CGRect(x: 0, y: 0, width: newSize.width, height: newSize.height)
UIGraphicsBeginImageContextWithOptions(newSize, false, 1.0)
self.draw(in: rect)
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return newImage!
}
}

Resize UIImage Keep Image Quality in Swift

I have created an image from UIView using UIGraphicsGetCurrentContext. It's work fine but when I resized that image to larger size its be blurred with bad quality. Can it possibility to keep quality when resize? I have tried many ways but not work.
- code image:
func image(with view: UIView) -> UIImage? {
UIGraphicsBeginImageContextWithOptions(view.bounds.size, view.isOpaque, 0.0)
defer { UIGraphicsEndImageContext() }
if let context = UIGraphicsGetCurrentContext() {
view.drawHierarchy(in: view.bounds, afterScreenUpdates: true)
context.setAllowsAntialiasing(true)
context.setShouldAntialias(true)
let image = UIGraphicsGetImageFromCurrentImageContext()
return image
}
return nil
}
code I have used to resize, this extension of UIImage
func resizedImage(newSize: CGSize) -> UIImage {
// Guard newSize is different
guard self.size != newSize else { return self }
let aspect_ratio = self.size.width / self.size.height
var image_w = newSize.height * aspect_ratio
let finalSize = CGSize(width: image_w, height: newSize.height)
UIGraphicsBeginImageContextWithOptions(finalSize, false, 0.0)
self.draw(in: CGRect(x: 0, y: 0, width: (CGFloat)(finalSize.width), height: (CGFloat)(newSize.height)))
let newImage: UIImage = UIGraphicsGetImageFromCurrentImageContext()!
UIGraphicsEndImageContext()
return newImage
}
and the example image:image
Thanks
Try this:
extension UIImage {
func resizeImage(targetSize: CGSize) -> UIImage {
let size = self.size
let widthRatio = targetSize.width / size.width
let heightRatio = targetSize.height / size.height
let newSize = widthRatio > heightRatio ? CGSize(width: size.width * heightRatio, height: size.height * heightRatio) : CGSize(width: size.width * widthRatio, height: size.height * widthRatio)
let rect = CGRect(x: 0, y: 0, width: newSize.width, height: newSize.height)
UIGraphicsBeginImageContextWithOptions(newSize, false, 1.0)
self.draw(in: rect)
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return newImage!
}
}

Keep same image quality after converting in swift

I am referring to ZImageCropper for my project to crop an image. However, it reduces the quality of the image during conversion.
I know one of the reasons the quality of image was reduced is due to UIGraphicsBeginImageContextWithOptions(imageView.bounds.size, false, 1)
I have tried changing it to UIGraphicsBeginImageContextWithOptions(imageView.bounds.size, false, 0) and UIGraphicsBeginImageContextWithOptions(imageView.bounds.size, false, UIScreen.main.scale)
I reference my code from Resize and Crop 2 Images affected the original image quality It does help but when I check the quality of the image, it is still reduced. Is there any way to prevent a reduction in image quality?
in iOS 10 and above you can resize any UIImage without quality loss, hope this helps:
func resize(targetSize: CGSize) -> UIImage {
if #available(iOS 10.0, *) {
return UIGraphicsImageRenderer(size: targetSize).image { _ in
self.draw(in: CGRect(origin: .zero, size: targetSize))
}
} else {
return resizeImage(maxSize: targetSize.width)
}
}
func resizeImage(maxSize: CGFloat) -> UIImage {
var newWidth = size.width
var newHeight = size.height
if size.width >= maxSize {
newWidth = min(maxSize, size.width)
newHeight = maxSize * size.height / size.width
} else if size.height >= maxSize {
newHeight = min(maxSize, size.height)
newWidth = maxSize * size.width / size.height
}
UIGraphicsBeginImageContext(CGSize(width: newWidth, height: newHeight))
draw(in: CGRect(x: 0, y: 0, width: newWidth, height: newHeight))
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return newImage!
}

How to take a snapshot from UIImage of a UIImageView that is scaleAspectFit mode?

I have an image in a UIImageView:
imageView.contentMode = .scaleAspectFit
imageView.backgroundColor = .red
because of .scaleAspectFit the image view has some red borders and thats OK:
User can added some UIView like label or images over the imageView.
In final step I used the following code to save edited image and user can share it or save it to photo library:
private func generateImage() -> UIImage? {
var finalImage: UIImage?
UIGraphicsBeginImageContextWithOptions(CGSize(width: imageView.frame.size.width, height: imageView.frame.size.height), true, 0)
imageView.drawHierarchy(in: CGRect(x: 0, y: 0, width: imageView.frame.size.width, height: imageView.frame.size.height), afterScreenUpdates: true)
finalImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return image
}
The problem is that the finalImage still has the red borders from imageView.
You can get CGRect of the UIImage displayed in the UIImageView in AspectFit content mode. Please create extension of UIImageView like this,
extension UIImageView {
var contentClippingRect: CGRect {
guard let image = image else { return bounds }
guard contentMode == .scaleAspectFit else { return bounds }
guard image.size.width > 0 && image.size.height > 0 else { return bounds }
let scale: CGFloat
if image.size.width > image.size.height {
scale = bounds.width / image.size.width
} else {
scale = bounds.height / image.size.height
}
let size = CGSize(width: image.size.width * scale, height: image.size.height * scale)
let x = (bounds.width - size.width) / 2.0
let y = (bounds.height - size.height) / 2.0
return CGRect(x: x, y: y, width: size.width, height: size.height)
}
}
You can now use imageView.contentClippingRect to read how read the position and size of the image inside.
You have to do minor changes in your method, call your function with appropriate bounds as contentClippingRect.
Let me know in case of any queries.
UPDATE
Please try this UIImageView+Extension, this might help you. It is in Objective-C code, convert it in Swift.
You can try this as well,
let image = #imageLiteral(resourceName: "Cat03")
let x: CGRect = AVMakeRect(aspectRatio: image.size, insideRect: imageView1.frame)
print(x)
Above code gives you size perfectly.