Swift 5 Resize image to pixel size before upload - swift

I have a web service that I send images from iOS to. Before sending the images, I want to reduce the file size to a max. width of 1024 px to save bandwidth
I have tried
let size = CGSize(width: 1024, height: 768)
let renderer = UIGraphicsImageRenderer(size: size)
//https://nshipster.com/image-resizing/
let smallImage = renderer.image { (context) in
uiImage.draw(in: CGRect(origin: .zero, size: size))
}
but that always produces different output pixel sizes. What do I have to do to make sure that the output image does not exceed a width of 1024 px?

This one seems to work fine, you need to consider the scaling aspect of the different screens:
let scaleFactor = UIScreen.main.scale
let scale = CGAffineTransform(scaleX: scaleFactor, y: scaleFactor)
let width = 1024 / scale.a
let height = 768 / scale.d
let size = CGSize(width: width, height: height)
let renderer = UIGraphicsImageRenderer(size: size)
let smallImage = renderer.image { (context) in
uiImage.draw(in: CGRect(origin: .zero, size: size))
}

Related

UIGraphicsImageRenderer producing Larger file size for Smaller image?

I am resizing an image and in doing so I find that the reduced image produces a larger file than the original. This for the same image.
func imageResizedForSMS() -> UIImage? {
guard let image = self.image else {print("No image to resize");return nil}
let maxDimension:CGFloat = 1280
guard max(image.size.width,image.size.height) > maxDimension else {print("Original image within proper sizing");return image}
let scaleRatio = min(image.size.width,image.size.height) / max(image.size.width,image.size.height)
let scaledTarget = maxDimension * scaleRatio
let targetSize = CGSize(width: image.size.width >= image.size.height ? maxDimension : scaledTarget, height: image.size.height >= image.size.width ? maxDimension : scaledTarget)
let renderer = UIGraphicsImageRenderer(size: targetSize)
let scaledImage = renderer.image { _ in
image.draw(in: CGRect(origin: .zero, size: targetSize))
}
let imageData = scaledImage.jpegData(compressionQuality: 1.0)
let imageSize = imageData?.count ?? 0
print("Size of resized image = \(Double(imageSize) / 1000.0) KB")
let imageData2 = image.jpegData(compressionQuality: 1.0)
let imageSize2 = imageData2?.count ?? 0
print("Size of original image = \(Double(imageSize2) / 1000.0) KB")
return scaledImage
}
The original image dimensions are 4032X3024 and the resized dimensions are 1280X960. However, the resized file size is 11733.6 KB while the original file size is 7088.8 KB
From the debugger log:
Size of resized image = 11733.636 KB
Size of original image = 7088.865 KB
Printing description of image:
<UIImage:0x283978b40 anonymous {4032, 3024} renderingMode=automatic>
Printing description of scaledImage:
<UIImage:0x28394d320 anonymous {1280, 960} renderingMode=automatic>
How is that a reduction in image dimensions is producing a much larger file size?
As usual I figure it out shortly after posting the question no matter how much time was spent prior to posting....
The issue appears to be the display scale. If I set a trait with a displayScale of 1 and then use the code above, the file size is indeed much reduced. It drops from the previous 11733KB to 1474KB which is much smaller than the original of 7088KB
The updated portion of the code above that fixes this issue:
let format = UIGraphicsImageRendererFormat(for: UITraitCollection(displayScale: 1))
let renderer = UIGraphicsImageRenderer(size: targetSize, format: format)
let scaledImage = renderer.image { _ in
image.draw(in: CGRect(origin: .zero, size: targetSize))
}

Creating thumbnail -> ugly quality (Swift - preparingThumbnail)

This is how I create a thumbnail from Data:
let image = UIImage(data: data)!
.preparingThumbnail(of: .init(width: size, height: size))!
try image.pngData()!.write(to: url)
The data variable contains the original image. That looks good, but I want to create thumbnails from lists.
The size variable holds a value which is the same height as my Image in SwiftUI. The problem is, it looks horrible:
Thumbnail:
Original:
The 'thumbnail' is the same size as the image above, it really looks that bad on the device, it is not stretched out. What is the correct way to create a thumbnail of the same quality in iOS 15.0>?
Have you tried to consider the aspect ratio as well instead of just the size? Pass in the data (your let image = UIImage(data: data)! and see if that works)
func resizeImageWithAspect(image: UIImage,scaledToMaxWidth width:CGFloat,maxHeight height :CGFloat)->UIImage? {
let oldWidth = image.size.width;
let oldHeight = image.size.height;
let scaledBy = (oldWidth > oldHeight) ? width / oldWidth : height / oldHeight;
let newHeight = oldHeight * scaledBy;
let newWidth = oldWidth * scaledBy;
let newSize = CGSize(width: newWidth, height: newHeight)
UIGraphicsBeginImageContextWithOptions(newSize,false,UIScreen.main.scale);
image.draw(in: CGRect(x: 0, y: 0, width: newSize.width, height: newSize.height));
let newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage
}

Unable to change image size in my draw function

In my meme application, I want the user to be able to put text on image. To make this possible I am using UIGraphics. In my scene, I have an image drawn on the background and a text showing on the foreground.
Here is my code for generating the so said image :
func GenerateMemeFrom(image: UIImage, topText: NSString, bottomText: NSString) -> UIImage {
let scale: CGFloat = 4
let size = CGSize(width: UIScreen.main.bounds.size.width * scale, height: UIScreen.main.bounds.size.height * scale)
UIGraphicsBeginImageContext(size)
let textAttributes: [NSAttributedString.Key : Any] = [
NSAttributedString.Key.font: UIFont(name: "HelveticaNeue-CondensedBlack", size: 200)!,
NSAttributedString.Key.foregroundColor: UIColor.white,
]
let ratio = UIScreen.main.bounds.size.width / image.size.width
let imageSize = CGSize(width: UIScreen.main.bounds.size.width * scale, height: image.size.height * ratio * scale)
image.draw(in: CGRect(origin: CGPoint.zero, size: imageSize))
let point = CGPoint(x: UIScreen.main.bounds.size.width * scale / 2, y: 100)
let rect = CGRect(origin: point, size: image.size)
topText.draw(in: rect, withAttributes: textAttributes)
let generatedImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return generatedImage!
}
Here is what I have on my screen : Image
But whether I put scale equal to 2 or 4, it changes nothing in the application. I want to be able to change size based on screen width. Furthermore, I want the image to be centered. Is there any way to do so ?

How to resized scale UIImageView in textview like Scale Aspect Fit swift?

Hey I create a textview and I can add images in this textview.This image's width equal to textview's width. But I want to give a maximum height for this ImageView and I want to show the image like content mode scale aspect fit but it shows stretched(compressed aspect fill) how can I solve this situation ? Code like below
let image = UIImageView()
image.contentMode = .scaleAspectFit
let imageAttachment = NSTextAttachment()
let newImageWidth = self.textView.bounds.width
let newImageHeight = 200
imageAttachment.bounds = CGRect(x: 0, y: 0, width: Int(newImageWidth), height: newImageHeight)
imageAttachment.image = image.image
This is how you would calculate the new height for an aspectFit ratio:
// don't use "image" ... that's confusing
let imageView = UIImageView()
// assuming you set the image here
imageView.image = UIImage(named: "myImage")
guard let imgSize = imageView.image?.size else {
// this will happen if you haven't set the image of the imageView
fatalError("Could not get size of image!")
}
let imageAttachment = NSTextAttachment()
let newWidth = self.textView.bounds.width
// get the scale of the difference in width
let scale = newWidth / imgSize.width
// multiply image height by scale to get aspectFit height
let newHeight = imgSize.height * scale
imageAttachment.bounds = CGRect(x: 0, y: 0, width: newWidth, height: newHeight)
imageAttachment.image = imageView.image

White Space Around Resized Image

I'm trying to Resize Image by preserving aspect ratio.But there is this white space surrounding the resized image.
extension NSImage {
func resizeTo(width: CGFloat, height: CGFloat) -> NSImage {
let ratioX = width / size.width
let ratioY = height / size.height
let ratio = ratioX < ratioY ? ratioX : ratioY
let newHeight = size.height * ratio
let newWidth = size.width * ratio
let canvasSize = CGSize(width: width, height: CGFloat(ceil(width/size.width * size.height)))
let img = NSImage(size: canvasSize)
img.lockFocus()
let context = NSGraphicsContext.current()
context?.imageInterpolation = .high
draw(in: NSRect(origin: .zero, size: NSSize(width: newWidth,height: newHeight)), from: NSRect(origin: .zero, size: size) , operation: .copy, fraction: 1)
img.unlockFocus()
return img
}
}
What I'm I doing wrong?
First you are picking the smallest ratio from ratioX and ratioY, but later you create canvas using ratioX (of size width, height * ratioX). I'd say you need to create canvas using newWidth and newHeight.
let canvasSize = CGSize(width: newWidth, height: newHeight)
Additionally, be aware that this script will resize in both directions, i.e. will increase the size of small images.