How to reduce image size captured from cam - iphone

The Picture taken from the iphone cam is nearly 2.5 Mb, How to reduce this size ,I have tried
UIJPEGRepresentation(image,0.1f),but it does not effect the size ?

You really can't reduce the size the images takes up in memory.
When an image is loaded, basically a UIImage object the size wil be width x height x 4 bytes. That is the size the an uncompressed image will take up in memory.
Since you can use compressed images all image, once loaded in a UIImage will be uncompressed.
If you really need so save some memory, save the image to disk and create a thumbnail which you use in your app. Then when need you can load the larger image and use it,

Try using the Resize method in UIImage+Resize.h
https://github.com/AliSoftware/UIImage-Resize
[aImgView setImage:[ImageObjectFromPicker resizedImageWithContentMode:UIViewContentModeScaleAspectFit bounds:YourSize interpolationQuality:kCGInterpolationHigh]];

Related

How to shrink or manage an image's size in bytes

Python 3.6.6, Pillow 5.2.0
The Google Vision API has a size limit of 10485760 bytes.
When I'm working with a PIL Image, and save it to Bytes, it is hard to predict what the size will be. Sometimes when I try to resize it to have smaller height and width, the image size as bytes gets bigger.
I've tried experimenting with modes and formats, to understand their impact on size, but I'm not having much luck getting consistent results.
So I start out with a rawImage that is Bytes obtained from some user uploading an image (meaning I don't know much about what I'm working with yet).
rawImageSize = sys.getsizeof(rawImage)
if rawImageSize >= 10485760:
imageToShrink = Image.open(io.BytesIO(rawImage))
## do something to the image here to shrink it
# ... mystery code ...
## ideally, the minimum amount of shrinkage necessary to get it under 10485760
rawBuffer = io.BytesIO()
# possibly convert to RGB first
shrunkImage.save(rawBuffer, format='JPEG') # PNG files end up bigger after this resizing (!?)
rawImage = rawBuffer.getvalue()
print(sys.getsizeof(rawImage))
To shrink it I've tried getting a shrink ratio and then simply resizing it:
shrinkRatio = 10485760.0 / float(rawImageSize)
imageWidth, imageHeight = pilImage.size
shrunkImage = imageToShrink.resize((int(imageWidth * shrinkRatio),
int(imageHeight * shrinkRatio)), Image.LANCZOS)
Of course I could use a sufficiently small and somewhat arbitrary thumbnail size instead. I've thought about iterating thumbnail sizes until a combination takes me below the maximum bytes size threshold. I'm guessing the bytes size varies based on the color depth and mode and (?) I got from the end user that uploaded the original image. And that brings me to my questions:
Can I predict the size in bytes a PIL Image will be before I convert it for consumption by Google Vision? What is the best way to manage that size in bytes before I convert it?
First all, you probably don't need to maximize to the 10M limit posed by Google Vision API. In most case, a much smaller file will be just fine, and faster.
In addition to that, you may want to keep in mind that the aspect ratio might lead to different result. See this, https://www.mlreader.com/prepare-image-for-google-vision-api

memory issues with UIImage when we load 2000 images

I'm trying to run an animation by changing the images of my UIImageView . I need about 200 images of 24K to create a 5 sec animation. I am able to load all the images into the memory (into an NSArray), but when I start the animation (switching the UIImage of the UIImageView) - after about 60 images I get a memory warning and if I continue displaying images the app crashes.
Just because your image files are 24Kb on disk, doesn't mean that is the amount of memory they will take up.
If you have an image that is 480x960 with 1 byte per pixel, that may only be a small file size due to compression (jpeg, for example), but when it is in memory in your app, it will be 450KB. Multiply that by 60 (the point at which you get the memory warning) and you will see that is approx 27MB.
If your images are larger, or have a greater colour depth, then obviously they will consume more memory. I think I read once that iOS gives you a memory warning when you hit 22Mb, but that includes other memory allocated to your app for other things as well.
And just because your app "loads" the images into the array, doesn't mean it actually loads it into memory, or expands it until it really needs it.
So, to calculate how much memory your image is going to use, don't look at the file size, but instead work it out from the image dimensions.

Major speed issues with imageWithContentsOfFile

In my application I'm creating a large image dynamically and then loading it up for display in my image explorer class. Because I can't add new images to the bundle at run time, it seems I have to use imageWithContentsOfFile - however, this gives me major speed issues further down the line.
The way my image explorer works is that it takes in an image, splits it up into tiles, caches those tiles and then only loads those tiles into memory for display that need to be shown on the screen. Using a bunch of NSLogs, I've managed to find out where all the slowdown is. It's not in the imageWithContentsOfFile function itself, it's when I try to call this line:
CGContextDrawImage(context_ref,
CGRectMake(0, 0, imgWidth, imgHeight), tileImage);
This is when I'm writing the tile to the cache file. tileImage is a CGImageRef that is returned from CGImageCreateWithImageInRect, which is how I get subsets of my larger image to save separately.
The odd thing is that splitting up a large image this way takes about 45 seconds (!), but when I split up an image from the bundle using imageNamed rather than imageWithContentsOfFile, it takes only about 2 seconds.
Anyone have any ideas? Thanks in advance :)
I think U should split up your image.
Because, CGContextDrawImage will take fully loaded "tileImage".
If your "tileImage" size is 8 MB, your app must load 8MB data to memory.
It takes long time for loading. and It may create memory issue and so on.
If you want to use a single big image and you can wait for loading,
there is solution that U can use another thread.
It can avoid to UI lock during loading a big image.
An 8MB JPG image will use over 8MB memory, UIImage should use noncompressed format for fast drawing.
imageNamed uses caching, and may reduce the amount of scaling.
UIImage is immutable. imageNamed may note this and return a reference to a cached image, rather than loading and creating a new image... wherever you load your image.
if you create an images, you can setup your own (in memory) caching scheme and pass references in many cases. then purge the cache when you receive a memory warning.
if you need to scale the image and the size is static, determine the size to draw, and create a UIImage using imageWithCGImage:scale:orientation: -- or you can approach the problem in a similar way using CoreGraphics apis directly too.
beyond that, hold onto/reuse what you need, and use a profiler to balance your allocations and to measure timings.

UIImage allocates more memory

In my viewcontroller i created a UIImageView and assigned a image in the Interface Builder. While checking on instruments i have allocation of malloc of 600kb and the responsible library is ImageIO_Malloc. But the size of my image is 37kb. I dont know why it allocates 600kb.
I have also tried with the code by assigning UIImage imageNamed. Still no good.
Do you people have any idea on that.
600 kB is really not much to allocate for a image. Your 37 kB is probably just the size of the compressed image file. However when that image needs to be displayed the image view needs to allocate back buffering of it so it can be represented in an uncompressed format internally.
An image with dimensions of 640x480 pixels will result in 300.000 pixels, each of which needs and R, G, B, and possible alpha value - meaning 3-4 bytes per pixel. So you can easily see allocations in the order og 600 kB for even fairly small images.

Iphone reduce Image file size

Is there anyway to reduce the Image file size or Raw RGB buffer ?
Actually I have RGB buffer which it has 500KB with 320X420 size.I tried to save it to disk using UIimage and it comes to 240 KB.
As per the image size, I want it to have less than 50KB or so.(loosing quality is OK)
Is it possible ?
Thanks,
Raghu
See Trevor Harmon's excellent post on the subject.
The 240KB size sounds exactly like the raw RGB image data, uncompressed (320x420x3). Doesn't the iPhone have a PNG or JPEG exporter? The internet says to use UIImageJPEGRepresentation or UIImagePNGRepresentation and NSData writeToFile.