I'm trying to apply the cameraViewTransform from my UIImagePickerController to the UIImagePickerControllerOriginalImage (the out coming UIImage from the camera). The only way I managed to do this is by printing the UIImage on an UIImageView and then using UIGraphicsBeginImageContext(); to take a screenshot of the UIImageView I have.
The thing is, printing the UIImagePickerControllerOriginalImage to an UIImageView, and then zooming and taking a screenshot takes a long time to process and also freezes the iPhone. (That's because that UIImage is in full resolution, and I also want it to stay in full resolution.)
My Question is... is there a better way to apply the cameraViewTransform to the UIImagePickerControllerOriginalImage image? Or is there a faster way to print my image on the image view?
Thank you so much guys for your time, I really appreciate it!!! :)
Related
I am trying to create an animation using a UIImageView with an array if images. It works great, except that I need a different delay between some of the frames. In an animated gif you can set the delay between frames. Every example I see of an animated UIImageView has a fixed delay between images.
Does anyone know how I can set a different delay between frames? Or, is there an example of such?
Maybe UIImageView is not the right thing to use, so if anyone has an alternative please let me know.
I have posted a similiar question in other forums and no one seems to be able to answer this. It seems like it should bo doable, since .gif images have had this forever.
Thanks
It may seem hacky, but:
From UIImageView documentation:
animationImages
An array of UIImage objects to use for an animation.
#property(nonatomic, copy) NSArray *animationImages
Discussion
The array must contain UIImage objects. You may use the same
image object more than once in the array. Setting this property to
a value other than nil hides the image represented by the image
property. The value of this property is nil by default.
So you need to add your UIImage several times to the array for a delay.
UIImageView does not support a different delay between different images. You will have to use multiple UIImageViews and manage the transition between them manually if you want to have different delays between frames.
Another approach would be to use a UIWebView which is sized exactly to fit a .gif image that has the delay settings you want.
You have to use different UIImageView to get this working. Try this application page. Might be helpful for you. Link: http://www.raywenderlich.com/2454/how-to-use-uiview-animation-tutorial
If you would like to try my animation library in your iOS app, it offers a complete solution to this problem. For example, you could take a look at the APNG app, it is a free app in the iTunes store that displays animated APNG files. This app was created using my AVAnimator library which contains APNG and GIF decoding support. The APNG format works just like a GIF in the sense that each frame can have a delay time, you would use existing software to create the APNG in this case. But, it is easier to just encode a new .mvid file (this is a custom video file format) from a series of PNG imags. To create a longer delay, you simply repeat frames that do not change in the input PNG image series. Either approach could be used to create an input movie that could display with a variable amount of time in between specific frames (the overall FPS would be the same, but specific frames can simply repeat so that they do not change for say N frames in a row). You can also implement specific user interactions like starting an animation, pausing on a specific frame, and then starting again from that same frame.
Is there a way to apply some functions like:brightness,contrast,sharpness,hue,saturation,exposure to uiimages. For example I have a UIImageview and inside that i have a uiimage. I want to manipulate its brightness or contrast or sharpness with a help of UISlider
Have a look at the sample app GLImageProcessing, however it uses openGL to do the image processing, but it is really fast.
I have a 5MP image coming from the back camera. I want to downsize it to put it into an ImageView without having to wait too long (it take a logn time to the UIImageView to display a 5 MP picture). So I tried many methods to resize / resample the image to make it fit a just-screen resolution (retina one). But it take around 1 sec to downsize the image.
Would you know any optimised way to be able to resize this 5MP image to the retina 960 x 640 resolution as fast as possible, and at least at less that 0.7 sec on an iPhone 4 ?
This is my favorite blog post and code to resize images. I haven't benchmarked it but I know of no faster way.
http://vocaro.com/trevor/blog/2009/10/12/resize-a-uiimage-the-right-way/
You're displaying an image in an UIImageView. I understand you want a version that is equal to the size of the screen. The way I would do it: display the original image in the UIImageView directly, this shouldn't take any noticeable amount of time.
Then in a background thread resize the same image down to the size you want and update the UIImageView as soon as that is done.
This gives you something on screen immediately while you generate an optimized version at the same time.
I don't know if you are using the original image for something else, but if you really only want the screen sized image it might be an idea to use an AVCaptureSession with a preset of AVCaptureSessionPresetHigh (see AVCaptureSession sessionPreset). This setting results in a more reasonably sized image, which might better suit your needs.
I am using following code to resize an image - it all works well and as expected...
Resize UIImage the right way
I use interpolation quality as kCGInterpolationLow and UIImageJPEGRepresentation(image,0.0) to get the NSData of that image.
The problem is that the image size is still a bit high in size at around 100kb.
My question is can I reduce it further?
The images originate from the iPhone Photo Album and are selected via a imagePickerController.
Many thanks,
I am trying to improve scrolling performance of a UITableView. I am loading the images using imageNamed: and dispatch_async, but scrolling is very good once the images have been loaded into the cells. I would like to fade in the images only if they are not in the system cache to reduce the jarring effect of the images "popping" into view.
Is there a way to know if an image is already in the system cache?
There is no documented way to look inside the UIImage to check such things.
I think the only way to know for sure that the image is available immediately, is to force the UIImage to be loaded. This can be done in the background, by creating the UIImage and accessing it's pixels, using CGImage functions. If you ensure that there is no rescaling needed (i.e. don't put a 3000x2000 image in a 30x20 space) then it should display without a glitch.