Is there a way to manipulate UIImage in iPhone SDK - iphone

Is there a way to apply some functions like:brightness,contrast,sharpness,hue,saturation,exposure to uiimages. For example I have a UIImageview and inside that i have a uiimage. I want to manipulate its brightness or contrast or sharpness with a help of UISlider

Have a look at the sample app GLImageProcessing, however it uses openGL to do the image processing, but it is really fast.

Related

UIImageView Animation delay between frames?

I am trying to create an animation using a UIImageView with an array if images. It works great, except that I need a different delay between some of the frames. In an animated gif you can set the delay between frames. Every example I see of an animated UIImageView has a fixed delay between images.
Does anyone know how I can set a different delay between frames? Or, is there an example of such?
Maybe UIImageView is not the right thing to use, so if anyone has an alternative please let me know.
I have posted a similiar question in other forums and no one seems to be able to answer this. It seems like it should bo doable, since .gif images have had this forever.
Thanks
It may seem hacky, but:
From UIImageView documentation:
animationImages
An array of UIImage objects to use for an animation.
#property(nonatomic, copy) NSArray *animationImages
Discussion
The array must contain UIImage objects. You may use the same
image object more than once in the array. Setting this property to
a value other than nil hides the image represented by the image
property. The value of this property is nil by default.
So you need to add your UIImage several times to the array for a delay.
UIImageView does not support a different delay between different images. You will have to use multiple UIImageViews and manage the transition between them manually if you want to have different delays between frames.
Another approach would be to use a UIWebView which is sized exactly to fit a .gif image that has the delay settings you want.
You have to use different UIImageView to get this working. Try this application page. Might be helpful for you. Link: http://www.raywenderlich.com/2454/how-to-use-uiview-animation-tutorial
If you would like to try my animation library in your iOS app, it offers a complete solution to this problem. For example, you could take a look at the APNG app, it is a free app in the iTunes store that displays animated APNG files. This app was created using my AVAnimator library which contains APNG and GIF decoding support. The APNG format works just like a GIF in the sense that each frame can have a delay time, you would use existing software to create the APNG in this case. But, it is easier to just encode a new .mvid file (this is a custom video file format) from a series of PNG imags. To create a longer delay, you simply repeat frames that do not change in the input PNG image series. Either approach could be used to create an input movie that could display with a variable amount of time in between specific frames (the overall FPS would be the same, but specific frames can simply repeat so that they do not change for say N frames in a row). You can also implement specific user interactions like starting an animation, pausing on a specific frame, and then starting again from that same frame.

Best way to apply cameraViewTransform to UIImagePickerControllerOriginalImage? iPhone SDK

I'm trying to apply the cameraViewTransform from my UIImagePickerController to the UIImagePickerControllerOriginalImage (the out coming UIImage from the camera). The only way I managed to do this is by printing the UIImage on an UIImageView and then using UIGraphicsBeginImageContext(); to take a screenshot of the UIImageView I have.
The thing is, printing the UIImagePickerControllerOriginalImage to an UIImageView, and then zooming and taking a screenshot takes a long time to process and also freezes the iPhone. (That's because that UIImage is in full resolution, and I also want it to stay in full resolution.)
My Question is... is there a better way to apply the cameraViewTransform to the UIImagePickerControllerOriginalImage image? Or is there a faster way to print my image on the image view?
Thank you so much guys for your time, I really appreciate it!!! :)

How to perform Image Editing in iPhone application?

I have to perform some image editing operation like cropping, gray scale, rotating
image, Polaroid-effects etc.
From these I found some of functionality like cropping, gray scale. But from this we have to manage separate functions iN there. Is there any valid API or Library available for iPhone to perform image processing?
I use the following function for cropping:
-(UIImage *)imageByCropping:(UIImage *)imageToCrop toRect:(CGRect)rect
{
CGImageRef imageRef = CGImageCreateWithImageInRect([imageToCrop CGImage], rect);
UIImage *cropped = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return cropped;
}
for this I have not idea, how to set cropping rectangle on my actual image...and how to re-size that rectangle.
Is there any method in iPhone like Android we direct crop image after capture image... by setting it`s some property we can achieve this in iPhone.
for gray scale I found something here, but for this if we perform Polaroid effect we have to manage separate enums...,
I found one Image Magik API for this but many people said apple reject app due to private API.and also not get how to use it.
If for this or any-other API available than tell me about it and its tutorial.
check the Quartz Demo code sample by apple... especially QuartzClippingView and QuartzMaskingView...
and if you want to learn core graphics basics for image editing, a quick look at Quartz 2D Programming Guide and Drawing and Printing Guide would really help...
also, if you are planning to go for better cropping check Cropped Image .. this is for mac but it can be useful for creating similar implementation for iPhone by using Cocoa Touch counterparts of the same classes they have used e.g. instead of NSBezierPath for mac use UIBezierPath for iPhone ...
No , Using ImageMagick is absolutely valid.
Pl. follow this discussion.

Is it possible to know if an image is in the iPhone system cache?

I am trying to improve scrolling performance of a UITableView. I am loading the images using imageNamed: and dispatch_async, but scrolling is very good once the images have been loaded into the cells. I would like to fade in the images only if they are not in the system cache to reduce the jarring effect of the images "popping" into view.
Is there a way to know if an image is already in the system cache?
There is no documented way to look inside the UIImage to check such things.
I think the only way to know for sure that the image is available immediately, is to force the UIImage to be loaded. This can be done in the background, by creating the UIImage and accessing it's pixels, using CGImage functions. If you ensure that there is no rescaling needed (i.e. don't put a 3000x2000 image in a 30x20 space) then it should display without a glitch.

How to create iOS image buttons that scale well across multiple resolutions?

I've run into the issue of using a UIBarButtonItem with a custom color. Everything out on the 'net seems to indicate that the only way around this lack of official API support revolves around the use of images. This is all fine and dandy when developing for pre-iOS 4 devices, except when using the new iPhone 4. Creating an image for iPad and pre-iOS 4 devices is straightforward enough, but the images developed for those devices look absolutely horrid on iPhone 4. I suspect that this problem will be exacerbated further with the introduction of next generation devices.
Consider the example below. Notice how the default colored button is nice and smooth, but the iPhone 3GS image looks terrible. It does not seem very scalable (pun intended) to have to include multiple images for different resolution devices.
In the absence of an official API for changing the color of a UIBarButtonItem, what strategies are out there for creating images that scale well against differing resolution devices? This problem is hardly unique to UIBarButtonItems, how is the community adapting to other UI elements that are bitmapped? Is there a better solution for this particular case than using an image (such as using Quartz to draw it)?
If at all possible, please offer concrete code examples.
You can list any image as Image#2x.png along with Image.png and the system will select the appropriate image at runtime.
If you look at the source for Three20 you can see how they draw custom buttons and shapes that will scale well, regardless of resolution.
Give Opacity (for Mac) a try. Draw your button in it with vector elements and effects, and it'll spit out the necessary Quartz code to reproduce it, drawing natively in your iOS application. You get Retina (#2x) support automatically.
Been over a year since I posted this question, but ran into a use case where I wanted to be able to do this, so instead of having to draw or otherwise create the buttons, I decided to write an open source application to create them. This application uses private APIs to change the colors of the UIBarButtonItem objects and then uses a graphics context to save them to a determined location on your computer's file system. This way you can have pixel perfect UIBarButtonItem images to use in your UIToolbars.
The app creates both the standard and #2x resolution images.
UIBarButtonItem-Generator # GitHub
Any vector drawing app may work, but I would also consider povray, which allows you to create in a C-like scripting 3D language, then export any pixel size you choose.
http://povray.org
I have the same problem with navigation bar so solve as the following:
first i subclass my navigation bar
inside this class
- (void)drawRect:(CGRect)rect
{
UIImage *image=[UIImage imageNamed:#"MyImage.png"];
self.frame=CGRectMake(0, 0, image.size.width, image.size.height);
self.backgroundImage =image;
}
finally save the same image with different resolution With #2x at the end