I am working on an application whose job is to build an image(jpeg) that is a collage of selected images from gallery. I can crop the gallery images to needed size using the technique specified in the question here.
However, I want to create a collage that is 2400x1600 (configurable) pixels and arrange cropped images on white background.
I couldn't find a right example to create a canvas and set its background color. I believe I need to create a core graphics context, create a canvas, set background to white, save as image and work on that image object. However am not able to find the right way to do it. Appreciate any help.
Edit:
Found this code to save view to image. Now the problem is reduced to creating a view that has a canvas of 2400x1600.
-(UIImage*) makeImage {
UIGraphicsBeginImageContext(self.view.bounds.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return viewImage;
}
You should look up the methods in your example code. self.view.bounds.size is a CGSize, so if you replace the call to UIGraphicsBeginImageContext with the following, it'll get you an image of the size you want:
UIGraphicsBeginImageContext(CGSizeMake(2400.0,1600.0));
Good luck!
In my apps i designed a frame for camera viewer when i click the capture button i should get a single image along with my frame merged over the captured image.
What I did to achieve something like this was basically take a programatic screenshot of the area. You could maybe take the picture first and then apply the frame over it and then use the following code to take a screenshot. Make sure both the image and the frame are subViews of a UIView. In the example both of them would need to be part of "saveView".
UIGraphicsBeginImageContext(saveView.bounds.size);
[saveView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *finalImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
I'm using the UIGraphicsGetImageFromCurrentImageContext() function to capture the screen contents into an UIImage object (previously rendered into an Image context). This works great for both the simulator and a real device, however in the latter the resulting image has a few pixels with distorted colors, as seen here:
http://img300.imageshack.us/img300/2788/screencap.png
Please notice the few fucsia pixels at the top navigation bar, at both sides of the search field and to the right of the button. There are also such pixels at the right of the bottom-left button.
The code I'm using to capture the screen view into an UIImage object is pretty straightforward:
UIGraphicsBeginImageContext(self.view.window.frame.size);
[self.view.window.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *anImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
One thing to note is that all the graphics that get distorted belong to custom PNG files, used to draw the search field background as well as the buttons background.
Does anyone knows what could be possible causing this strange color distortion?
Best regards,
Just checked my own code that is doing the same thing you are. Yours is nearly identical to mine, except that I am asking the view's layer to render instead of the window's, i.e.:
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
I don't know why that would make a difference, but it's worth a try.
Solved it by using the just-approved private function UIGetScreenImage().
For more info, please check http://www.tuaw.com/2009/12/15/apple-relents-and-is-now-allowing-uigetscreenimage-for-app-st/ and https://devforums.apple.com/message/149553
Regards,
This article explains the issue with image corruption (caused by partially transparent pixels) and provides a workaround which corroborates Chris's comment:
http://www.kaoma.net/iphone/?p=9
UIGetScreenImage() is quite annoying when you just want to capture a view.
I found a nice trick, just re-save all your PNG images into TIFF format using Preview.app :)
I'm trying to resize an image using stretchableImageWithLeftCapWidth: it works on the simulator, but on the device, vertical green bars appears.
I've tried to use imageNamed, imageWithContentOfFile and imageWithData lo load the image, it doesnt change.
UIImage *bottomImage = [[UIImage imageWithData:
[NSData dataWithContentsOfFile:
[NSString stringWithFormat:#"%#/bottom_part.png",
[[NSBundle mainBundle] resourcePath]]]]
stretchableImageWithLeftCapWidth:27 topCapHeight:9];
UIImageView *bottomView = [[UIImageView alloc] initWithFrame:CGRectMake(10, 200+73, 100, 73)];
[self.view addSubview:bottomView];
UIGraphicsBeginImageContext(CGSizeMake(100, 73));
[bottomImage drawInRect:CGRectMake(0, 0, 100, 73)];
UIImage *bottomResizedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
bottomView.image = bottomResizedImage;
See the result: the green bars shouldn't be there, they dont appear on the simulator.
alt text http://www.quicksnapper.com/files/5161/96232162149F1C14751048_m.png
Did you figure this out?
Seems like it might be a bug with UIGraphicsGetImageFromCurrentImageContext. If I draw an image that has transparency I get red/green artifacts. These only appear on the device (not in the sim). Also, if I remove transparency from the image, the artifacts go away.
Update: Some more weirdness. I was using PNGs before, so I tried using a transparent gif instead. Using a GIF, shows the artifact problem on the sim.
Victory! Found a solution:
Turn off 'Compress PNG Files' in the build settings for your project. Disabling this makes the PNG transparency work without any artifacts.
It seems like you're writing a lot of unnecessary code, but perhaps that's just because it's out of context and there's more to it that I'm missing.
To get the image:
[[UIImage imageName: #"bottom_part.png"] stretchableImageWithLeftCapWidth: 27 topCapHeight: 9];
What part of your code are you displaying this image? i.e. in what method call are you drawing the images above?
Why not just use a UIImageView and put the image in there? Then you don't need to do any of the image context drawing etc.
In my case I had an UIImageView which I stretched horizontally. I found a strange vertical white line in the stretched image.
The solutions above didn't work for me. However, the following did:
Cast the width value to an int value:
myNewViewFrame.size.width = (int)newWidth;
myView.frame = myNewViewFrame;
Hope it works for you guys as well...
I've spent 5 hours debugging this last night, disabling PNG compression in xcode didn't do anything for me.
For anyone else meeting vertical, green, bars/lines/artifacts with stretchableImageWithLeftCapWidth (in UIImage API) - this post is for you.
I have a 21x30 for a custom barbutton background that I want to widen, but got the same green stripes as the OP. I found a PNG created with photoshop and that worked fine - mine are made with Gimp.
Anyway, stripping ALL chunks from the files (except the three essential ones) made no difference. Neither did disabling PNG compression.
What did work for me was this:
Add ONE empty line above my image (which is now 21x31).
Set topCapHeight:0 when creating the scaled image.
When I use my scaled image, I draw into a CGContext, which is later used to make an UIImage. I use this to draw:
[image drawInRect:CGRectMake(0,-2,width,32)];
This makes the problem go away (for me).
I assume that the bug/issue has to do with not scaling vertically when drawing, so I force scaling of the first source image line (into two lines), which are draw outside my composition.
I hope this helps someone save 5 hours.
/Krisb
I stopped using stretchableImageWithLeftCapWidth in favor of UIView's much more well-behaved contentStretch property. I've never seen the green bars you are describing, but in general I would recommend using contentStretch instead of stretchableImageWithLeftCapWidth.
I tried to disable PNG compression like Nate proposed, but this wouldn't work for me (iOS 3.1.3). I then tried using TIFF images instead of PNG, which works.
I have found similar issues with contentStretch (on any UIView that has drawn content) when using a value of (0.5, 0.5, 0, 0). i.e. stretch on the center pixel.
I have found that only the iphone 3G (possibly 2G) exhibits this problem. iphone 4 and 3GS is OK. So, I assume this is a problem with the old graphics HW.
a way i found around the problem was to stretch on a slightly larger center area.
e.g. (0.4, 0.4, 0.1, 0.1)
I am developing one game where I want to magnify the image where magnifier image is placed.
For that I am using the concept of masking. After masking I am zooming the image but looks blur. And I want image should be clearer like we r looking through rifle magnifier. So if any one have solution then kindly reply
are you sure that the problem is the masking?
perhaps your resources are too low resolution? high resolution images scaled down always look better than low resolution images scaled up.
Maybe you need to look at the problem backwards... so that your image when looking through the rifle magnifier [scope?] is viewed at a 1:1 resolution and when not viewed through the scope it is zoomed out (1:2 resolution?). so this way your 'normal' mode is the zoomed out mode and the "magnified view" is actually just the image at 1:1.
If you have A UIImage whose size is 293x184 but you create a UIImageView with an initial size of 40x30, the iPhone SCALES the UIImage to fit according to the property: contentMode. Default contentMode is: UIViewContentModeScaleToFill, which scales your image.
So even though you started with a large image, it is now only 40x30 and rendered at 40x30. When you zoom it is STILL 40x30, but rendered at some larger size which is causing the blur.
One solution would be to replace the image after the zoom, then you would have a completely new UIImage at full resolution.
[self.view setFrame:reallyBigFrame];
[self.view setImage:newUIImage];
Another would be initially place the UIImage in a full size 293x184 UIImageView, then use an AffineTransform to scale it down:
view.transform = CGAffineTransformScale(view.transform, 0.25, 0.25);