Setting frame of CCTexture to fit into a predefined frame CCSprite - iphone

I am downloading an image from server and displaying it on game scene. I am able to get the CCTexture2D of the image from server and display it on game scene. The problem is that the image from server may vary in size. But I have to display that image on to a predefined frame CCSprite.
CCSprite *temp = [CCSprite spriteWithTexture:[[CCTexture2D alloc] initWithImage:[UIImage imageWithData:data] resolutionType:kCCResolutioniPhoneFourInchDisplay]];
CCRenderTexture *test=[CCRenderTexture renderTextureWithWidth:70 height:70]; //set proper width and height
[test begin];
[temp draw];
[test end];
UIImage *img=[test getUIImageFromBuffer];
sprite_Temp =[CCSprite spriteWithCGImage:img.CGImage key:#"1"];
sprite_Temp.tag = K_TagUserImage;
sprite_Temp.scale=1;
sprite_Temp.position=ccp(432,273);
[self addChild:sprite_Temp z:1];
I am using this code to resize the CCTexture2D to predefined frame CCSprite. But the image gets cropped to the desired frame which is not wanted. Can someone tell me how to get the original image from server to desired frame without getting cropped. Thanks.

try :
CCSprite *temp = [CCSprite spriteWithTexture:[[CCTexture2D alloc] initWithImage:[UIImage imageWithData:data] resolutionType:kCCResolutioniPhoneFourInchDisplay]];
float scaleX = 70./temp.contentSize.width;
float scaleY = 70./temp.contentSize.height;
// if you want to preserve the original texture's aspect ratio
float scale = MIN(scaleX,scaleY);
temp.scale = scale;
// or if you want to 'stretch-n-squeeze' to 70x70
temp.scaleX = scaleX;
temp.scaleY = scaleY;
// then add the sprite *temp
usual disclaimer : not tested, done from memory, beware of divides by 0 :)

Related

How to Define UIImageView size as UIImage resolution?

I have scenario, in which I am getting images using Web Service and all images are in different resolution. Now my requirement is that I want resolution of each Images and using that I want to define size of UIImageView so I can prevent my Images from getting blurred
For example image resolution if 326 pixel/inch the imageview should be as size of that image can represent fully without any blur.
UIImage *img = [UIImage imageNamed:#"foo.png"];
CGRect rect = CGRectMake(0, 0, img.size.width, img.size.height);
UIImageView *imgView = [[UIImageView alloc] initWithFrame:rect];
[imgView setImage:img];
Image size IS it's resolution.
Your problem might be - retina display!
Check for Retina display and thus - make UIImageView width/height twice smaller (so that each UIImageView pixel would consist of four smaller UIImage pixels for retina display).
How to check for retina display:
https://stackoverflow.com/a/7607087/894671
How to check image size (without actually loading image in memory):
NSString *mFullPath = [[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject]
stringByAppendingPathComponent:#"imageName.png"];
NSURL *imageFileURL = [NSURL fileURLWithPath:mFullPath];
CGImageSourceRef imageSource = CGImageSourceCreateWithURL((CFURLRef)imageFileURL, NULL);
if (imageSource == NULL)
{
// Error loading image ...
}
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithBool:NO], (NSString *)kCGImageSourceShouldCache, nil];
CFDictionaryRef imageProperties = CGImageSourceCopyPropertiesAtIndex(imageSource, 0, (CFDictionaryRef)options);
NSNumber *mImgWidth;
NSNumber *mImgHeight;
if (imageProperties)
{
//loaded image width
mImgWidth = (NSNumber *)CFDictionaryGetValue(imageProperties, kCGImagePropertyPixelWidth);
//loaded image height
mImgHeight = (NSNumber *)CFDictionaryGetValue(imageProperties, kCGImagePropertyPixelHeight);
CFRelease(imageProperties);
}
if (imageSource != NULL)
{
CFRelease(imageSource);
}
So - for example:
UIImageView *mImgView = [[UIImageView alloc] init];
[mImgView setImage:[UIImage imageNamed:#"imageName.png"]];
[[self view] addSubview:mImgView];
if ([UIScreen instancesRespondToSelector:#selector(scale)])
{
CGFloat scale = [[UIScreen mainScreen] scale];
if (scale > 1.0)
{
//iphone retina screen
[mImgView setFrame:CGRectMake(0,0,[mImgWidth intValue]/2,[mImgHeight intValue]/2)];
}
else
{
//iphone screen
[mImgView setFrame:CGRectMake(0,0,[mImgWidth intValue],[mImgHeight intValue])];
}
}
Hope that helps!
You can get image size using following code. So, first calculate downloaded image size and than make image view according to that.
UIImage *Yourimage = [UIImage imageNamed:#"image.png"];
CGFloat width = Yourimage.size.width;
CGFloat height = Yourimage.size.height;
Hope, this will help you..
UIImage *oldimage = [UIImage imageWithContentsOfFile:imagePath]; // or you can set from url with NSURL
CGSize imgSize = [oldimage size];
imgview.frame = CGRectMake(10, 10, imgSize.width,imgSize.height);
[imgview setImage:oldimage];
100% working ....
To solve this problem, we need to take care of the device's display resolution..
For example you have an image of resolution 326ppi which is same as of iPhone4, iPhone4S and iPod 4th Gen. So you can simply use solutions suggested by #Nit and #Peko. But for other devices(or for image with different resolution on these devices) you will need to apply maths to calculate size for better display.
Now suppose you have 260ppi(with dimensions W x H) image and you wish to display it on iPhone4S, so as the information contained in it per inches is less than the display resolution of iPhone so we will need to resize it by reducing image size by 326/260 factor. so now the size for imageView that you will use is
imageViewWidth = W*(260/326);
imageViewHeight = H*(260/326);
In general:
resizeFactor = imageResolution/deviceDisplayResolution;
imageViewWidth = W*resizeFactor;
imageViewHeight = H*resizeFactor;
Here I am considering when we set an image in imageView and resize it, it does not removes or adds pixels from image,
Let the UIImageView do the work by utilizing the contentMode property to do your image resizing for you.
You probably want to be displaying your UIImageView with a static sizing (the "frame" property) that represents the maximum size of the image you want to display, and allowing the images to resize within that frame relative to their own particular size requirements (overall size, aspect ratio, etc.). You can let the UIImageView do the heavy lifting for you of dealing with different sized images by mastering the contentMode property. It has many different settings, one of which is UIViewContentModeScaleAspectFit, which will downsize your image as necessary to fit within the UIImageView, which if the image is smaller, it will simply display centered. You can play with the setting to get the results you want.
Note that with this approach, there is nothing special you need to do to deal with scaling issues associated with a Retina display.
As per the requirement you stated in the question body, I believe you need not change UIImageView size.
Image can represent fully without any blur using this line of code:
imageView.contentMode = UIViewContentModeScaleAspectFit;

Coreplot Library - Extract entire graph image

i am creating image for graph using this code
UIImage *newImage=[graph imageOfLayer]
NSData *newPNG= UIImageJPEGRepresentation(newImage, 1.0);
NSString *filePath=[NSString stringWithFormat:#"%#/graph.jpg", [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject]];
if([newPNG writeToFile:filePath atomically:YES])
NSLog(#"Created new file successfully");
But i get only visible area(320*460) in image, How can i get whole graph image with axis.
Please provide some code snippet, how can i do it with coreplot.
Thanks In Advance...
Make a new graph the size of the desired output image. It doesn't have to be added to a hosting view—that's only needed for displaying it on screen.
CPTXYGraph *graph = [(CPTXYGraph *)[CPTXYGraph alloc] initWithFrame:desiredFrame];
// set up the graph as usual
UIImage *newImage=[graph imageOfLayer];
// process output image
My approach to this problem was to create a method to extract the image.
In that method I momentarily make the hosting view, scroll view, graph bounds, and plot area frame bounds momentarily bigger. I then convert the graph to a an image.
I then remove the hosting view from its container to remove it from screen. I then call the doPlot method to reinitialise the plot and its data via the DoPlot method. My code is below.
There might be a visual jitter whilst this is carried out. However, you could always disguise this by using a alert either to enter an email for export, or just a simple alert saying image exported.
//=============================================================================
/**
Gets the image of the chart to export via email.
*/
//=============================================================================
-(UIImage *) getImage
{
//Temprorarilty make plot bigger.
// CGRect rect = self.hostingView.bounds;
CGRect rect = self.scroller.bounds;
rect.size.height = rect.size.height -100;
rect.origin.x = 0;
rect.size.width = rect.size.width + [fields count] * 100.0;
[self.hostingView setBounds:rect];
[scroller setContentSize: hostingView.frame.size];
graph.plotAreaFrame.bounds = rect;
graph.bounds = rect;
UIImage * image =[graph imageOfLayer];//get image of plot.
//Redraw the plot back at its normal size;
[self.hostingView removeFromSuperview];
self.hostingView = nil;
[self doPlot];
return image;
}//============================================================================

UIGraphicsBeginImageContext leads to memory (leak) overflow

In my code I stretch the size of an image to a specified size. The code works fine so far.
I got the problem that "UIGraphicsBeginImageContext ()" does not release the memory of the new image. So, the memory is full after about 10 minutes and the app is terminated by IOS.
Does anyone have a solution to this problem?
- (CCSprite *)createStretchedSignFromString:(NSString *)string withMaxSize:(CGSize)maxSize withImage:(UIImage *)signImage
{
// Create a new image that will be stretched with 10 px cap on each side
UIImage *stretchableSignImage = [signImage stretchableImageWithLeftCapWidth:10 topCapHeight:10];
// Set size for new image
CGSize newImageSize = CGSizeMake(260.f, 78.0f);
// Create new graphics context with size of the answer string and some cap
UIGraphicsBeginImageContext(newImageSize);
// Stretch image to the size of the answer string
[stretchableSignImage drawInRect:CGRectMake(0.0f, 0.0f, newImageSize.width, newImageSize.height)];
// Create new image from the context
UIImage *resizedImage = UIGraphicsGetImageFromCurrentImageContext();
// End graphics context
UIGraphicsEndImageContext();
// Create new texture from the stretched
CCTexture2D *tex = [[CCTexture2D alloc] initWithImage:resizedImage];
CCSprite *spriteWithTex = [CCSprite spriteWithTexture:tex];
[[CCTextureCache sharedTextureCache] removeTexture:tex];
[tex release];
// Return new sprite for the sign with the texture
return spriteWithTex;
}
Called by this code:
// Create image from image path
UIImage *targetSignImage = [UIImage imageWithContentsOfFile:targetSignFileName];
// Create new sprite for the sign with the texture
CCSprite *plainSign = [self createStretchedSignFromString:answerString withMaxSize:CGSizeMake(260.0f, 78.0f) withImage:targetSignImage];
Thank you so far.
I've found the solution to my problem.
First of all, the code shown above is correct and without leaks.
The problem was caused by the removal of the sprite that has planSign as child. The sprite is removed by a timer that runs on a different thread, so on an others NSAutoreleasePool.
[timerClass removeTarget:targetWithSign] released an empty pool.
[timerClass performSelectorOnMainThread:#selector(removeTarget:) withObject:targetWithSign waitUntilDone:NO]; released the correct pool, which contains the target sprite and its child plainSign.
Thanks to SAKrisT and stigi for your suggestions.

Overlaying UIImageview over UIImageview save

I'm trying to merge two UIImageViews. The first UIImageView (theimageView) is the background, and the second UIImageView (Birdie) is an image overlaying the first UIImageView. You can load the first UIImageView from a map or take a picture. After this you can drag, rotate and scale the second UIImageView over the first one. I want the output (saved image) to look the same as what I see on the screen.
I got that working, but I get borders and the quality and size are bad. I want the size to be the same as that of the image which is chosen, and the quality to be good. Also I get a crash if I save it a second time, right after the first time.
Here is my current code:
//save actual design in photo library
- (void)captureScreen {
UIImage *myImage = [self addImage:theImageView ToImage:Birdie];
[myImage retain];
UIImageWriteToSavedPhotosAlbum(myImage, self, #selector(imageSavedToPhotosAlbum:didFinishSavingWithError:contextInfo:), self);
}
- (UIImage*) addImage:(UIImage*)theimageView toImage:(UIImage*)Birdie{
CGSize size = CGSizeMake(theimageView.size.height, theimageView.size.width);
UIGraphicsBeginImageContext(size);
CGPoint pointImg1 = CGPointMake(0,0);
[theimageView drawAtPoint:pointImg1 ];
CGPoint pointImage2 = CGPointMake(0, 0);
[Birdie drawAtPoint:pointImage2 ];
UIImage* result = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return result;
}
But I only get errors with this code!
Thanks in advanced!
Take a look at Drawing a PNG Image Into a Graphics Context for Blending Mode Manipulation

How to I rotate UIImageView by 90 degrees inside a UIScrollView with correct image size and scrolling?

I have an image inside an UIImageView which is within a UIScrollView. What I want to do is rotate this image 90 degrees so that it is in landscape by default, and set the initial zoom of the image so that the entire image fits into the scrollview and then allow it to be zoomed up to 100% and back down to minimum zoom again.
This is what I have so far:
self.imageView.transform = CGAffineTransformMakeRotation(-M_PI/2);
float minimumScale = scrollView.frame.size.width / self.imageView.frame.size.width;
scrollView.minimumZoomScale = minimumScale;
scrollView.zoomScale = minimumScale;
scrollView.contentSize = CGSizeMake(self.imageView.frame.size.height,self.imageView.frame.size.width);
The problem is that if I set the transform, nothing shows up in the scrollview. However if I commented out the transform, everything works except the image is not in the landscape orientation that I want it to be!
If I apply the transform and remove the code that sets the minimumZoomScale and zoomScale properties, then the image shows up in the correct orientation, however with the incorrect zoomScale and seems like the contentSize property isn't set correctly either - since the doesn't scroll to the edge of the image in the left/right direction, however does top and bottom but much over the edge.
NB: image is being loaded from a URL
Maybe rotating the image itself fits your needs:
UIImage* rotateUIImage(const UIImage* src, float angleDegrees) {
UIView* rotatedViewBox = [[UIView alloc] initWithFrame: CGRectMake(0, 0, src.size.width, src.size.height)];
float angleRadians = angleDegrees * ((float)M_PI / 180.0f);
CGAffineTransform t = CGAffineTransformMakeRotation(angleRadians);
rotatedViewBox.transform = t;
CGSize rotatedSize = rotatedViewBox.frame.size;
[rotatedViewBox release];
UIGraphicsBeginImageContext(rotatedSize);
CGContextRef bitmap = UIGraphicsGetCurrentContext();
CGContextTranslateCTM(bitmap, rotatedSize.width/2, rotatedSize.height/2);
CGContextRotateCTM(bitmap, angleRadians);
CGContextScaleCTM(bitmap, 1.0, -1.0);
CGContextDrawImage(bitmap, CGRectMake(-src.size.width / 2, -src.size.height / 2, src.size.width, src.size.height), [src CGImage]);
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
I believe the easiest way (and thread safe too) is to do:
//assume that the image is loaded in landscape mode from disk
UIImage * LandscapeImage = [UIImage imageNamed: imgname];
UIImage * PortraitImage = [[UIImage alloc] initWithCGImage: LandscapeImage.CGImage
scale: 1.0
orientation: UIImageOrientationLeft];
Any calculations that you do based on the imageView's frame should probably be done before you apply any transformations to it. But I would actually suggest doing those calculations based on the size of the UIImage, not the UIImageView. Then set both the UIImageView's frame and the UIScrollView's contentSize based on that.
Max's suggestion is a good one, although with a larger image it could be a performance killer. Are you displaying this image from your app's resources? If so, why not just rotate the images before you even build the app?
There's a much easier solution that is also faster, just do this:
- (void) imageRotateTapped:(id)sender
{
[UIView animateWithDuration:0.33f animations:^()
{
self.imageView.transform = CGAffineTransformMakeRotation(RADIANS(self.rotateDegrees += 90.0f));
self.imageView.frame = self.imageView.superview.bounds; // change this to whatever rect you want
}];
}
When the user is done, you will need to actually create a new rotated image, but that is very easy to do.
I was using the accepted answer for a while until we noticed that non-square rotations based on images taken directly from the camera seemed stretched (they were rotated as desired, just the frame width/height wasn't adjusted).
Great explanation/post here from Trevor: http://vocaro.com/trevor/blog/2009/10/12/resize-a-uiimage-the-right-way/
In the end, it was a very simple import of Trevor's code which uses categories to add a resizedImage:interpoationQuality method to UIImage. So yeah, user beware, if it still works for you, great. But if it doesn't, I'd take a look at the library instead.