How to save an animated image to iPhone's photos album? - iphone

I created an animated image, and it animates properly in an UIImageView:
- (void)viewDidLoad
{
[super viewDidLoad];
UIImage *image1 = [UIImage imageNamed:#"apress_logo"];
UIImage *image2 = [UIImage imageNamed:#"Icon"];
UIImage *animationImage = [UIImage animatedImageWithImages:[NSArray arrayWithObjects:image1, image2, nil] duration:0.5];
UIImageView *imageView = [[UIImageView alloc] initWithFrame:CGRectMake(100, 100, 100, 100)];
imageView.image = animationImage;
[self.view addSubview:imageView];
}
But if saved to photos album, it couldn't animate agian:
UIImageWriteToSavedPhotosAlbum(animationImage,
self,
#selector(imageSavedToPhotosAlbum:didFinishSavingWithError:contextInfo:),
nil);
So is there any solution to save an animated image to photos album?
Special thanks!

The photo album is an application as well, like exactly what you are making. Except to animate an image you wrote corresponding code like,
UIImage *animationImage = [UIImage animatedImageWithImages:[NSArray arrayWithObjects:image1, image2, nil] duration:0.5];
UIImageView *imageView = [[UIImageView alloc] initWithFrame:CGRectMake(100, 100, 100, 100)];
imageView.image = animationImage;
But you assumed that your Image will be treated in similar way inside your photo album app as well. Then No thats not true. in photo album app probably the application just reading an Image from disk and then shows in inside UIImageView without any animation "BECAUSE THERE IS NO ANIMATION CODE" :)
This creates an opportunity to create a photo album with animated images.
I have seen programs like Quartz Composer creating .mov file from several images and creating animations to it. But thats professionally built by Apple. Now, I said because I think it's possible to create .mov by using API but you might have to dig deeper as I have no idea how to.
The closest reference I can find is this :
http://www.cimgf.com/2008/09/10/core-animation-tutorial-rendering-quicktime-movies-in-a-caopengllayer/

Related

I want to get imageView Position vertically upside down

In the Below image there are two imageViews one is body imageView and another one is black tattoo imageview ,
Now i am getting the tatoo imageView position with the below code
appDelegate.xFloat= self.imgView.frame.origin.x;
appDelegate.yFloat= self.imgView.frame.origin.y;
appDelegate.widthFloat= self.imgView.frame.size.width;
appDelegate.heightFloat= self.imgView.frame.size.height;
Now i need to put the tattoo image in another view controller as we are seeing in the image(Here Car is in reverse position), But with the help of (appDelegate.xFloat, appDelegate.yFloat, appDelegate.widthFloat, appDelegate.heightFloat) these I am setting the tattooimageview frame.
But i am getting the image as shown below in another view
I need to place the car image in reverse as we seen in first image.
Please Guide Me
My requirement is not only rotation.. The image may be in any position like below
It is strange that one image is not rotating and the other one is rotating. I am assuming that one of the images is a background image or what ever, but you can use transform properties to change the rotation. Some thing similar to the code below should show you the same image in two orientations.
UIImage *image = [UIImage imageNamed:#"image.png"];
UIImageView *iv1 =[[UIImageView alloc] initWithImage:image];
[self.view addSubview:iv1];
UIImageView *iv2 = [[UIImageView alloc] initWithImage:image];
[iv2 setFrame:CGRectMake(100,100,image.size.width,image.size.height)];
[iv2 setTransform:CGAffineTransformMakeRotation(-M_PI)];
[self.view addSubview:iv2];
you must contructor appDelegate before use it.
by:
appDelegate = (AppDelegate *) [[UIApplication sharedApplication] delegate];
Babul, Looks like you are trying to create an image. For that you need to use some thing called ImageContext. I have given some code, where you draw an image in a context and then get the image out.
UIGraphicsBeginImageContext(CGSizeMake(300,300));
UIImage *image = [UIImage imageNamed:#"breakpoint place.png"];
[image drawAtPoint:CGPointMake(0,0)];
UIImage *image2 = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageView *iv = [[UIImageView alloc] initWithImage:image2];
[iv setFrame:CGRectMake(100,100,image2.size.width,image2.size.height)];
[self.view addSubview:iv];

How to take a screenshot from an video playing through MPMediaPlayerController in iPhone?

Can anybody help me in this? I want to get an screenshot from an video playing through MPMediaPlayerController. What i have tried so far is:-
-(void)viewDidLoad
{
NSURL *tempFilePathURL = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"LMFao" ofType:#"mp4"]];
player = [[MPMoviePlayerController alloc] initWithContentURL:tempFilePathURL];
[player setControlStyle:MPMovieControlStyleFullscreen];
//Place it in subview, else it won’t work
player.view.frame = CGRectMake(0, 0, 320, 400);
[self.view addSubview:player.view];
}
-(IBAction)screenshot
{
CGRect rect = [player.view bounds];
UIGraphicsBeginImageContext(rect.size);
CGContextRef context = UIGraphicsGetCurrentContext();
[player.view.layer renderInContext:context];
image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
imgView = [[UIImageView alloc] initWithFrame:CGRectMake(150, 150, 100, 100)];
[imgView setImage:image];
imgView.layer.borderWidth = 2.0;
[self.view addSubview:imgView];
}
And what i ve got now is this:-
what is happening now is my player's button are getting captured in ScreenShot but my video's image is not been getting.I'm using this context method to capture my image because
i have an overlay on my video,using thumbnail method i cant capture video with overlay but i want to get video's image with overlay.
EDIT:
what i want is getting screenshot of both this video and drawing overlay as shown in this image
But what i m getting is by one method i got only video not overlay in my screenshot that method is thumbnail method,i was doing thumbnailing in that which is given by MPMoviePlayerController and by second method i m getting only overlay not an video in ma screenshot,that method is context method.i'm getting context of rect.Now i thought solution for this may be is k i combine both these images.So help me out by giving suggestion is merging images will work for me or not???
Please help.Thanks :)
You just have to call a single method on your mpmovieplayer object:
- (UIImage *)thumbnailImageAtTime:(NSTimeInterval)playbackTime timeOption:(MPMovieTimeOption)option
this method will return a UIImage object. what you have to do is just:
UIImage *thumbnail = [player thumbnailImageAtTime:1.0 timeOption:MPMovieTimeOptionNearestKeyFrame];
for more details look at MPMoviePlayerController
Hope this helps
EDIT: In your method you can first hide player's control and than capture Image after that show controls again. you can achive this using controlStyle property of MPMoviePlayerController.
Not sure this is what you are looking for but it is what I understood. If you are looking for different let me know.
U can merge two image from this way.Please have an look from the below code..
- (UIImage*) maskImage:(UIImage *)image withMask:(UIImage *)maskImage {
MaskView=[[UIView alloc] init];
UIImageView *image1=[[UIImageView alloc]init];
UIImageView *image2=[[UIImageView alloc]init];
MaskView.frame=CGRectMake(0, 0,500,500);
image1.frame=CGRectMake(0,0,160, 160);
image2.frame=CGRectMake(60,54,40,40);
image1.image=image;
image2.image=maskImage;
[MaskView addSubview:image1];
[MaskView addSubview:image2];
UIGraphicsBeginImageContext(CGSizeMake(160,160));
[MaskView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *finishedPic = UIGraphicsGetImageFromCurrentImageContext();
return finishedPic;
}
This may help
- (UIImage *)thumbnailImageAtTime:(NSTimeInterval)playbackTime timeOption:(MPMovieTimeOption)option
Get the thumbnailImage at the given time and scale that image.

Saving two Overlapping UIImage

I am trying to build a photo frame application on iphone. I made the frame it is transparent in png formate, then by choosing photos and was placed behind the frame layer in the interface builder.
In interface builder they are placed well and fit well. Now my problem is how can i save them into one picture.
Here is the code i have, but the saving part keep crashing.
-(IBAction) saveImage:(id)sender{
imagefront .backgroundColor = [UIColor clearColor]; //This sets your backgroung to transparent.
imagefront.opaque = NO;
[imageView bringSubviewToFront:imagefront];
UIImage *overlappedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(overlappedImage, self, #selector(imageSavedToPhotosAlbum: didFinishSavingWithError: contextInfo:), nil);
}
Imagefront is the photoframe while imageView is the photo.
Thank you.
Your current approach is incorrect. You will need to do this to get the image.
UIGraphicsBeginImageContext(imageView.frame.size);
[imageView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsGetCurrentContext();
This is assuming that imageView has imageFront as its subview as suggested by the code you've posted.

iPhone taking a picture when saving always landscape

I have this iPhone application that let users take a picture and save it in a database online. My problem is that every time an user take a picture and saves it, the picture results to be in landscape, even though it was taken in portrait mode. This results in having the portrait picture stretched.
This is the code I use when taking a picture:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
[picker dismissModalViewControllerAnimated:YES];
NSData *imgData=UIImageJPEGRepresentation([info objectForKey:#"UIImagePickerControllerOriginalImage"],1);
UIImage *img=[[UIImage alloc] initWithData:imgData];
if(img.size.width < img.size.height){
NSLog(#"width < height");
imageView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, 320, 460)];
}
else{
NSLog(#"width > height");
imageView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, 460, 320)];
}
imageView.image = img;
[img release];
[[self view] fillPreviewWithImg: imageView];
[[self view] setImage: imageView.image];
}
Basically what I do is take the picture, create a UIImage, check if it's portrait or landscape, create the corresponding UIImageView and then set the image into the UIImageView. After that I just call a couple of methods to set up the image in the main view.
I believe the problem with stretching is not bounded to the PHP but to the Objective-C code, but I can't really see how or why this behavior happens.
Does anyone of you have an idea?
Thanks,
Masiar
Have a look at the imageOrientation property of the UIImage class. You can have an image that is 320 wide and 480 high, but with an orientation of Landscape. This is contained in the EXIF information and it is up to the viewing program to use this orientation information to rotate the image appropriately. Just checking the width and height of the image is not sufficient to know the orientation and this is causing your stretching that you are seeing.

addSubview does not show view on iOS 3.1.3

I have a scroll view which has one UIView inside which contains the content. I am adding a UIImageView as a sub of the UIScrollView (so it should be on top of the content container) and this works on iPhone 3.2+, but on an iPhone running 3.1.3 the image does not show up above the container. My code is something like this:
// add the content container
UIView *contentContainer = [[UIView alloc] init];
[scrollView addSubview:contentContainer];
// add content, etc
// this works in 3.2+
UIImageView *imageView = [[UIImageView alloc] initWithImage:image];
[scrollView addSubview:imageView];
// tried adding this for 3.1, but still didn't work
[scrollView bringSubviewToFront:imageView];
[imageView setFrame:CGRectMake(point.x, point.y, image.size.width, image.size.height)];
Is there something else that I'm missing? Thanks!
Are you also actually setting the contentSize property of the UIScrollView ?
Found the issue: when I specified the image name, I was not specifying the image extension:
UIImage *image = [UIImage imageNamed:#"myImage"];
If I change this to the following, it works:
UIImage *image = [UIImage imageNamed:#"myImage.png"];
Does iOS know to still look for the "#2x" version if available when the extension is present by chance?