Why can't I capture a screenshot of MPMoviePlayerController? - iphone

I need to capture a screen shot of a video playing in mpmovieplayer controller, but all I get is a red screen (I made the coverView with red background and 0.5 alpha).
Here is the code:
NSArray *windows = [[UIApplication sharedApplication] windows];
if ([windows count] > 1)
{
UIWindow *moviePlayerWindow = [[UIApplication sharedApplication] keyWindow];
UIView *coverView = [[UIView alloc] initWithFrame:moviePlayerWindow.bounds];
[mainController.moviePlayer pause]; //Without that it won't work either!
coverView.backgroundColor = [UIColor redColor];
coverView.alpha = 0.5;
[moviePlayerWindow addSubview:coverView];
UIGraphicsBeginImageContext(coverView.bounds.size);
[coverView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *screenShot;
screenShot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(screenShot, self, nil, nil);
}
Any ideas???
Thanks

I was curious about his myself. I believe there are similar issues with taking screenshots of OpenGL stuff too.
If you look at this blog post http://getsetgames.com/2009/07/30/5-ways-to-take-screenshots-of-your-iphone-app/, they have a method that works for an EAGLView, it might be worth giving it a go for your issue.

Follow up to the question:
Recently I have a project that requires the functionality that was asked in this thread again, and I am glad to say there is already an apple provided solution to this. I am posting this so that people that visit this thread can get an answer.
MPMoviePlayerController now has a method that return a UIImage of the moment you want to capture.
- (UIImage *)thumbnailImageAtTime:(NSTimeInterval)playbackTime timeOption:(MPMovieTimeOption)option
Just put in the playbackTime you want to capture, and bingo, UIImage.
Looking at your codes, I would like to advise that you should look for an alternative method of getting the image rather than from the mpmovieplayer itself. The reason being that the method you are using is a private framework code, and second, getting screenshots off a video is highly prone to crashes.
Instead of getting a screenshot from the video, how about tagging the video beforehand? The codes looks highly vulnerable to an app rejection by apple.

Sounds like you're taking a shot of the video overlay mask
http://en.wikipedia.org/wiki/Hardware_overlay

Related

GIF Support for Iphone

My Sires, My Dames (ah.. just another knightly thing,...)
I have a question, I have been battling with this for days, here's my scenario:
I have an HTTP response which contains a link to a GIF Image, the GIF image then is converted into an NSData, and later on used to populate an ImageView. Expectedly or Unexpectedly, only the first frame/transition of the GIF is shown. Simply said, the image loads but doesn't animate.
What I want to know if there is native support for GIF in iPhone, likewise I found an extended class here SCGifExample
which seems to appear to work only when I import the GIF image within the app, but not when the image is coming from a URI->NSData.
I don't want to create multiple images then create an array of images because that is not gonna work my setup as it will give a lot of load to the server and the client.
I've made my search here at SO, but I didn't found any substantial result to stop me from posting my own question as a matter of fact I found many questions here related to iphone+gif are left unanswered.
I do have a suspicion, and I was hoping somebody could help me figure out and understand it. I suspect that the GIF Animation/Transition is "LOST" during the conversion from Image to NSDATA format. But I don't have any concrete evidence to back this up tho.
Here's how I convert the image into NSData;
contentURI = http://gta.champion.com/content?cmsFileId=513af9e9-a96d-4a56-8239-92be273393e0&mt=image/gif //lets just assume that when view using the browser, this would display the image.
NSData * imageData = [[NSData alloc] initWithContentsOfURL: [NSURL URLWithString: contentURI]];
If there was somebody else who came victor of this plight, please help me to conquer such feat also.
Yes there is an alternate , but it depends on the case you have.But let me share the idea with you because using the UIWebView solved my problem since I just need to Animate a single image i.e Advertisement.
Here is the code Snippet :
UIWebView *webV = [[[UIWebView alloc]initWithFrame:appDel.isiPad==TRUE?CGRectMake(260, 35, 500, 100):CGRectMake(40, 18, 250, 60)] autorelease] ;
webV.delegate =self;
webV.multipleTouchEnabled=NO;
[webV setBackgroundColor:[UIColor whiteColor]];
webV.userInteractionEnabled=NO;
[webV loadRequest:[NSURLRequest requestWithURL:[NSURL URLWithString:#"YourImageUrl"]]];
And implement their delegate methods :
- (void)webViewDidStartLoad:(UIWebView *)webView;
- (void)webViewDidFinishLoad:(UIWebView *)webView;
- (void)webView:(UIWebView *)webView didFailLoadWithError:(NSError *)error;
While Apple's APIs do not support animated GIFs out-of-the-box, there is a user-created solution that is just as effective:
LBGIFImage:
https://github.com/larcus94/LBGIFImage
Unfortunately iOS does not support GIF images, also you cannot use them for animations. If you use GIF images, the imageview will show the first frame of that image, nothing more.
Please refer UIImage for the supported formats.
See what I found out about showing animated gif in iOS. http://iosnotestoremember.blogspot.com/2013/01/showing-animated-gif-in-ios.html
I hope this helps.

Getting Blank Image while trying to capture MPMoviePlayerViewController view using UIGraphicsGetImageFromCurrentImageContext()

I have an requirement to capture iPhone screen when my app is in foreground . I have used UIGraphicsGetImageFromCurrentImageContaxt() for this it works in most of synerio but fails when video is playing by using MPMoviePlayerViewController or AVPlayer and gives back black image with player control.
Probabely My guess is MPMoviePlayerViewController rendering frames using OpenGl and method UIGraphicsGetImageFromCurrentImageContaxt() is not able to capture the image ??
I am missing something or is there any alternative soln available to capture iPhone Screen while app is in foreground ??
There is not any easy solution(i.e. might be I don't know) for this but you need to figure out. Here I have described possible solutions. When you will try simply rendering the view on context then it will give blank screen in place of player and other things will be as it is.
Possible solutions that I know
Private API
You can use UIGetScreenImage() function to capture whole screen of device including player and its control. This is the way you can get the image of player with view easily.
Note: Some buddies are saying use of this function may cause rejection of application(I have never used for app store's app so I don't have much experience about it :)).
Second way.
If you want to get image of player with the other things that resides in main view or iPhone screen then basic idea is to capture two images(i.e. One of movie player and another of whole screen like you are doing using UIGraphicsGetImageFromCurrentImageContext) and combine them as a one image using some proper calculations.In this way you need to do some calculations so accuracy is depend upon your calculations.
Here I am preferring to use AVPlayerinstead of MPMoviePlayer not sure I am true or not but as I have feel that AVPlayer providing exact frame at certain time(Good accuracy. Exact image that is showing on the screen.)
Please set the gravity mode of AVPlayer to AVLayerVideoGravityResizeAspect. This mode will preserver aspect ratio and fit within layers bounds(i.e. This option is default). You can set this way.
playerLayer.videoGravity = setVideoFillMode:AVLayerVideoGravityResizeAspect;
where playerLayer is the object of AVPlayerLayer.
Now get the image of AVPlayerLayer
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc]
initWithAsset:asset];
imageGenerator.appliesPreferredTrackTransform=YES;
/*AVAssetImageGenerator will scale images such that they fit within the defined bounding box. Images will never be scaled up. So provide this should be the size of AVPlayer. */
imageGenerator.maximumSize=CGSizeMake(400, 400);
CGImageRef imgRef = [imageGenerator copyCGImageAtTime:storedCMTime actualTime:NULL error:NULL];
[imageGenerator release];
Please note that this generated image might be not exact size for what you are looking for or as you have exactly provided in maximum size because as I mentioned in above comment here image will be never scaled up. if image is not the exact size for what you are looking then use some image utility function to scale the image as per the size of your player but don't spoil Aspect ration because we have set the AVPLayer's mode to AVLayerVideoGravityResizeAspect.store this image temporary.
Now Capture the image of View using UIGraphicsGetImageFromCurrentImageContext(i.e. like you are doing currently). Draw image of AVplayer that we have captured and stored over exactly the black area that is showing on your view(i.e. You need to do some trial and error to get the exactly same).
Here I have tried to cover all the main points that may cause panic or not immediately obvious.
Update:
Screen capturing solutions from APPLE: http://developer.apple.com/library/ios/#qa/qa1703/_index.html#//apple_ref/doc/uid/DTS40010193
If you would like to go with MPMoviewPlayerController then it's okay. Get the thumbnail using code shows by #Kuldeep. And combine the Both view's image and Thumbnail using Image Masking exactly explain here : Link
I would rather suggest to use below snippet, may it would be more easy to use. You just need to pass time factor at which you need to capture the video.
float timeFactor = 60.0;
CGRect rect = CGRectMake(self.view.center.x, self.view.center.y, 100.0, 100.0)
MPMoviePlayerController *player = [[MPMoviePlayerController alloc] initWithContentURL:videoURL];
UIImage *thumbnail = [player thumbnailImageAtTime:timeFactor timeOption:MPMovieTimeOptionNearestKeyFrame];
UIImageView *imagV=[[UIImageView alloc]initWithImage:thumbnail];
[imagV setFrame:rect];
Well, it seems I've found a solution to capture frames in a AVPlayer:
- (UIImage*) captureFrame {
AVAssetImageGenerator *imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:[self.player.currentItem asset]];
if ([[[UIDevice currentDevice] systemVersion] floatValue] >= 5) {
[imageGenerator setRequestedTimeToleranceAfter:kCMTimeZero];
[imageGenerator setRequestedTimeToleranceBefore:kCMTimeZero];
}
CGImageRef ref = [imageGenerator copyCGImageAtTime:self.player.currentItem.currentTime actualTime:nil error:nil];
UIImage *img = [UIImage imageWithCGImage:ref];
CGImageRelease(ref);
return img;
}

AVCaptureVideoPreviewLayer: taking a snapshot

I'm trying to emulate the animation seen in the default camera app, where a snapshot of the cameras viewfinder is animated into the corner of the apps display.
The AVCaptureVideoPreviewLayer object that holds the key to solving this problem isn't very open to these requirements: trying to create a copy of it in a new layer with ..
- (id)initWithLayer:(id)layer
.. returns an empty layer, without the image snapshot, so clearly there is some deeper magic going on here.
Your clues/boos are most welcome.
M.
facing the same woes, from a slightly different angle.
Here are possible solutions, that none are too great IMO:
You can add to an AVCaptureSession both an AVCaptureStillImageOutput and an AVCaptureVideoDataOutput. When you set the sessionPreset to AVCaptureSessionPresetHigh you'll start getting frames by the API, and when you switch to AVCaptureSessionPresetPhoto you can take real images. So right before taking the picture, you can switch to video, get a frame, and then return to camera. Major caveat is that it takes a "long" time (couple of seconds) for the camera to switch between the video camera and picture camera.
Another option would be to use only the camera output (AVCaptureStillImageOutput), and use UIGetScreenImage to get a screen capture of the phone. You could then crop out the controls and leave only the image. This gets complicated if you're showing UI controls over the image. Also, according to this post, Apple started rejecting apps that use this function (it was always iffy).
Aside from these I also tried playing with AVCaptureVideoPreviewLayer. There's this post to save a UIView or CALayer to a UIImage. But it all produces clear or white images. I tried accessing the layer, the view's layer, the superlayer, the presentationLayer, the modelLayer, but to no avail. I guess the data in AVCaptureVideoPreviewLayer is very internal, and not really part of the regular layer infrastructure.
Hope this helps,
Oded.
I think you should add an AVCaptureVideoDataOutput to the current session with:
AVCaptureVideoDataOutput *videoOutput = [[AVCaptureVideoDataOutput alloc] init];
videoOutput.videoSettings = #{ (NSString *)kCVPixelBufferPixelFormatTypeKey : #(kCVPixelFormatType_32BGRA) };
[session addOutput:videoOutput];
dispatch_queue_t queue = dispatch_queue_create("MyQueue", NULL);
[videoOutput setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
Then, implement the delegate method below to get your image snapshot:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
// Add your code here that uses the image.
dispatch_async(dispatch_get_main_queue(), ^{
_imageView.image = image;
});
}
This will consume memory and reduce the performance of the app. To improve, you can also optimize your AVCaptureVideoDataOutput with:
videoOutput.minFrameDuration = CMTimeMake(1, 15);
You can also use alwaysDiscardsLateVideoFrames.
there are 2 ways to grab frames of the preview.. AVCaptureVideoDataOutput & AVCaptureStillImageOutput :)
is your capture session is setup to grab video frames, then make your layer with the cgimage from a chosen frame. if it's setup for stills, wait until getting your still image and make your layer from a scaled down version of that cgimage. if you don't have an output on your session yet, you'll have to add one i think.
Starting in iOS 7, you can use UIView::snapshotViewAfterScreenUpdates to snapshot the UIView wrapping your AVCaptureVideoPreviewLayer. This is not the same as UIGetScreenImage, which will get your app rejected.
UIView *snapshot = [self.containerView snapshotViewAfterScreenUpdates:YES];
Recall the old-school way of turning a view into an image. For some reason it worked on everything except for camera previews:
UIGraphicsBeginImageContextWithOptions(self.containerView.bounds.size, NO, [UIScreen mainScreen].scale);
[self.containerView drawViewHierarchyInRect:self.containerView.bounds afterScreenUpdates:YES];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();

Get what PLCameraView is showing on 3.0

with fw 2.2 i was able to get a screenshot using the private method _createCGImageRefRepresentationInFrame of UIWindow. In 3.0 doesn't exist anymore.
I used that method with the PLCameraView over a window to take a small video just by get as much screenshot as possible. Now i tried with the CALayer of the PLPreviewView and -renderInContext: method, but it always render the view as it has the iris closed.
How can i take a screenshot of what the cameraView is showing?
Thanks
Marco
Try for following code snippet to get
CGImageRef imageRef = [[UIApplication sharedApplication] _createDefaultImageSnapshot];
UIImage* img = [UIImage imageWithCGImage:imageRef];
//Now you can save it the way you want
//May as following
//oops yes this image is just the screenshot so better take care of unwanted image side
//So cut crap out from the image captured
UIImageWriteToSavedPhotosAlbum(img, nil, nil, nil);

Capture iPhone screen with status bar included?

I am looking for a way to capture a screenshot on the iPhone with the top status bar included, I am currently using the following code:
UIGraphicsBeginImageContext(self.view.bounds.size); //self.view.window.frame.size
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(viewImage, nil, nil, nil);
The above code sucessfully takes a screenshot of the iPhone UIView but does not include the top status bar (In its place is just a blank 20px space).
As of iOS 7, you can do this without the use of private APIs using
UIView *screenshotView = [[UIScreen mainScreen] snapshotViewAfterScreenUpdates:NO];
See my answer to this question:
Moving status bar in iOS 7
You can get the entire contents of the screen by calling the private API UIGetScreenImage. See my previous answer to a similar question for details.
As of Oct 8, 2009, the use of UIGetScreenImage got my app rejected! :( I would not advise using it. I believe Apple is trying to clean up all the apps and make them conform to the new 3.x OS/APIs. I'm looking for an alternative. If anyone has any suggestions. The video API?
Instead of using private API, why not render the entire UIWindow into the image context? It might be enough to replace self.view with self.view.window in your code.
You can also get the current window(s) as a property of the [UIApplication sharedApplication] instance. It's possible the status bar is on a separate window layer and maybe you'll have to render the windows in order.
Something like this:
UIGraphicsBeginImageContext(self.view.window.frame.size);
for (UIWindow *window in [[UIApplication sharedApplication] windows]) {
[window.layer renderInContext:UIGraphicsGetCurrentContext()];
}
UIImage *screenshotImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(viewImage, nil, nil, nil);
At any rate, you probably don't need to resort to private API.