iPhone Use UIImagePickerController to capture selective frames from a video - iphone

I wanted to use the UiImagePicker to record a video then let the user browse the frames using the scrollview built in the UiImagePicker and while at it, select a few frames that interest the user.
I know I can overlay a control on top of the uiimagepicker to trigger the selection
What I am not sure about is whether I have programatic access to the current frame shown by the UIImagePicker for me to extract an image out of it.
Please let me know if this is doable/possible.
Are there any other time efficient/elegant ways to achieving the above?
I had created by own view with video recording and frame selection (using AVCaptureSession, AVCaptureVideoDataOutput) but would take me quite a bit of time to polish it to make it look good like UIImagePicker plus everyone is familiar with the default camera app in the iPhone.
I hope it makes sense what I want to achieve.
I know one can kind of achieve the goal by taken a screenshot of the screen but I just want the user to click one button to capture the current show frame.
Thanks

Screenshot method did not work for frozen frame from the video recording.
Went with a custom AVFoundation based frame capture.

Related

Recreating the iTunes (iOS app) tableViewCell which previews songs

I'm creating an app which would provide the user with a list of audio files and let them sample a small piece of audio when they select the song (just like the iTunes iOS app).
I love the way the iTunes iOS App has implemented it (image attached); wherein I can click on a cell and the album cover flips over to show a progress indicator and a stop button. I can select a cell and the sample starts to play, and the moment I stop it, the progress indicator flips back over to show the album/song art.
I'd like to create something like that for my app. Any suggestions on how I can go ahead with it.
Thanks a ton in advance! :D
EDIT: Based on Till's suggestion, I'm adding this edit. What I wish to get by asking this question is suggestions for the best approach to make the flip animation, showing a custom view in a UITableViewCell image space, possible. Currently, I'm not worried about playing the audio or displaying the progress of the playback. I simply need suggestions on the best approach to perform the flip animation and substitute the image with a custom UIView. :) Thanks again! :D
You will need to create it. I suggest making it as a custom tablecell. If you polish it well enought, you can put it on CodeCanyon and make it worth your while.
Also, I can recommend looking at the works on http://cocoacontrols.com/ - It might not have exactly this one, but it has many interesting controls and inspirations.
This is exactly what u looking for:
https://github.com/marshluca/AudioPlayer
You can also refer to some Sources :
https://github.com/lipka/LLACircularProgressView

How to get the color of the tapped point on the iPhone's camera input in real time?

I want to get the color of the point that the user is panning on the image from the camera.
And this needs to happen in real time.
I'm using UIImagePickerController class with sourceType property set to UIImagePickerControllerSourceTypeCamera.
So the user opens the video camera and after the iris opens he has the possibility to tap over it.
While he is panning on the video camera view I want the application to show the color of the point under his finger. Real time.
If there is someone who could please tell me if this is possible and how to do it.
I tried to use the code from here:
http://developer.apple.com/library/ios/#qa/qa1702/_index.html
First I have a warning:
warning: 'CameraViewController' may not respond to '-setSession:'
I get a lot of errors when trying to compile. I included inside the .h file this:
#import <AVFoundation/AVFoundation.h>
Do I have to include more then this?
Also do I still need to use the UIImagePickerController to show the camera?
I'm new to iOS and very confused with this.
OK I did it using the example from http://developer.apple.com/library/ios/#qa/qa1702/_index.html
The problem that I have is that is working only on the iPhone. On the simulator I still get those errors regarding not recognizing the frameworks.
You will have to use AVFoundation for this. An AVCaptureSession can deliver live video frames to AVCaptureVideoDataOutput's delegate method captureOutput:didOutputSampleBuffer:fromConnection: where you then have to analyze each frame to determine the color at a particular position.

display two different Video-Streams at the same time

I try to display two different http/rtsp-Video-Streams at the same time on the same UIView.
So my first thought was to use the UIViews of two MPMoviePlayerController.
But the documentation says:
Note: Although you may create multiple MPMoviePlayerController objects and present their views in your interface, only one movie player at a time may play its movie.
Okey!
My second thought was to use the UIWebView.
But that also doesn't work. I can display only one stream.
I hope you can help me.
Best regards.
On iPhone, as the documentation says, only one Video can be shown at any point. Video has to be full Screen! check also:
Customizable non-full screen video player in iPhone

How to modify in real time the video stream from the iPhone camera?

Every time the camera of the iPhone captures a new image, I want to modify it and then display it on the iPhone screen. In other way: how to modify in real time the video stream from the iPhone camera?
I need a method who is called every time when a new image comes from the camera.
Thanks for your help! :-)
EDIT: what I want to do is like augmented reality: while I'm taking a video, every image is modified, and is showed in real time on the iPhone screen.
You could capture an image with the UIImagePickerController's takePicture method and draw your modified version to the cameraOverlayView.
You get the picture recorded as a result of the takePicture message from the UIImagePicker's delegate in a imagePickerController:didFinishPickingMediaWithInfo:. From the dictionary supplied to that method, you can get the original image which you modify and draw to the overlay.
Here is an example for using the cameraOverlayView. You should be able to re-use the captured image from your delegate for drawing your overlay view.
Many augmented reality apps do not actually modify the image but just overlay information based on what they think is on the screen from input from the accelerometer and compass. If this is the kind of AR you are looking to do then try looking at ARKit.
You cannot process the camera's image data in real time. THere is no API to do this. File a request with Apple using their bug tracker. Many of us have done this already. More requests might lead to this being possible.
Oh, yes, just use an overlay view then. You were talking about modifying the video stream, which is clearly not needed.

Modify photo selected by UIImagePicker before it's used in the move-and-scale screen

as you probably know, the UIImagePickerController in the UIKit Framework allows the user to take a photo which is then handed over to my App for further processing.
If the property allowsEditing is set to YES, the user is allowed to move and scale the photo after taking it.
What I'd like to accomplish is to modify the photo in my code BEFORE the user is presented with the move and scale screen. This is because I'd like to add some visual effects to the photo and the user should be able to move and scale with these effects already applied to the photo.
I know that there's the cameraOverlayView property, but this is not useful in my case as far as I'm concerned.
Any ideas?
Thanks, Thomas
Not so easy way is to implement move and scale functionality on your own.
make showsCameraControls NO. You can design your own preview screen (with modified Image).
Looks like there are issues with touches in IOS5 cameraOverlayView. Make sure the above solution works for IOS5 also :).