iPhone: get image from camera without UIImagePickerController - iphone

Please help me with my question.
Is there any way to get image from camera without UIImagePickerController?
I need to render current image(from camera) into image on my view and update it by timer.
May be AVCaptureStillImageOutput? I didn't find any examples.
Any ideas?

Yes, you can do it easily using AVCamCaptureManager and AVCamRecorder classes. Apple has a demo program build on its developer site here. It is named AVCam. In simple words what it does is when you click to open the camera, it calls the classes and methods which are responsible for opening the iPhone's camera and record video or capture audio. It calls the same classes which are called by UIImagePickerController.
I hope it helps.

Related

iOS 6 in app camera

Im building an app that allows the user to record a video (in app) by pressing a button on the main screen. I don't want the user to be taken to the photo app because the video will only be able to be viewed on the app (Max of 15 seconds) and I can't quite get it. Anyone have the code to do this? A good example of what i want the camera to do is the camera in the app Cinemagram. Thanks for any help.
If you plan on saving the movie to the user's photo library, then you can use UIImagePickerController. In particular, you should read the guide that accompanies the class.
However, if you only want the video to be temporary, then you will probably want to use AVFoundation. You would then need to configure an AVCaptureSession with an AVCaptureMovieFileOutput to write the video to disk. Then, when you are ready to play the video, create an AVURLAsset with the file url that you just wrote, use that to create an AVPlayer to play the video, and add an AVPlayerLayer to your view, with said player, to display the video.
Either way, I would recommend studying the examples that Apple provides.
AVCam and
AVPlayerDemo should be more than enough to get you started (especially the AVCam example project).

How do I create an AVAsset with a UIImage captured from a camera?

I am a newbie trying to capture camera video images using AVFoundation and
want to render the captured frames without using AVCaptureVideoPreviewLayer. I
want a slider control to be able to slow down or speed up the rate of display of
camera images.
Using other peoples code as examples, I can capture images and using an NSTimer,
with my slider control can define on the fly how often to display them, but I
can't convert the image to something I can display. I want to move these
images into a UIView or UIImageView and render them in the timer Fire function.
I have looked at Apples AVCam app, (which uses an AVCaptureVideoPreviewLayer)
but because it has its own built in AVCaptureSession, I can't adjust how often the
images are displayed. (well, you can adjust the preview layer frame rate but
that can't be done on the fly)
I have looked at the AVFoundation programming guide, which talks about AVAssets
and AVPlayer, etc. but I can't see how a camera image can be turned into an
AVAsset. When I look at the AVFoundation guide, and other demos which show how
to define an AVAsset, it only gives me choices of using http stream data to
create the asset, or a url to define an asset using an existing file. I can't
figure out how to make my captured UIImage into an AVAsset, in which case I guess
I could use an AVPlayer, AVPlayerItems and AVAssetTracks to show the image with
an observeValueForKeyPath function checking status and doing [myPlayer play].
(I also studied the WWDC session 405 "Exploring AV Foundation" to see how that
is done)
I have tried similar code as in the WWDC Session 409 "Using the Camera on iPhone."
Like that myCone demo, I can set up the device, the input, the capture session,
the output, the setting up of a callback function to a CMSampleBuffer, and I
can collect UIImages and size them, etc. At this point I want to send that image
to a UIView or UIimageView. The session 409 just talks about doing it with
CFShow(sampleBuffer). This wasn't explained, and I guess its just assuming a
knowledge of Core Foundation I don't yet have. I think I am turning the captured
output in the sample buffer into a UIImage, but I can't figure out how to render
it. I created an IBOutlet UIImageView in my nib file, but when I try to stuff
the image into that view, nothing gets displayed. Do I need an AVPlayerLayer?
I have looked at the UIImagePickerViewController as an alternate method of
controlling how often I display captured camera images, and I dont see that I
can change the time on the fly to display images using that controller either.
So, as you can see, I am learning this stuff with the Apple development forum and
their documentation, the WWDC videos, and various websites such as
stackoverflow.com but have yet to see any examples of doing camera to screen
without using AVCaptureVideoPreviewLayer, UIImagePickderViewController or by
using an AVAsset that isnt already a file or http stream.
Can anybody make a suggestion? Thanks in advance.

Iphone - Load image from camera roll to UIImageView without UIImagepicker

Is it possible to load the last photo from camera roll to UIImageView without using a UIImagepicker?
I mean retrieve the path to the file or something like that.
Thankz :D
You should be able to do this by using the new AVFoundation framework. You may want to take a look at the WWDC 2010 session videos discussing the framework and the related source code.

How to record a video automatically in Iphone app without user interaction

I am working an iphone app that needs to record a vedio automatically.
I used mobile coreservices framework and using that. I made it to came into video mode and clicking on record option its start capturing a vedio. But I want it automatically that is.. I should able to record a video without clicking on record option. That is when video mode comes up its automatically start record video.
Could any one help?
You can look at UIImagePickerControllers startVideoCapture method which is used to start taking video from the camera, this is to be used when you arent using the camera standard controlors and you provide and overlay view. Here is a reference UIImagePickerCOntroller ref. If this is not enough for you, you might want to look into AVFoundation framework which gives you a lot more control over video capturing process...hope that helps

iPhone. Is it possible to load a video file and select a specific frame?

Is it possible to load a short video file and - once loaded - select a specific frame and display that frame in a view? If there is no native support for this, how about an open source alternative?
Thanks in advance.
-Doug
I think that in iphone programming you're stuck with the fullscreen video solution proposed by apple. You could write your own controller to do it differently, but i think it could be difficult to achieve good performances and you're cut out of the app-store for sure.
edit:
looks like in iphone sdk 3.2 apple added something for you:
The MPMoviePlayerController class
defines an interface for managing the
playback of a movie. Playback occurs
either in full-screen mode or in a
custom view that is vended by the
movie player controller. You can
incorporate the view into your own
view hierarchies or use a
MPMoviePlayerViewController object to
manage the presentation for you.
and again
Behavior in iPhone OS 3.1 and Earlier
In iPhone OS 3.1 and earlier, this
class implemented a full-screen movie
player only. After creating the movie
player and initializing it with a
single movie file, you called the play
method to present the movie. (The
definition of the play method has
since moved out of this class and into
the MPMediaPlayback protocol.) The
movie player object itself handled the
actual presentation of the movie
content.
i haven't tested it yet but have a look at the official documentation under MPMoviePlayerController Class Reference, it may help.