I would like to test around a bit with augmented reality. My first problem is how to get the picture that comes from the camera as background for my view? For what i want to do, i dont need to access the picture, i just need it as background. I found a few solutions how to take a picture with the camera, but nothing that gives me a picture that comes from the camera.
thanks,.
I would use AVFoundation for that.
You need to set up an AVCaptureSession with an AVCaptureDevice and an AVCaptureDeviceInput. Finally - and this is what you're interested in - set up an AVCaptureVideoPreviewLayer.
Send the startRunning: message to your AVCaptureSession object, and you should be good to go.
Docs here: http://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/03_MediaCapture.html#//apple_ref/doc/uid/TP40010188-CH5-SW14
another way would be using UIImagePickerController .
Create an instance, say picker and set: picker.showsCameraControls = NO .
And finally set cameraOverlayView with your overlay: picker.cameraOverlayView = someViewController.view.
Hope this helps.
Related
I am using UIImagePickerController with:
[picker setAllowsEditing:YES];`
picker.mediaTypes is set to kUTTypeMovie and kUTTypeImage
But I would like to set this option only for movies and not for images. How can I implement this?
you can try to make two sperate buttons(action) for movie and image. when the user wants to capture image disable the editing. Or, I would go with
Fully-Customized Media Capture and Browsing which will enable you to do many flexible things
EDIT: Also 'cameraOverlayView' can help. you make a custom view where you make button to handle the event.
I am a newbie trying to capture camera video images using AVFoundation and
want to render the captured frames without using AVCaptureVideoPreviewLayer. I
want a slider control to be able to slow down or speed up the rate of display of
camera images.
Using other peoples code as examples, I can capture images and using an NSTimer,
with my slider control can define on the fly how often to display them, but I
can't convert the image to something I can display. I want to move these
images into a UIView or UIImageView and render them in the timer Fire function.
I have looked at Apples AVCam app, (which uses an AVCaptureVideoPreviewLayer)
but because it has its own built in AVCaptureSession, I can't adjust how often the
images are displayed. (well, you can adjust the preview layer frame rate but
that can't be done on the fly)
I have looked at the AVFoundation programming guide, which talks about AVAssets
and AVPlayer, etc. but I can't see how a camera image can be turned into an
AVAsset. When I look at the AVFoundation guide, and other demos which show how
to define an AVAsset, it only gives me choices of using http stream data to
create the asset, or a url to define an asset using an existing file. I can't
figure out how to make my captured UIImage into an AVAsset, in which case I guess
I could use an AVPlayer, AVPlayerItems and AVAssetTracks to show the image with
an observeValueForKeyPath function checking status and doing [myPlayer play].
(I also studied the WWDC session 405 "Exploring AV Foundation" to see how that
is done)
I have tried similar code as in the WWDC Session 409 "Using the Camera on iPhone."
Like that myCone demo, I can set up the device, the input, the capture session,
the output, the setting up of a callback function to a CMSampleBuffer, and I
can collect UIImages and size them, etc. At this point I want to send that image
to a UIView or UIimageView. The session 409 just talks about doing it with
CFShow(sampleBuffer). This wasn't explained, and I guess its just assuming a
knowledge of Core Foundation I don't yet have. I think I am turning the captured
output in the sample buffer into a UIImage, but I can't figure out how to render
it. I created an IBOutlet UIImageView in my nib file, but when I try to stuff
the image into that view, nothing gets displayed. Do I need an AVPlayerLayer?
I have looked at the UIImagePickerViewController as an alternate method of
controlling how often I display captured camera images, and I dont see that I
can change the time on the fly to display images using that controller either.
So, as you can see, I am learning this stuff with the Apple development forum and
their documentation, the WWDC videos, and various websites such as
stackoverflow.com but have yet to see any examples of doing camera to screen
without using AVCaptureVideoPreviewLayer, UIImagePickderViewController or by
using an AVAsset that isnt already a file or http stream.
Can anybody make a suggestion? Thanks in advance.
Please help me with my question.
Is there any way to get image from camera without UIImagePickerController?
I need to render current image(from camera) into image on my view and update it by timer.
May be AVCaptureStillImageOutput? I didn't find any examples.
Any ideas?
Yes, you can do it easily using AVCamCaptureManager and AVCamRecorder classes. Apple has a demo program build on its developer site here. It is named AVCam. In simple words what it does is when you click to open the camera, it calls the classes and methods which are responsible for opening the iPhone's camera and record video or capture audio. It calls the same classes which are called by UIImagePickerController.
I hope it helps.
I am developing an app that plays internet radio. Owing to my lack of skill, i have only used the stock MPMoviePlayerController.
This is so far been able to play a few m3u streams (like the 'feeling floyd' station)
However, i was wondering if there was any way to have this MPMoviePlayerViewController to show me extra information.. like the song that is playing (which information i was able to extract form teh metadata.)
I can get this information all right, but how do i put it on the screen?
Can I make an overlay or something? (the centre of the MPMovieplayer is taken up by the quicktime background... it would be great if i could use an overlay on this space to show current music information or whatever.)
Is this overlay thing possible? if not, is there any other way?
Thank you very much!
V
If you want your own custom view over the player you can add it as a subview to the player's view as [player.view addSubview:yourView];
Get the moviePlayer property of MPMoviePlayerViewController. I think you should be able to to do this [moviePlayer.view addSubview:myView].
I want to play videos. I am using MPMoviePlayer, but I don't want to use the controls provided by MPMoviePlayer. So I am trying to create my own custom controls. All the functionality like play, pause, fullscreen, forward, backward are done. The only problem is with the scrubber. I am having one UISlider but I don't know how exactly work with this. How to track the currently playing video time? How to play video from where I will slide the thumb of slider?
If anyone knows this kindly help me in this.
Thanks in advance.
I was having a similar problem. I figured out how to create custom movie controls and put it up on github. Let me know if that helps. Feel free to ask me any questions if you want details.
First, we should note that all of this is possible in iOS 3.2+, if you are OK not to support iOS 3.1.x.
In iOS 3.2+, MPMoviePlayerController implements the MPMediaPlayback protocol, meaning that it responds to play, stop, etc., all the controls you would expect -- sounds like you already have some of this working. Please see the reference for the MPMediaPlayback protocol.
To get the MPMoviePlayerController to stop showing its own controls, do this on initialization:
yourPlayer.controlStyle = MPMovieControlStyleNone;
Finally, to get the scrubber to work, you need to set the UISlider valueChanged: callback to something, and update the value of currentPlaybackTime property. If you want to seek 10 seconds in:
yourPlayer.currentPlaybackTime = 10;