I am new to XCode. I am trying to develop a custom camera with an overlay view . I am able to load this well and it's working great.
Now I want to add zoom functionality activated with a button on the overlay view.
Can any one guide me? I'm trying to find out how to zoom the camera, but I haven not bene able to find anything.
Thank you
You can use cameraViewTransform property :
cameraViewTransform The transform to
apply to the camera’s preview image.
#property(nonatomic) CGAffineTransform
cameraViewTransform Discussion This
transform affects the live preview
image only and does not affect your
custom overlay view or the default
image picker controls. You can use
this property in conjunction with
custom controls to implement your own
electronic zoom behaviors.
You can access this property only when
the source type of the image picker is
set to
UIImagePickerControllerSourceTypeCamera.
Attempting to access this property for
other source types results in the
throwing of an
NSInvalidArgumentException exception.
Availability Available in iOS 3.1 and
later. Declared In
UIImagePickerController.h
http://developer.apple.com/library/ios/#documentation/uikit/reference/UIImagePickerController_Class/UIImagePickerController/UIImagePickerController.html
Related
I want to get the color of the point that the user is panning on the image from the camera.
And this needs to happen in real time.
I'm using UIImagePickerController class with sourceType property set to UIImagePickerControllerSourceTypeCamera.
So the user opens the video camera and after the iris opens he has the possibility to tap over it.
While he is panning on the video camera view I want the application to show the color of the point under his finger. Real time.
If there is someone who could please tell me if this is possible and how to do it.
I tried to use the code from here:
http://developer.apple.com/library/ios/#qa/qa1702/_index.html
First I have a warning:
warning: 'CameraViewController' may not respond to '-setSession:'
I get a lot of errors when trying to compile. I included inside the .h file this:
#import <AVFoundation/AVFoundation.h>
Do I have to include more then this?
Also do I still need to use the UIImagePickerController to show the camera?
I'm new to iOS and very confused with this.
OK I did it using the example from http://developer.apple.com/library/ios/#qa/qa1702/_index.html
The problem that I have is that is working only on the iPhone. On the simulator I still get those errors regarding not recognizing the frameworks.
You will have to use AVFoundation for this. An AVCaptureSession can deliver live video frames to AVCaptureVideoDataOutput's delegate method captureOutput:didOutputSampleBuffer:fromConnection: where you then have to analyze each frame to determine the color at a particular position.
Is it possible to add an overlay image on top of a map added in my app? I am using MapKit to show the map of an area. I would like to add an overlay image on top of the map before the pins show up
i.e. the stack should be map->image overlay->pins
Is it possible without going into the hierarchy of views - get all subviews of the view and then add an image just on top of the map?
Thanks.
I know that you need a solution for a static map , but here's one for a "draggable" one , which should also solve your problem.
You should subclass MKOverlayView , and override its (empty by default):
- (void)drawMapRect:(MKMapRect)mapRect zoomScale:(MKZoomScale)zoomScale inContext:(CGContextRef)context.
The method should actually do what drawRect does in views.
You should also implement another "should" method , that should return TRUE if the overlay should be visible on screen (in your case.. always ?).
In the overriden method , you should draw your image on top of the map (according to the mapRect and zoomScale of course) , and viola!
Some more reference :
http://developer.apple.com/library/ios/#documentation/UserExperience/Conceptual/LocationAwarenessPG/AnnotatingMaps/AnnotatingMaps.html#//apple_ref/doc/uid/TP40009497-CH6-SW15
Try taking a look into the MKOverlayView outlined in the MKMapView documentation (see link). In addition, it may be worth reviewing "Apple WWDC Session 127 - Customizing Maps with Overlays".
If you watch that "Apple WWDC Session 127 - Customizing Maps with Overlays" session there is a part about about raster images as overlays. If you download the 2010 WWDC Sample Code there is an example named "TileMap" which has the code for doing that.
I want to create an iphone app that if you shoot using the camera it automatically save and shoot again until you click Done.
is that possible?
Look at the takePicture method of UIImagePickerController. From the documentation:
Use this method in conjunction with a custom overlay view to initiate the programmatic capture of a still image. This supports taking more than one picture without leaving the interface, but requires that you hide the default image picker controls.
Since you have to have the default controls hidden, you'll also want to look at cameraOverlayView, which lets you provide your own controls to use instead. That's where you can put your start and done buttons.
as you probably know, the UIImagePickerController in the UIKit Framework allows the user to take a photo which is then handed over to my App for further processing.
If the property allowsEditing is set to YES, the user is allowed to move and scale the photo after taking it.
What I'd like to accomplish is to modify the photo in my code BEFORE the user is presented with the move and scale screen. This is because I'd like to add some visual effects to the photo and the user should be able to move and scale with these effects already applied to the photo.
I know that there's the cameraOverlayView property, but this is not useful in my case as far as I'm concerned.
Any ideas?
Thanks, Thomas
Not so easy way is to implement move and scale functionality on your own.
make showsCameraControls NO. You can design your own preview screen (with modified Image).
Looks like there are issues with touches in IOS5 cameraOverlayView. Make sure the above solution works for IOS5 also :).
When I use the standard [view.layer renderInContext:UIGraphicsGetCurrentContext()]; method for turning views into images on the iPhone camera modal view, all I get is the controls with black where the camera preview was.
Anyone have a solution for this? Speed is not an issue.
Not possible from a documented api, but possible. Find the "PLCameraView" view in the camera's subviews, then call
CGImageRef img = (CGImageRef)[foundCameraView imageRef];
This will return a reference to the image that camera holds.
You can do this with the new AVFoundation stuff in iOS 4.0.
You should be able to call UIGetScreenImage() (returns a UIImage) to get a current capture of the whole screen, including the preview. That's how all the barcode apps worked before. But supposedly Apple is disallowing that now and only allow the AVFoundation technique - which only works under 4.0.
The whole reason there's even an issue is because UIGetScreenImage() is not part of the documented API, but Apple made a specific exception for using it. It's not like they are pulling current apps, but they are not allowing new submissions (or updates) that use the older technique.
There is some lobbying on behalf on a number of people to convince Apple to let app developers use the old technique for iOS 3.x only, so send an email to developer relations if you want to use it.
This isn't possible unfortunately. For performance the iPhone uses some form of direct rendering to draw the camera preview directly onto the screen instead of as part of a UIView surface. As such when you "capture" the camera view you will only get the UIView elements, not the preview image.
(FWIW this is similar to the reasons why its difficult to screengrab some movie software on Windows/Mac)
You could try AVCaptureVideoPreviewLayer