In my app, I want to check if the torch/flashlight of the iPhone is on or off. I know how to set up an AVCaptureSession and all that, but that doesn't give me the information I really want to know. I want to know whether, at the time of the code being called, the phone's torch is on or off from control center.
Related
my app use AVCapture for capture image, this is my supervisor's ideal. But i research in internet and a can't get any information about the difference between AVCapture and default camera of iPhone or iPop (tab focus or camera quality...). please tell me what advance of AVFoundation framework ...
with the AVCaptureSession you can give your recorder a lot more functionality. You can customize nearly every aspect of the recording session. and you can ever get the raw data straight from the camera. the code can get quite complex however, and nothing is taken care of for you.
With the iOS default image capture controller you will be stuck with a few presets, and you will only have a little bit of camera functionality. But it is really simple to implement.
updated with link to apple code
If you want to see how to use the AVFoundation to do you camera recording you will probably like this app from apple.
Like I said, you will have to do everything manually. so be prepared for a handful of work.
AVCam demo app by Apple
Im using ZXing, is working fine on my new app, but i would like to integrate the option of front or rear camera,
so far the only reference to this i've found is on the google group
But is not very clear what they mean with that,
so any pointers on what i have to do to accomplish this?
thanks !
ZXWidgetController doesn't provide that functionality and it's not really set up to make it easy to change.
The code that needs to change is in - (void)initCapture. It calls [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]. This returns the default camera and you don't want the defalt.
You need code similar to that in - (ZXCaptureDevice*)device in ZXCapture.mm. That code won't work out of the box (it's designed to work with both AVFF and QTKit) but it's the same idea. Instead of using the default video input device, go through the devices and look at device position to find the device you want.
Be nice to port that code to the widget but that hasn't happened at this point.
As a new AppDeveloper, I've never used orientation. So, I would like to try to learn with a simple task. What I want to do is have the screen fade to black when the device is set face down. Is that something that would be simple to do, that perhaps somebody could assist me in, or provide helpful information?
Thanks! :D
Your help is appriciated
You can use orientation (using the XYZ values when the screen is face down). I do not recommend this, because your screen will fade out even if a user is using the app while lying down, and staring up at the screen.
There is an easier and cleaner way. Notice how during phone calls, having the phone close to your ear blacks out the screen?
You can access that property by monitoring the proximityState property of UIDevice. Details here
Doing something like:
BOOL closeToUser = [[UIDevice currentDevice] proximityState];
will assign a YES to closeUser when the device is face down on a surface of some kind, and a NO when it is not
If the value is YES, you can invoke code to do whatever you want.
I want to get the color of the point that the user is panning on the image from the camera.
And this needs to happen in real time.
I'm using UIImagePickerController class with sourceType property set to UIImagePickerControllerSourceTypeCamera.
So the user opens the video camera and after the iris opens he has the possibility to tap over it.
While he is panning on the video camera view I want the application to show the color of the point under his finger. Real time.
If there is someone who could please tell me if this is possible and how to do it.
I tried to use the code from here:
http://developer.apple.com/library/ios/#qa/qa1702/_index.html
First I have a warning:
warning: 'CameraViewController' may not respond to '-setSession:'
I get a lot of errors when trying to compile. I included inside the .h file this:
#import <AVFoundation/AVFoundation.h>
Do I have to include more then this?
Also do I still need to use the UIImagePickerController to show the camera?
I'm new to iOS and very confused with this.
OK I did it using the example from http://developer.apple.com/library/ios/#qa/qa1702/_index.html
The problem that I have is that is working only on the iPhone. On the simulator I still get those errors regarding not recognizing the frameworks.
You will have to use AVFoundation for this. An AVCaptureSession can deliver live video frames to AVCaptureVideoDataOutput's delegate method captureOutput:didOutputSampleBuffer:fromConnection: where you then have to analyze each frame to determine the color at a particular position.
I want to adjust the exposure of the iPhone/iPod touch camera with intimate detail. I would prefer to take a series of photos with decreasing exposure times to obtain a sequence of images (for HDR reconstruction). Is this possible?
If not, what's the next best thing? It seems you can set a point of interest in the image for the autoexposure. Perhaps I could search for a dark/light region of the image and then use this exposurePointOfInterest to adjust the exposure, but this seems like a very indirect solution that is also error-prone. If anybody has tried an alternative, such an answer is also desirable.
As iOS gives control of frame durations by
MinFrameDuration
MaxFrameDuration
since exposure times vary based on fram rate and frame duration
By setting min and max frame rate to a particular value
You will be locking the fram rate.
That will effect your exposure times.
This is also very indirect way of controlling, may be it helps your case
some example would be like this:
if (conn.isVideoMinFrameDurationSupported)
conn.videoMinFrameDuration = CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND);
if (conn.isVideoMaxFrameDurationSupported)
conn.videoMaxFrameDuration = CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND);
Since you would have to decrease the shutter speed of the camera, this unfortunately does not appear to be possible, and more importantly, against the HIG:
Changing the behavior of iPhone external hardware is a violation of
the iPhone Developer Program License Agreement. Applications must
adhere to the iPhone Human Interface Guidelines as outlined in the
iPhone Developer Program License Agreement section 3.3.7
Related article Apple Removes Camera+ iPhone App From The App Store After Developer Reveals Hack To Enable Hidden Feature.
If it can be done programatically, instead of with the hardware, you might have a chance, but then its just an effect on an image,not a true long exposure picture.
There are some simulated slow shutter apps that do get approved like Slow Shutter or Magic Shutter.
Related article: New iPhone Camera App “Magic Shutter” Hits The App Store.
This is supported since iOS 8:
http://developer.xamarin.com/guides/ios/platform_features/intro_to_manual_camera_controls/
Have a look at AVCaptureExposureModeCustom and CaptureDevice.LockExposure...
I tried to do this for my motion activated camera app (Pocket Sentry) and I found that it is not possible to do this AND get approved in the app store.
I have been trying to do this myself. I think its possible only by using the exposure point of interest property. I am detecting the dark and bright spots and then adjusting the point accordingly.
Please refer : Detecting bright/dark points on iPhone screen
Does anyone know a better way to do this?
I am not too sure, but you should try using AVFoundation class to build the camera app, following the apple's sample code:
AVCam Sample Code
And then try to leverage the exposureMode property of the Class:
exposureMode Class Reference