How can I make the iPhone 4 LED light fire instantly? - iphone

I'm currently using the below code to turn on and off my iPhone 4 LED light and it's working great, but the only problem is that every time I turn the LED on there is a slight delay. However it turns off instantly. I need it to fire instantly to implement a strobe like feature, and because it's just more convenient.
I've noticed that in Apple's camera app and many other apps that the LED turns on and off instantaneously when you hit the power button.
I've tried adding some of the objects like "session" and "device" as instance variables to my view controller in order to have the iPhone create those objects at load time, however I haven't had any luck in getting it to work.
I've also tried looking at apples WWDC sample code but I just can't seem to decipher their complex code. Can someone please help me figure this out i've been trying for about 4 days to get this to work.
.h
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#interface FlashlightViewController : UIViewController {
AVCaptureSession *torchSession;
}
#property (nonatomic, retain) AVCaptureSession * torchSession;
- (void) toggleTorch;
#end
.m
#import "FlashlightViewController.h"
#implementation FlashlightViewController
#synthesize torchSession;
- (void)dealloc
{
[torchSession release];
[super dealloc];
}
- (void)viewDidLoad
{
[self toggleTorch];
[super viewDidLoad];
}
- (void) toggleTorch
{
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device hasTorch] && [device hasFlash])
{
if (device.torchMode == AVCaptureTorchModeOff)
{
NSLog(#"It's currently off.. turning on now.");
AVCaptureDeviceInput *flashInput = [AVCaptureDeviceInput deviceInputWithDevice:device error: nil];
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[session beginConfiguration];
[device lockForConfiguration:nil];
[device setTorchMode:AVCaptureTorchModeOn];
[device setFlashMode:AVCaptureFlashModeOn];
[session addInput:flashInput];
[session addOutput:output];
[device unlockForConfiguration];
[output release];
[session commitConfiguration];
[session startRunning];
[self setTorchSession:session];
[session release];
}
else {
NSLog(#"It's currently on.. turning off now.");
[torchSession stopRunning];
}
}
}

Do everything (all the session and device configuration stuff) except the flash configuration block before you want to turn the flash LED on, during app init or view load.
Then just set torch mode on when you want to turn the LED on. Something like:
[self.myDevice lockForConfiguration:nil];
[self.myDevice setTorchMode:AVCaptureTorchModeOn];
[self.myDevice setFlashMode:AVCaptureFlashModeOn];
[self.myDevice unlockForConfiguration];
Make sure that myDevice is a properly configured property during your init.

A bit necromantic, but here is a great Library to do it :
LARSTTorch

Related

Toggle Flashlight when Motion is detected

I'm currently working on a small project, the meaning of the project/app is that if you were to shake the device the flash ignites. If you shake it again the light turns off!
The first 3/4 toggles between light on or off are working ok, but after 3/4 toggles it is imposible to turn of the light since the device doesnt detect a shake.
Another small problem (since iOS 5.0) is that if a motion is detected the flash will blink very short (1 sec) and than power on. This looks like a short flash.
What am i doing wrong
Detecting the shake:
- (void)motionBegan:(UIEventSubtype)motion withEvent:(UIEvent *)event {
if (event.type == UIEventSubtypeMotionShake) {
NSLog(#"shaken, not stirred.");
[self playSound];
[self toggleFlashlight];
}
}
Toggling the Flash ligt:
- (void)toggleFlashlight
{
AVCaptureDevice *device =
[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device hasTorch] && [device hasFlash]){
if (device.torchMode == AVCaptureTorchModeOff) {
NSLog(#"It's currently off.. turning on now.");
AVCaptureDeviceInput *flashInput =
[AVCaptureDeviceInput deviceInputWithDevice:device error: nil];
AVCaptureVideoDataOutput *output =
[[AVCaptureVideoDataOutput alloc] init];
AVCaptureSession *session =
[[AVCaptureSession alloc] init];
[session beginConfiguration];
[device lockForConfiguration:nil];
[device setTorchMode:AVCaptureTorchModeOn];
[device setFlashMode:AVCaptureFlashModeOn];
[session addInput:flashInput];
[session addOutput:output];
[device unlockForConfiguration];
[output release];
[session commitConfiguration];
[session startRunning];
[self setAVSession:session];
[session release];
}
else {
NSLog(#"It's currently on.. turning off now.");
[AVSession stopRunning];
}
}
}
I would really appreciate your expertise in solving the problem for me!
Well, I don't know much about it, but it's possible you're starting/stopping the session too quickly (calling ToggleFlashLight too soon after last calling it). You might see this code here... It suggests that you do all that (except for the Torch/FlashModeOn) once to create the session, and then just do the lock/torchModeOn/flashModeOn/unlock to turn on and lock/torchModeOff/flashModeOff/unlock to turn off.
Calling your code in
- (void)motionEnded:(UIEventSubtype)motion withEvent:(UIEvent *)event
instead of motionEnded might solve your problems.

How do I keep my iPhone 4 flashlight app from blinking when it turns off?

I'm playing around with a simple little flashlight app that turns on and off the LED flash when you press buttons on my view.
It works just fine, but when I turn off the flash, it blinks once before turning off. Any ideas what's causing this behavior?
Here's the pertinent code:
//
// No_Frills_FlashlightViewController.m
// No Frills Flashlight
//
// Created by Terry Donaghe on 8/9/11.
// Copyright 2011 Tilde Projects. All rights reserved.
//
#import "No_Frills_FlashlightViewController.h"
#implementation No_Frills_FlashlightViewController
#synthesize AVSession;
- (void)didReceiveMemoryWarning
{
// Releases the view if it doesn't have a superview.
[super didReceiveMemoryWarning];
// Release any cached data, images, etc that aren't in use.
}
#pragma mark - View lifecycle
/*
// Implement viewDidLoad to do additional setup after loading the view, typically from a nib.
- (void)viewDidLoad
{
[super viewDidLoad];
}
*/
- (void)viewDidUnload
{
[super viewDidUnload];
// Release any retained subviews of the main view.
// e.g. self.myOutlet = nil;
}
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
{
// Return YES for supported orientations
return (interfaceOrientation == UIInterfaceOrientationPortrait);
}
- (IBAction)TurnOnLight:(id)sender {
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVSession = [[AVCaptureSession alloc] init];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil];
[AVSession addInput:input];
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
[AVSession addOutput:output];
[AVSession beginConfiguration];
[device lockForConfiguration:nil];
[device setTorchMode:AVCaptureTorchModeOn];
[device setFlashMode:AVCaptureFlashModeOn];
[device unlockForConfiguration];
[AVSession commitConfiguration];
[AVSession startRunning];
[self setAVSession:AVSession];
[output release];
}
- (IBAction)TurnOffLight:(id)sender {
[AVSession stopRunning];
[AVSession release];
AVSession = nil;
}
- (IBAction)DoNothing:(id)sender {
}
#end
AVSession is just a class level AVCaptureSession variable.
And yes, this is code I just found on the internets. I'm just playing and trying to figure things out.
I figured out what was going on, and it had nothing to do with the code. :) ID10T error.
I had copied the "Turn On" button to create the "Turn Off" button. I forgot to unwire the "Turn Off" button's connection to the TurnOnLight method that was there due to the copying.
I simply removed that connection and now the app works perfectly! :)
Lesson Learned: Sometimes it's not your source code that's the problem. :D

Expected "(" before AVCaptureSession

When I try to build my application XCode shows me this error
Expected "(" before AVCaptureSession
Can someone help me fixing this warning? Here's my code:
ViewController.h
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#interface ViewController : UIViewController
{
}
- (IBAction)SwitchON_Flash;
- (void)setTorchSession:(AVCaptureSession *)CaptureSession;
#end
ViewController.m
#import "ViewController.h"
#implementation ViewController
UIAlertView *NoFlash;
- (IBAction)SwitchON_Flash
{
AVCaptureDevice *Device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([Device hasTorch] && [Device hasFlash])
{
if (Device.torchMode == AVCaptureTorchModeOff)
{
AVCaptureDevice *Device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *FlashInput = [AVCaptureDeviceInput deviceInputWithDevice:device error: nil];
AVCaptureVideoDataOutput *VideoOutput = [[AVCaptureVideoDataOutput alloc] init];
AVCaptureSession *CaptureSession = [[AVCaptureSession alloc] init];
[CaptureSession beginConfiguration];
[CaptureSession addInput:FlashInput];
[CaptureSession addOutput:VideoOutput];
[CaptureSession commitConfiguration];
[CaptureSession startRunning];
[Device lockForConfiguration:nil];
[Device setTorchMode:AVCaptureTorchModeOn];
[Device setFlashMode:AVCaptureFlashModeOn];
[Device unlockForConfiguration];
[self setTorchSession:CaptureSession];
[CaptureSession release];
[VideoOutput release];
}
else
{
[torchSession stopRunning];
}
}
else
{
NoFlash = [[UIAlertView alloc] initWithTitle:#"Uh-Oh"
message:#"Your device doesn't have a flash camera"
delegate:nil
cancelButtonTitle:#"mhmm, OK"
otherButtonTitles:nil];
NoFlash.delegate = self;
[NoFlash show];
[NoFlash release];
}
}
- (void)setTorchSession:(AVCaptureSession *)CaptureSession <== /// ERROR HERE ///
{
}
Thanks!
UIViewController does not have that method by default. You don't have the implementation of that method, hence it's giving you this warning. If you will try to run this code - you will get "unknown selector sent to instance" error.
You either have to add property for torchSession in view controller or implement that specific method.
The first answer is not necessarily right as you may have implemented the method but simply not declared it in your header file.
On the line where you get the warning, you are sending a message to self (in this case your view controller) to run the method setTorchSession with the parameter CaptureSession
Provided you have implemented the setTorchSessin method in your .m file, all you have to do is declare it in your interface (.h file) by adding the following line under your SwitchON_Flash method:
- (IBAction)SwitchON_Flash;
- (void)setTorchSession:(AVCaptureDeviceSession*)captureSession;
If you haven't got the method in your implementation file your App will crash with a "Unrecognised selector sent to instance" message.
Are you sure the class is AVCaptureDeviceSession ? I can't find it in XCode help at all.
Maybe you meant AVCaptureSession

Getting exposure values from camera on iPhone OS 4.0

Exposure values from camera can be acquired when you take picture (without saving it to SavedPhotos). A light meter application on iPhone does this, probably by using some private API.
That application does it on iPhone 3GS only, so I guess it may be somehow related to EXIF data which is populated with this information when the image is created.
This all applies to 3GS.
Has anything changed with iPhone OS 4.0?
Is there a regular way to get these values now?
Does anyone have a working code example for taking these camera/photo setting values?
Thank you
If you want realtime* exposure information, you can capture a video using AVCaptureVideoDataOutput. Each frame CMSampleBuffer is full of interesting data describing the current state of the camera.
*up to 30 fps
With AVFoundation in iOS 4.0 you can mess with exposure, refer specifically to AVCaptureDevice, here is a link AVCaptureDevice ref. Not sure if its exactly what you want but you can look around AVFoundation and probably find some useful stuff
I think I finally found the lead to the real EXIF data. It'll be a while before I have actual code to post, but I figured this should be publicized in the meantime.
Google captureStillImageAsynchronouslyFromConnection. It's a function of AVCaptureStillImageOutput and following is an excerpt from the documentation (long sought for):
imageDataSampleBuffer -
The data that was captured.
The buffer attachments may contain metadata appropriate to the image data format. For example, a buffer containing JPEG data may carry a kCGImagePropertyExifDictionary as an attachment. See ImageIO/CGImageProperties.h for a list of keys and value types.
For an example of working with AVCaptureStillImageOutput see WWDC 2010 sample code, under AVCam.
Peace,
O.
Here is the complete solution. Dont forget to import appropriate frameworks and headers.
In the exifAttachments var in capturenow method you'll find all data you are looking for.
#import <AVFoundation/AVFoundation.h>
#import <ImageIO/CGImageProperties.h>
AVCaptureStillImageOutput *stillImageOutput;
AVCaptureSession *session;
- (void)viewDidLoad
{
[super viewDidLoad];
[self setupCaptureSession];
// Do any additional setup after loading the view, typically from a nib.
}
-(void)captureNow{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *__strong error) {
CFDictionaryRef exifAttachments = CMGetAttachment( imageDataSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments)
{
// Do something with the attachments.
NSLog(#"attachements: %#", exifAttachments);
}
else
NSLog(#"no attachments");
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
}];
}
// Create and configure a capture session and start it running
- (void)setupCaptureSession
{
NSError *error = nil;
// Create the session
session = [[AVCaptureSession alloc] init];
// Configure the session to produce lower resolution video frames, if your
// processing algorithm can cope. We'll specify medium quality for the
// chosen device.
session.sessionPreset = AVCaptureSessionPreset352x288;
// Find a suitable AVCaptureDevice
AVCaptureDevice *device = [AVCaptureDevice
defaultDeviceWithMediaType:AVMediaTypeVideo];
[device lockForConfiguration:nil];
device.whiteBalanceMode = AVCaptureWhiteBalanceModeLocked;
device.focusMode = AVCaptureFocusModeLocked;
[device unlockForConfiguration];
// Create a device input with the device and add it to the session.
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input) {
// Handling the error appropriately.
}
[session addInput:input];
stillImageOutput = [AVCaptureStillImageOutput new];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
if ([session canAddOutput:stillImageOutput])
[session addOutput:stillImageOutput];
// Start the session running to start the flow of data
[session startRunning];
[self captureNow];
}

CMMotionManager and the Gyroscope on iPhone 4

I am trying to simply NSLog the output of the new iPhone 4 Gyroscope. But after reading the documentation and following their sample code I am getting this error.
ERROR,Time,300635803.946,Function,"CLLoggingSetFile",could not open locations log /var/mobile/Library/Caches/CoreMotion/CoreMotion.log
Even if I just setup my motionManager object with [[CMMotionManager alloc] init]; on its own and no other code, I still get the error.
Here is my .h file.
#import <UIKit/UIKit.h>
#import <CoreMotion/CoreMotion.h>
#interface GyroTest0ViewController : UIViewController {
CMMotionManager *motionManager;
NSOperationQueue *opQ;
}
#end
And here my .m file.
- (void)viewDidLoad {
[super viewDidLoad];
// the error occurs even just with this line on its own
motionManager = [[CMMotionManager alloc] init];
if (motionManager.gyroAvailable) {
motionManager.gyroUpdateInterval = 1.0/60.0;
[motionManager startGyroUpdates];
opQ = [[NSOperationQueue currentQueue] retain];
CMGyroHandler gyroHandler = ^ (CMGyroData *gyroData, NSError *error) {
CMRotationRate rotate = gyroData.rotationRate;
NSLog(#"rotation rate = [%f, %f, %f]", rotate.x, rotate.y, rotate.z);
};
} else {
NSLog(#"No gyroscope on device.");
[motionManager release];
}
}
Any help and/or source code to simply log the iPhone 4 gyroscope data would be much appreciated. Many thanks!
Try this,
motionManager.gyroUpdateInterval = 1.0/60.0;
[motionManager startGyroUpdatesToQueue:[NSOperationQueue currentQueue]
withHandler: ^(CMGyroData *gyroData, NSError *error)
{
CMRotationRate rotate = gyroData.rotationRate;
NSLog(#"rotation rate = [%f, %f, %f]", rotate.x, rotate.y, rotate.z);
}];
For the WWDC sample code:
Log in to ADC
Click on WWDC 2010 session videos
View in iTunes
There you find the link to sample code (230 MB)
Any results regarding this issue? I get the same error even when I use WWDC teapot demo code.
I filed a bug report (8382659).
By the way I get the same error when I use the push method described by Joshua Weinberg.
Update: Apple confirmed the bug but referred to a duplicate issue 8173937 that I can't find. Well let's hope that it will be fixed in the next release.