Expected "(" before AVCaptureSession - iphone

When I try to build my application XCode shows me this error
Expected "(" before AVCaptureSession
Can someone help me fixing this warning? Here's my code:
ViewController.h
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#interface ViewController : UIViewController
{
}
- (IBAction)SwitchON_Flash;
- (void)setTorchSession:(AVCaptureSession *)CaptureSession;
#end
ViewController.m
#import "ViewController.h"
#implementation ViewController
UIAlertView *NoFlash;
- (IBAction)SwitchON_Flash
{
AVCaptureDevice *Device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([Device hasTorch] && [Device hasFlash])
{
if (Device.torchMode == AVCaptureTorchModeOff)
{
AVCaptureDevice *Device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *FlashInput = [AVCaptureDeviceInput deviceInputWithDevice:device error: nil];
AVCaptureVideoDataOutput *VideoOutput = [[AVCaptureVideoDataOutput alloc] init];
AVCaptureSession *CaptureSession = [[AVCaptureSession alloc] init];
[CaptureSession beginConfiguration];
[CaptureSession addInput:FlashInput];
[CaptureSession addOutput:VideoOutput];
[CaptureSession commitConfiguration];
[CaptureSession startRunning];
[Device lockForConfiguration:nil];
[Device setTorchMode:AVCaptureTorchModeOn];
[Device setFlashMode:AVCaptureFlashModeOn];
[Device unlockForConfiguration];
[self setTorchSession:CaptureSession];
[CaptureSession release];
[VideoOutput release];
}
else
{
[torchSession stopRunning];
}
}
else
{
NoFlash = [[UIAlertView alloc] initWithTitle:#"Uh-Oh"
message:#"Your device doesn't have a flash camera"
delegate:nil
cancelButtonTitle:#"mhmm, OK"
otherButtonTitles:nil];
NoFlash.delegate = self;
[NoFlash show];
[NoFlash release];
}
}
- (void)setTorchSession:(AVCaptureSession *)CaptureSession <== /// ERROR HERE ///
{
}
Thanks!

UIViewController does not have that method by default. You don't have the implementation of that method, hence it's giving you this warning. If you will try to run this code - you will get "unknown selector sent to instance" error.
You either have to add property for torchSession in view controller or implement that specific method.

The first answer is not necessarily right as you may have implemented the method but simply not declared it in your header file.
On the line where you get the warning, you are sending a message to self (in this case your view controller) to run the method setTorchSession with the parameter CaptureSession
Provided you have implemented the setTorchSessin method in your .m file, all you have to do is declare it in your interface (.h file) by adding the following line under your SwitchON_Flash method:
- (IBAction)SwitchON_Flash;
- (void)setTorchSession:(AVCaptureDeviceSession*)captureSession;
If you haven't got the method in your implementation file your App will crash with a "Unrecognised selector sent to instance" message.

Are you sure the class is AVCaptureDeviceSession ? I can't find it in XCode help at all.
Maybe you meant AVCaptureSession

Related

Toggle Flashlight when Motion is detected

I'm currently working on a small project, the meaning of the project/app is that if you were to shake the device the flash ignites. If you shake it again the light turns off!
The first 3/4 toggles between light on or off are working ok, but after 3/4 toggles it is imposible to turn of the light since the device doesnt detect a shake.
Another small problem (since iOS 5.0) is that if a motion is detected the flash will blink very short (1 sec) and than power on. This looks like a short flash.
What am i doing wrong
Detecting the shake:
- (void)motionBegan:(UIEventSubtype)motion withEvent:(UIEvent *)event {
if (event.type == UIEventSubtypeMotionShake) {
NSLog(#"shaken, not stirred.");
[self playSound];
[self toggleFlashlight];
}
}
Toggling the Flash ligt:
- (void)toggleFlashlight
{
AVCaptureDevice *device =
[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device hasTorch] && [device hasFlash]){
if (device.torchMode == AVCaptureTorchModeOff) {
NSLog(#"It's currently off.. turning on now.");
AVCaptureDeviceInput *flashInput =
[AVCaptureDeviceInput deviceInputWithDevice:device error: nil];
AVCaptureVideoDataOutput *output =
[[AVCaptureVideoDataOutput alloc] init];
AVCaptureSession *session =
[[AVCaptureSession alloc] init];
[session beginConfiguration];
[device lockForConfiguration:nil];
[device setTorchMode:AVCaptureTorchModeOn];
[device setFlashMode:AVCaptureFlashModeOn];
[session addInput:flashInput];
[session addOutput:output];
[device unlockForConfiguration];
[output release];
[session commitConfiguration];
[session startRunning];
[self setAVSession:session];
[session release];
}
else {
NSLog(#"It's currently on.. turning off now.");
[AVSession stopRunning];
}
}
}
I would really appreciate your expertise in solving the problem for me!
Well, I don't know much about it, but it's possible you're starting/stopping the session too quickly (calling ToggleFlashLight too soon after last calling it). You might see this code here... It suggests that you do all that (except for the Torch/FlashModeOn) once to create the session, and then just do the lock/torchModeOn/flashModeOn/unlock to turn on and lock/torchModeOff/flashModeOff/unlock to turn off.
Calling your code in
- (void)motionEnded:(UIEventSubtype)motion withEvent:(UIEvent *)event
instead of motionEnded might solve your problems.

How do I keep my iPhone 4 flashlight app from blinking when it turns off?

I'm playing around with a simple little flashlight app that turns on and off the LED flash when you press buttons on my view.
It works just fine, but when I turn off the flash, it blinks once before turning off. Any ideas what's causing this behavior?
Here's the pertinent code:
//
// No_Frills_FlashlightViewController.m
// No Frills Flashlight
//
// Created by Terry Donaghe on 8/9/11.
// Copyright 2011 Tilde Projects. All rights reserved.
//
#import "No_Frills_FlashlightViewController.h"
#implementation No_Frills_FlashlightViewController
#synthesize AVSession;
- (void)didReceiveMemoryWarning
{
// Releases the view if it doesn't have a superview.
[super didReceiveMemoryWarning];
// Release any cached data, images, etc that aren't in use.
}
#pragma mark - View lifecycle
/*
// Implement viewDidLoad to do additional setup after loading the view, typically from a nib.
- (void)viewDidLoad
{
[super viewDidLoad];
}
*/
- (void)viewDidUnload
{
[super viewDidUnload];
// Release any retained subviews of the main view.
// e.g. self.myOutlet = nil;
}
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
{
// Return YES for supported orientations
return (interfaceOrientation == UIInterfaceOrientationPortrait);
}
- (IBAction)TurnOnLight:(id)sender {
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVSession = [[AVCaptureSession alloc] init];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil];
[AVSession addInput:input];
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
[AVSession addOutput:output];
[AVSession beginConfiguration];
[device lockForConfiguration:nil];
[device setTorchMode:AVCaptureTorchModeOn];
[device setFlashMode:AVCaptureFlashModeOn];
[device unlockForConfiguration];
[AVSession commitConfiguration];
[AVSession startRunning];
[self setAVSession:AVSession];
[output release];
}
- (IBAction)TurnOffLight:(id)sender {
[AVSession stopRunning];
[AVSession release];
AVSession = nil;
}
- (IBAction)DoNothing:(id)sender {
}
#end
AVSession is just a class level AVCaptureSession variable.
And yes, this is code I just found on the internets. I'm just playing and trying to figure things out.
I figured out what was going on, and it had nothing to do with the code. :) ID10T error.
I had copied the "Turn On" button to create the "Turn Off" button. I forgot to unwire the "Turn Off" button's connection to the TurnOnLight method that was there due to the copying.
I simply removed that connection and now the app works perfectly! :)
Lesson Learned: Sometimes it's not your source code that's the problem. :D

Simultaneous AVCaptureVideoDataOutput and AVCaptureMovieFileOutput

I need to be able to have AVCaptureVideoDataOutput and AVCaptureMovieFileOutput working at the same time. The below code works, however, the video recording does not. The didFinishRecordingToOutputFileAtURL delegate is called directly after startRecordingToOutputFileURL is called. Now if i remove AVCaptureVideoDataOutput from the
AVCaptureSession by simply commenting out the line:
[captureSession addOutput:captureDataOutput];
The video recording works but then the SampleBufferDelegate is not called (which i need).
How can i go about having AVCaptureVideoDataOutput and AVCaptureMovieFileOutput working simultaneously.
- (void)initCapture {
AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] error:NULL];
captureDataOutput = [[AVCaptureVideoDataOutput alloc] init];
[captureDataOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
m_captureFileOutput = [[AVCaptureMovieFileOutput alloc] init];
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[captureDataOutput setVideoSettings:videoSettings];
captureSession = [[AVCaptureSession alloc] init];
[captureSession addInput:captureInput];
[captureSession addOutput:m_captureFileOutput];
[captureSession addOutput:captureDataOutput];
[captureSession beginConfiguration];
[captureSession setSessionPreset:AVCaptureSessionPresetLow];
[captureSession commitConfiguration];
[self performSelector:#selector(startRecording) withObject:nil afterDelay:10.0];
[self performSelector:#selector(stopRecording) withObject:nil afterDelay:15.0];
[captureSession startRunning];
}
- (void) startRecording
{
[m_captureFileOutput startRecordingToOutputFileURL:[self tempFileURL] recordingDelegate:self];
}
- (void) stopRecording
{
if([m_captureFileOutput isRecording])
[m_captureFileOutput stopRecording];
}
- (NSURL *) tempFileURL
{
NSString *outputPath = [[NSString alloc] initWithFormat:#"%#%#", NSTemporaryDirectory(), #"camera.mov"];
NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
NSFileManager *fileManager = [NSFileManager defaultManager];
if ([fileManager fileExistsAtPath:outputPath]) {
[[NSFileManager defaultManager] removeItemAtPath:outputPath error:nil];
[outputPath release];
return [outputURL autorelease];
}
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections
{
NSLog(#"start record video");
}
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
NSLog(#"end record");
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
// do stuff with sampleBuffer
}
I should add i am getting the error:
Error Domain=NSOSStatusErrorDomain Code=-12780 "The operation couldn’t be completed. (OSStatus error -12780.)" UserInfo=0x23fcd0 {AVErrorRecordingSuccessfullyFinishedKey=false}
from
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
Cheers
I have contacted an engineer at Apple's support and he told me that simultaneous AVCaptureVideoDataOutput + AVCaptureMovieFileOutput use is not supported. I don't know if they will support it in the future, but he used the word "not supported at this time".
I encourage you to fill a bug report / feature request on this, as I did (bugreport.apple.com), as they measure how hard people want something and we perhaps can see this in a near future.
Still 9 years later Apple apparently does not seem to want this to work together.
But you can easily work with AVAssetWriter.
You can't use AVCaptureVideoDataOutput and AVCaptureMovieFileOutput on the same time. But you can use AVCaptureVideoDataOutput and analyse or modify on the data, then use AVAsseWriter to write the frames to a file.
Source: https://developer.apple.com/forums/thread/98113
How to save video with output using AVAssetWriter:
Save AVCaptureVideoDataOutput to movie file using AVAssetWriter in Swift
Although you cannot use AVCaptureVideoDataOutput, you can use AVCaptureVideoPreviewLayer simultaneously with AVCaptureMovieFileOutput. See the "AVCam" example on Apple's Website.
In Xamarin.iOS, the code looks like this:
var session = new AVCaptureSession();
var camera = AVCaptureDevice.DefaultDeviceWithMediaType(AVMediaType.Video);
var mic = AVCaptureDevice.DefaultDeviceWithMediaType(AVMediaType.Audio);
if(camera == null || mic == null){
throw new Exception("Can't find devices");
}
if(session.CanAddInput(camera)){
session.AddInput(camera);
}
if(session.CanAddInput(mic)){
session.AddInput(mic);
}
var layer = new AVCaptureVideoPreviewLayer(session);
layer.LayerVideoGravity = AVLayerVideoGravity.ResizeAspectFill;
layer.VideoGravity = AVCaptureVideoPreviewLayer.GravityResizeAspectFill;
cameraView = new UIView();
cameraView.Layer.AddSublayer(layer);
var filePath = System.IO.Path.Combine( Path.GetTempPath(), "temporary.mov");
var fileUrl = NSUrl.FromFilename( filePath );
var movieFileOutput = new AVCaptureMovieFileOutput();
var recordingDelegate = new MyRecordingDelegate();
session.AddOutput(movieFileOutput);
movieFileOutput.StartRecordingToOutputFile( fileUrl, recordingDelegate);
XCode 14.1 is already support it.
XCode 13.4: Not works
XCode 14.1: works

How can I make the iPhone 4 LED light fire instantly?

I'm currently using the below code to turn on and off my iPhone 4 LED light and it's working great, but the only problem is that every time I turn the LED on there is a slight delay. However it turns off instantly. I need it to fire instantly to implement a strobe like feature, and because it's just more convenient.
I've noticed that in Apple's camera app and many other apps that the LED turns on and off instantaneously when you hit the power button.
I've tried adding some of the objects like "session" and "device" as instance variables to my view controller in order to have the iPhone create those objects at load time, however I haven't had any luck in getting it to work.
I've also tried looking at apples WWDC sample code but I just can't seem to decipher their complex code. Can someone please help me figure this out i've been trying for about 4 days to get this to work.
.h
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#interface FlashlightViewController : UIViewController {
AVCaptureSession *torchSession;
}
#property (nonatomic, retain) AVCaptureSession * torchSession;
- (void) toggleTorch;
#end
.m
#import "FlashlightViewController.h"
#implementation FlashlightViewController
#synthesize torchSession;
- (void)dealloc
{
[torchSession release];
[super dealloc];
}
- (void)viewDidLoad
{
[self toggleTorch];
[super viewDidLoad];
}
- (void) toggleTorch
{
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device hasTorch] && [device hasFlash])
{
if (device.torchMode == AVCaptureTorchModeOff)
{
NSLog(#"It's currently off.. turning on now.");
AVCaptureDeviceInput *flashInput = [AVCaptureDeviceInput deviceInputWithDevice:device error: nil];
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[session beginConfiguration];
[device lockForConfiguration:nil];
[device setTorchMode:AVCaptureTorchModeOn];
[device setFlashMode:AVCaptureFlashModeOn];
[session addInput:flashInput];
[session addOutput:output];
[device unlockForConfiguration];
[output release];
[session commitConfiguration];
[session startRunning];
[self setTorchSession:session];
[session release];
}
else {
NSLog(#"It's currently on.. turning off now.");
[torchSession stopRunning];
}
}
}
Do everything (all the session and device configuration stuff) except the flash configuration block before you want to turn the flash LED on, during app init or view load.
Then just set torch mode on when you want to turn the LED on. Something like:
[self.myDevice lockForConfiguration:nil];
[self.myDevice setTorchMode:AVCaptureTorchModeOn];
[self.myDevice setFlashMode:AVCaptureFlashModeOn];
[self.myDevice unlockForConfiguration];
Make sure that myDevice is a properly configured property during your init.
A bit necromantic, but here is a great Library to do it :
LARSTTorch

CMMotionManager and the Gyroscope on iPhone 4

I am trying to simply NSLog the output of the new iPhone 4 Gyroscope. But after reading the documentation and following their sample code I am getting this error.
ERROR,Time,300635803.946,Function,"CLLoggingSetFile",could not open locations log /var/mobile/Library/Caches/CoreMotion/CoreMotion.log
Even if I just setup my motionManager object with [[CMMotionManager alloc] init]; on its own and no other code, I still get the error.
Here is my .h file.
#import <UIKit/UIKit.h>
#import <CoreMotion/CoreMotion.h>
#interface GyroTest0ViewController : UIViewController {
CMMotionManager *motionManager;
NSOperationQueue *opQ;
}
#end
And here my .m file.
- (void)viewDidLoad {
[super viewDidLoad];
// the error occurs even just with this line on its own
motionManager = [[CMMotionManager alloc] init];
if (motionManager.gyroAvailable) {
motionManager.gyroUpdateInterval = 1.0/60.0;
[motionManager startGyroUpdates];
opQ = [[NSOperationQueue currentQueue] retain];
CMGyroHandler gyroHandler = ^ (CMGyroData *gyroData, NSError *error) {
CMRotationRate rotate = gyroData.rotationRate;
NSLog(#"rotation rate = [%f, %f, %f]", rotate.x, rotate.y, rotate.z);
};
} else {
NSLog(#"No gyroscope on device.");
[motionManager release];
}
}
Any help and/or source code to simply log the iPhone 4 gyroscope data would be much appreciated. Many thanks!
Try this,
motionManager.gyroUpdateInterval = 1.0/60.0;
[motionManager startGyroUpdatesToQueue:[NSOperationQueue currentQueue]
withHandler: ^(CMGyroData *gyroData, NSError *error)
{
CMRotationRate rotate = gyroData.rotationRate;
NSLog(#"rotation rate = [%f, %f, %f]", rotate.x, rotate.y, rotate.z);
}];
For the WWDC sample code:
Log in to ADC
Click on WWDC 2010 session videos
View in iTunes
There you find the link to sample code (230 MB)
Any results regarding this issue? I get the same error even when I use WWDC teapot demo code.
I filed a bug report (8382659).
By the way I get the same error when I use the push method described by Joshua Weinberg.
Update: Apple confirmed the bug but referred to a duplicate issue 8173937 that I can't find. Well let's hope that it will be fixed in the next release.