CMMotionManager and the Gyroscope on iPhone 4 - iphone

I am trying to simply NSLog the output of the new iPhone 4 Gyroscope. But after reading the documentation and following their sample code I am getting this error.
ERROR,Time,300635803.946,Function,"CLLoggingSetFile",could not open locations log /var/mobile/Library/Caches/CoreMotion/CoreMotion.log
Even if I just setup my motionManager object with [[CMMotionManager alloc] init]; on its own and no other code, I still get the error.
Here is my .h file.
#import <UIKit/UIKit.h>
#import <CoreMotion/CoreMotion.h>
#interface GyroTest0ViewController : UIViewController {
CMMotionManager *motionManager;
NSOperationQueue *opQ;
}
#end
And here my .m file.
- (void)viewDidLoad {
[super viewDidLoad];
// the error occurs even just with this line on its own
motionManager = [[CMMotionManager alloc] init];
if (motionManager.gyroAvailable) {
motionManager.gyroUpdateInterval = 1.0/60.0;
[motionManager startGyroUpdates];
opQ = [[NSOperationQueue currentQueue] retain];
CMGyroHandler gyroHandler = ^ (CMGyroData *gyroData, NSError *error) {
CMRotationRate rotate = gyroData.rotationRate;
NSLog(#"rotation rate = [%f, %f, %f]", rotate.x, rotate.y, rotate.z);
};
} else {
NSLog(#"No gyroscope on device.");
[motionManager release];
}
}
Any help and/or source code to simply log the iPhone 4 gyroscope data would be much appreciated. Many thanks!

Try this,
motionManager.gyroUpdateInterval = 1.0/60.0;
[motionManager startGyroUpdatesToQueue:[NSOperationQueue currentQueue]
withHandler: ^(CMGyroData *gyroData, NSError *error)
{
CMRotationRate rotate = gyroData.rotationRate;
NSLog(#"rotation rate = [%f, %f, %f]", rotate.x, rotate.y, rotate.z);
}];

For the WWDC sample code:
Log in to ADC
Click on WWDC 2010 session videos
View in iTunes
There you find the link to sample code (230 MB)

Any results regarding this issue? I get the same error even when I use WWDC teapot demo code.
I filed a bug report (8382659).
By the way I get the same error when I use the push method described by Joshua Weinberg.
Update: Apple confirmed the bug but referred to a duplicate issue 8173937 that I can't find. Well let's hope that it will be fixed in the next release.

Related

Leaderboard "No Items"

I'm trying to implement GameCenter to my app.
This is to show me the GameCenter LeaderBoard but it's shows me: No Items.
-(IBAction) ShowLeader{
GKGameCenterViewController* gameCenterController = [[GKGameCenterViewController alloc] init];
gameCenterController.viewState = GKGameCenterViewControllerStateLeaderboards;
gameCenterController.gameCenterDelegate = self;
[self presentViewController:gameCenterController animated:YES completion:nil];
}
The user is authenticate and that's show up when i connect.
when i report score in my GameViewController:
if ([GKLocalPlayer localPlayer].isAuthenticated) {
GKScore* scoreReporter = [[GKScore alloc] initWithLeaderboardIdentifier:#"GameHighScore"];
scoreReporter.value = HighScoreNbr;
scoreReporter.context = 0;
// NSArray *scores = #[scoreReporter];
[GKScore reportScores:#[scoreReporter] withCompletionHandler:^(NSError *error) {
if (error) {
NSLog(#"error: %#", error);
}
printf("no error: ");
}];
}
This shows me no error so i suppose it works.
I already tried with 2 accounts since i saw that on other answer but didn't help.
If you need any more info please comment.
Thanks.
I found out myself after reading and viewing tones of video.
if that can help someone who as the same problem:
you need to had your BundleID from Itunes connect to your xcode5 info playlist.
then it should work.
Hope it helps ;)
This can also happen if gamekit is missing from Required device capabilities in your plist.

Required device capabilitiesi

If i use accelerometer in my app should i add Required device capabilitiesi in info.plist and enter accelerometer ?if yes what should i add exactly?
You should write code defensively and always "degrade" the app nicely.
For example, you should think of how you app will run if accelerometer is not present.
Have a look at the CMMotionManager class - in particular at its accelerometerActive property which returns a BOOL. Here is an example:
CMMotionManager *manager = [[CMMotionManager alloc] init];
if(!manager.accelerometerAvailable) {
NSLog(#"Accelerometer not available");
} else {
manager.accelerometerUpdateInterval = 1.0;
NSOperationQueue *motionQueue = [[NSOperationQueue alloc] init];
[manager startAccelerometerUpdatesToQueue: motionQueue withHandler:
^(CMAccelerometerData *data, NSError *error) {
NSLog(#"Accelerometer data: %#", [data description]);
}
];
}
You need to use
<key>UIRequiredDeviceCapabilities</key>
<array>
<string>accelerometer</string>
</array>
Further details in the iOS Application Programming Guide (page 89, table 5-1).
You do not need to include this key if your app detects only device orientation changes.
no you don't need to add anything to your info.plist.
Just make sure that your appDelegate implements protocol UIAccelerometerDelegate and then in your .m file just do:
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
...
[[UIAccelerometer sharedAccelerometer] setDelegate:self];
....
}
- (void) accelerometer:(UIAccelerometer*)accelerometer didAccelerate:(UIAcceleration*)acceleration
{
//your accelerometer code
}

Expected "(" before AVCaptureSession

When I try to build my application XCode shows me this error
Expected "(" before AVCaptureSession
Can someone help me fixing this warning? Here's my code:
ViewController.h
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#interface ViewController : UIViewController
{
}
- (IBAction)SwitchON_Flash;
- (void)setTorchSession:(AVCaptureSession *)CaptureSession;
#end
ViewController.m
#import "ViewController.h"
#implementation ViewController
UIAlertView *NoFlash;
- (IBAction)SwitchON_Flash
{
AVCaptureDevice *Device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([Device hasTorch] && [Device hasFlash])
{
if (Device.torchMode == AVCaptureTorchModeOff)
{
AVCaptureDevice *Device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *FlashInput = [AVCaptureDeviceInput deviceInputWithDevice:device error: nil];
AVCaptureVideoDataOutput *VideoOutput = [[AVCaptureVideoDataOutput alloc] init];
AVCaptureSession *CaptureSession = [[AVCaptureSession alloc] init];
[CaptureSession beginConfiguration];
[CaptureSession addInput:FlashInput];
[CaptureSession addOutput:VideoOutput];
[CaptureSession commitConfiguration];
[CaptureSession startRunning];
[Device lockForConfiguration:nil];
[Device setTorchMode:AVCaptureTorchModeOn];
[Device setFlashMode:AVCaptureFlashModeOn];
[Device unlockForConfiguration];
[self setTorchSession:CaptureSession];
[CaptureSession release];
[VideoOutput release];
}
else
{
[torchSession stopRunning];
}
}
else
{
NoFlash = [[UIAlertView alloc] initWithTitle:#"Uh-Oh"
message:#"Your device doesn't have a flash camera"
delegate:nil
cancelButtonTitle:#"mhmm, OK"
otherButtonTitles:nil];
NoFlash.delegate = self;
[NoFlash show];
[NoFlash release];
}
}
- (void)setTorchSession:(AVCaptureSession *)CaptureSession <== /// ERROR HERE ///
{
}
Thanks!
UIViewController does not have that method by default. You don't have the implementation of that method, hence it's giving you this warning. If you will try to run this code - you will get "unknown selector sent to instance" error.
You either have to add property for torchSession in view controller or implement that specific method.
The first answer is not necessarily right as you may have implemented the method but simply not declared it in your header file.
On the line where you get the warning, you are sending a message to self (in this case your view controller) to run the method setTorchSession with the parameter CaptureSession
Provided you have implemented the setTorchSessin method in your .m file, all you have to do is declare it in your interface (.h file) by adding the following line under your SwitchON_Flash method:
- (IBAction)SwitchON_Flash;
- (void)setTorchSession:(AVCaptureDeviceSession*)captureSession;
If you haven't got the method in your implementation file your App will crash with a "Unrecognised selector sent to instance" message.
Are you sure the class is AVCaptureDeviceSession ? I can't find it in XCode help at all.
Maybe you meant AVCaptureSession

How can I make the iPhone 4 LED light fire instantly?

I'm currently using the below code to turn on and off my iPhone 4 LED light and it's working great, but the only problem is that every time I turn the LED on there is a slight delay. However it turns off instantly. I need it to fire instantly to implement a strobe like feature, and because it's just more convenient.
I've noticed that in Apple's camera app and many other apps that the LED turns on and off instantaneously when you hit the power button.
I've tried adding some of the objects like "session" and "device" as instance variables to my view controller in order to have the iPhone create those objects at load time, however I haven't had any luck in getting it to work.
I've also tried looking at apples WWDC sample code but I just can't seem to decipher their complex code. Can someone please help me figure this out i've been trying for about 4 days to get this to work.
.h
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#interface FlashlightViewController : UIViewController {
AVCaptureSession *torchSession;
}
#property (nonatomic, retain) AVCaptureSession * torchSession;
- (void) toggleTorch;
#end
.m
#import "FlashlightViewController.h"
#implementation FlashlightViewController
#synthesize torchSession;
- (void)dealloc
{
[torchSession release];
[super dealloc];
}
- (void)viewDidLoad
{
[self toggleTorch];
[super viewDidLoad];
}
- (void) toggleTorch
{
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device hasTorch] && [device hasFlash])
{
if (device.torchMode == AVCaptureTorchModeOff)
{
NSLog(#"It's currently off.. turning on now.");
AVCaptureDeviceInput *flashInput = [AVCaptureDeviceInput deviceInputWithDevice:device error: nil];
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[session beginConfiguration];
[device lockForConfiguration:nil];
[device setTorchMode:AVCaptureTorchModeOn];
[device setFlashMode:AVCaptureFlashModeOn];
[session addInput:flashInput];
[session addOutput:output];
[device unlockForConfiguration];
[output release];
[session commitConfiguration];
[session startRunning];
[self setTorchSession:session];
[session release];
}
else {
NSLog(#"It's currently on.. turning off now.");
[torchSession stopRunning];
}
}
}
Do everything (all the session and device configuration stuff) except the flash configuration block before you want to turn the flash LED on, during app init or view load.
Then just set torch mode on when you want to turn the LED on. Something like:
[self.myDevice lockForConfiguration:nil];
[self.myDevice setTorchMode:AVCaptureTorchModeOn];
[self.myDevice setFlashMode:AVCaptureFlashModeOn];
[self.myDevice unlockForConfiguration];
Make sure that myDevice is a properly configured property during your init.
A bit necromantic, but here is a great Library to do it :
LARSTTorch

Getting exposure values from camera on iPhone OS 4.0

Exposure values from camera can be acquired when you take picture (without saving it to SavedPhotos). A light meter application on iPhone does this, probably by using some private API.
That application does it on iPhone 3GS only, so I guess it may be somehow related to EXIF data which is populated with this information when the image is created.
This all applies to 3GS.
Has anything changed with iPhone OS 4.0?
Is there a regular way to get these values now?
Does anyone have a working code example for taking these camera/photo setting values?
Thank you
If you want realtime* exposure information, you can capture a video using AVCaptureVideoDataOutput. Each frame CMSampleBuffer is full of interesting data describing the current state of the camera.
*up to 30 fps
With AVFoundation in iOS 4.0 you can mess with exposure, refer specifically to AVCaptureDevice, here is a link AVCaptureDevice ref. Not sure if its exactly what you want but you can look around AVFoundation and probably find some useful stuff
I think I finally found the lead to the real EXIF data. It'll be a while before I have actual code to post, but I figured this should be publicized in the meantime.
Google captureStillImageAsynchronouslyFromConnection. It's a function of AVCaptureStillImageOutput and following is an excerpt from the documentation (long sought for):
imageDataSampleBuffer -
The data that was captured.
The buffer attachments may contain metadata appropriate to the image data format. For example, a buffer containing JPEG data may carry a kCGImagePropertyExifDictionary as an attachment. See ImageIO/CGImageProperties.h for a list of keys and value types.
For an example of working with AVCaptureStillImageOutput see WWDC 2010 sample code, under AVCam.
Peace,
O.
Here is the complete solution. Dont forget to import appropriate frameworks and headers.
In the exifAttachments var in capturenow method you'll find all data you are looking for.
#import <AVFoundation/AVFoundation.h>
#import <ImageIO/CGImageProperties.h>
AVCaptureStillImageOutput *stillImageOutput;
AVCaptureSession *session;
- (void)viewDidLoad
{
[super viewDidLoad];
[self setupCaptureSession];
// Do any additional setup after loading the view, typically from a nib.
}
-(void)captureNow{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *__strong error) {
CFDictionaryRef exifAttachments = CMGetAttachment( imageDataSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments)
{
// Do something with the attachments.
NSLog(#"attachements: %#", exifAttachments);
}
else
NSLog(#"no attachments");
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
}];
}
// Create and configure a capture session and start it running
- (void)setupCaptureSession
{
NSError *error = nil;
// Create the session
session = [[AVCaptureSession alloc] init];
// Configure the session to produce lower resolution video frames, if your
// processing algorithm can cope. We'll specify medium quality for the
// chosen device.
session.sessionPreset = AVCaptureSessionPreset352x288;
// Find a suitable AVCaptureDevice
AVCaptureDevice *device = [AVCaptureDevice
defaultDeviceWithMediaType:AVMediaTypeVideo];
[device lockForConfiguration:nil];
device.whiteBalanceMode = AVCaptureWhiteBalanceModeLocked;
device.focusMode = AVCaptureFocusModeLocked;
[device unlockForConfiguration];
// Create a device input with the device and add it to the session.
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input) {
// Handling the error appropriately.
}
[session addInput:input];
stillImageOutput = [AVCaptureStillImageOutput new];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
if ([session canAddOutput:stillImageOutput])
[session addOutput:stillImageOutput];
// Start the session running to start the flow of data
[session startRunning];
[self captureNow];
}