Using CMDeviceMotion to track iPhone tilt over time - iphone

I'm trying to use CMDeviceMotion to track when the iPhone is tilted backwards, and then do something. I've successfully created the CMMotionManager, I'm getting motion data from the system, and I'm filtering for results above a certain acceleration.
What I want to do though, is detect when the device is being tilted backwards or forwards above a certain speed. How do I do that?
Here's the code I have so far:
UPDATE: I think I solved it. I was looking for the rotationRate property, CMRotationRate. I've paired them together, I really only need the x value, so I'll keep working on it. If anyone has some tips at the below code, it's much appreciated.
- (void)startMotionManager{
if (motionManager == nil) {
motionManager = [[CMMotionManager alloc] init];
}
motionManager.deviceMotionUpdateInterval = 1/15.0;
if (motionManager.deviceMotionAvailable) {
NSLog(#"Device Motion Available");
[motionManager startDeviceMotionUpdatesToQueue:[NSOperationQueue currentQueue]
withHandler: ^(CMDeviceMotion *motion, NSError *error){
//CMAttitude *attitude = motion.attitude;
//NSLog(#"rotation rate = [%f, %f, %f]", attitude.pitch, attitude.roll, attitude.yaw);
[self performSelectorOnMainThread:#selector(handleDeviceMotion:) withObject:motion waitUntilDone:YES];
}];
//[motionManager startDeviceMotionUpdates];
} else {
NSLog(#"No device motion on device.");
[self setMotionManager:nil];
}
}
- (void)handleDeviceMotion:(CMDeviceMotion*)motion{
CMAttitude *attitude = motion.attitude;
float accelerationThreshold = 0.2; // or whatever is appropriate - play around with different values
CMAcceleration userAcceleration = motion.userAcceleration;
float rotationRateThreshold = 7.0;
CMRotationRate rotationRate = motion.rotationRate;
if ((rotationRate.x) > rotationRateThreshold) {
if (fabs(userAcceleration.x) > accelerationThreshold || fabs(userAcceleration.y) > accelerationThreshold || fabs(userAcceleration.z) > accelerationThreshold) {
NSLog(#"rotation rate = [Pitch: %f, Roll: %f, Yaw: %f]", attitude.pitch, attitude.roll, attitude.yaw);
NSLog(#"motion.rotationRate = %f", rotationRate.x);
[self showMenuAnimated:YES];
}
}
else if ((-rotationRate.x) > rotationRateThreshold) {
if (fabs(userAcceleration.x) > accelerationThreshold || fabs(userAcceleration.y) > accelerationThreshold || fabs(userAcceleration.z) > accelerationThreshold) {
NSLog(#"rotation rate = [Pitch: %f, Roll: %f, Yaw: %f]", attitude.pitch, attitude.roll, attitude.yaw);
NSLog(#"motion.rotationRate = %f", rotationRate.x);
[self dismissMenuAnimated:YES];
}
}
}

Have you looked at CMAttitude? Sounds like what you need, plus some mathematics I guess.
EDIT: Ok you did.
Quoting Apple documentation, at chapter "Getting the Current Device Orientation" :
If you need to know only the general
orientation of the device, and not the
exact vector of orientation, you
should use the methods of the UIDevice
class to retrieve that information.
Using the UIDevice interface is simple
and does not require that you
calculate the orientation vector
yourself.
Great, they do the maths for us. Then I guess that tracking acceleration like you do + tracking orientation, as above documentation explains, should do the job, using UIDeviceOrientationFaceUp and UIDeviceOrientationFaceDown ?
If none of the UIDeviceOrientation fit your needs, what you will need is to calculate some spatial angles from two CMAttitude references, which provides a rotationMatrix and a quaternion... these school maths are far away for me... I would then suggest to maybe search/ask a math only question with these.

Related

iOS Accelerometer-Based Gesture Recognition

I want to create a project that reads the user's gesture (accelerometer-based) and recognise it, I searched a lot but all I found was too old, I neither have problems in classifying nor in recognition, I will use 1 dollar recogniser or HMM, I just want to know how to read the user's gesture using the accelerometer.
Is the accelerometer data (x,y,z values) enough or should i use other data with it like Attitude data (roll, pitch, yaw), Gyro data or magnitude data, I don't even understand anyone of them so explaining what does these sensors do will be useful.
Thanks in advance !
Finally i did it, i used userAcceleration data which is device acceleration due to device excluding gravity, i found a lot of people use the normal acceleration data and do a lot of math to remove gravity from it, now it's already done by iOS 6 in userAcceleration.
And i used 1$ recognizer which is a 2D recongnizer (i.e. point(5, 10), no Z).Here's a link for 1$ recognizer, there's a c++ version of it in the downloads section.
Here are the steps of my code...
Read userAcceleration data with frequancy 50 HZ.
Apply low pass filter on it.
Take a point into consideration only if its x or y values are greater than 0.05 to reduce noise. (Note: The next step depends on your code and on the recognizer you use).
Save x and y points into array.
Create a 2D path from this array.
Send this path to the recognizer to weather train it or recongize it.
Here's my code...
#implementation MainViewController {
double previousLowPassFilteredAccelerationX;
double previousLowPassFilteredAccelerationY;
double previousLowPassFilteredAccelerationZ;
CGPoint position;
int numOfTrainedGestures;
GeometricRecognizer recognizer;
}
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
previousLowPassFilteredAccelerationX = previousLowPassFilteredAccelerationY = previousLowPassFilteredAccelerationZ = 0.0;
recognizer = GeometricRecognizer();
//Note: I let the user train his own gestures, so i start up each time with 0 gestures
numOfTrainedGestures = 0;
}
#define kLowPassFilteringFactor 0.1
#define MOVEMENT_HZ 50
#define NOISE_REDUCTION 0.05
- (IBAction)StartAccelerometer
{
CMMotionManager *motionManager = [CMMotionManager SharedMotionManager];
if ([motionManager isDeviceMotionAvailable])
{
[motionManager setDeviceMotionUpdateInterval:1.0/MOVEMENT_HZ];
[motionManager startDeviceMotionUpdatesToQueue:[NSOperationQueue currentQueue]
withHandler: ^(CMDeviceMotion *motion, NSError *error)
{
CMAcceleration lowpassFilterAcceleration, userAcceleration = motion.userAcceleration;
lowpassFilterAcceleration.x = (userAcceleration.x * kLowPassFilteringFactor) + (previousLowPassFilteredAccelerationX * (1.0 - kLowPassFilteringFactor));
lowpassFilterAcceleration.y = (userAcceleration.y * kLowPassFilteringFactor) + (previousLowPassFilteredAccelerationY * (1.0 - kLowPassFilteringFactor));
lowpassFilterAcceleration.z = (userAcceleration.z * kLowPassFilteringFactor) + (previousLowPassFilteredAccelerationZ * (1.0 - kLowPassFilteringFactor));
if (lowpassFilterAcceleration.x > NOISE_REDUCTION || lowpassFilterAcceleration.y > NOISE_REDUCTION)
[self.points addObject:[NSString stringWithFormat:#"%.2f,%.2f", lowpassFilterAcceleration.x, lowpassFilterAcceleration.y]];
previousLowPassFilteredAccelerationX = lowpassFilterAcceleration.x;
previousLowPassFilteredAccelerationY = lowpassFilterAcceleration.y;
previousLowPassFilteredAccelerationZ = lowpassFilterAcceleration.z;
// Just viewing the points to the user
self.XLabel.text = [NSString stringWithFormat:#"X : %.2f", lowpassFilterAcceleration.x];
self.YLabel.text = [NSString stringWithFormat:#"Y : %.2f", lowpassFilterAcceleration.y];
self.ZLabel.text = [NSString stringWithFormat:#"Z : %.2f", lowpassFilterAcceleration.z];
}];
}
else NSLog(#"DeviceMotion is not available");
}
- (IBAction)StopAccelerometer
{
[[CMMotionManager SharedMotionManager] stopDeviceMotionUpdates];
// View all the points to the user
self.pointsTextView.text = [NSString stringWithFormat:#"%d\n\n%#", self.points.count, [self.points componentsJoinedByString:#"\n"]];
// There must be more that 2 trained gestures because in recognizing, it gets the closest one in distance
if (numOfTrainedGestures > 1) {
Path2D path = [self createPathFromPoints]; // A method to create a 2D path from pointsArray
if (path.size()) {
RecognitionResult recongnitionResult = recognizer.recognize(path);
self.recognitionLabel.text = [NSString stringWithFormat:#"%s Detected with Prob %.2f !", recongnitionResult.name.c_str(),
recongnitionResult.score];
} else self.recognitionLabel.text = #"Not enough points for gesture !";
}
else self.recognitionLabel.text = #"Not enough templates !";
[self releaseAllVariables];
}

ios App check iPhone shaking and then take picture

I am writing an iPhone camera App. When user is about to take a picture, I would like to check if the iPhone is shaking and wait for the moment that there is no shaking and then capture the phone.
How can I do it?
Anit-shake feature is quite a complicated feature to pull off. I reckon it's a combination of some powerful blur detection/removal algorithms, and the gyroscope on iPhone.
You may start by looking into how to detect motion with the iPhone, and see what kind of results you can get with that. If it's not enough, start looking into shift/blur direction detection algorithms. This is not a trivial problem, but is something that you could probably accomplish given enough time. Hope that Helps!
// Ensures the shake is strong enough on at least two axes before declaring it a shake.
// "Strong enough" means "greater than a client-supplied threshold" in G's.
static BOOL L0AccelerationIsShaking(UIAcceleration* last, UIAcceleration* current, double threshold) {
double
deltaX = fabs(last.x - current.x),
deltaY = fabs(last.y - current.y),
deltaZ = fabs(last.z - current.z);
return
(deltaX > threshold && deltaY > threshold) ||
(deltaX > threshold && deltaZ > threshold) ||
(deltaY > threshold && deltaZ > threshold);
}
#interface L0AppDelegate : NSObject <UIApplicationDelegate> {
BOOL histeresisExcited;
UIAcceleration* lastAcceleration;
}
#property(retain) UIAcceleration* lastAcceleration;
#end
#implementation L0AppDelegate
- (void)applicationDidFinishLaunching:(UIApplication *)application {
[UIAccelerometer sharedAccelerometer].delegate = self;
}
- (void) accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration {
if (self.lastAcceleration) {
if (!histeresisExcited && L0AccelerationIsShaking(self.lastAcceleration, acceleration, 0.7)) {
histeresisExcited = YES;
/* SHAKE DETECTED. DO HERE WHAT YOU WANT. */
} else if (histeresisExcited && !L0AccelerationIsShaking(self.lastAcceleration, acceleration, 0.2)) {
histeresisExcited = NO;
}
}
self.lastAcceleration = acceleration;
}
// and proper #synthesize and -dealloc boilerplate code
#end
I Googled it and found at How do I detect when someone shakes an iPhone?

iPad 1 Gyroscope: roll,pitch,yaw stay zero

I'm trying to make a simple app utilizing gyroscope, where a character moves according to the rotation of the iPad 1.
My code is not working, so I tested to see the values of raw,pitch,yaw,
and they actually stay as zero however I move the device.
I'm sure iPad 1 supports CMMotionManager, so I'm not sure what's causing it...
My code is as follows
- (id) init{
if((self=[super init])){
self.isTouchEnabled = YES;
winSize = [[CCDirector sharedDirector] winSize];
[self createRabbitSprite];
self.motionManager = [[CMMotionManager alloc] init];
motionManager.deviceMotionUpdateInterval = 1.0/60.0;
if(motionManager.isDeviceMotionAvailable){
[motionManager startDeviceMotionUpdates];
}
[self scheduleUpdate];
//[self registerWithTouchDispatcher];
}
return self;
}
-(void)update:(ccTime)delta{
CMDeviceMotion *currentDeviceMotion = motionManager.deviceMotion;
CMAttitude *currentAttitude = currentDeviceMotion.attitude;
if(referenceFrame){
[currentAttitude multiplyByInverseOfAttitude:referenceFrame];
}
float roll = currentAttitude.roll;
float pitch = currentAttitude.pitch;
float yaw = currentAttitude.yaw;
NSLog(#"%.2f and %.2f and %.2f",roll,pitch,yaw);
rabbit.rotation = CC_RADIANS_TO_DEGREES(yaw);
}
Please help me out..
and thanx in advance.
(edit)
Apparently, motionManager.isDeviceMotionAvailable is returning FALSE...
which must mean that iPad 1 doesn't support CoreMotion???
Could it be something with the setting?
The iPad First generation does support CMMotionManager (as it has an accelerometer), but won't return any gyroscopic data - it doesn't have a gyroscope! You'll need to check the gyroAvailable property of a CMMotionManager instance.

How do you set the framerate when recording video on the iPhone?

I would like to write a camera application where you record video using the iPhone's camera, but I can't find a way to alter the framerate of the recorded video. For example, I'd like to record at 25 frames per second instead of the default 30.
Is it possible to set this framerate in any way, and if yes how?
You can use AVCaptureConnection's videoMaxFrameDuration and videoMinFrameDuration properties. See http://developer.apple.com/library/ios/#DOCUMENTATION/AVFoundation/Reference/AVCaptureConnection_Class/Reference/Reference.html#//apple_ref/doc/uid/TP40009522
Additionally, there is an SO question that addresses this (with a good code example):
I want to throttle video capture frame rate in AVCapture framework
As far as I could tell, you can't set the FPS for recording. Look at the WWDC 2010 video for AVFoundation. It seems to suggest that you can but, again, as far as I can tell, that only works for capturing frame data.
I'd love to be proven wrong, but I'm pretty sure that you can't. Sorry!
You will need AVCaptureDevice.h
Here is working code here:
- (void)attemptToConfigureFPS
{
NSError *error;
if (![self lockForConfiguration:&error]) {
NSLog(#"Could not lock device %# for configuration: %#", self, error);
return;
}
AVCaptureDeviceFormat *format = self.activeFormat;
double epsilon = 0.00000001;
int desiredFrameRate = 30;
for (AVFrameRateRange *range in format.videoSupportedFrameRateRanges) {
NSLog(#"Pre Minimum frame rate: %f Max = %f", range.minFrameRate, range.maxFrameRate);
if (range.minFrameRate <= (desiredFrameRate + epsilon) &&
range.maxFrameRate >= (desiredFrameRate - epsilon)) {
NSLog(#"Setting Frame Rate.");
self.activeVideoMaxFrameDuration = (CMTime){
.value = 1,
.timescale = desiredFrameRate,
.flags = kCMTimeFlags_Valid,
.epoch = 0,
};
self.activeVideoMinFrameDuration = (CMTime){
.value = 1,
.timescale = desiredFrameRate,
.flags = kCMTimeFlags_Valid,
.epoch = 0,
};
// self.activeVideoMinFrameDuration = self.activeVideoMaxFrameDuration;
// NSLog(#"Post Minimum frame rate: %f Max = %f", range.minFrameRate, range.maxFrameRate);
break;
}
}
[self unlockForConfiguration];
// Audit the changes
for (AVFrameRateRange *range in format.videoSupportedFrameRateRanges) {
NSLog(#"Post Minimum frame rate: %f Max = %f", range.minFrameRate, range.maxFrameRate);
}
}

Emulating the camera apps 'tap to focus'

I am trying hard to emulate the basic functionality of the built in camera app. Thus far I have become stuck on the 'tap to focus' feature.
I have a UIView from which I am collecting UITouch events when a single finger is tapped on the UIView. This following method is called but the camera focus & the exposure are unchanged.
-(void)handleFocus:(UITouch*)touch
{
if( [camera lockForConfiguration:nil] )
{
CGPoint location = [touch locationInView:cameraView];
if( [camera isFocusPointOfInterestSupported] )
camera.focusPointOfInterest = location;
if( [camera isExposurePointOfInterestSupported] )
camera.exposurePointOfInterest = location;
[camera unlockForConfiguration];
[cameraView animFocus:location];
}
}
'camera' is my AVCaptureDevice & it is non-nil. Can someone perhaps tell me where I am going wrong?
Clues & boos all welcome.
M.
This snippet might help you...There is a CamDemo provided by apple floating around which allows you to focus, change exposure while tapping, set flash, swap cameras and more, it emulates the camera app pretty well, not sure if youll be able to find it since it was part of wwdc, but if u leave some email address in the comments i can email you the sample code...
- (void) focusAtPoint:(CGPoint)point
{
AVCaptureDevice *device = [[self videoInput] device];
if ([device isFocusPointOfInterestSupported] && [device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
NSError *error;
if ([device lockForConfiguration:&error]) {
[device setFocusPointOfInterest:point];
[device setFocusMode:AVCaptureFocusModeAutoFocus];
[device unlockForConfiguration];
} else {
id delegate = [self delegate];
if ([delegate respondsToSelector:#selector(acquiringDeviceLockFailedWithError:)]) {
[delegate acquiringDeviceLockFailedWithError:error];
}
}
}
}