I want to create a project that reads the user's gesture (accelerometer-based) and recognise it, I searched a lot but all I found was too old, I neither have problems in classifying nor in recognition, I will use 1 dollar recogniser or HMM, I just want to know how to read the user's gesture using the accelerometer.
Is the accelerometer data (x,y,z values) enough or should i use other data with it like Attitude data (roll, pitch, yaw), Gyro data or magnitude data, I don't even understand anyone of them so explaining what does these sensors do will be useful.
Thanks in advance !
Finally i did it, i used userAcceleration data which is device acceleration due to device excluding gravity, i found a lot of people use the normal acceleration data and do a lot of math to remove gravity from it, now it's already done by iOS 6 in userAcceleration.
And i used 1$ recognizer which is a 2D recongnizer (i.e. point(5, 10), no Z).Here's a link for 1$ recognizer, there's a c++ version of it in the downloads section.
Here are the steps of my code...
Read userAcceleration data with frequancy 50 HZ.
Apply low pass filter on it.
Take a point into consideration only if its x or y values are greater than 0.05 to reduce noise. (Note: The next step depends on your code and on the recognizer you use).
Save x and y points into array.
Create a 2D path from this array.
Send this path to the recognizer to weather train it or recongize it.
Here's my code...
#implementation MainViewController {
double previousLowPassFilteredAccelerationX;
double previousLowPassFilteredAccelerationY;
double previousLowPassFilteredAccelerationZ;
CGPoint position;
int numOfTrainedGestures;
GeometricRecognizer recognizer;
}
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
previousLowPassFilteredAccelerationX = previousLowPassFilteredAccelerationY = previousLowPassFilteredAccelerationZ = 0.0;
recognizer = GeometricRecognizer();
//Note: I let the user train his own gestures, so i start up each time with 0 gestures
numOfTrainedGestures = 0;
}
#define kLowPassFilteringFactor 0.1
#define MOVEMENT_HZ 50
#define NOISE_REDUCTION 0.05
- (IBAction)StartAccelerometer
{
CMMotionManager *motionManager = [CMMotionManager SharedMotionManager];
if ([motionManager isDeviceMotionAvailable])
{
[motionManager setDeviceMotionUpdateInterval:1.0/MOVEMENT_HZ];
[motionManager startDeviceMotionUpdatesToQueue:[NSOperationQueue currentQueue]
withHandler: ^(CMDeviceMotion *motion, NSError *error)
{
CMAcceleration lowpassFilterAcceleration, userAcceleration = motion.userAcceleration;
lowpassFilterAcceleration.x = (userAcceleration.x * kLowPassFilteringFactor) + (previousLowPassFilteredAccelerationX * (1.0 - kLowPassFilteringFactor));
lowpassFilterAcceleration.y = (userAcceleration.y * kLowPassFilteringFactor) + (previousLowPassFilteredAccelerationY * (1.0 - kLowPassFilteringFactor));
lowpassFilterAcceleration.z = (userAcceleration.z * kLowPassFilteringFactor) + (previousLowPassFilteredAccelerationZ * (1.0 - kLowPassFilteringFactor));
if (lowpassFilterAcceleration.x > NOISE_REDUCTION || lowpassFilterAcceleration.y > NOISE_REDUCTION)
[self.points addObject:[NSString stringWithFormat:#"%.2f,%.2f", lowpassFilterAcceleration.x, lowpassFilterAcceleration.y]];
previousLowPassFilteredAccelerationX = lowpassFilterAcceleration.x;
previousLowPassFilteredAccelerationY = lowpassFilterAcceleration.y;
previousLowPassFilteredAccelerationZ = lowpassFilterAcceleration.z;
// Just viewing the points to the user
self.XLabel.text = [NSString stringWithFormat:#"X : %.2f", lowpassFilterAcceleration.x];
self.YLabel.text = [NSString stringWithFormat:#"Y : %.2f", lowpassFilterAcceleration.y];
self.ZLabel.text = [NSString stringWithFormat:#"Z : %.2f", lowpassFilterAcceleration.z];
}];
}
else NSLog(#"DeviceMotion is not available");
}
- (IBAction)StopAccelerometer
{
[[CMMotionManager SharedMotionManager] stopDeviceMotionUpdates];
// View all the points to the user
self.pointsTextView.text = [NSString stringWithFormat:#"%d\n\n%#", self.points.count, [self.points componentsJoinedByString:#"\n"]];
// There must be more that 2 trained gestures because in recognizing, it gets the closest one in distance
if (numOfTrainedGestures > 1) {
Path2D path = [self createPathFromPoints]; // A method to create a 2D path from pointsArray
if (path.size()) {
RecognitionResult recongnitionResult = recognizer.recognize(path);
self.recognitionLabel.text = [NSString stringWithFormat:#"%s Detected with Prob %.2f !", recongnitionResult.name.c_str(),
recongnitionResult.score];
} else self.recognitionLabel.text = #"Not enough points for gesture !";
}
else self.recognitionLabel.text = #"Not enough templates !";
[self releaseAllVariables];
}
Related
I have a problem getting pitch, roll and yaw angles from CMAttitude class.
First, I did a normal Gyro using 'CMMotionManager' class and atributes x,y,z and worked fine.
Then, I tried to use CMAttitude for "absolute angles", but I doesn't work because It seems that is not updating data. Angles are always 0 (but the isn't errors or warnings)
I have searched a lot in stackoverflow, and used some solutions I find, but I have the same problem.
Here's my code:
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
motionManager = [[CMMotionManager alloc] init];
CMDeviceMotion *deviceMotion = motionManager.deviceMotion;
CMAttitude *attitude = deviceMotion.attitude;
referenceAttitude = attitude;
[motionManager startGyroUpdates];
timer = [NSTimer scheduledTimerWithTimeInterval:1/30.0
target:self
selector:#selector(doGyroUpdate)
userInfo:nil
repeats:YES];
}
-(void)doGyroUpdate {
//cambia el frame de referencia
[motionManager.deviceMotion.attitude multiplyByInverseOfAttitude: referenceAttitude];
double rRotation = motionManager.deviceMotion.attitude.roll*180/M_PI;
double pRotation = motionManager.deviceMotion.attitude.pitch*180/M_PI;
double yRotation = motionManager.deviceMotion.attitude.yaw*180/M_PI;
NSString *myString = [NSString stringWithFormat:#"%f",rRotation];
self.angYaw.text = myString;
myString = [NSString stringWithFormat:#"%f",pRotation];
self.angPitch.text = myString;
myString = [NSString stringWithFormat:#"%f",yRotation];
self.angRoll.text = myString;
}
Thanks a lot! :D
motionManager has 4 modes: Accelerometer, Gyroscope, Magnetometer and Device motion.
Depending on which one you need, you need to start appropriate mode: startAccelerometerUpdates, startGyroUpdates, startMagnetometerUpdates or startDeviceMotionUpdates.
You are starting startGyroUpdates but reading deviceMotion property. In your case only gyroData will be available.
do this instead and you will be getting the deviceMotion data:
[motionManager startDeviceMotionUpdates];
I'm using deviceMotion for getting useracceleration(x, y, z). My aim is to create a filetext where in each iteration my application writes the 3 components in a row.
I'm using MotionGraphs code sample.
How is it possible - directly, or is necessary to create an array first?
This array; is it NSMutableArray or NSMutableNumber?
I've been looking for this question and I'm lost. :-(
I'm not an Objective-C expert but I remember Pascal code where I opened a file, and then I was writing in each iteration, but I checked: that programming has changed.
At the beginning we don't take into account different filters or discrimination window. For them, I've implemented freescale procedure. I'm just looking to save accelerometer data / to store data from accelerometer using deviceMotion userAcceleration.
float minX = 1.0f;
float minY = 1.0f;
float minZ = 1.0f;
NSMutableArray *container = [[NSMutableArray alloc] init];
-(void)startUpdatesWithSliderValue:(int)sliderValue
{
NSTimeInterval delta = 0.005;
NSTimeInterval updateInterval = deviceMotionMin + delta * sliderValue;
CMMotionManager *mManager = [(APLAppDelegate *)[[UIApplication sharedApplication] delegate] sharedManager];
APLDeviceMotionGraphViewController * __weak weakSelf = self;
[container addObject:[NSNumber numberWithFloat:deviceMotion.userAcceleration.x]];
[container addObject:[NSNumber numberWithFloat:deviceMotion.userAcceleration.y]];
[container addObject:[NSNumber numberWithFloat:deviceMotion.userAcceleration.z]];
}
//Finally we have to dump data to filetext, this is I don´t know correctly.
1 Create NSMutableArray *container = [[NSMutableArray alloc] init]; to be your container.
2 Within the Accelerometer delegate method for did detect motion be sure to set a min for each of the 3 axis. e.g. float min_X = 1.0f; float min_y =1.0f; float min_Z = 1.0f
-(void)accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration {
}
3 Use Simple Filter Logic as in: (keep in mind the Acceleration is maxed at +- 2.3g so both positive and negative thresholds need consideration.
if ((acceleration.x > min_X || acceleration.x < -min_X) && (Y's..) && (Z's...) ) {
[container addObject:[NSNumber numberWithFloat:acceleration.x]];
[container addObject:[NSNumber numberWithFloat:acceleration.y]];
[container addObject:[NSNumber numberWithFloat:acceleration.z]];
}
4 The Array should be full of NSNumbers in groups of three (x,y,z).
5 The filter is needed, otherwise the accelerometers can pick small vibrations just sitting on the table.
WARNING: The Array will fill up fast, so Set the Sample Rate to an acceptable range based on how long you want to record data.
I will like to measure the sound volume of the surrounding, not too sure if I am doing the right thing.
I will like to create a VU meter of a range of 0(quiet) to 120(very noisy).
I gotten the Peak and Avg power but are very high in normal quiet environment.
Do give me some pointer.
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
//creating an audio CAF file in the temporary directory, this isn’t ideal but it’s the only way to get this class functioning (the temporary directory is erased once the app quits). Here we also specifying a sample rate of 44.1kHz (which is capable of representing 22 kHz of sound frequencies according to the Nyquist theorem), and 1 channel (we do not need stereo to measure noise).
NSDictionary* recorderSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kAudioFormatLinearPCM],AVFormatIDKey,
[NSNumber numberWithInt:44100],AVSampleRateKey,
[NSNumber numberWithInt:1],AVNumberOfChannelsKey,
[NSNumber numberWithInt:16],AVLinearPCMBitDepthKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsBigEndianKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey,
nil];
NSError* error;
NSURL *url = [NSURL fileURLWithPath:#"/dev/null"];
recorder = [[AVAudioRecorder alloc] initWithURL:url settings:recorderSettings error:&error];
//enable measuring
//tell the recorder to start recording:
[recorder record];
if (recorder) {
[recorder prepareToRecord];
recorder.meteringEnabled = YES;
[recorder record];
levelTimer = [NSTimer scheduledTimerWithTimeInterval: 0.01 target: self selector: #selector(levelTimerCallback:) userInfo: nil repeats: YES];
} else
{
NSLog(#"%#",[error description]);
}
}
- (void)levelTimerCallback:(NSTimer *)timer {
[recorder updateMeters];
const double ALPHA = 0.05;
double peakPowerForChannel = pow(10, (0.05 * [recorder averagePowerForChannel:0]));
lowPassResults = ALPHA * peakPowerForChannel + (1.0 - ALPHA) * lowPassResults;
NSLog(#"Average input: %f Peak input: %f Low pass results: %f", [recorder averagePowerForChannel:0], [recorder peakPowerForChannel:0], lowPassResults);
float tavgPow =[recorder averagePowerForChannel:0] + 120.0;
float tpPow = [recorder peakPowerForChannel:0] + 120.0;
float avgPow = tavgPow;//(float)abs([recorder averagePowerForChannel:0]);
float pPow = tpPow;//(float)abs([recorder peakPowerForChannel:0]);
NSString *tempAvg = [NSString stringWithFormat:#"%0.2f",avgPow];
NSString *temppeak = [NSString stringWithFormat:#"%0.2f",pPow];
[avg setText:tempAvg];
[peak setText:temppeak];
NSLog(#"Average input: %f Peak input: %f Low pass results: %f", avgPow,pPow , lowPassResults);
}
Apple uses a lookup table in their SpeakHere sample that converts from dB to a linear value displayed on a level meter. This is to save device power (I guess).
I also needed this, but didn't think a couple of float calculations every 1/10s (my refresh rate) would cost so much device power. So, instead of building up a table I moulded their code into:
float level; // The linear 0.0 .. 1.0 value we need.
const float minDecibels = -80.0f; // Or use -60dB, which I measured in a silent room.
float decibels = [audioRecorder averagePowerForChannel:0];
if (decibels < minDecibels)
{
level = 0.0f;
}
else if (decibels >= 0.0f)
{
level = 1.0f;
}
else
{
float root = 2.0f;
float minAmp = powf(10.0f, 0.05f * minDecibels);
float inverseAmpRange = 1.0f / (1.0f - minAmp);
float amp = powf(10.0f, 0.05f * decibels);
float adjAmp = (amp - minAmp) * inverseAmpRange;
level = powf(adjAmp, 1.0f / root);
}
I'm using an AVAudioRecorder, hence you see getting the dB's with averagePowerForChannel:, but you can fill your own dB value there.
Apple's example used double calculations, which I don't understand because for audio metering float accuracy is more than sufficient, and costs less device power.
Needless to say, you can now scale this calculated level to your 0 .. 120 range with a simple level * 120.0f.
The above code can be sped up when we fix root at 2.0f, by replacing powf(adjAmp, 1.0f / root) with sqrtf(adjAmp); but that's a minor thing, and a very good compiler might be able to do this for us. And I'm almost sure that inverseAmpRange will be calculated once at compile-time.
The formula for converting a linear amplitude to decibels when you want to use 1.0 as your reference (for 0db), is
20 * log10(amp);
So I'm not sure about the intent from looking at your code, but you probably want
float db = 20 * log10([recorder averagePowerForChannel:0]);
This will go from -infinity at an amplitude of zero, to 0db at an amplitude of 1.
If you really need it to go up to between 0 and 120 you can add 120 and use a max function at zero.
So, after the above line:
db += 120;
db = db < 0 ? 0 : db;
The formula you are using appears to be the formula for converting DB to amp, which I think is the opposite of what you want.
Edit: I reread and it seems you may already have the decibel value.
If this is the case, just don't convert to amplitude and add 120.
So Change
double peakPowerForChannel = pow(10, (0.05 * [recorder averagePowerForChannel:0]));
to
double peakPowerForChannel = [recorder averagePowerForChannel:0];
and you should be okay to go.
Actually, the range of decibels is from -160 to 0, but it can go to positive values.(AVAudioRecorder Class Reference - averagePowerForChannel: method)
Then is better write db += 160; instead of db += 120;. Of course you can also put an offset to correct it.
I make a regression model to convert the mapping relation between the wav data generated from NSRecorder and the decibel data from NSRecorder.averagePowerForChannel
NSRecorder.averagePowerForChannel (dB) = -80+6 log2(wav_RMS)
Where wav_RMS is root mean square value of wav data in a short time, i.e. 0.1 sec.
Simply set your maximum and minimum value. Like you getting range of 0-120. If you want range of 0-60. Simply divide value to half to get the half range and so on..
I'm trying to implement a gauge animation using (+ and - buttons) on iphone, but i have no idea where to start? Any help is really welcome. See the image below (this is what I'm trying to do). Thanks for your help.
Here is some open source code (with an example) that implements the gauge view. You of course would still need to do the buttons yourself, and possible add a different visual style.
http://www.cocoacontrols.com/platforms/ios/controls/meterview
You need to rotate the needle based on the angle... Here is the logic
You can refer my answer here... Rotating a UIImageView around a point over 10 seconds?
fireInterval = 10;
//Adjust starting and ending angle
mStartingAngle = 45;
mEndingAngle = 180;
//Implementation
-(void) startTimer
{
mPreviousTime = [NSDate timeIntervalSinceReferenceDate];
}
In the loop
-(void) updateFunction
{
NSTimeInterval timeNow = [NSDate timeIntervalSinceReferenceDate];
//NewValue = (((OldValue - OldMin) * (NewMax - NewMin)) / (OldMax - OldMin)) + NewMin
//Mapping values between mStartAngle and mEndAngle
mCurrentAngle = (((timeNow - mPreviousTime) * (mEndingAngle - mStartingAngle)) / (previousTime+fireInterval - mPreviousTime)) + mStartingAngle;
if( mPreviousTime + fireInterval <= timeNow )
{
NSLog(#"10 seconds completed");
mPreviousTime = timeNow;
}
}
And rotate the needle based on mCurrentAngle....
I'm trying to use CMDeviceMotion to track when the iPhone is tilted backwards, and then do something. I've successfully created the CMMotionManager, I'm getting motion data from the system, and I'm filtering for results above a certain acceleration.
What I want to do though, is detect when the device is being tilted backwards or forwards above a certain speed. How do I do that?
Here's the code I have so far:
UPDATE: I think I solved it. I was looking for the rotationRate property, CMRotationRate. I've paired them together, I really only need the x value, so I'll keep working on it. If anyone has some tips at the below code, it's much appreciated.
- (void)startMotionManager{
if (motionManager == nil) {
motionManager = [[CMMotionManager alloc] init];
}
motionManager.deviceMotionUpdateInterval = 1/15.0;
if (motionManager.deviceMotionAvailable) {
NSLog(#"Device Motion Available");
[motionManager startDeviceMotionUpdatesToQueue:[NSOperationQueue currentQueue]
withHandler: ^(CMDeviceMotion *motion, NSError *error){
//CMAttitude *attitude = motion.attitude;
//NSLog(#"rotation rate = [%f, %f, %f]", attitude.pitch, attitude.roll, attitude.yaw);
[self performSelectorOnMainThread:#selector(handleDeviceMotion:) withObject:motion waitUntilDone:YES];
}];
//[motionManager startDeviceMotionUpdates];
} else {
NSLog(#"No device motion on device.");
[self setMotionManager:nil];
}
}
- (void)handleDeviceMotion:(CMDeviceMotion*)motion{
CMAttitude *attitude = motion.attitude;
float accelerationThreshold = 0.2; // or whatever is appropriate - play around with different values
CMAcceleration userAcceleration = motion.userAcceleration;
float rotationRateThreshold = 7.0;
CMRotationRate rotationRate = motion.rotationRate;
if ((rotationRate.x) > rotationRateThreshold) {
if (fabs(userAcceleration.x) > accelerationThreshold || fabs(userAcceleration.y) > accelerationThreshold || fabs(userAcceleration.z) > accelerationThreshold) {
NSLog(#"rotation rate = [Pitch: %f, Roll: %f, Yaw: %f]", attitude.pitch, attitude.roll, attitude.yaw);
NSLog(#"motion.rotationRate = %f", rotationRate.x);
[self showMenuAnimated:YES];
}
}
else if ((-rotationRate.x) > rotationRateThreshold) {
if (fabs(userAcceleration.x) > accelerationThreshold || fabs(userAcceleration.y) > accelerationThreshold || fabs(userAcceleration.z) > accelerationThreshold) {
NSLog(#"rotation rate = [Pitch: %f, Roll: %f, Yaw: %f]", attitude.pitch, attitude.roll, attitude.yaw);
NSLog(#"motion.rotationRate = %f", rotationRate.x);
[self dismissMenuAnimated:YES];
}
}
}
Have you looked at CMAttitude? Sounds like what you need, plus some mathematics I guess.
EDIT: Ok you did.
Quoting Apple documentation, at chapter "Getting the Current Device Orientation" :
If you need to know only the general
orientation of the device, and not the
exact vector of orientation, you
should use the methods of the UIDevice
class to retrieve that information.
Using the UIDevice interface is simple
and does not require that you
calculate the orientation vector
yourself.
Great, they do the maths for us. Then I guess that tracking acceleration like you do + tracking orientation, as above documentation explains, should do the job, using UIDeviceOrientationFaceUp and UIDeviceOrientationFaceDown ?
If none of the UIDeviceOrientation fit your needs, what you will need is to calculate some spatial angles from two CMAttitude references, which provides a rotationMatrix and a quaternion... these school maths are far away for me... I would then suggest to maybe search/ask a math only question with these.