Saving user acceleration from iPhone devieMotion to a filetext - iphone

I'm using deviceMotion for getting useracceleration(x, y, z). My aim is to create a filetext where in each iteration my application writes the 3 components in a row.
I'm using MotionGraphs code sample.
How is it possible - directly, or is necessary to create an array first?
This array; is it NSMutableArray or NSMutableNumber?
I've been looking for this question and I'm lost. :-(
I'm not an Objective-C expert but I remember Pascal code where I opened a file, and then I was writing in each iteration, but I checked: that programming has changed.
At the beginning we don't take into account different filters or discrimination window. For them, I've implemented freescale procedure. I'm just looking to save accelerometer data / to store data from accelerometer using deviceMotion userAcceleration.
float minX = 1.0f;
float minY = 1.0f;
float minZ = 1.0f;
NSMutableArray *container = [[NSMutableArray alloc] init];
-(void)startUpdatesWithSliderValue:(int)sliderValue
{
NSTimeInterval delta = 0.005;
NSTimeInterval updateInterval = deviceMotionMin + delta * sliderValue;
CMMotionManager *mManager = [(APLAppDelegate *)[[UIApplication sharedApplication] delegate] sharedManager];
APLDeviceMotionGraphViewController * __weak weakSelf = self;
[container addObject:[NSNumber numberWithFloat:deviceMotion.userAcceleration.x]];
[container addObject:[NSNumber numberWithFloat:deviceMotion.userAcceleration.y]];
[container addObject:[NSNumber numberWithFloat:deviceMotion.userAcceleration.z]];
}
//Finally we have to dump data to filetext, this is I don´t know correctly.

1 Create NSMutableArray *container = [[NSMutableArray alloc] init]; to be your container.
2 Within the Accelerometer delegate method for did detect motion be sure to set a min for each of the 3 axis. e.g. float min_X = 1.0f; float min_y =1.0f; float min_Z = 1.0f
-(void)accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration {
}
3 Use Simple Filter Logic as in: (keep in mind the Acceleration is maxed at +- 2.3g so both positive and negative thresholds need consideration.
if ((acceleration.x > min_X || acceleration.x < -min_X) && (Y's..) && (Z's...) ) {
[container addObject:[NSNumber numberWithFloat:acceleration.x]];
[container addObject:[NSNumber numberWithFloat:acceleration.y]];
[container addObject:[NSNumber numberWithFloat:acceleration.z]];
}
4 The Array should be full of NSNumbers in groups of three (x,y,z).
5 The filter is needed, otherwise the accelerometers can pick small vibrations just sitting on the table.
WARNING: The Array will fill up fast, so Set the Sample Rate to an acceptable range based on how long you want to record data.

Related

How to optimize code for drawing map overlay

I draw path on MKMapView based on coordinates stored in SQLite on iPhone.
But now I stored 14000 coordinates (just lat/lng) in database and now when I want to display overlay path I get application crash.
My question is is there any way to optimize this code to be faster?
This is in view did load:
// ar is NSMutableArray and it is populate from database for a few seconds but code bellow cost me app crash
for(Path* p in ar)
{
self.routeLine = nil;
self.routeLineView = nil;
// while we create the route points, we will also be calculating the bounding box of our route
// so we can easily zoom in on it.
MKMapPoint northEastPoint;
MKMapPoint southWestPoint;
// create a c array of points.
MKMapPoint* pointArr = malloc(sizeof(CLLocationCoordinate2D) * ar.count);
for(int idx = 0; idx < ar.count; idx++)
{
Path *m_p = [ar objectAtIndex:idx];
CLLocationDegrees latitude = m_p.Latitude;
CLLocationDegrees longitude = m_p.Longitude;
// create our coordinate and add it to the correct spot in the array
CLLocationCoordinate2D coordinate = CLLocationCoordinate2DMake(latitude, longitude);
MKMapPoint point = MKMapPointForCoordinate(coordinate);
// adjust the bounding box
// if it is the first point, just use them, since we have nothing to compare to yet.
if (idx == 0) {
northEastPoint = point;
southWestPoint = point;
}
else
{
if (point.x > northEastPoint.x)
northEastPoint.x = point.x;
if(point.y > northEastPoint.y)
northEastPoint.y = point.y;
if (point.x < southWestPoint.x)
southWestPoint.x = point.x;
if (point.y < southWestPoint.y)
southWestPoint.y = point.y;
}
pointArr[idx] = point;
}
// create the polyline based on the array of points.
self.routeLine = [MKPolyline polylineWithPoints:pointArr count:ar.count];
_routeRect = MKMapRectMake(southWestPoint.x, southWestPoint.y, northEastPoint.x - southWestPoint.x, northEastPoint.y - southWestPoint.y);
// clear the memory allocated earlier for the points
free(pointArr);
[self.mapView removeOverlays: self.mapView.overlays];
// add the overlay to the map
if (nil != self.routeLine) {
[self.mapView addOverlay:self.routeLine];
}
UPDATE
ViewDidLoad:
...
[self performSelectorInBackground:#selector(drawPathInBackground) withObject:nil];
...
-(void)drawPathInBackground{
for(int idx = 0; idx < ar.count; idx++)
{ ... }
[self.mapView performSelector:#selector(addOverlay:) onThread:[NSThread mainThread] withObject:self.routeLine waitUntilDone:YES];
}
I did like this and UI not freezes.
The only thing that left is how to draw MKPolyLine on every X points?
three approaches:
don't display every point but rather combine nearby points to just one. the saving depends on your data and the necessity to display all.
if possible load the data in a background thread and display in multiple batches on the main thread. the user will practically see how the data is loaded after time.
load and display data lazily. means: only display those points which are visible on screen
Do fetching from the database and processing in a background thread.
Then reduce the number of coordinates in the path using the Douglas–Peucker algorithm:
And cache the results.
If you have array of coordinates the use this code
here routes is array of coordinates.
NSLog(#"count %d",[routes count]);
MKMapPoint* pointArr = malloc(sizeof(CLLocationCoordinate2D) * [routes count]);
for(int idx = 0; idx < [routes count]; idx++)
{
CLLocation* location = [routes objectAtIndex:idx];
CLLocationCoordinate2D workingCoordinate;
workingCoordinate.latitude=location.coordinate.latitude;
workingCoordinate.longitude=location.coordinate.longitude;
NSLog(#"loop = %f,%f",workingCoordinate.latitude, workingCoordinate.longitude);
MKMapPoint point = MKMapPointForCoordinate(workingCoordinate);
pointArr[idx] = point;
}
// create the polyline based on the array of points.
self.routeLine = [MKPolyline polylineWithPoints:pointArr count:[routes count]];
[mapView addOverlay:self.routeLine];
free(pointArr);
Hope this helps.
google has a algorithm that can encode the locations to string.
for your situation , 14000 coordinates will be encoded to a String nearly 14000 length.
then put the String into sqlite.
it will accelerate the speed to get data from DB

iOS Accelerometer-Based Gesture Recognition

I want to create a project that reads the user's gesture (accelerometer-based) and recognise it, I searched a lot but all I found was too old, I neither have problems in classifying nor in recognition, I will use 1 dollar recogniser or HMM, I just want to know how to read the user's gesture using the accelerometer.
Is the accelerometer data (x,y,z values) enough or should i use other data with it like Attitude data (roll, pitch, yaw), Gyro data or magnitude data, I don't even understand anyone of them so explaining what does these sensors do will be useful.
Thanks in advance !
Finally i did it, i used userAcceleration data which is device acceleration due to device excluding gravity, i found a lot of people use the normal acceleration data and do a lot of math to remove gravity from it, now it's already done by iOS 6 in userAcceleration.
And i used 1$ recognizer which is a 2D recongnizer (i.e. point(5, 10), no Z).Here's a link for 1$ recognizer, there's a c++ version of it in the downloads section.
Here are the steps of my code...
Read userAcceleration data with frequancy 50 HZ.
Apply low pass filter on it.
Take a point into consideration only if its x or y values are greater than 0.05 to reduce noise. (Note: The next step depends on your code and on the recognizer you use).
Save x and y points into array.
Create a 2D path from this array.
Send this path to the recognizer to weather train it or recongize it.
Here's my code...
#implementation MainViewController {
double previousLowPassFilteredAccelerationX;
double previousLowPassFilteredAccelerationY;
double previousLowPassFilteredAccelerationZ;
CGPoint position;
int numOfTrainedGestures;
GeometricRecognizer recognizer;
}
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
previousLowPassFilteredAccelerationX = previousLowPassFilteredAccelerationY = previousLowPassFilteredAccelerationZ = 0.0;
recognizer = GeometricRecognizer();
//Note: I let the user train his own gestures, so i start up each time with 0 gestures
numOfTrainedGestures = 0;
}
#define kLowPassFilteringFactor 0.1
#define MOVEMENT_HZ 50
#define NOISE_REDUCTION 0.05
- (IBAction)StartAccelerometer
{
CMMotionManager *motionManager = [CMMotionManager SharedMotionManager];
if ([motionManager isDeviceMotionAvailable])
{
[motionManager setDeviceMotionUpdateInterval:1.0/MOVEMENT_HZ];
[motionManager startDeviceMotionUpdatesToQueue:[NSOperationQueue currentQueue]
withHandler: ^(CMDeviceMotion *motion, NSError *error)
{
CMAcceleration lowpassFilterAcceleration, userAcceleration = motion.userAcceleration;
lowpassFilterAcceleration.x = (userAcceleration.x * kLowPassFilteringFactor) + (previousLowPassFilteredAccelerationX * (1.0 - kLowPassFilteringFactor));
lowpassFilterAcceleration.y = (userAcceleration.y * kLowPassFilteringFactor) + (previousLowPassFilteredAccelerationY * (1.0 - kLowPassFilteringFactor));
lowpassFilterAcceleration.z = (userAcceleration.z * kLowPassFilteringFactor) + (previousLowPassFilteredAccelerationZ * (1.0 - kLowPassFilteringFactor));
if (lowpassFilterAcceleration.x > NOISE_REDUCTION || lowpassFilterAcceleration.y > NOISE_REDUCTION)
[self.points addObject:[NSString stringWithFormat:#"%.2f,%.2f", lowpassFilterAcceleration.x, lowpassFilterAcceleration.y]];
previousLowPassFilteredAccelerationX = lowpassFilterAcceleration.x;
previousLowPassFilteredAccelerationY = lowpassFilterAcceleration.y;
previousLowPassFilteredAccelerationZ = lowpassFilterAcceleration.z;
// Just viewing the points to the user
self.XLabel.text = [NSString stringWithFormat:#"X : %.2f", lowpassFilterAcceleration.x];
self.YLabel.text = [NSString stringWithFormat:#"Y : %.2f", lowpassFilterAcceleration.y];
self.ZLabel.text = [NSString stringWithFormat:#"Z : %.2f", lowpassFilterAcceleration.z];
}];
}
else NSLog(#"DeviceMotion is not available");
}
- (IBAction)StopAccelerometer
{
[[CMMotionManager SharedMotionManager] stopDeviceMotionUpdates];
// View all the points to the user
self.pointsTextView.text = [NSString stringWithFormat:#"%d\n\n%#", self.points.count, [self.points componentsJoinedByString:#"\n"]];
// There must be more that 2 trained gestures because in recognizing, it gets the closest one in distance
if (numOfTrainedGestures > 1) {
Path2D path = [self createPathFromPoints]; // A method to create a 2D path from pointsArray
if (path.size()) {
RecognitionResult recongnitionResult = recognizer.recognize(path);
self.recognitionLabel.text = [NSString stringWithFormat:#"%s Detected with Prob %.2f !", recongnitionResult.name.c_str(),
recongnitionResult.score];
} else self.recognitionLabel.text = #"Not enough points for gesture !";
}
else self.recognitionLabel.text = #"Not enough templates !";
[self releaseAllVariables];
}

Acceleration in a for..... loop

I have a for....loop and the UI then is blocked when in the loop.
Is the accelerometer also blocked?
I have following code in viewDidLoad
- (void)viewDidLoad{
[super viewDidLoad];
UIAccelerometer *accel = [UIAccelerometer sharedAccelerometer];
accel.delegate = self;
accel.updateInterval = 1.0f/60.0f;
}
and further down
- (void)accelerometer:(UIAccelerometer *)acel didAccelerate:(UIAcceleration *)acceleration {
acc=acceleration.x;
}
then an IBAction connected to a button which starts a loop
- (IBAction)doSomething {
for (int n = 1; n<=100;n++) {
// do someting in the loop
NSLog(#"x: %g", acc);
}
}
when logging the accelerometer x value it shows only the first value when the button is pressed and does not update when the loop is running. The same first value is repeating continuously.
Is there a way to log the acceleration when in the loop?
It would seem that UIAccelerometer needs the main thread so is getting blocked by the for...loop. Core Motion also has an accelerometer property in the CMMotionManager object that can report acceleration, and does better backgrounding since it's what Nike+ uses. There are two ways to access it - a pull method where you get the data directly, or a call-back. To be honest, I just tried it and couldn't get the call-back to work (mostly because it requires NSOperationQueue and I don't have much experience with that so I'm probably putting it on the wrong queue), but if you're ok with pulling data, which your approach seems to need, then this works:
CMMotionManager *motionManager = [[CMMotionManager alloc] init];
[motionManager startAccelerometerUpdates];
for (int i = 0; i < 10000; i++)
{
NSLog(#"Acceleration: %f", motionManager.accelerometerData.acceleration.x);
}
I just tried it on a device and the values reported in the for...loop update with the current acceleration.
Source: UIBackgroundModes and UIAccelerometer

Am I doing the right thing to convert decibel from -120 - 0 to 0 - 120

I will like to measure the sound volume of the surrounding, not too sure if I am doing the right thing.
I will like to create a VU meter of a range of 0(quiet) to 120(very noisy).
I gotten the Peak and Avg power but are very high in normal quiet environment.
Do give me some pointer.
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
//creating an audio CAF file in the temporary directory, this isn’t ideal but it’s the only way to get this class functioning (the temporary directory is erased once the app quits). Here we also specifying a sample rate of 44.1kHz (which is capable of representing 22 kHz of sound frequencies according to the Nyquist theorem), and 1 channel (we do not need stereo to measure noise).
NSDictionary* recorderSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kAudioFormatLinearPCM],AVFormatIDKey,
[NSNumber numberWithInt:44100],AVSampleRateKey,
[NSNumber numberWithInt:1],AVNumberOfChannelsKey,
[NSNumber numberWithInt:16],AVLinearPCMBitDepthKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsBigEndianKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey,
nil];
NSError* error;
NSURL *url = [NSURL fileURLWithPath:#"/dev/null"];
recorder = [[AVAudioRecorder alloc] initWithURL:url settings:recorderSettings error:&error];
//enable measuring
//tell the recorder to start recording:
[recorder record];
if (recorder) {
[recorder prepareToRecord];
recorder.meteringEnabled = YES;
[recorder record];
levelTimer = [NSTimer scheduledTimerWithTimeInterval: 0.01 target: self selector: #selector(levelTimerCallback:) userInfo: nil repeats: YES];
} else
{
NSLog(#"%#",[error description]);
}
}
- (void)levelTimerCallback:(NSTimer *)timer {
[recorder updateMeters];
const double ALPHA = 0.05;
double peakPowerForChannel = pow(10, (0.05 * [recorder averagePowerForChannel:0]));
lowPassResults = ALPHA * peakPowerForChannel + (1.0 - ALPHA) * lowPassResults;
NSLog(#"Average input: %f Peak input: %f Low pass results: %f", [recorder averagePowerForChannel:0], [recorder peakPowerForChannel:0], lowPassResults);
float tavgPow =[recorder averagePowerForChannel:0] + 120.0;
float tpPow = [recorder peakPowerForChannel:0] + 120.0;
float avgPow = tavgPow;//(float)abs([recorder averagePowerForChannel:0]);
float pPow = tpPow;//(float)abs([recorder peakPowerForChannel:0]);
NSString *tempAvg = [NSString stringWithFormat:#"%0.2f",avgPow];
NSString *temppeak = [NSString stringWithFormat:#"%0.2f",pPow];
[avg setText:tempAvg];
[peak setText:temppeak];
NSLog(#"Average input: %f Peak input: %f Low pass results: %f", avgPow,pPow , lowPassResults);
}
Apple uses a lookup table in their SpeakHere sample that converts from dB to a linear value displayed on a level meter. This is to save device power (I guess).
I also needed this, but didn't think a couple of float calculations every 1/10s (my refresh rate) would cost so much device power. So, instead of building up a table I moulded their code into:
float level; // The linear 0.0 .. 1.0 value we need.
const float minDecibels = -80.0f; // Or use -60dB, which I measured in a silent room.
float decibels = [audioRecorder averagePowerForChannel:0];
if (decibels < minDecibels)
{
level = 0.0f;
}
else if (decibels >= 0.0f)
{
level = 1.0f;
}
else
{
float root = 2.0f;
float minAmp = powf(10.0f, 0.05f * minDecibels);
float inverseAmpRange = 1.0f / (1.0f - minAmp);
float amp = powf(10.0f, 0.05f * decibels);
float adjAmp = (amp - minAmp) * inverseAmpRange;
level = powf(adjAmp, 1.0f / root);
}
I'm using an AVAudioRecorder, hence you see getting the dB's with averagePowerForChannel:, but you can fill your own dB value there.
Apple's example used double calculations, which I don't understand because for audio metering float accuracy is more than sufficient, and costs less device power.
Needless to say, you can now scale this calculated level to your 0 .. 120 range with a simple level * 120.0f.
The above code can be sped up when we fix root at 2.0f, by replacing powf(adjAmp, 1.0f / root) with sqrtf(adjAmp); but that's a minor thing, and a very good compiler might be able to do this for us. And I'm almost sure that inverseAmpRange will be calculated once at compile-time.
The formula for converting a linear amplitude to decibels when you want to use 1.0 as your reference (for 0db), is
20 * log10(amp);
So I'm not sure about the intent from looking at your code, but you probably want
float db = 20 * log10([recorder averagePowerForChannel:0]);
This will go from -infinity at an amplitude of zero, to 0db at an amplitude of 1.
If you really need it to go up to between 0 and 120 you can add 120 and use a max function at zero.
So, after the above line:
db += 120;
db = db < 0 ? 0 : db;
The formula you are using appears to be the formula for converting DB to amp, which I think is the opposite of what you want.
Edit: I reread and it seems you may already have the decibel value.
If this is the case, just don't convert to amplitude and add 120.
So Change
double peakPowerForChannel = pow(10, (0.05 * [recorder averagePowerForChannel:0]));
to
double peakPowerForChannel = [recorder averagePowerForChannel:0];
and you should be okay to go.
Actually, the range of decibels is from -160 to 0, but it can go to positive values.(AVAudioRecorder Class Reference - averagePowerForChannel: method)
Then is better write db += 160; instead of db += 120;. Of course you can also put an offset to correct it.
I make a regression model to convert the mapping relation between the wav data generated from NSRecorder and the decibel data from NSRecorder.averagePowerForChannel
NSRecorder.averagePowerForChannel (dB) = -80+6 log2(wav_RMS)
Where wav_RMS is root mean square value of wav data in a short time, i.e. 0.1 sec.
Simply set your maximum and minimum value. Like you getting range of 0-120. If you want range of 0-60. Simply divide value to half to get the half range and so on..

iPad 1 Gyroscope: roll,pitch,yaw stay zero

I'm trying to make a simple app utilizing gyroscope, where a character moves according to the rotation of the iPad 1.
My code is not working, so I tested to see the values of raw,pitch,yaw,
and they actually stay as zero however I move the device.
I'm sure iPad 1 supports CMMotionManager, so I'm not sure what's causing it...
My code is as follows
- (id) init{
if((self=[super init])){
self.isTouchEnabled = YES;
winSize = [[CCDirector sharedDirector] winSize];
[self createRabbitSprite];
self.motionManager = [[CMMotionManager alloc] init];
motionManager.deviceMotionUpdateInterval = 1.0/60.0;
if(motionManager.isDeviceMotionAvailable){
[motionManager startDeviceMotionUpdates];
}
[self scheduleUpdate];
//[self registerWithTouchDispatcher];
}
return self;
}
-(void)update:(ccTime)delta{
CMDeviceMotion *currentDeviceMotion = motionManager.deviceMotion;
CMAttitude *currentAttitude = currentDeviceMotion.attitude;
if(referenceFrame){
[currentAttitude multiplyByInverseOfAttitude:referenceFrame];
}
float roll = currentAttitude.roll;
float pitch = currentAttitude.pitch;
float yaw = currentAttitude.yaw;
NSLog(#"%.2f and %.2f and %.2f",roll,pitch,yaw);
rabbit.rotation = CC_RADIANS_TO_DEGREES(yaw);
}
Please help me out..
and thanx in advance.
(edit)
Apparently, motionManager.isDeviceMotionAvailable is returning FALSE...
which must mean that iPad 1 doesn't support CoreMotion???
Could it be something with the setting?
The iPad First generation does support CMMotionManager (as it has an accelerometer), but won't return any gyroscopic data - it doesn't have a gyroscope! You'll need to check the gyroAvailable property of a CMMotionManager instance.