How can i track a circular motion in 3D? Currently I'm using accelerometer & gyroscope and sensor fusion algorithm to calculate orientation. But i cannot calculate the position accurately and hence unable to track the motion precisely.Please suggest any algorithm or an alternate method.
Related
I'm doing robot project - It need to measure subtle movements in XY direction, while driving in Z direction .
So I was thinking of using a camera with MATLAB and blinking LED attached to a wall - that way using image subtraction I can identify the LED, and with weight matrix locate the center of the light.
Now every period of time I can log the amount of pixels the center moved right-left or up-down directions and check the accuracy of the motion.
But when attempting this sensing solution I had some challenges I couldn't overcome
light source like LED/laser has soft edges so the center is not accurate
the camera is not calibrated (and I'm not sure how to calibrate it)
Is there other simple solution for this problem?
note: the amount of motion can be proportional.
You might be able to improve the accuracy of the location of the led by applying some kind of peak interpolation.
For the calibration: Matlab offers an app for camera calibration, maybe that helps you.
I am working on Optical flow based vehicle detection and tracking purely on MATLAB.
Provided that camera is in motion and object is also in motion.
Previously, a lot work is done on camera in stationary condition and object moving. Optical flow vectors can easily be determined using LUCAS-KANADE method and Horn and shunck. Sipmly taking two consecutive images results are achevied. I have done tests and acheived.
There is also simulink example viptrafficof_win available.
I need to perform optical flow based detection and tracking for camera and object both in motion. What methodology shall I pursue?
If your camera is moving, you would have to separate the camera motion (ego motion) from the motion of the objects. There are different ways of doing that. Here is a recent paper describing an approach using the orientations of optical flow vectors.
I am using an Accelerometer to track the linear motion of an object x-axis. I need to precisely find the time instant when the object stops, so that i can integrate the accelerometer data to track the path of the motion from the start to the completion of motion.
Can i know through Gyroscope, if I've moved some distance?
To elaborate my question, can I be able to detect the device's motion if the user moves a couple of feet from the starting position?
Thanks in advance.
Linear motion is detected using the accelerometer. Rotation is detected using the gyroscope.
You can use the CMDeviceMotion class to detect both types of movement. You will have to integrate the value of userAcceleration over time to detect a change in position.
Check out the What's New in Core Motion video from WWDC 2011. You will probably find it helpful.
I'd like to transform the yaw, pitch and roll of the iPhone from the body frame to the world frame, i.e. azimuth, pitch and roll. On Android this is easily done with the
SensorManager.remapCoordinateSystem(), SensorManager.getOrientation methods as detailed here: http://blog.mysticlakesoftware.com/2009/07/sensor-accelerometer-magnetics.html
Are similar methods available for the iPhone or can someone point me in the right direction how to do this transformation?
Thanks
The accelerometer is good enough to get gravity direction vector in device coordinate system. That is in case when device calms down.
The next step for full device orientation is to use CLLocationManager and get the true north vector in device coordinate system.
With the normalized true north vector and gravity vector you can easily get all other directions using the dot and cross vectors product.
The accelerometer (UIAccelerometer) will give you a vector from the device's accelerometer chip straight down. If you can assume that the device is being held fairly steady (i.e., that you're not reading acceleration from actual movement), then you can use simple trig (acos(), asin()) to determine the device's orientation.
If you're worried that the device might be moving, you can wait for several accelerometer readings in a row that are nearly the same. You can also filter out any vector with a length that's ± TOLERANCE (as you define it) from 1.0
In more general terms, the device has no way of knowing its orientation, other than by "feeling gravity", which is done via the accelerometer. The challenges you'll have center around the fact that the accelerometer feels all acceleration, of which gravity is only one possible source.
If you're targeting a device with a gyroscope (iPhone 4 at the time of writing), the CoreMotion framework's CMMotionManager can supply you with CMDeviceMotion updates. The framework does a good job of processing the raw sensor data and separating gravity and userAcceleration for you. You're interested in the gravity vector, which can define the pitch and roll with a little trig. To add yaw, (device rotation around the gravity vector) you'll also need to use the CoreLocation framework's CLLocationManager to get compass heading updates.