DMP MPU 6050 using STM32F4 - accelerometer

I am trying to combine both data (accelerometer and gyroscope) from my MPU 6050 to get pitch, roll, and yaw value. I've able to get the accelerometer and gyroscope data, but I had problems combining it. I use Keil uVison as my IDE. Has anyone ever made the library to get that value using DMP on Keil?

Yes there is are a few Arduino libraries that do this. You can just translate the code to the stm32F4. The hardest part to find is the datasheet for the DMP

Related

How to do High End video encoding on BeagleboneBlak

As we know BeagleBone Black dont have a DSP on SoC specific for the Video processing but is there any way we can achieve that by adding some extra DSP board.
I mean like Raspberry got Video Processing, so anyone tried to integrate both to get, so we have both the things to make that work.
I know its not the optimal way and these both are different but i have only one BBB and one Raspberry and I am trying to achieve some 1080p video streaming with better quality.
There is no DSP on BeagleBoneBlack, you need to use DSP functions.
If your input is audio, you can use ALSA.
When you say "dont have a DSP on SoC specific for the Video processing" - I think you mean what is usually called a VPU (Video Processing Unit), and indeed Beaglebone Black's AM3358 processor doesn't have it (source: http://www.ti.com/lit/gpn/am3358)
x264 has ARM NEON optimizations, so it can encode video reasonably well in software, 640x480#30fps should be fine, but 1920x1080#30fps is likely out of reach (you may get 8-10fps).
On Raspberry Pi, you can use gstreamer with omxh264enc to take advantage of the onboard VPU to encode video. I think it is a bit rough (not as solid as raspivid etc) but this should get you started: https://blankstechblog.wordpress.com/2015/01/25/hardware-video-encoding-progess-with-the-raspberry-pi/

Can MATLAB do realtime motion tracking?

I tried to track motion in Matlab by using this tutorial (http://www.mathworks.com/help/vision/examples/motion-based-multiple-object-tracking.html) and it works fine but it implies video as source to work.
I wanna know if it's possible to track motion by using the same tutorial but in real time by using camera as source!
Everything is possible, just please try to find some stuff by yourself before asking here.
I think you may find the information you need in this link:
http://www.matlabtips.com/realtime-processing/
Alternately, you could of course just store the camera output as a (very short) video and continuously analyse that instead.
As of release R2014a, MATLAB includes support for USB webcams. If you have an older version, or if you want to use a high-end camera, you would need the Image Acquisition Toolbox.
Once you are able to get frames from the camera, you can reuse almost all of the code in the multiple object tracking example. You would only need to rewrite the readFrame function with code to get a frame from the camera.

Getting the voltage applied to an iPhone's microphone port

I am looking at a project where we send information from a peripheral device to an iPhone through the microphone input.
In a nutshell, the iPhone would act as a voltmeter. (In reality, the controller we developed will send data encoded as voltages to the iPhone for further processing).
As there are several voltmeter apps on the AppStore that receive their input through the microphone port, this seems to be technically possible.
However, scanning the AudioQueue and AudioFile APIs, there doesn't seem to be a method for directly accessing the voltage.
Does anyone know of APIs, sample code or literature that would allow me to access the voltage information?
The A/D converter on the line-in is a voltmeter, for all practical purposes. The sample values map to the voltage applied at the time the sample was taken. You'll need to do your own testing to figure out what voltages correspond to the various sample values.
As far as I know, it won't be possible to get the voltages directly; you'll have to figure out how to convert them to equivalent 'sounds' such that the iOS APIs will pick them up as sounds, which you can interpret as voltages in your app.
If I were attempting this sort of app, I would hook up some test voltages to the input (very small ones!), capture the sound and then see what it looks like. A bit of reverse engineering should get you to the point where you can interpret the inputs correctly.

What Webcam is Compatible with MATLAB

I am working on a project of mine which requires a webcam and MATLAB. I have a Logitech Webcam, and I dont know if I could talk to it through MATLAB because Im trying to work with images and image processing, so I just want to know if there is a way to find out if the webcam is combatible with matlab or if I need to get some other type of webcam to get the job done, if I do need to get another one, it would be helpful if you suggest a certain cam that is cheap and available around.
Thank you.
I've used several Logitech webcams with the Image Acquisition Toolbox on Windows. You'll find a list of supported hardware here.

Improving iPhone AR (Tool)Kit by using the Gyroscope

I'm using iPhone AR Kit and its fork, iPhone AR Toolkit, but I'm trying to improve the user experience by using the gyroscope when it's available.
For those of you who used the kits, do you have any idea on how to do this ? My first thought was to get the gyroscope yaw to get a more precise azimuth value.
So I have to questions :
Does anyone used the AR Kit linked above, and have thoughts on including gyroscope in it ?
Is it a good idea to mix gyroscope and compass data to get a more precise value of the azimuth ?
Gyroscopes measure rotational velocity, so the gyro output will be in change in yaw per second (e.g rad/s) rather than an absolute yaw. There are various methods for trying to use gyros for "dead reckoning" of orientation, but in practice while they're very accurate over the short term, integrating gyro read-outs to determine orientation "drifts" significantly, so you have to keep recalibrating against some absolute measure.
It would be very trivial to use the gyro to interpolate between compass readings, or calculate the bearing based on the gyro only for short fast motions while the compass catches up, but properly fusing the compass and gyro isn't trivial. There's a talk here on integrating sensor for Android that might be a good start. The standard method of fusing sensors is to use a Kalman Filter, there's an introduction here. They're fairly involved tools, you need a good model of your sensor errors for example.