Connecting an arduino, accelerometer and a camera [closed] - matlab

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I am look at using accelerometer(s) as a wearable sensor to track the acceleration of someone's leg while they perform various motions. I would like to video/ take photos of the subject whilst the accelerometer(s) are collecting data. Is there someway to sync the camera with the data from the accelerometer? In order to draw the acceleration vectors on a frames/image from the camera. Therefore, the camera an accelerometer would have to be synchronised to be in real time. Could I use MATLAB?

I have actually done something similar in the past and it might give you a starting point.
I synchronized the video from a webcam and accelerometer data from an IMU connected to an arduino. I ended up programming most if it in Java but that's not really necessary you could probably do it in Matlab.
Assuming that you have already programmed the arduino to sample the accelerometer, you can send that data to a PC via a serial connection. Then you would connect the camera to the same PC, and use Matlab to start recording from both of them simultaneously.
It's far to complicated for me to explain all of the details in this post but I hope this gives you an idea of how to begin.
Goodluck!

Related

neural network that gets a game turn and game state as input [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I have seen this video: https://youtu.be/v9M2Ho9I9Qo?t=49
It’s about creating AI for “GO”
at 0:50 he talks about feeding a neural network a game state of go and a possible move of a player.
My question: what are the best way to feed the game state and the move. I know I can just feed the neural net all the game states after each possible move. But in the video, he said he feeds the move with the board state before the move.
how should it be done?
I think you need to understand policy-based methods. In policy-based methods we are trying to learn directly the policy function that maps state to action.(policy gradients and actor-critic method)

Ideas for a 2D game for a neural network to play [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I am currently trying to implement my own neural-network library, and I would like to test it by letting it (and networks made with other librarys) play a 2D game. The problem is I can't really find a good game for a neural network to play.
Requirements for the game:
It should not involve skills like reaction time, precision. It should instead require some tactical skills.
It should be easily scorable, in order to create an efficient evolutional algorithm.
It should be relatively simple.
It does not have to be a game that already exists, you can come up with one if you have an idea.
It may be a single-player game (like mario) or a 1v1 game (like pong).
It must not be any kind of MMO, RPG etc. I am looking for a small kind of mini-game.
The game should be well playable by a neural network. This means it should have a fixed amount of inputs somehow normalizable between 0 and 1. Inputs can be sensors, angles to closest objects etc. inputs should NOT be the pixels of the screen because 3*1920*1080 is just too much. Up to about 100 inputs are manageable (because I am a beginner and can't afford to let my computer calculate for hours just to get one generation evolved or so).
Also the game should definitely be a 2D game since i am going to use a AWT JPanel to draw on.
I'm the main developer of Neataptic.js, basically a neural network library with neuro-evolution built in to it. Just to give you some ideas, you might want to look at my following articles:
Agar.io AI
Target-seeking AI
Some other suggestions:
Snake
Flappy bird
Bomberman
Neural networks have been tested on most simple 2D games, so if you're stuck you will always find code that might help you.

Create Voice Frequency Graph when user record audio? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
Im building a voice recording app and I would like to show a voice frequency graph similar to "Voice Memo" app on iPhone.
Im not sure exactly where to start build this.. could anyone give me some areas to look into and how to structure it? Ill then go learn all the areas and build it!
Thank you
Great Example Project by Apple:
https://developer.apple.com/library/ios/samplecode/aurioTouch/Introduction/Intro.html
The top chart measures Intensity vs. Time. This is the most intuitive representation of a sound because a louder voice would show up as a larger spike. Intensity is measured in Percentage of Full-Scale (%FS) units where 100% corresponds to the loudest recordable sound by the device.
When a person speaks into a microphone, a voltage fluctuates up and down over time. This is what this graph represents.
The bottom chart is a Power Spectral Density. It shows where there is most power in the signal. For example, a deep loud voice would appear as a maximum at the lower end of the x-axis, corresponding to the low frequencies a deep voice contains. Power is measured in dB (a logarithmic unit) at different frequencies.
After a bit of Googling and testing, I think AVFoundation doesn't provide access to the audio data in real-time, it's a high-level API primarily useful for recording to a file and playing back.
The lower-level Audio Queue Services API seems to be the way to go (although I'm sure there are libraries out there that simplify its complex API).
Audio Queue Services Programming Guide:
https://developer.apple.com/library/mac/documentation/MusicAudio/Conceptual/AudioQueueProgrammingGuide/AboutAudioQueues/AboutAudioQueues.html#//apple_ref/doc/uid/TP40005343-CH5-SW18
DSP in Swift:
https://www.objc.io/issues/24-audio/functional-signal-processing/

Detecting taps on any surface by iPhone [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I am developing an app in which I need to put my iPhone on any surface and detect the taps on that surface. Please provide me any link or idea regarding that.
Thanks.
See sample code out there for using the accelerometer to detect movement.
Then test varying amounts of detected movement and determine what you think is "tap".
As a suggestion - you could use the accelerometer in the phone to detect vibrations. I'm not sure if it will be accurate enough for you, and you will need to do some processing of the results to filter for the correct type of vibration.
This isn't simple, so if you were hoping for some code or library to do this I don't think you'll find what you need.
If you do manage to do it, you could open source your implementation and help the next person who comes along.

Music pitch affecting a game [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
In windows media player, do you know that music visualization graph that changes based on frequency and pitch? What I want is to implement this into an iphone game.
I'll try to explain this as well as I can. I will be playing classical music in a game. I want to use the music's volume/pitch/whatever it is called, to affect gameplay. Like, if suddenly in the music, the volume raises (not the volume of the iphone, but the actual playing of the music) it would increase the chances of a spawn or something.
I'm not asking for a guide on how to implement this, I want to know if there is something that can give me numbers or something based on the pitch/volume/high and low notes of the song that was playing in a game.
Oh and if anyone can tell me what the name of the music graph I am looking for, it would be greatly appreciated.
This sample shows how to do what you want to do. The visualizer in WMP uses the amplitude (volume) of the signal as well as frequency information (using Fast Fourier Transform - probably) to construct the visualization effect.
You can also use the simpler AVAudioPlayer API, if you're interested in just responding to the music's current volume level (and if you want to skip the frequency analysis part). The API includes a callback that notifies your app periodically of the current audio volume.