How to detect music note and chords programmatically (in iPhone SDK)? - iphone

How to detect music note and chords programmatically (in iPhone SDK) ??

You would need to do a Fourier transform (usually a FFT) of the incoming sound wave and find the dominant frequency, then look up that frequency in a table of notes with their corresponding frequencies.
FFT is part of iOS since iOS4 and is located in the Accelerate framework.
Look at this other SO thread for more info and sample code.
To detect a chord is done by the same principle, but it's much trickier, since you need to find all the notes that make up the chord.

Related

Are there open source libraries to build audio visualizations for iOS?

I have an equalizer view with 10 bars in OpenGL ES which can light up and down. Now I'd like to drive this equalizer view from the background music that is playing in iOS.
Someone suggested what I need is a Fast Fourier Transform to transform the audio data into something else. But since there are so many audio visualizations floating around, my hope is that there is an open source library or anything that I could start with.
Maybe there are open source iOS projects which do audio visualization?
Yes.
You can try this Objective-C library which I wrote for exactly this purpose. What it does is to give you an interface for playing files from URLs and then getting real-time FFT and waveform data so that you can feed it with your OpenGL bars or whatever graphics you're using to visualise the sound. It also tries to deliver very accurate results.
If you want to do it in swift, you can take a look at this example which is cleaner and also shows how it's possible to actually draw the equalizer view.

Acceleration threshold for shake versus typical hand motion?

What acceleration characterization should be used to differentiate an intentional single shake-to-do-something shake versus other typical random or unintentional device motions?
Shaking can be detected by the OS itself. There's no need to do this yourself.
More info in this StackOverflow question:
How to use Shake API in iPhone SDK 3.0?

Measuring distance between two iOS Devices

Yeah, I'm currently wondering about this.
In my use case the devices will be 50cm to 10m apart and I'd like it to be accurate to at least 10 cm. (Therefore GPS is not an option)
2 Ways spring to mind:
Sound: I asked about this in the dev forums and I'm in contact with laanlabs, about the code of their sonar ruler.
Picture on one device + Camera on the other: Seems easier to set up, since my user case involves the user facing one device at 90 degrees anyway. But it would be more work for the user to face the camero into the direction and it would not react to a change in distance.
Now the question: Is anyone aware of any code that does something like this already? Possibly a non-iPhone general c-Project?
Method with camera: we already know size for each device. You take a picture of device, calculate it's height/width to determine type of device (iPhone/iPod or iPad), than calculate a distance.
For example - if device is iPhone you know, that its size is 115x58 mm. On picture it NxM pixels. Now you can calculate the distance. (If N & M smaller hence distance is larger)
If you were to use the sound method one approach would be to have device A emit a sound, device B would then be listening for this at all times and on detection echo back a secondary sound. This would give you a round-trip time from which you could calculate distance - don't forget to compensate for latency between detection re-emission as well.
I am not sure about but this is what i found from one of the answers in this previous SO question How to measure distance between two iphone devices using bluetooth?
Using bluetooth for localization is a very well known research field . The short answer is: you can't. Signal strength isn't a good indicator of distance between two connected bluetooth devices, because it is too much subject to environment condition (is there a person between the devices? How is the owner holding his/her device? Is there a wall? Are there any RF reflecting surfaces?). Using bluetooth you can at best obtain a distance resolution of few meters, but you can't calculate the direction, not even roughly.
You may obtain better results by using multiple bluetooth devices and triangulating the various signal strength, but even in this case it's hard to be more accurate than few meters in your estimates.

How reduce background noise while recording in iphone?

I am planning to compare two audio files. i have recorded two voices and compared them using cross correlation. since the presence on background noise while recording the resulting correlation value is always near 0.5.If i give any recorded waves from internet , i am able to get the correct value. So how can i reduce the background noise while recording.Please guide me .Thanks.
Is there any possibility to reduce noise from the recorded .wav file?
Not really a good way to remove background noise while recording other than simply turning down the recording mic. It seems that you're already exporting the data, so instead of attempting to eliminate it at the source, it'd probably be easier for you filter it out, this is a quick tutorial on how to eliminate background noise using Audacity
Try using a noise-canceling microphone and an anechoic chamber (or really quiet studio).
You can also try filtering out all frequency bands that are not of interest (e.g. are outside the human vocal spectral range required for recognition or your comparison).
There is a special kind software to remove background audio noises. Audacity for PC might be one of the most popular (it is open source by the way). Also, you could do that entirely on iPhone with the Denoise app.
Basically, recorded audio contains a lof of individual sounds and some of them are harmful (a.e. buzzing lamps, street noises, mic issues, etc.). A very popular approach to suppress harmful sounds is to calculate frequency spectre of harmful noises and supress "harmful" frequencies throughout the whole audio duration. In both, Audacity and Denoise you do that by selecting a "noise only" fragment. All sounds in that fragment are considered to be noises and are suppressed in whole file.
If you need to incorporate this feature into your app then you could have a look at the Audacity sources. Please let me know if you need more details.

Motion detection using iPhone

I saw at least 6 apps in AppStore that take photos when detect motion (i.e. a kind of Spy stuff). Does anybody know what is a general way to do such thing using iPhone SDK?
I guess their apps take photos each X seconds and compare current image with previous to determine if there any difference (read "motion"). Any better ideas?
Thank you!
You could probably also use the microphone to detect noise. That's actually how many security system motion detectors work - but they listen in on ultrasonic sound waves. The success of this greatly depends on the iPhone's mic sensitivity and what sort of API access you have to the signal. If the mic's not sensitive enough, listening for regular human-hearing-range noise might be good enough for your needs (although this isn't "true" motion-detection).
As for images - look into using some sort of string-edit-distance algorithm, but for images. Something that takes a picture every X amount of time, and compares it to the previous image taken. If the images are too different (edit distance too big), then the alarm sounds. This will account for slow changes in daylight, and will probably work better than taking a single reference image at the beginning of the surveillance period and then comparing all other images to that.
If you combine these two methods (image and sound), it may get you what you need.
You could have the phone detect light changes meaning using the sensor at the top front of the phone. I just don't know how you would access that part of the phone
I think you've about got it figured out- the phone probably keeps images where the delta between image B and image A is over some predefined threshold.
You'd have to find an image library written in Objective-C in order to do the analysis.
I have this application. I wrote a library for Delphi 10 years ago but the analysis is the same.
The point is to make a matrix from whole the screen, e.g. 25x25, and then make an average color for each cell. After that, compare the R,G,B,H,S,V of average color from one picture to another and, if the difference is more than set, you have motion.
In my application I use fragment shader to show motion in real time. Any question, feel free to ask ;)