Measuring a room with an iPhone - iphone

I have a need to measure a room (if possible) from within an iPhone application, and I'm looking for some ideas on how I can achieve this. Extreme accuracy is not important, but accuracy down to say 1 foot would be good. Some ideas I've had so far are:
Walk around the room and measure using GPS. Unlikely to be anywhere near accurate enough, particularly for iPod touch users
Emit sounds from the microphone and measure how long they take to return. There are some apps out there that do this already, such as PocketMeter. I suspect this would not be user friendly, and more gimmicky than practical.
Anyone have any other ideas?

You could stand in one corner and throw the phone against the far corner. The phone could begin measurement at a certain point of acceleration and end measurement at deceleration

1) Set iPhone down on the floor starting at one wall with base against the wall.
2) Mark line where iPhone ends at top.
3) Pick iPhone up and move base to where the line is you just drew.
4) Repeat steps 1->3 until you reach the other wall.
5) Multiply number of lines it took to reach other wall by length of iPhone to reach final measurement.
=)

I remember seeing programs for realtors that involved holding a reference object up in a picture. The program would identify the reference object and other flat surfaces in the image and calculate dimensions from that. It was intended for measuring the exterior of houses. It could follow connected walls that it could assume were at right angles.
Instead of shipping with a reference object, as those programs did, you might be able to use a few common household objects like a piece of printer paper. Let the user pick from a list of common objects what flat item they are holding up to the wall.
Detecting the edges of walls, and of the reference object, is some tricky pattern recognition, followed by some tricky math to convert the found edges to planes. Still better than throwing you phone at the far wall though.

Emit sounds from the microphone and measure how long they take to return. There are some apps out there that do this already, such as PocketMeter. I suspect this would not be user friendly, and more gimmicky than practical.
Au contraire, mon frère.
This is the most user friendly, not to mention accurate, way of measuring the dimensions of a room.
PocketMeter measures the distance to one wall with an accuracy of half an inch.
If you use the same formulas to measure distance, but have the person stand near a corner of the room (so that the distances to the walls, floor, and ceiling are all different), you should be able to calculate all three measurements (length, width, and height) with one sonar pulse.
Edited, because of the comment, to add:
In an ideal world, you would get 6 pulses, one from each of the surfaces. However, we don't live in an ideal world. Here are some things you'll have to take into account:
The sound pulse causes the iPhone to vibrate. The iPhone microphone picks up this vibration.
The type of floor (carpet, wood, tile) will affect the time that the sound travels to the floor and back to the device.
The sound reflects of off more than one surface (wall) and returns to the iPhone.
If I had to guess, because I've done something similar in the past, you're going to have to emit a multi-frequency tone, made up of a low frequency, a medium frequency, and a high frequency. You'll have to perform a fast Fourier Transform on the sound wave you receive to pick out the frequencies that you transmitted.
Now, I don't want to discourage you. The calculations can be done. However, it's going to take some work. After all PocketMeter has been at it for a while, and they only measure the distance to one wall.

I think an easier way to do this would be to use the Pythagorean theorem. Most rooms are 8 or 10 feet tall and if the user can guess accurately, you can use the camera to do some analysis and crunch the numbers. (You might have to have some clever way to detect the angle)
How to do it
I expect 5 points off of your bottom line for this ;)

Let me see if it helps. Take an object of known length and keep it beside the wall and with Iphone, take pic of wall along with the object that you kept beside the wall. Now get the ratio of wall width and object width from the image in Iphone. And as you know the width of the object, you can easily calcualte the width of wall. repeat it for each wall and you will have a room measurement.

Your users could measure a known distance by pacing it off, and thereby calibrate the length of their pace. Then they could enter the distance of each wall in paces, and the phone would convert it to feet. This would probably be very convenient, and would probably be accurate to within 10%.
If they may need more accurate readings, then give them the option of entering in a measurement from a tape measure.

This answer is somewhat similar to Jitendra's answer, but the method he suggests will only work where you can fit the whole wall in a single shot.
Get an object of know size and photograph it held against the wall with the iphone held against the other wall (two people or blutac needed). Then you can calculate the distance between the walls by looking at the size of the object (in pixels) in the photo. You could use a PDF to make a printed document the object of known size and use a 2D barcode to get the iphone to pick it up.

When the user wants to measure something, he takes a picture of it, but you're actually taking two separate images, one with flash on, one with flash off. Then you can analyze the lighting differences in the image and the flash reflection to determine the scale of the image. This will only work for close and not too shining objects I guess.
But that's about the only other way I thought about deducting scale from an image without any fixed objects.

Related

Area measurement of fluorescent powder from finger contacts - batch processing

I have 120 photos like the one below showing the amount of fluorescent powder deposited onto a surface when it is touched by fingers. The photo is taken under UV light. You can see 5 finger prints and the reflection from the light source.
I'd like to know if there is an automated way of estimating the area of the fluorescent finger prints in batch mode. We have been using image J to manually select a particular print and estimate the area. Is it possible to automatically recognise the fingerprint in imageJ and measure it for all 5 prints on each of the 120 photos?
Note: Clearly you can see the print on the right is quite well defined but the one of the left is quite diffuse.
First, the data is useless without a scale, and the photos will be hard to process without a fixed set-up. I'd spend time to make a photo set up that minimizes glare and doesn't change scale, then try approaching the problem using the Threshold tool to find the prints, make selections using the resulting mask, then measuring the area. I'd then create a macro to batch process them.

Calculate the distance between the iPhone and an object, knowing their physical widths (of the iPhone as well)

if you check this thread started from me (Calculate the distance between the iPhone and a door, knowing their physical widths) I accepted an answer, that states, correctly, that if you do not know focal lens data of the camera of the iPhone, there is no easy way to calculate the distance between an iPhone and, let's say, a door, knowing its width.
I have to start another thread now asking:
I know the physical (not only in pixel) size of the screen of the iPhone (for iPhone 5 is 2.31 inches)
Also I know the width of a door.
Now, if I am in the position where the width of the door fits perfectly in the width of the iPhone itself (not of the camera), and the iPhone stands perfectly vertical, is it possible to know the distance between the iPhone and the door in that moment?
Thank you for your kind help!
I assume you mean that there is some outside image capturing device (be it a human eye or another camera) and the image capturing device, the phone, and the door are all in a line such that the phone appears to be the same width as the door.
In this case, you still need a) the distance between the image capturing device and the phone and b) the optical information (such as focal length) of the image capturing device. Just sit down with a pen and paper and draw out the geometry for a little bit and you'll see that.
This is going to involve a trigonometric calculation. I think you might have done R&D on Gyroscope, if not then surely you should refer it.
1) Find angle your phone is making with ground. Like when you point the device's camera to bottom of the object.
2) Now you are having one angle and you are making 90 degree with ground. So basically you are forming a right angled triangle. And you had just found one of your angle near your hand.
3) You can approximate distance of your phone from surface to your hand. So you got one side of triangle and one angle. Thus you can find second side i.e distance between you and object.
Hope this helps. :)

Smallest touch point that can be accurately detected on a smartphone screen?

What is the smallest touch point that can be accurately detected on a typical Androud or iOS smartphone screen? Or (if you could reframe this question in a different way), what is narrowest tip that a smartphone stylus could have?
UPDATE
So I've done some Googling of touchscreen suppliers and the only possibly relevant spec I could find was touchpoint density, expressed as touchpoints/unit area. However, the numbers are absurdly high, something on the order of 100k/sq.inch. This would seem to imply that smartphone screens can detect touches that are as small as 10^-6 sq. inches, or touches that are thinner than a hair. Or my understanding of the unit of touchpoint density is flawed.
http://kingtouch.en.made-in-china.com/product/DqOQNaVjferi/China-22-Surface-Capacitive-Touch-Screen-KTT-CA22K-.html
It varies from smartphone to smartphone, obviously. There are many different kinds of touchscreen (CCD, resistive, etc.) and each of them have their own general precision range. Here's a good PDF for you: (though I don't know who it's by).
The main problem is that this question has too general of an answer. There is no set value, and there is no scale for touchscreen precision (that I'm aware of).
Assuming that the touch sensor is 1-for-1 with the display (I have no idea if it is), just ask the API: http://developer.android.com/reference/android/util/DisplayMetrics.html

Measuring distance with iPhone camera

How to implement a way to measure distances in real time (video camera?) on the iPhone, like this app that uses a card to compare the size of the card with the actual distance?
Are there any other ways to measure distances? Or how to go about doing this using the card method? What framework should I use?
Well you do have something for reference, hence the use of the card. Saying that after watching the a video for the app I can't seem it seems too user friendly.
So you either need a reference of an object that has some known size, or you need to deduct the size from the image. One idea I just had that might help you do it is what the iPhone's 4 flash (I'm sure it's very complicated by it might just work for some stuff).
Here's what I think.
When the user wants to measure something, he takes a picture of it, but you're actually taking two separate images, one with flash on, one with flash off. Then you can analyze the lighting differences in the image and the flash reflection to determine the scale of the image. This will only work for close and not too shining objects I guess.
But that's about the only other way I thought about deducting scale from an image without any fixed objects.
I like Ron Srebro's idea and have thought about something similar -- please share if you get it to work!
An alternative approach would be to use the auto-focus feature of the camera. Point-and-shoot camera's often have a laser range finder that they use to auto-focus. iPhone doesn't have this and the f-stop is fixed. However, users can change the focus by tapping the camera screen. The phone can also switch between regular and macro focus.
If the API exposes the current focus settings, maybe there's a way to use this to determine range?
Another solution may be to use two laser pointers.
Basically you would shine two laser pointers at, say, a wall in parallel. Then, the further back you go, the beams will look closer and closer together in the video, but they will still remain the same distance apart. Then you can easily come up with some formula to measure the distance based on how far apart the dots are in the photo.
See this thread for more details: Possible to measure distance with an iPhone and laser pointer?.

Manipulating Gyroscope / Accelerometer Values obtained from iPhone 4

I'm developing a project for my university, to manipulate gyroscope/accelerometer values obtained from the iPhone 4. But i'm stuck with a mathmatical issue and I hope you guys can help me out.
I'll give you an example of what's about:
Your iPhone is face up and you move it UP, on Y axis.
Your iPhone is face right and you move it UP, on X axis this time (since you rotated the iphone 90 degrees).
On the second time, the computer interprets that i've moved the iphone to the RIGHT, but it's wrong. I've moved it up, but my axis were rotated since the iphone was face right.
What do I need?
I need a way to VIRTUALY position back the iphone face up (where the 3 axis are correct) and give each axis his correct movement value.
If the iphone is turned 90 degrees, then I can easily switch X-Z axis and its correct. But I want to make it work for any angle of rotation.
I would be really thankfull if anyone can help me with some sort of pseudo-algorithm or mathmatical description of what to do.
NOTE: I only need a way to compensate all three axis acording with the iPhone rotation.
Update:
I don't actually need the precise values, since I'm making a graph coparition between all the records I get from the gyroscope. I'll make it clearer.
-> You draw a LETTER just by moving the iphone in the air and my application will recognize the letter you just drew. The method I use for recognition is based on TFT algorithm, and recording to a database with sample values originated from letters I've previously drawed.
My point is: Don't really matter the values I get, or what they represent. All I need is that all graphs be equal even if the iPhone is on different position. Quite hard to explain, but if you draw the letter 'P' with the iphone turned UP, the graph originated will be different if you draw the 'P' with the iPhone turned RIGHT.
So I need to compensate the axis to their original orientation, that way I'll get always similar graphs.
This post was before iOS5 was released. FYI to anyone coming here, DeviceMotion relative to world - multiplyByInverseOfAttitude shows how to transform device-relative acceleration values to earth-relative for iOS5 and above.
So, what you want is to convert from iPhone's coordinate system (or object/tangent system) to world coordinate system (or vice versa, as you see it, doesn't matter). You know the iPhone coordinate system as you have gyroscope data. So what you want is to create the object->world transformation matrix and multiple each of the velocity vectors (from accelerometer) by it.
Take a look here for a good explanation of tangent space and how to create tangent space -> world transformation matrix. If you aren't familiar with 3D/linear math it might be a bit tricky, but worth the trouble.