Distance to "point of interest" in iphone auto focus - iphone

In iphone, I want to calculate the distance from the camera to the subject.
I was wondering maybe iphone camera's "Active Auto Focus" provide me the distance to the subject (point of interest)??
many thanks in advance.
p.s. guys if you think it is not possible plz let me know ;)

As far as I'm aware, it's not possible - see here for more info on what you can and can't do. You can tell when the camera is autofocusing, but nothing more.
I'm sure there is probably some more info you can get on focus/distance using a private API call - but unless you're not worried about getting on the App Store, it's not really an option.
That doesn't mean there's not another way though. Looks like a good discussion on the topic here: Distance using OpenCV (you can compile openCV for iOS!)

I've been investigating the ability of the camera to measure short distances and not finding anything yet I came up with this shot in the dark. I haven't tried it yet, but the docs make me think this might work:
Create and configure an AVCaptureDevice representing the camera
Through the AVCaptureSession, capture the AVCaptureStillImageOutput
From the AVCaptureStillImageOutput object check the Exif properties for kCGImagePropertyExifSubjectDistance
Love to hear if anyone has been able to use a methodology like this to be able to make accurate (less than 1 foot) distance measurements.

Related

Re-creating iPad fling momentum in AIR (AS3)

I'm creating an AIR app, but I realized that it doesn't seem to natively support the "fling" momentum. I thought I'd ask if anyone out there has created an object or plugin that would put this back in? Currently, on the objects I want to fling, I'm recreating the momentum, but it's not perfect yet. Could anyone put me onto the right path in doing this?
Thanks!
I ended up creating my own fling, and it works great.
I keep the last 3 touchmove locations and times in memory, then on touchend, I find the difference between the most recent and oldest touch in memory, to find speed and direction. Then I used Actuate to ease it a certain distance past that if necessary.
It took a lot of tweaking to make it feel natural, though.
Greensock has a great plugin which works just like iOS native flick, and can be customized to work in an array of ways. Performance in Air is superb even on older iPads. For best results use BitMask.

iPhone, Open custom camera, take picture, save in UITableView with GPS info

Ok I need to open the camera, take a picture and save this in UITableView, also need to be saved the GPS info like date, place and how much km you are from the place that you take the image.
The image need to be saved in UITableView with the GPS info and every time that you take a new pic the process start again and is saving one after one in the same UITableView.
Many thanks.
First, welcome to stackoverflow.
Second, this isn't really a question, so I am going to "answer" this the best way possible. It seems like you are trying to just learn how to make an iPhone application, and this is your first stab at it.
I HIGHLY recommend checking out the stackoverflow FAQ first.
Next, check out Apple's Sample Apps and developer library for some help in getting started:
Sample Code
Getting Started
Photo Picker
Photos by Location
Lastly, as you are working through your code, if you stumble onto a problem, please return and as a more specific question.
Thanks,
Dan
The most important GPS info for photos are:
- time
- latitude
- longitude
Slightly advanced GPS info, still important for photos:
heading (v1)(or course) the direction in which you are going, driving; unfortunately when making a photo you have been stand still for some time, so there not always is a valid heading/course available.
heading (v2) the direction which the camera is looking while taking that photo: if GPS is not suitable, you also can use the compass orientation.
That 3 or 4 attributes, is usually stored into jpgs using an EXIF headers. If for some reason you cannot, or don't want to store that info into the picture then you have to store that values in an own file. A simple CSV file is sufficient.

iOS 4.2, looking for a way to manipulate the iPhone 4 camera's focus distance

I am working in an AR project, and we want to manipulate the focusing distance of the iPhone4 camera. Is this even possible? So far, we've found just toggling and auto focusing as options listed here : http://developer.apple.com/library/ios/#documentation/AVFoundation/Reference/AVCaptureDevice_Class/Reference/Reference.html%23//apple_ref/occ/instm/AVCaptureDevice/isAdjustingFocus
Thanks in advance for any tips! :)
Regarding the API it seems that the only supported actions are:
- check if AF is supported on the device (iPhones 3GS an 4 only I think)
- enable/disable AF
- set the point-of-interest that is NOT the distance, but only a point in the camera view.
Certainly not what you want to do.
Might be supported in private API... but that would not pass the validation process.
A workaround might be to see how much pixels move as the user shifts slightly, to get a sense of how distant some parts of the image are, and then set the AF point to a region of the image either closer or further based on that.
But, also file a Radar ticket requesting access to specify the focus distance if possible - if enough people ask Apple will add it to the API.

Motion detection using iPhone

I saw at least 6 apps in AppStore that take photos when detect motion (i.e. a kind of Spy stuff). Does anybody know what is a general way to do such thing using iPhone SDK?
I guess their apps take photos each X seconds and compare current image with previous to determine if there any difference (read "motion"). Any better ideas?
Thank you!
You could probably also use the microphone to detect noise. That's actually how many security system motion detectors work - but they listen in on ultrasonic sound waves. The success of this greatly depends on the iPhone's mic sensitivity and what sort of API access you have to the signal. If the mic's not sensitive enough, listening for regular human-hearing-range noise might be good enough for your needs (although this isn't "true" motion-detection).
As for images - look into using some sort of string-edit-distance algorithm, but for images. Something that takes a picture every X amount of time, and compares it to the previous image taken. If the images are too different (edit distance too big), then the alarm sounds. This will account for slow changes in daylight, and will probably work better than taking a single reference image at the beginning of the surveillance period and then comparing all other images to that.
If you combine these two methods (image and sound), it may get you what you need.
You could have the phone detect light changes meaning using the sensor at the top front of the phone. I just don't know how you would access that part of the phone
I think you've about got it figured out- the phone probably keeps images where the delta between image B and image A is over some predefined threshold.
You'd have to find an image library written in Objective-C in order to do the analysis.
I have this application. I wrote a library for Delphi 10 years ago but the analysis is the same.
The point is to make a matrix from whole the screen, e.g. 25x25, and then make an average color for each cell. After that, compare the R,G,B,H,S,V of average color from one picture to another and, if the difference is more than set, you have motion.
In my application I use fragment shader to show motion in real time. Any question, feel free to ask ;)

What's the best way to provide animation grabs, e.g. when asking questions about coreanimation?

This is a little meta, but I'd like to ask a question related to coreanimation on the iphone, and I think it would really benefit from some kind of movie attachment / link to show what the code does. Probably I'm not the only one who'd like to do this.
My question is, what's the best way to illustrate an animation, grab it from the device or simulator, and attach it to a SO question?
This is basically the equivalent of a screenshot, but it needs to show the movement. Would a sequence of screen grabs be enough? Or some kind of mov file? And what tools exist to capture that with the minimum of pain.
Screenshots should be good enough (particularly if you annotate the movement direction of an animating object). If you wish to grab movies, I highly recommend running the application in the simulator and using ScreenFlow to capture and edit your video. Unfortunately, that may not give a true representation of the animation frame rate you see on an actual device.
I think that a set of screen-shots are enough to attach to a SO question - otherwise I think you would need to shot a video with another device.