Can I set the focal length of iphone camera programmatically? - iphone

Hey therer, I am wondering whether I could fix the focal length of iPhone camera to certain distance, such as 120mm. Is it possible to achieve this programmatically?
Thank you for your time and help :D

No, the iPhone camera is fixed focal length (5.4mm for the iPhone 4) and cannot zoom - perhaps you mean focal distance which is the distance from the lens at which an image is in focus? If you mean focal distance, then this answer is of no use to you. :-(
If you want to simulate the effect of different focal length (i.e. zoom), you can simply crop the image, although your cropped image will be of lower resolution than the uncropped original.
The iPhone 4 has a real focal length of 5.4mm, and a full-frame equivalent focal length of 29.4mm. Your desired 120mm focal length is also full-frame equivalent.
120/29.4 is about 4. So crop out the middle 1/4 (25%) of the height and width of the iPhone photo using your application, and you have simulated the Field of View of a 120mm lens.
The iPhone 4 takes photos at 2592x1936, so the cropped photo will be 648x484. That's approximately VGA resolution.

Related

Extracting similar image patches captured from 2 different lenses of the same camera

Example Image
I have 2 images captured from my iPhone 8 plus and each of them have been captured at different zoom levels of the iPhone camera (this implies one image was captured with a 1x zoom and the other at 2x zoom). And now I wanna try and extract an image patch in Picture with 1x zoom corresponding to the zoomed in image i.e. Image with 2x Zoom. How would one go about doing that ?
I understand that SIFT features maybe helpful but is there a way I could use the information about the camera extrinsic and intrinsic matrices to find out the desired region of interest ?
(I'm just looking for hints)

Calculate the distance between the iPhone and an object, knowing their physical widths (of the iPhone as well)

if you check this thread started from me (Calculate the distance between the iPhone and a door, knowing their physical widths) I accepted an answer, that states, correctly, that if you do not know focal lens data of the camera of the iPhone, there is no easy way to calculate the distance between an iPhone and, let's say, a door, knowing its width.
I have to start another thread now asking:
I know the physical (not only in pixel) size of the screen of the iPhone (for iPhone 5 is 2.31 inches)
Also I know the width of a door.
Now, if I am in the position where the width of the door fits perfectly in the width of the iPhone itself (not of the camera), and the iPhone stands perfectly vertical, is it possible to know the distance between the iPhone and the door in that moment?
Thank you for your kind help!
I assume you mean that there is some outside image capturing device (be it a human eye or another camera) and the image capturing device, the phone, and the door are all in a line such that the phone appears to be the same width as the door.
In this case, you still need a) the distance between the image capturing device and the phone and b) the optical information (such as focal length) of the image capturing device. Just sit down with a pen and paper and draw out the geometry for a little bit and you'll see that.
This is going to involve a trigonometric calculation. I think you might have done R&D on Gyroscope, if not then surely you should refer it.
1) Find angle your phone is making with ground. Like when you point the device's camera to bottom of the object.
2) Now you are having one angle and you are making 90 degree with ground. So basically you are forming a right angled triangle. And you had just found one of your angle near your hand.
3) You can approximate distance of your phone from surface to your hand. So you got one side of triangle and one angle. Thus you can find second side i.e distance between you and object.
Hope this helps. :)

Calculate the distance between the iPhone and a door, knowing their physical widths

I have this scenario:
I know the physical (not only in pixel) size of the screen of the iPhone.
Also I know the width of a door.
Now, if I have the iPhone camera on (with UIImagePicker or whatever), and I am in the position where the width of the door fits perfectly in the width of the camera, and the iPhone stands perfectly vertical, is it possible to know the distance between the iPhone and the door?
It would depend on the camera specs which vary between devices. For this reason I would try to sample some data with a ruler - for instance take a 3' wide plank, align the edges perfectly and measure distance. Do this with varying widths on different devices and you'll have a formula per device (basic algebra)

What's the difference between position and positionInPixels

In the past I always see that the property called position and positionInPixels are the same. This time position * 2 == positionInPixels. Who can tell me what's the difference between these 2 properties or when their value will be different?
position is in points and positionInPixels is in pixels. On non-retina device 1 point = 1 pixel. On retina device like iPhone 4/4S and the New iPad, 1 point = 2 pixels.
Per iOS Human Interface Guidelines:
Note: Pixel is the appropriate unit of measurement to use when discussing the size of a device screen or the size of an icon you create in an image-editing application. Point is the appropriate unit of measurement to use when discussing the size of an area that is drawn onscreen.
On a standard-resolution device screen, one point equals one pixel, but other resolutions might dictate a different relationship. On a Retina display, for example, one point equals two pixels.
See "Points Versus Pixels" in View Programming Guide for iOS for a complete discussion of this concept.
position gives you a position in points, whereas positionInPixels gives you the position in pixels. On an iPhone 4, e.g., position can range from (0,0) to (320, 480) (in portrait mode); positionInPixels can range from (0,0) to (640, 960) to reflect the higher resolution its retina display.
Basically, they are different on retina display devices; they are the same on non retina display devices.
Hope this helps...
When you use Retina display your canvas still consists of 320x480 points but each point is composed of 2 pixels. In the standard display each point is one pixel. This is why retina display is more detailed as more pixel data can be used in textures. So position in pixels refers to the position on a specific pixel. (i.e. point 0 can be pixel 0 or pixel 1 in high retina display)

What is the minimum size of an image to perform double finger rotation

I am currently try to perform double finger rotation on a image of size 155*75 and its working properly but when i try to perform the same operation on a image 45*25 its not working.So i think the problem may be due to small size and width So what is the minimum size for an image to perform smooth double finger rotation?
Apple suggest that a finger is approximately 44 points wide on an iOS device screen. Based on that, to fit two of them you'd need 88 points. I'd say 88x88.
Yes, I'm agree with Amorya. Few month ago, I've build an app with several button and the buttons size were less than 30px. It became difficult to touch.