Is there any api to find out the room temparature intensity , light intensity etc - iphone

Thanks in advance.
Is there any api in iphone to calculate room or surrounding temperature intensity , light intensity programatically.
is it possible to find light intensity using iphone camera programatically.And i found out this app http://itunes.apple.com/us/app/realthermo/id432583584?mt=8 . Here some light and temperature measured apps are there.

No,
The iPhone does not contain a thermometer.
Also there is no public API for accessing the ambient light sensor.

Related

how can a light sensor be used to read colors, in particular red

how can a light sensor be used to read colors, in particular red. I don't have a color sensor, I have a light sensor. how can i use light sensor instead of color sensor
When using a light sensor, you are able to use red color-filter, placed in front of a light sensor, to detect red light only.
Colors filters will filter the colors of incoming light, making it so only red light would be visible to the light sensor.
A color filter would probably be sold separately from a light sensor, although you could probably do some shopping around to find cheap examples.

Android Long Exposure for Sky Imaging

I am very new to Android Camera API. My question is a bit similar to this but with different intentions.
In astro-photography, long exposures are very common to capture faint Deep Sky Objects (DSOs) like nebulas, galaxies and star clusters. But when we take sky shots with long exposures (say 30s), the stars appear as lines (star trails) instead of points, due to continuous rotation of the Earth. Thus astro-photography highly depends on tracking mounts for cameras and telescopes which negate this effect by continuous rotation (after polar alignment) in the opposite direction as the direction of Earth's rotation.
I am trying to find if it's possible to develop an algorithm to achieve this "de-rotation".
Remember we can record videos for planetary imaging (planets are very bright). Stacking softwares like Registax are used to stack good frames to end up with a nice detailed image. But the same technique cannot be used for DSOs because they are too faint and at 30 FPS each frame will just get 1/30 seconds of exposure, thus enough photons won't be recorded per frame by the sensor to distinguish it from background glow.
So my question is: Can we stream raw data from sensor using Android Camera API to a program which will take care of derotation, continuously adding the light information to the SAME pixels, instead of adjacent pixels due to Earth's rotation?
Many Thanks,
Ahmed
(Attached image was taken using 30s exposure using Xiaomi Redmi Note 9S)
Orion and Pleiades:

iPhone camera Brightness... How is it physically measured?

I hope someone can answer this question, its been bugging me for a while now and I can't seem to get a solid answer.
How exactly does the iPhone measure scene brightness (or luminance if you prefer) through its camera? Does it measure it off the pixels on the sensor, or does it have dedicated in lens brightness sensor(s)?
Is there a way to read this raw brightness information other than through EXIF data?
Thanks in advance,
Chris
I have determined through experimentation that Luminance is measured off the camera sensor and not from a discrete sensor through the lens. Thanks to everyone who read and/or answered this question!
Kind Regards,
Chris
I don't know of a public API, but this shows a way to get it
http://b2cloud.com.au/tutorial/obtaining-luminosity-from-an-ios-camera
The suggestion at the end is to use this API
https://developer.apple.com/library/mac/#documentation/AVFoundation/Reference/AVCaptureVideoDataOutput_Class/Reference/Reference.html
To capture raw uncompressed image frames and use the luminosity function on them. To get EXIF, you need to bring up a camera and have the user take a picture.
If you do this, you'd be doing it by measuring pixels on the sensor. I have no idea how Apple does it, but I don't see any mention of a brightness sensor anywhere. There is an ambient light one used in the proximity sensor, but that's not the same.

iphone, Image processing

I am building an application on night vision but i don't find any useful algorithm which I can apply on the dark images to make it clear. Anyone please suggest me some good algorithm.
Thanks in advance
With the size of the iphone lens and sensor, you are going to have a lot of noise no matter what you do. I would practice manipulating the image in Photoshop first, and you'll probably find that it is useful to select a white point out of a sample of the brighter pixels in the image and to use a curve. You'll probably also need to run an anti-noise filter and smoother. Edge detection or condensation may allow you to bold some areas of the image. As for specific algorithms to perform each of these filters there are a lot of Computer Science books and lists on the subject. Here is one list:
http://www.efg2.com/Lab/Library/ImageProcessing/Algorithms.htm
Many OpenGL implementations can be found if you find a standard name for an algorithm you need.
Real (useful) night vision typically uses an infrared light and an infrared-tuned camera. I think you're out of luck.
Of course using the iPhone 4's camera light could be considered "night vision" ...
Your real problem is the camera and not the algorithm.
You can apply algorithm to clarify images, but it won't make from dark to real like by magic ^^
But if you want to try some algorithms you should take a look at OpenCV (http://opencv.willowgarage.com/wiki/) there is some port like here http://ildan.blogspot.com/2008/07/creating-universal-static-opencv.html
I suppose there are two ways to refine the dark image. first is active which use infrared and other is passive which manipulates the pixel of the image....
The images will be noisy, but you can always try scaling up the pixel values (all of the components in RGB or just the luminance of HSV, either linear or applying some sort of curve, either globally or local to just the darker areas) and saturating them, and/or using a contrast edge enhancement filter algorithm.
If the camera and subject matter are sufficiently motionless (tripod, etc.) you could try summing each pixel over several image captures. Or you could do what some HDR apps do, and try aligning images before pixel processing across time.
I haven't seen any documentation on whether the iPhone's camera sensor has a wider wavelength gamut than the human eye.
I suggest conducting a simple test before trying to actually implement this:
Save a photo made in a dark room.
Open in GIMP (or a similar application).
Apply "Stretch HSV" algorithm (or equivalent).
Check if the resulting image quality is good enough.
This should give you an idea as to whether your camera is good enough to try it.

iPhone: Real-time video color info, focal length, aperture?

Is there any way using AVFoundation and CoreVideo to get color info, aperture and focal length values in real-time?
Let me explain. Say when I am shooting video I want to sample the color in a small portion of the screen and output that in RGB values to the screen? Also, I would like to show what the current aperture is set at.
Does anyone know if it is possible to gather these values? Currently I have only seen that this is possible with still images.
Ideas?
AVCaptureStillImageOutput will get you a real time still from the video stream, including exif data for focal length, aperture, etc. Color info you could calculate yourself from that bitmap.
AVFoundation, CoreVideo, and CoreMedia include support for getting a video bitmap in "real-time". From there you can process a portion of the RGB pixels however you want.
I don't know of any current public iOS API to get you the aperture.
The focal length is fixed, but differs between product models. ifixit.com might have that info.
You can get the aperture by:- camera.lensAperture
this will give the aperture of the camera of iPhone whichever you selected