How to implement an iOS metal detector app? - iphone

I want to implement a Metal Detector app which will react to a magnetic field in the phones surroundings.
How do I read the magnetic field readings from the magnetometer on the iPhone? Does anyone have sample code for accessing the magnetometer readings directly?
Thanks!

I would recommend you look at the O'Reilly book called iOS Sensor that is coming out. They have an entire chapter (6) on the magnetometer and a sample app.
iOS Sensor Programming
Apple Sample Code for Reading the Raw Values from the Magnetometer

No. You can't.
UPDATE : there is no metal detecting components in iPhone. However, there is a magnetometer available since iPhone 3GS. The Compass App makes use of the sensor to detect direction. It just reads magnetic North, but not reading magnetic fields surrounding. Therefore, you can't make a metal detector with it.
Physics Fact: Metal does not generate magnetic fields. Magnets do.
p.s. correct me if I am wrong.

There is already an app which does this. Check this and this blog.
This is what the description says,
The app has an adjustable sensitivity and makes an audible sound
signal if the sensitivity reaches a medium level. The phone needs to
be shook once to be calibrated if you intend to use the device again.
The app makes use of the magnetometer on the iPhone to find
interference with the compass to detect metal objects. Just hover the
backside of your mobile, the area under the camera, over the metal and
the app should find the metal and there should be an increase in
sensitivity.
However I am not sure if it is a fake app or it can be really done this way. Try downloading that and check it yourself. It is a free app.

Related

Using ARKit to process online video, is it possible?

Apple ARKit is awesome, however, it looks like it can only take the device's front/back end camera as the video source. My question is if it is possible to use the online video or some other video stream as the video source of the ARKit.
Welcome, I don't think it is possible, because ARKit uses accelerometer, gyroscope to get device position and scan to world. I don't know how it can be possible over the video. Also ARKit basic requirement is access to camera.

Is it possible to programmatically capture iPhone 5S slow motion video?

I couldn't find an answer to this question, and looking at Apples own apps like iMovie on iOS, the video picker does not offer a slow motion option on the iPhone 5S.
The image picker offers very little control over the video. If you are willing to dive deeper into the APIs you can use AVFoundation to capture your video and manipulate the camera properties as you see fit.
iOS 7 introduces a new AVCaptureDeviceFormat class that will give you the maximum and minimum supported frame-rates for the capture device, and you can use these to set a custom frame rate on the camera itself. I don't have an iPhone 5S to hand so I can't actually verify whether this API goes all the way down to 120FPS.

html5 and heat sensors

I'm new to html5 programming and was thinking of building an html5 app for iphone. I was wondering if there was any information in the API about sensing temperature on touch in the api? My app depends on the temperature of ones finger when they touch the screen. Is there anyway to measure this? IF not, is there a way to measure the amount of pressure applied to the screen?
Thanks!
Room temperature: Yes.
The temperature of your thumb on the iPhone screen: no.
The pressure of your thumb on the screen: also no.
Remember - the iPhone "touch" works on capacitance, not "mechanical pressure".

About Ambient Light sensor in iphone

Thanks in advance.
I got the information about the iphone sensors from http://ipod.about.com/od/ipodiphonehardwareterms/qt/iphone-sensors.htm. But I didn't get information about how to use ambient sensor in iphone
Here's a link to a description of the ambient light sensor hardware with example code that shows how to access it.
It is possible, and the framework is public.
You can activate the connection to the sensor like so :
[[UIDevice currentDevice] setProximityMonitoringEnabled:YES];
BTW, it doesn't seem to be using the light sensor, because proximity sensing would tweak out in a dark room.
However, the API call basically blanks the screen when you hold the phone up to your face. Not useful for interaction, sadly.
Here is a guide about iPhone sensors :
iPhone sensors

Does the Iphone 1/2 have a compass inside?

Can one be simulated by periodicly syncing with GPS and in the meen while working with the accelerometer? I know for example that the N95 accelerometer is invarient to rotation on the Y axis (while beeing face up/down).
The original iPhone and the iPhone 3G use GPS to calculate the heading, however the iPhone 3GS now has a 3-dimensional magnetometer compass in it.
This can only be done taking two GPS coordinates (while moving) and determining the direction from point A to B.
iPhone doesn't have a built in compass; but there is one created in software. It's called Compass Free, and unsurprisingly perhaps, it's free.
Extra info: The IPHONE 1 did not have GPS or compass.