iPhone Temperature Sensor - iphone

My question is very similar to this one: iPhone Proximity Sensor. There's clearly some manner of thermometer within the iPhone that's readable by the OS. Has anyone uncovered the super-secret undocumented APIs to read this sensor?

I doubt this sensor is for ambient temperature - rather I suspect it is for overheating of the circuits. If that is all you want then great, but again, I think it would be useless for ambient temperature.
just my opinion.

All i could find was CTGetTemperature in CoreTelephony of all places.

I don't know about previous models, but my iPhone4 goes from cool-ish to very warm in a matter of minutes depending on the various radio usages. So unless "good enough" = "within 20 degrees F or so", then probably not good for ambient measurement.
Unless (maybe you meant this) you could also track radio usage and subtract a temperature variable depending on radio usages. phew. complicated. Easier to just query NWS.

command to get all super-secret names which related with temperature in CoreTelephony framework
nm "/Applications/Xcode463.app/Contents/Developer/Platforms/iPhoneOS.platform/DeviceSupport/6.1 (10B141)/Symbols/System/Library/Frameworks/CoreTelephony.framework/CoreTelephony" | grep empera

Related

diffrence between mlx90614 and mlx90616 infrared sensors

I test my code with mlx90614 but with mlx90616 the ambient and object temperature almost same. anybody have this problem. what's the solution?
i test these code on lpc1768 with smbus protocol.
excuse for my english
I have had the same problem. After carefully reading the datasheet it is clear it needs to be calibrated (how, I'm not sure just yet). It's frustrating because Melexis will not provide technical support unless you're buying 10k+, and are very vague about their "calibration software".
The ambient temperature sensor in the MLX90616 is calibrated in the
factory... The sensor-lens combination temperature measurement has to
be calibrated by the customer.
I spent a couple weeks before I realized the reason it was not working, was because it was not calibrated. The output appears to simply be the ambient temperature!
If you read the calibration coefficient register 0x0F you'll see that it is 0x0020.

How accurate is CoreLocation's accuracy measurement

I have been testing the accuracy of various aspects of CoreLocation and have been surprised to find the accuracy value supplied on new and old locations passed to the CLLocationMangerDelegate method: (void)locationManager:didUpdateToLocation:fromLocation is wildly inaccurate itself.
Walking certain stretches of pavement(along a stretch of road with buildings along one side) I have found that whilst CLLocationManager claims the CLLocations it is supplying have a horizontal accuracy of 5 meters, in actual fact the position shown by the MKUserAnnotation is anything up to 30 meters away from the position I know I'm in (often showing as being the opposite side of a row of buildings). This suggests that the accuracy reading itself is not accurate. CLLocationManager is telling me its coordinates are accurate to within 5 meters but are actually as many as 30 meters out.
To clarify, this is not a caching/timestamp issue It is repeatable in certain areas, and not something that improves over time.
Has anyone else encountered this issue?
CoreLocation is not a human and does not know where you "really" are, is the essence of the answer to your question. It is using the sensors on your device and giving you the best answer it's got, and the problems with its ability to know the accuracy of its claimed answer, reflect that the problem of location is inherently very difficult.
To digress into the location data itself: CoreLocation draws from cell tower triangulation, nearby SSIDs, and finally GPS, to determine the device's location. These do have varying degrees of accuracy, but what they have in common is that if you let the device work for longer, you'll get a more accurate answer. However, because all of the tools that CoreLocation uses are expensive in terms of battery life, it will stop as soon as it has generated an answer that meets your criteria for "good enough."
This is why another answer told you to set your desiredAccuracy to kCLLocationAccuracyBestForNavigation - when you do that, you're basically telling CoreLocation "show me the best you've got," and you've also volunteered for the CPU load, time, and battery drain that come with that. You've also gone off the normal scale that CoreLocation uses to figure out what's "good enough" - instead of telling it ahead of time when it can stop, you've basically said "I'll tell you when it's good enough."
CoreLocation will thus keep on trying to give you better answers until you tell it you're done, and you need to carry out your responsibility for deciding what that point is. This is also where you need to bear in mind that there are limits to how accurate the tools in CoreLocation's repertoire can be - you can get lots of significant digits of latitude and longitude, but by venturing off the scale, you've volunteered for the task of figuring out how trustworthy those numbers are. The pre-defined scales of accuracy are 10 meters, 100 meters, 1 km, and 3km. The fact that the creators of CoreLocation chose those cut-off points should tell you something about the difficulty of the problem - if better accuracy were easy, it would probably already be in the framework.
So circling back around: CoreLocation is reporting the accuracy of its answers faithfully, it's just that the data underlying those answer is problematic, and if it knew more about the accuracy of its accuracy claims, it would reflect that in its accuracy claims. Consider the hypothetical situation where CoreLocation knows exactly how accurate its accuracy claims are, then the situation where CoreLocation knows nothing about how accurate its accuracy claims are, and consider how you'd design an API to account for the problem. CoreLocation is as sure as it can reasonably be about how sure it is, and if you can figure out a tractable-within-the-limits-of-the-hardware way to be more sure about how sure you are, App Store wealth awaits you.
the hardware takes some time to get the accuracy. the best accuracy is 65m in 2-3 seconds. else u have 1414m accuracy

Transferring data using ultrasound

Yamaha InfoSound and ShopKick application use technologies that allow to transfer data using ultrasound. That is playing an inaudible signal (>18kHz) that can be picked up by modern mobile phones (iOS, Android).
What is the approach used in such technologies? What kind of modulation they use?
I see several problems with this approach. First, 18kHz is not inaudible. Many people cannot hear it, especially as they age, but I know I certainly can (I do regular hearing tests, work-related). Also, most phones have different low-pass filters on their A/D converters, and many devices, especially older Android ones (I've personally seen that happen), filter everything below 16 kHz or so. Your app therefore is not guaranteed to work on any hardware. The iPhone should probably be able to do it.
In terms of modulation, it could be anything really, but I would definitely rule out AM. Sound has next to zero robustness when it comes to volume. If I were to implement something like that, I would go with FSK. I would think that PSK would fail due to acoustic reflections and such. The difficulty is that you're working with non-robust energy transfer within a very narrow bandwidth. I certainly do not doubt that it can be achieved, but I don't see something like this proving reliable. Just IMHO, that is.
Update: Now that i think about it, a plain on-off would work with a single tone if you're not transferring any data, just some short signals.
Can't say for Yamaha InfoSound and ShopKick, but what we used in our project was a variation of frequency modulation: the frequency of the carrier is modulated by a digital binary signal, where 0 and 1 correspond to 17 kHz and 18 kHz respectively. As for demodulator, we tried heterodyne. More details you could find here: http://rnd.azoft.com/mobile-app-transering-data-using-ultrasound/
There's nothing special in being ultrasound, the principle is the same as data transmission through a modem, so any digital modulation is -in principle- feasible. You only have a specific frequency band (above 18khz) and some practical requisites (the medium is very unreliable, I guess) that suggest to use a simple-robust scheme with low-bit rate.
I don't know how they do it but this is how I do it:
If it is a string then make sure it's not a long one (the longer the higher is the error probability ). Lets assume we're working with the vital part of the ASCII code, namely up to character number 127, then all you need is 7 bits per character. Transform this character into bits and modulate those bits using QFSK (there are several modulations to choose from, frequency shift based ones have turned out to be the most robust I've tried from the conventional ones... I've created my own modulation scheme for this use case). Select the carrier frequencies as 18.5,19,19.5, and 20 kHz (if you want to be mathematically strict in your design, select frequency values that assure you both orthogonality and phase continuity at symbol transitions, if you can't, a good workaround to avoid abrupt symbols transitions is to multiply your symbols by a window of the same size, eg. a Gaussian or Bartlet ). In my experience you can move this values in the range from 17.5 to 20.5 kHz (if you go lower it will start to bother people using your app, if you go higher the average type microphone frequency response will attenuate your transmission and induce unwanted errors).
On the receiver side implement a correlation or matched filter receiver (an FFT receiver works as well, specially a zero padded one but it might be a little bit slower, I wouldn't recommend Goertzel because frequency shift due to Doppler effect or speaker-microphone non-linearities could affect your reception). Once you have received the bit stream make characters with them and you will recover your message
If you face too many broadcasting errors, try selecting a higher amount of samples per symbol or band-pass filtering each frequency value before giving them to the demodulator, using an error correction code such as BCH or Reed Solomon is sometimes the only way to assure an error free communication.
One topic everybody always forgets to talk about is synchronization (to know on the receiver side when the transmission has begun), you have to be creative here and make a lot a tests with a lot of phones before you can derive an actual detection threshold that works on all, notice that this might also be distance dependent
If you are unfamiliar with these subjects I would recommend a couple of great books:
Digital Modulation Techniques from Fuqin Xiong
DIGITAL COMMUNICATIONS Fundamentals and Applications from BERNARD SKLAR
Digital Communications from John G. Proakis
You might have luck with a library I created for sound-based modems, libquiet. It gives you a handful of profiles to work from, including a slow "Ultrasonic whisper" profile with spectral content above 19kHz. The library is written in C but would require some work to interface with iOS.

Whate are the basic concepts for implementing anti-shock and anti-shake algorithms?

I have some animations happening upon fine acceleration detections. But when the user sits in a car or is walking it may get annoying.
Basically, all that stuff has to be disabled automatically as soon as there is too much vibration or shaking. Conceptually, I think that it's very hard to filter those vibrations out , since the "vibration phase" changes permanently. I woul define "unwanted vibration or shocks" as acceleration values that change very fast by an large interval of values, or, an permanently changing accumulated value that does not exceed an specified treshold range in an specified minimum period of time.
I am looking for "proven" concepts, before I start reinventing the wheel for a couple of days.
I don't have any concrete answers for you, but you might want to Google band-pass filters or anti-aliasing filters for some ideas on how to approach this. Basically, if you can identify the frequency range of accelerations that you want to consider real, you can filter out frequencies that fall outside this range.
Before you start doing too much pre-optimization, I think you should implement a low pass filter and see if that does the job. Most iPhone apps effectively use a variation of an LPF to get rid of unwanted accelerometer noise.
You could also go the other way and use a high pass filter. Once you get a certain power level passing through the HPF, stop processing data.

iPhone CoreLocation: How to get the most accurate speed

I'm experimenting with adding the GPS functionality to my iPhone app. It's a workout app that will be used while walking or running. So what I want to use GPS for is to show the speed that the person is moving in Mph and minute/mile.
How should I configure the CLLocationManager so I get the best possible results? What should I set desiredAccuracy and distanceFilter?
I've tried with:
distanceFilter = 10 and desiredAccuracy = kCLLocationAccuracyNearestTenMeters
and reading
CLLocation.speed property
Testing while driving around in my car the accuracy seems good compared to the car speedometer although it takes a while to update. I realize that the update delay may very well be the time it takes to query the GPS location, but I'm not sure if changing the above two parameters would give better results.
Should I use kCLLocationAccuracyBest and some other value for distanceFilter?
I'm interested to hear from others using CoreLocation to get speed. What are you doing to get more accurate results?
For best results, you should use kCLLocationAccuracyBest. What you put into your distance filter depends on up with which faults you're willing to put. Basically, you're going to have to make decisions based on accuracy vs availability. That is, during periods when a best-accuracy answer is not available, what will you display?
One approach is to let the phone deliver less-accurate answers and, using a projection of what was happening the last time you had best-accuracy information, see if what you have makes sense.
That is, suppose I'm jogging at 6mph to the North. You plot me along point-A, point-B, point-C... then you get a low-accuracy answer (maybe kCLLocationAccuracyNearest100Meters.) Look at the spot where it says I am and figure out "could I have gotten to that spot from point-C if I'd continued along my current path, making reasonable adjustments for possible changes in speed?" If so, then the new point is within the realm of possibility. (If not, then toss it out.) Then project from point-C at my last-known speed and figure out where you think I probably am, ballistically. Save that as ballistic-point-D.
Of course, you're using the accelerometer to get some sort of inertial sense of which way I went, right? So, you can't know direction (you don't know what way the phone is pointing), but you can make a reasonable stab at distance.
Using all this information, plot the most likely spot where you think I probably am.
NOTE: When testing, don't just drive in good-cell coverage areas. See how your app performs out in the hills, away from cell phones. A lot of people like to bike & jog those areas!
Disclaimer: I've only played with CoreLocation a bit, I've not tested the accuracy very closely.
I'd expect that you'd get the most accurate results by using the defaults for distanceFilter and desiredAccuracy. Less-frequent updates are only going to give you less data to work with.
One issue you're likely to run into is when the location fix is lost for a while, then comes back. The naive, connect-the-dots approach to figuring out distance traveled is going to tend to under-estimate the actual speed of the runner. Rather than using CLLocation.speed, you might get better results calculating speed based on some heuristic approximation to the line the runner is actually following.