iPhone indoor location based app - iphone

I am researching how to create an app for my work that allows clients to download the app (preferably via the app store) and using some sort of wifi triangulation/fingerprints be able to determine their location for essentially an interactive tour.
Now, my question specifically is what is the best route to take for the iPhone? None of the clients will be expected to have jail broken iPhones.
To my understanding this requires the use of the wifi data which is a private api therefore not meeting the app store requirements. The biggest question I have is how does American Museum of Natural History get away with using the same technology, but still available on the app store?
if you're unfamiliar with American Museum of Natural History interactive tour app, see here:
http://itunes.apple.com/us/app/amnh-explorer/id381227123?mt=8
Thank you for any clarification you can provide.

I'm one of the developers of the AMNH Explorer app you're referencing.
Explorer uses the Cisco "Mobility Services Engine" (MSE) behind the scenes to determine its location. This is part of their Cisco wifi installation. The network itself listens for devices in the museum and estimates their position via Wifi triangulation. We do a bit of work in the app to "ask" the MSE for our current location.
Doing this work on the network side was (and still is) the only available option for iOS since, as you've found, the wifi scanning functions are considered to be private APIs.
If you'd like to build your own system and mobile app for doing something similar, you might start with the MSE.
Alternatively, we've built the same tech from Explorer into a new platform called Meridian which provides location-based services on both iOS and Android. Definitely get in touch with us via the website if you're interested in building on that.
Update 6/1/2017
Thought I would update this old answer - AMNH is no longer using the Wifi-based system I describe above, as of a few years ago. They now use an installation of a few hundred battery-powered Bluetooth Beacons (also provided by Meridian). The device (iOS or Android) scans for nearby beacons and, based on their known locations and RSSI values, triangulates a position. You can read more about it in this article.

Navizon offers an indoor positioning solution that works for iOS as well as any other platform. You can check it out here:
http://www.navizon.com/product-navizon-indoor-triangulation-system
It works by triangulating the WiFi signals transmitted by the device. Since it doesn't require an app to run on the phone, it bypasses the iOS limitations and can locate any other WiFi device for that matter.

Google recently launched an API called Maps Geolocation API. You can use it for indoor tracking of devices, which essentially can be used to achieve something similar to what AMNH's app does.

I would do this using Augmented Reality. There is a system sort of in place for this, the idea being that you place physical markers that have virtual information associated with them. I believe the system I saw was a type of bar code. When a user holds up the phone with the app, the app uses the camera to read the code and then display information. This could easily be used to make a virtual tour type app distributable through the app store and not even require a WIFI or 3/4G connection. This assumes that you simply load your information and store it locally with your app. Then to update it you simply push an update through the app store. Another solution is to use a SOAP/REST service and provide the information in that way, and this does not use private API's, though it does require some form of internet connection. For this you can see a question I asked about this topic a little bit ago:
SOAP/XML Tutorials Question
In addition, you could load a map of your tour location, and based on what code is scanned you can locate the user on the map and give suggested routes based on interests etc.
I found this tutorial recently on augmented reality, I haven't gone through it, but if its anything like the rest of Ray's tutorials, it will be extremely helpful.
http://www.raywenderlich.com/3997/introduction-to-augmented-reality-on-the-iphone
I'll stick around to clarify any questions or other concerns you may have with your app.

To augment the original answer for devs who were using Cisco MSE for indoor location - now they have an iOS and Android SDK which enables you to do indoor location using the MSE. A simulator can be used as well to develop the app without implementing the infrastructure to start with : https://developer.cisco.com/site/cmx-mobility-services/downloads/

For indoor location you can use Bluetooth LE beacons since it's a very accessible technology nowadays, there are several methods:
Trilateration: it uses 3 beacons, but with the noise and attenuation of Bluetooth signals, it gets quite difficult to determine the exact position and also it's not easy to use more than 3 beacons to increase accuracy.
Levenberg Marquadt method: used to solve non-linear squares problems showed good results on indoor positioning.
Dead Reckoning method: using the motion co-processor of the device, giving an initial position you can calculate the moving path of the device. Not that easy to implement anyway.
I wrote a post on the topic, you can find more info here: http://bits.citrusbyte.com/indoor-positioning-with-beacons/
And you can use this iOS app for your own indoor positioning experiments: https://github.com/citrusbyte/beacons-positioning

I doubt the American Museum is actually using private APIS; you'll probably find the routers that have been setup serve different responses to each other, so the app can detect it's position in the museum.
If you are looking for a cheaper to way to do the same task, you could have signs with QR codes, and use an open source library to let users scan these barcodes as they move through the museum, and update the onscreen content accordingly. On an even more low tech level, you can just tag each area with unique numbers, and distinguish that way.

Related

Best way to build a proximity app within 200 meter range of the user?

I am using Swift to develop a personal app for my family and have been researching methods of what toolset to use to build an app that allows a user to see other people using the same app in a vicinity of around 0 - 200meters.
I was looking at bluetooth and also trying to work out how Tinder finds other users. Do they just use GPS? If so how would one best implement that.
What would be the most effective way to determine another users location within 200 meters?
Note: 1 user would search the surrounding area for any devices that are on the app - Then it would tell the user their location. So it is doing both, finding the distance between the two locations and how you should get to that location. Obviously as this is for personal use, security issues arent much of a concern.
As you may have expected, you have many options of different approaches that will all accomplish your goal. I suggest you start by taking a look at a couple open-source projects:
PeerKit
LocationChat
Both of these libraries demonstrate a way to transfer a payload of data between devices. In addition, both projects provide very helpful example apps.
Assuming you choose to use PeerKit, each device could be responsible for obtaining its own location (via CoreLocation) and then broadcasting it to other devices (via PeerKit). Then the receiving device will be able to calculate the distance between itself and its nearby peers.
Note: At the time of writing, I have not contributed to either of the projects suggested above.

Bing Maps GeoLocationProvider service not working

I have recently posted a question regarding getting the user location which I thought I had solved using geolocation.GeoLocationProvider. However, I am having strange behaviour on different devices. On an iphone 5s, I get the most accurate and smallest circle marking my position. On a galaxy S3 I get very large circles and takes long to connect. I then connected a Nexus to my mobile over bluetooth and shared 3G internet. Funnily enough, my position was not showing at all. In all 3 cases, I tried going into bing maps and google maps and they all have shown my position very accurately. Is there anything I am skipping for this discrepancy between my code and bing/google maps' code?
Thanks you all,
Justin
The Bing Maps site uses the same functionality. All this does is wrap the different web based geolocation API's for different browsers into one easy to use class. This method pulls the location from the browser built-in geolocation functionality (older browsers had several different ways of doing this). This will make use of a GPS device if it has access (small circle), fall back to WIFI or IP address (large circles). A couple of things worth checking, if this is being used in an app, have you enabled access to the geolocation sensors? If this is being used in as a web app through the browser, did you get a prompt to allow access to your location and did you press Allow? If the device isn't able to access the GPS and it's using share WIFI from another phone I could see how this might confuse things.
Another option is to use the HTML5 geolocation API: http://www.w3schools.com/HTML/html5_geolocation.asp

Detect other iPhones/iPads in the vicinity

I am thinking about a web app to detect the presence of the other iPads/iPhones. This is purely theoretical at the moment, i have no idea how to do this.
My question is what is my best technology/language method of doing this?
Is it going to be bluetooth or gps? How does the App Bump work?
Thanks for any suggestions posted...
Bump's FAQs: http://bu.mp/faq
According to their FAQs, when your phone has the Bump app up and ready, the app listens to the accelerometer for a sharp stop (your hand with your phone in it stopping when it hits the other person's hand with their phone in it), at this point, exact date time information as well as GPS position and characteristics of the bump are sent off to Bump's servers to compare other information from other bump users to see what other bump account shares most of that information. Bluetooth is not used in any capacity to make this happen, not for the transfer or the recognition of who to transfer to.
Your website may have to do the same thing. Have every instance report to your server, and then report back from the server where other people are.
Of course this goes without saying that I'm sure you were thinking about privacy settings and other layers of personal security.
Options I'd explore:
Bonjour discovery. In principle devices that can see each other via Bonjour could actually be in different continents, but usually it means 'same wireless network' and therefore at least 'same building'.
Game kit. Actually, this either uses Bluetooth or the local network, so is probably going to have similar results to Bonjour discovery but in less code.
I wouldn't try location services like Core Location (actually I would, but only if the above don't work), as the results probably aren't going to be fine enough. Especially indoors: the Maps app on my phone places me in a circle of about 50 metres radius with my actual location being on the outer edge of said circle. Someone on the next street with similar resolution could, as far as the app is concerned, be adjacent to me.

which features do you look forward to the most in iPhone SDK 3?

Which of the new features are you looking forward to the most in iPhone SDK 3.0?
Is it one of the main advertised six new things, or something smaller? Something in the "1,000 new APIs", perhaps?
Phone to phone communication via bluetooth seems like it will terribly useful for some apps I am writing. No longer do you have to input all the data you want to store yourself, you can share some of it with other iPhone users.
not really a feature, but the best thing about developing the iPhone SDK further is the great frameworks that arise. there are some really, really great frameworks out there already (like the Three20 project) which will become even better with the new 3.0 SDK.
my real excitement will take over once they let us run background processes. maybe in 4.0?
Video! The ability to write decent tools for mobile video uploads is a big draw.
MapKit by far will bring the biggest change sweeping across the app space.
My personal favorite is that we can finally easily track upload progress of large files (like images).
I really, really want to see fixes in the camera API so that it isn't either broken (2.2.1) or forcing a switch to portrait (3.0).
Apart from that, the most useful features to me are:
push notifications. Great for making an app more sticky - you can let the user know that something of interest to them has happened.
CoreData - I've been using a third-party SQL layer, but it's a little buggy and no longer supported.
Peer-peer bluetooth, as the poster above said, is also useful for local data exchange.
And the least useful? Cut and paste. I actually want to disable it in my app (to discourage people from copying content) - and it doesn't look as though you can (yet).
Bluetooth phone-to-phone communication with GameKit will enable a host of currently impossible applications. Multiplayer games with no WiFi network needed and data exchange between two phones are obvious use-cases.
I'd also like to see - not currently included in the betas - a decent camera API that allowed us to customize the appearance of the capture screen, and as another poster said, have it work properly in landscape and portrait mode.

What is the iPhone SDK Missing?

I've been doing mobile app development for a long time (2001?), but the systems we worked with back then were dedicated mobile development environments (Symbian, J2ME, BREW). iPhone SDK is a curious hybrid of Mac OS X and Apple's take on mobile (Cocoa Touch).
But it is missing some stuff that other mobile systems have, IMO. Specifically:
Application background processing
SMS/MMS application routing (send an SMS to my application in the background)
API for accessing phone functions/call history/call interception
I realize that Apple has perfectly valid reasons for releasing the SDK the way they did. I am curious what people on SO think the SDK is missing and how would they go about fixing/adding it, were they an Engineering Product Manager at Apple.
The biggest shortcoming in my opinion is support for separating licensing from distribution.
What I mean by this is that it should be possible to download a trial version of an application and later purchase a license for that application (from an API call inside the application or from the app store). This would make it much easier to try-before-you-buy and get rid of the current duplicates of many applications with 'lite' versions.
I think lack of push notifications for apps is the big thing we're missing right now. With push, you can register your application to perform a task (like getting the most recent data from a web service) even when it's not running, at a time and frequency the OS decides is best. In an ideal world, along with the existing concept of iPhone apps loading quickly and resuming where you last left off, this solves the problem of not running in the background. I know some tasks will be more difficult or maybe impossible with this strategy, but it's still a pretty good compromise between third party applications and the iPhone's limited hardware.
Originally push was scheduled for last September, but it was removed from the beta SDK and not spoken of since then.
API's I'm personally looking for:
Apple80211 as a public API (private, current API is fine if documented)
Access to Volume buttons (semi-accessible via Celestial, private, needs new API)
Access to Calendar (private, API status unknown)
Access to Bluetooth + SPP profile (status unknown)
Access to Camera (directly, API status unknown)
Access to JavaScript runtime (directly, not through UIWebView, API status unknown)
WebKit access that's lower-level than UIWebView (private, current API is fine)
Access to Music Library (private, current API is fine)
Garbage Collection.
CoreData is missing.
You've mentioned some of the big ones - copy & paste (or in fact any way for apps to collaborate) is another huge omission.
It also seems to lack a desktop synch framework (at least if it exists I can't find it).
Language independence and especially lack of scripting is another pet peeve - objective-c is all very well but more languages to choose from would be good.
Inability to dynamically extend apps, via scripts or otherwise, is another big omission. This is partly an SDK/OS issue, partly licensing.
My list ordered by priority:
Mapping abstraction (the MapKit looks awesome), but that would require a new Google Maps TOS
Music library
Camera (photo + video) Access to more
UIViews, Apple designed some pretty nice custom ones for their apps
Better UIWebKit abstraction
The features I see missing that it should have is
Access to SMS
Direct Access to Google Maps App. You should be able have access to this so you could extend your application to use the built in features provided by Google Maps.
Access to the Bluetooth functionality of the phone.
Access to the Calendar. Why not allow access to simply post a calendar event for the user.
Access to Active Sync. It would great if we could directly access this and communicate back to the Exchange Server.
Core Image. They provide Core Animation but Core Image is missing. I hope that this is added to the API soon.
These are some of the features that my clients have access for in the past and are supprised when they are not available.
We definitely miss a Calendar API and SMS access. So many applications could leverage such APIs. The iPhone allows users to have everything in their pocket, but it's almost useless as long as developers cannot leverage this integration in their apps.
A language with proper namespaces.
A limitation that bugs me is lack of access to system features that require root or setuid. For example: opening privileged IP ports.
I'm not sure there is a good solution to this, as long as Apple's policy is to keep the device locked-down.
Allow program to set some kind of local timed event for your application to bring up an alert and launch your app if the user agrees (like any calendar app). You could do that with push notifications but there are many cases I'd hate to have to rely on a whole server infrastructure and network connectivity just to basically do some timed thing.
Some idea of what direction the user is facing. I cannot believe the GPS chip the newer iPhones use are not capable of reporting direction.
I would personally love to see
Access to the CoreTelephony Framework (Currently private). Which allows access to all the phone functions (Especially sending MMS / SMS).
Some sort of ability to run stuff in the background. While push notifications is ok for most things, but it is a bit hard to leverage CoreLocation (i.e. have the app show a notification at a certain location). Of course this would probably need an on/off button or app specific like push is.
animation view which will be reduce developer to make a cool app , of course the core business local still need consider more , but the view layer could more easy to use ....