Google Maps and Siri implementation - iphone

Just wondering, can we integrate Google Maps and Siri together. For Example:
I ask Siri, "show nearest Starbucks" and Siri will open the Map app or Google Maps and show the nearest Starbucks on the map.
Or
I ask Siri, "show me all Apple Stores". Siri will open the map and show all the locations of Apple Stores on the map.
Is this doable?
I havn't found any good tutorial,documentation to study more about Siri implementation, apart from articles only. There is no technical documentation/API available?

This isn't directly supported right now, without writing a third party website that Siri can hook into. According to the below linked documentation, from Apple's website, Siri on iOS 6 will support this functionality in at least limited part:
http://www.apple.com/ios/ios6/siri/
Eyes Free
Apple is working with car manufacturers to integrate Siri into select voice control systems. Through the voice command button on your steering wheel, you’ll be able to ask Siri questions without taking your eyes off the road. To minimize distractions even more, your iOS device’s screen won’t light up. With the Eyes Free feature, ask Siri to call people, select and play music, hear and compose text messages, use Maps and get directions, read your notifications, find calendar information, add reminders, and more. It’s just another way Siri helps you get things done, even when you’re behind the wheel.
This encourages me to believe that they will also expose the API (because someone will ferret it out if it exists) to normal API consumers during the iOS 6 lifecycle, probably before iOS 6.1, or with that release.

It is not currently possible through any kind of API that Apple use. However, there are some third party APIs that you could use such as: https://www.ispeech.org/developers/iphone You'd have to use that and then pass on the returned data to the Google Maps API.
Although this approach won't be as intuitive as using Siri, since that is not currently possible, this is the best bet you have for the mean time.

Unfortunately, Apple has not opened Siri's API to developers yet making this task impossible. However, Apple will probably open it soon. If you just want to use it for personal use, check out SiriProxy (https://github.com/plamoni/SiriProxy). SiriProxy lets you do exactly what you asked; however, for it to work, you must be on your wifi network so it cannot be in one of your app. Good Luck!

Related

Getting routes and turn-by-turn navigation in an iPhone app

I'm developing an app that will focus heavily on giving users routes and turn-by-turn directions while driving. It is important that they stay within the app during their drive, so I really don't want to make them leave the app and go to the built-in Maps app. I have been doing a lot of research lately on how to include this functionality, and it is widely known that it is not easy, since Apple doesn't include this functionality in the SDK out of the box. It looks like my options are:
For providing routes (and directions) from one place to another
Use a UIWebView and load some web-based maps with JavaScript, and use the JavaScript API to draw a route
Use MapKit or the Google Maps iOS SDK, query for the route sepaartely through an API, and manually draw some kind of path or polyline on top of it.
Use a library that costs money (like MTDirectionsKit)
For providing turn-by-turn navigation
Use a proprietary library that comes with its own maps (like CloudMade)
Is there anything I'm missing here? What are the pros and cons of each, and how should I pick a solution?
Your insight is greatly appreciated. Thanks!
You say that you don't want to "make" the user leave the app and use the built in software for turn by turn navigation as though this is a bad user experience. A bad user experience would be getting forced to use a turn by turn nav from a webView or some API which would not show up as routing information on my lock screen if I lock the phone or cut into other applications to display banners telling me a turn is coming up, for example, while I'm listening to music on my phone. I strongly suggest you launch Apple maps and let the system handle what it is meant to handle instead of trying to build your own turn by turn navigation and heavily limiting the user from the freedom of exiting the application during a drive.
While on the subject of commercial libraries you might what to take a look at other options:
skobbler/telenav sdk - in-app routing and turn-by-turn navigation based on OpenStreetMap (same maps used by Cloudmade and Mapquest). Check out the demos and the licensing plans and figure out if it's the correct solution for you
(they have a Free Tier that might be good enough for your app - and if you go above that tier I think you should be considering monetizing you app)
and that's about it for the time being (besides the options already named): keep an eye out for Mapbox as they could provide an iOS sdk in the near future - with routing and navigation
I agree with Kris' response. Turn-by-turn navigation is probably best handled by the built-in app. However, for displaying a route and ETA on a 2D map, I think I'm going to go with the Mapquest iOS API, which offers this functionality for free. I didn't know this solution was available until today.

iPhone indoor location based app

I am researching how to create an app for my work that allows clients to download the app (preferably via the app store) and using some sort of wifi triangulation/fingerprints be able to determine their location for essentially an interactive tour.
Now, my question specifically is what is the best route to take for the iPhone? None of the clients will be expected to have jail broken iPhones.
To my understanding this requires the use of the wifi data which is a private api therefore not meeting the app store requirements. The biggest question I have is how does American Museum of Natural History get away with using the same technology, but still available on the app store?
if you're unfamiliar with American Museum of Natural History interactive tour app, see here:
http://itunes.apple.com/us/app/amnh-explorer/id381227123?mt=8
Thank you for any clarification you can provide.
I'm one of the developers of the AMNH Explorer app you're referencing.
Explorer uses the Cisco "Mobility Services Engine" (MSE) behind the scenes to determine its location. This is part of their Cisco wifi installation. The network itself listens for devices in the museum and estimates their position via Wifi triangulation. We do a bit of work in the app to "ask" the MSE for our current location.
Doing this work on the network side was (and still is) the only available option for iOS since, as you've found, the wifi scanning functions are considered to be private APIs.
If you'd like to build your own system and mobile app for doing something similar, you might start with the MSE.
Alternatively, we've built the same tech from Explorer into a new platform called Meridian which provides location-based services on both iOS and Android. Definitely get in touch with us via the website if you're interested in building on that.
Update 6/1/2017
Thought I would update this old answer - AMNH is no longer using the Wifi-based system I describe above, as of a few years ago. They now use an installation of a few hundred battery-powered Bluetooth Beacons (also provided by Meridian). The device (iOS or Android) scans for nearby beacons and, based on their known locations and RSSI values, triangulates a position. You can read more about it in this article.
Navizon offers an indoor positioning solution that works for iOS as well as any other platform. You can check it out here:
http://www.navizon.com/product-navizon-indoor-triangulation-system
It works by triangulating the WiFi signals transmitted by the device. Since it doesn't require an app to run on the phone, it bypasses the iOS limitations and can locate any other WiFi device for that matter.
Google recently launched an API called Maps Geolocation API. You can use it for indoor tracking of devices, which essentially can be used to achieve something similar to what AMNH's app does.
I would do this using Augmented Reality. There is a system sort of in place for this, the idea being that you place physical markers that have virtual information associated with them. I believe the system I saw was a type of bar code. When a user holds up the phone with the app, the app uses the camera to read the code and then display information. This could easily be used to make a virtual tour type app distributable through the app store and not even require a WIFI or 3/4G connection. This assumes that you simply load your information and store it locally with your app. Then to update it you simply push an update through the app store. Another solution is to use a SOAP/REST service and provide the information in that way, and this does not use private API's, though it does require some form of internet connection. For this you can see a question I asked about this topic a little bit ago:
SOAP/XML Tutorials Question
In addition, you could load a map of your tour location, and based on what code is scanned you can locate the user on the map and give suggested routes based on interests etc.
I found this tutorial recently on augmented reality, I haven't gone through it, but if its anything like the rest of Ray's tutorials, it will be extremely helpful.
http://www.raywenderlich.com/3997/introduction-to-augmented-reality-on-the-iphone
I'll stick around to clarify any questions or other concerns you may have with your app.
To augment the original answer for devs who were using Cisco MSE for indoor location - now they have an iOS and Android SDK which enables you to do indoor location using the MSE. A simulator can be used as well to develop the app without implementing the infrastructure to start with : https://developer.cisco.com/site/cmx-mobility-services/downloads/
For indoor location you can use Bluetooth LE beacons since it's a very accessible technology nowadays, there are several methods:
Trilateration: it uses 3 beacons, but with the noise and attenuation of Bluetooth signals, it gets quite difficult to determine the exact position and also it's not easy to use more than 3 beacons to increase accuracy.
Levenberg Marquadt method: used to solve non-linear squares problems showed good results on indoor positioning.
Dead Reckoning method: using the motion co-processor of the device, giving an initial position you can calculate the moving path of the device. Not that easy to implement anyway.
I wrote a post on the topic, you can find more info here: http://bits.citrusbyte.com/indoor-positioning-with-beacons/
And you can use this iOS app for your own indoor positioning experiments: https://github.com/citrusbyte/beacons-positioning
I doubt the American Museum is actually using private APIS; you'll probably find the routers that have been setup serve different responses to each other, so the app can detect it's position in the museum.
If you are looking for a cheaper to way to do the same task, you could have signs with QR codes, and use an open source library to let users scan these barcodes as they move through the museum, and update the onscreen content accordingly. On an even more low tech level, you can just tag each area with unique numbers, and distinguish that way.

What does iPhone OS 3.0 need from a programming perspective?

iPhone OS 3.0 is being announced and previewed next week (March 17).
We all know the feature set users want. Copy/paste, MMS, Flash on iPhone, etc.
We'll see about those.
What I'm interested in what does the development community feel the SDK is missing, in need of, to make programming for the platform easier and more productive.
A more complete Interface Builder with support for custom palettes and all sorts of goodies like that.
Better control over the keyboard.
Better unit testing support. (Unit testing can be done, but only on the simulator, and it's very awkward to set up.)
Push notifications. Please.
A more accurate simulator, i.e. one with a more accurate set of frameworks.
The ability to easily build views like the Mail compose window.
For that matter, an in-application compose window.
A better way for apps to share data locally than by invoking URLs.
Access to the calendar, notes, mail (possibly read-only), and bookmarks (again, read-only) databases. Maybe even limited access to the iPod database—even just the ability to read song metadata and access and change the playing song would be helpful.
Some sort of middle ground between UILabel and UIWebView that allows for formatted text without a huge hassle.
More built-in toolbar icons.
The return of the "glass" button style that was in the beta SDK.
A few useful internal views, like UIProgressHUD, exposed.
And last but not least...
A pony.
An easy Javascript bookmarklet installation method for Mobile Safari. (OpenRadar: 1, 2)
UIWebView needs more of UIScrollView's properties and methods, such as contentOffset.
More configurability on some of the built-in behaviors and views, e.g. the button text on UITableViewCell's "Delete" button, or the styles and text of UIAlertSheet/UIAlertView buttons. (Some of these can be done today with undocumented calls, but I'd rather not rely on those.)
More flexibility from UINavigationController, such as the ability to push/pop views that selectively don't display the navigation bar but using the same animations and stack, or more customizability over the navigation bar button labels and behaviors.
The ability to restrict interface orientation per UIViewController, not just accept/reject changes via shouldAutorotate. E.g. I want my main content view to be autorotatable, but I want my navigation hierarchy and settings screens to always display in portrait, even if the content view was rotated to landscape.
libxml and its handy DOM XML parser instead of the SAX-based NSXMLParser.
libcurl w/SSL, or more options and functionality for NSURLConnection.
Ability to check whether a URL scheme is registered. This could be used for apps to detect whether other specific apps are installed, and enable functionality selectively, e.g. when Instapaper detects Tweetie is installed, it can offer a "Post with Tweetie" button. (Disclaimer: That was a plug. I make Instapaper.)
I'm sure I'll think of more, but overall, I'm very happy developing for the iPhone. I'm amazed at the quality and sophistication of the iPhone OS, the SDK, and the development tools given how incredibly young they all are.
I'm surprised no one has mentioned garbage collection yet. Objective-C 2.0 on the Mac supports optional garbage collection. I don't really see any reason it wouldn't work just fine on the iPhone as well and it would eliminate much of the tedium of having to explicitly release objects all over the place.
What I'm hoping most for is to allow iPhones to talk to each other either via Bluetooth or some other means. Granted, they can talk via Bonjour if they are on the same Wi-Fi network, but that's just not convenient enough in 2009. If I'm out with a friend and want to play a multi-player game we first have to find a Starbucks or whatever the heck to get on the same Wi-Fi network. Also, think of the ridiculous amount of social apps you could have if iPhones could talk to each other without needing Wi-Fi. Exchange business cards, flirt with the cute girl over there, etc.
Form a PURE programmers perspective, make XCode as helpful of an IDE as Eclipse or IntelliJ are in the Java world. There's so much time I waste on stupid stuff that the IDE could have found for me as I typed it.
I also don't understand why I can't color buttons without having to use images.
Better multitasking is absolutely key at this point. Android's got it, Palm's WebOS has it - both, it seems, in largely unrestricted and well-implemented fashion. Possibilities:
Push notifications with a good UI (message stack in addition to badging/sound/whatever - if they have to have an extra approval step so apps can't be obnoxious, so be it)
Multiple full processes (not possible with current OS, I realize, but then I've never seen a good explanation why the iPhone doesn't support virtual memory)
Smaller "background" versions of apps that can run in the background - no GUI and a significantly tighter memory constraint
A good mapping API. Let us access the Google Maps abstraction that the Maps application uses !
More Interface Builder goodness
Better simulator
Smart inbox. Incoming messages are routed to installed handlers based on type.
Synchronisation framework that simplifies syncing with desktop & Mobile Me.
Decent landscape support, without the multitude of bugs, especially for the camera picker. Better support for rotation and more control of it.
Access to EXIF data on images from the picker, so we can tell their location
Deeper access to the camera API, so that we are not rail-roaded into the standard photo taker / picker
Push notifications that can launch an application. (In lieu of full multi-tasking, which I don't think we'll get and which could be problematic.)
Better, more intuitive keyboard controls.
API for inter-application messaging.
Access to data from Calendar, iTunes, Mail, Notes and more (with user's permission)
A more accurate simulator, with, for example, ways to limit bandwidth, and use the Mac's camera to actually take a photo.
Phone-phone bluetooth for data exchange
Access to more of the views used by iPhone apps, e.g. the progress HUD, email "blobbing" mechanism for email addresses, thumbnail scrollers, HUD brought up in Photos app, and more.
Less sandboxing. It won't likely happen, but it would always be appreciated for an app to have slightly more power than they currently do (actual filesystem access, for example. even if it was read-only access, it would still allow for more interesting applications to exist).
EDIT: Also, access to the copy/paste API. But I hope that one is obvious to Apple.
My list:
More full-featured IB support as the Mac has
Inter-app Data transfer mechanism (could be C&P, but does not have to be)
Greatly improved camera API with deeper level of control and more flexibility
SDK access to bluetooth and more support for protocols
Real ObjectiveC framework around the address book like the Mac has today.
Warnings similar to the location warning when an app tries to access address book data.
I'm sure whatever they actually have prepared, there will be a few interesting twists.
Ability to send SMS messages without having to have launch the SMS client and have the user type the message.
Access to the raw camera data so that things can be done without having to take a picture and wait for it to save (like you can do with Android)
push notification so that you can launch tasks... would need to be user controllable.
A camera that can focus (I know... have to wait for the next iPhone for that... if they decide to put it in...)
A UIKit level drawing api.
We all know the feature set people want. Copy/Paste, MMS, Flash on iPhone, etc.
I would have thought those specific items were down the SO wish list (although it seems I'm wrong looking at the votes on this comment :-).
MMS is a pretty pointless app when you have eMail. Flash is not an OS issue - Flash could be delivered today.
I don't even want push notifications - they're just a patch, I want background apps. I also want fixes for all the broken APIs like Camera, video and landscape support. Support for CoreImage filters would be nice too but probably too much to wish for.
[[ABAddressBook sharedAddressBook] me] for being able to use the owner's Zip code, phone number, or whatever.
Ability to download files to local storage and sync them back to iTunes or your hard drive
Get EXIF data from photos
Pull all photos at once
Pull all contacts at once
Control screen brightness
Access to music in iPod section
Read access to email and text messages
Access to Safari cookies (so maybe I could make some kind of keep-me-logged-in app.)
fix table view in landscape mode
new camera API with direct access to the camera
distribution code signing automatically when uploading to the app store (instead of code signing in xcode)
ability to request more memory so users don't have to reboot their phones to get rid of background apps
A non-Mac based development envionment.

What is the iPhone SDK Missing?

I've been doing mobile app development for a long time (2001?), but the systems we worked with back then were dedicated mobile development environments (Symbian, J2ME, BREW). iPhone SDK is a curious hybrid of Mac OS X and Apple's take on mobile (Cocoa Touch).
But it is missing some stuff that other mobile systems have, IMO. Specifically:
Application background processing
SMS/MMS application routing (send an SMS to my application in the background)
API for accessing phone functions/call history/call interception
I realize that Apple has perfectly valid reasons for releasing the SDK the way they did. I am curious what people on SO think the SDK is missing and how would they go about fixing/adding it, were they an Engineering Product Manager at Apple.
The biggest shortcoming in my opinion is support for separating licensing from distribution.
What I mean by this is that it should be possible to download a trial version of an application and later purchase a license for that application (from an API call inside the application or from the app store). This would make it much easier to try-before-you-buy and get rid of the current duplicates of many applications with 'lite' versions.
I think lack of push notifications for apps is the big thing we're missing right now. With push, you can register your application to perform a task (like getting the most recent data from a web service) even when it's not running, at a time and frequency the OS decides is best. In an ideal world, along with the existing concept of iPhone apps loading quickly and resuming where you last left off, this solves the problem of not running in the background. I know some tasks will be more difficult or maybe impossible with this strategy, but it's still a pretty good compromise between third party applications and the iPhone's limited hardware.
Originally push was scheduled for last September, but it was removed from the beta SDK and not spoken of since then.
API's I'm personally looking for:
Apple80211 as a public API (private, current API is fine if documented)
Access to Volume buttons (semi-accessible via Celestial, private, needs new API)
Access to Calendar (private, API status unknown)
Access to Bluetooth + SPP profile (status unknown)
Access to Camera (directly, API status unknown)
Access to JavaScript runtime (directly, not through UIWebView, API status unknown)
WebKit access that's lower-level than UIWebView (private, current API is fine)
Access to Music Library (private, current API is fine)
Garbage Collection.
CoreData is missing.
You've mentioned some of the big ones - copy & paste (or in fact any way for apps to collaborate) is another huge omission.
It also seems to lack a desktop synch framework (at least if it exists I can't find it).
Language independence and especially lack of scripting is another pet peeve - objective-c is all very well but more languages to choose from would be good.
Inability to dynamically extend apps, via scripts or otherwise, is another big omission. This is partly an SDK/OS issue, partly licensing.
My list ordered by priority:
Mapping abstraction (the MapKit looks awesome), but that would require a new Google Maps TOS
Music library
Camera (photo + video) Access to more
UIViews, Apple designed some pretty nice custom ones for their apps
Better UIWebKit abstraction
The features I see missing that it should have is
Access to SMS
Direct Access to Google Maps App. You should be able have access to this so you could extend your application to use the built in features provided by Google Maps.
Access to the Bluetooth functionality of the phone.
Access to the Calendar. Why not allow access to simply post a calendar event for the user.
Access to Active Sync. It would great if we could directly access this and communicate back to the Exchange Server.
Core Image. They provide Core Animation but Core Image is missing. I hope that this is added to the API soon.
These are some of the features that my clients have access for in the past and are supprised when they are not available.
We definitely miss a Calendar API and SMS access. So many applications could leverage such APIs. The iPhone allows users to have everything in their pocket, but it's almost useless as long as developers cannot leverage this integration in their apps.
A language with proper namespaces.
A limitation that bugs me is lack of access to system features that require root or setuid. For example: opening privileged IP ports.
I'm not sure there is a good solution to this, as long as Apple's policy is to keep the device locked-down.
Allow program to set some kind of local timed event for your application to bring up an alert and launch your app if the user agrees (like any calendar app). You could do that with push notifications but there are many cases I'd hate to have to rely on a whole server infrastructure and network connectivity just to basically do some timed thing.
Some idea of what direction the user is facing. I cannot believe the GPS chip the newer iPhones use are not capable of reporting direction.
I would personally love to see
Access to the CoreTelephony Framework (Currently private). Which allows access to all the phone functions (Especially sending MMS / SMS).
Some sort of ability to run stuff in the background. While push notifications is ok for most things, but it is a bit hard to leverage CoreLocation (i.e. have the app show a notification at a certain location). Of course this would probably need an on/off button or app specific like push is.
animation view which will be reduce developer to make a cool app , of course the core business local still need consider more , but the view layer could more easy to use ....

Is it possible to access data from the proximity-sensor of the iPhone surface?

When you hold the iPhone to your ear, it detects that there's something (proximity-sensor) and switches off the display.
is it possible to access this sensor in an iPhone app?
It is possible via undocumented System calls, this is how Google's voice search works on the iPhone to start listening when it is close to your ear (or so i'm told). The API isn't publicly exposed though so although google got the app on the store your app might be subject to more scrutiny.
Sorry I can't tell you exactly what the calls are.
I don't think so.
Rather, there aren't any published API's for it.
Google's voice search uses it, but that caused some fuss as they apparently used some unpublished functions.
http://www.iphonehacks.com/2008/11/iphone-app-news.html
EDIT:
To clarify, there are published API's allowing you to turn it on and off, but nothing that will allow you to detect when it has been triggered.
I was able to find this functionality in Apple's documentation here, however I haven't tried it yet.
The UIDevice instance also provides access to the proximity sensor state (described by the proximityState property). The proximity sensor detects whether the user is holding the device close to their face.