Just wondering, can we integrate Google Maps and Siri together. For Example:
I ask Siri, "show nearest Starbucks" and Siri will open the Map app or Google Maps and show the nearest Starbucks on the map.
Or
I ask Siri, "show me all Apple Stores". Siri will open the map and show all the locations of Apple Stores on the map.
Is this doable?
I havn't found any good tutorial,documentation to study more about Siri implementation, apart from articles only. There is no technical documentation/API available?
This isn't directly supported right now, without writing a third party website that Siri can hook into. According to the below linked documentation, from Apple's website, Siri on iOS 6 will support this functionality in at least limited part:
http://www.apple.com/ios/ios6/siri/
Eyes Free
Apple is working with car manufacturers to integrate Siri into select voice control systems. Through the voice command button on your steering wheel, you’ll be able to ask Siri questions without taking your eyes off the road. To minimize distractions even more, your iOS device’s screen won’t light up. With the Eyes Free feature, ask Siri to call people, select and play music, hear and compose text messages, use Maps and get directions, read your notifications, find calendar information, add reminders, and more. It’s just another way Siri helps you get things done, even when you’re behind the wheel.
This encourages me to believe that they will also expose the API (because someone will ferret it out if it exists) to normal API consumers during the iOS 6 lifecycle, probably before iOS 6.1, or with that release.
It is not currently possible through any kind of API that Apple use. However, there are some third party APIs that you could use such as: https://www.ispeech.org/developers/iphone You'd have to use that and then pass on the returned data to the Google Maps API.
Although this approach won't be as intuitive as using Siri, since that is not currently possible, this is the best bet you have for the mean time.
Unfortunately, Apple has not opened Siri's API to developers yet making this task impossible. However, Apple will probably open it soon. If you just want to use it for personal use, check out SiriProxy (https://github.com/plamoni/SiriProxy). SiriProxy lets you do exactly what you asked; however, for it to work, you must be on your wifi network so it cannot be in one of your app. Good Luck!
I am looking for the way to enable and disable default iPhone camera such as at certain time or latitude/longitude. There are some place where the camera is prohibited, so we can disable it when one reaches at such a place, this is just for an example. Well I thought that disabling and enabling camera in iPhone is not possible. But when my superior gave me the document and said me to check this then I found that its possible to enable/disable default iPhone camera. That document was for enterprise deployment guide, which is associated with enterprise program. There was written the new features of enterprise program and has mentioned
the point of enabling and disabling the default iPhone camera. You can look for the enterprise deployment guide over net and can see that thing. Well I googled for this that how to enable disable the camera, but nothing came. So can you please let me know that how to do this mean enabling disabling camera. Is there any tutorial reference any samples there. Moreover I would like to ask that what is the difference then in enterprise, do we make the apps in the same way in the enterprise as we make for the general developer program.
So how to accomplish this thing enabling/disabling default iPhone camera?
How to implement as well?
well I also thinked same as you are saying that it's not possible. But when I saw the enterprise deployment guide I was shocked.
Here is the link to download this guide and you can have a look over that then you will also come to know that this is possible
Enterprise Deployment Guide
What’s New for the Enterprise in iPhone OS 3.0 and Later
See this para and you can find the point under there where is this mentioned that you can enable/disable camera in iPhone
I'm pretty sure you can't disable the camera arbitrarily. Apple is pretty strict about not letting developers mess with things the user expects to work at all times, and the camera would definitely fall under that category.
So, unless there is some documented, Apple-approved way of doing things, your next best bet is probably jailbreaking and using that to somehow interfere with the camera's operation. I'm afraid I don't know how you'd go about doing that, however.
As for enterprise use and whatnot, I'm not sure, but I would guess that while an iPhone is part of an enterprise network/group/what-have-you, the admins can disable the camera. I doubt that is available programmatically either, however, as it doesn't seem like it would be something made available to developers. So, again, if a method to handle it programmatically isn't mentioned in the documentation you received, it probably doesn't exist.
Allow use of camera: When this option is turned off, the camera is completely
disabled and its icon is removed from the Home screen. Users are unable to take
photographs.
source:
http://manuals.info.apple.com/en_US/Enterprise_Deployment_Guide.pdf
I believe you can if you have gained the Enterprise Edition.
What are the typical mistakes an iPhone developer unaware of accessibility makes that renders their app unusable by customers with common impairments?
What are the first and easiest steps to take when making an iPhone app accessible to the vision impaired (etc.)?
How do I make sure Voice Over says or does something appropriate for all my programmatically created UIButtons, UISliders, etc.?
See the following link from Apple on accessibility on iOS.
http://developer.apple.com/iphone/library/documentation/UserExperience/Conceptual/iPhoneAccessibility/Accessibility_on_iPhone/Accessibility_on_iPhone.html
I strongly suggest using voiceover to test your application after you've used the developer tools to check it's accessibility. I'm a blind iPod touch user and find that VoiceOver has a very low entry barrior, unlike windows screen reading software that requires you to memorize a bunch of keystrokes and realize that your application may be presented in an entirely different way to the user then you expect.
I just saw an iphone application that enhances the built in "messages" application to send emotions. What this application does is that it extends the functionality of the Keyboard that appears in notes or messages application, and user can send emotion icons as messages. This application doesn't work when the emotions are sent to other mobile devices (it works only with iPhone), but that's not important. What i'm wondering is, how they did that (extended the built in keyboard)? Do we have API's that let you extend the functionality like this?
Best Regards.
They are part of the font set. called Emoji icons.
Apple's latest iPhone OS update
brought cute little "Emoji" icons for
Asian markets
To use them you need to use the (font) character code for each letter / emoji. I don't know what they are, but a quick google returns, abide a lot of spam, some codes worth trying. A quick script could be made to incrementally loop through each character code, thus finding them all.
A quick google:
iPhone OS 3.0 is being announced and previewed next week (March 17).
We all know the feature set users want. Copy/paste, MMS, Flash on iPhone, etc.
We'll see about those.
What I'm interested in what does the development community feel the SDK is missing, in need of, to make programming for the platform easier and more productive.
A more complete Interface Builder with support for custom palettes and all sorts of goodies like that.
Better control over the keyboard.
Better unit testing support. (Unit testing can be done, but only on the simulator, and it's very awkward to set up.)
Push notifications. Please.
A more accurate simulator, i.e. one with a more accurate set of frameworks.
The ability to easily build views like the Mail compose window.
For that matter, an in-application compose window.
A better way for apps to share data locally than by invoking URLs.
Access to the calendar, notes, mail (possibly read-only), and bookmarks (again, read-only) databases. Maybe even limited access to the iPod database—even just the ability to read song metadata and access and change the playing song would be helpful.
Some sort of middle ground between UILabel and UIWebView that allows for formatted text without a huge hassle.
More built-in toolbar icons.
The return of the "glass" button style that was in the beta SDK.
A few useful internal views, like UIProgressHUD, exposed.
And last but not least...
A pony.
An easy Javascript bookmarklet installation method for Mobile Safari. (OpenRadar: 1, 2)
UIWebView needs more of UIScrollView's properties and methods, such as contentOffset.
More configurability on some of the built-in behaviors and views, e.g. the button text on UITableViewCell's "Delete" button, or the styles and text of UIAlertSheet/UIAlertView buttons. (Some of these can be done today with undocumented calls, but I'd rather not rely on those.)
More flexibility from UINavigationController, such as the ability to push/pop views that selectively don't display the navigation bar but using the same animations and stack, or more customizability over the navigation bar button labels and behaviors.
The ability to restrict interface orientation per UIViewController, not just accept/reject changes via shouldAutorotate. E.g. I want my main content view to be autorotatable, but I want my navigation hierarchy and settings screens to always display in portrait, even if the content view was rotated to landscape.
libxml and its handy DOM XML parser instead of the SAX-based NSXMLParser.
libcurl w/SSL, or more options and functionality for NSURLConnection.
Ability to check whether a URL scheme is registered. This could be used for apps to detect whether other specific apps are installed, and enable functionality selectively, e.g. when Instapaper detects Tweetie is installed, it can offer a "Post with Tweetie" button. (Disclaimer: That was a plug. I make Instapaper.)
I'm sure I'll think of more, but overall, I'm very happy developing for the iPhone. I'm amazed at the quality and sophistication of the iPhone OS, the SDK, and the development tools given how incredibly young they all are.
I'm surprised no one has mentioned garbage collection yet. Objective-C 2.0 on the Mac supports optional garbage collection. I don't really see any reason it wouldn't work just fine on the iPhone as well and it would eliminate much of the tedium of having to explicitly release objects all over the place.
What I'm hoping most for is to allow iPhones to talk to each other either via Bluetooth or some other means. Granted, they can talk via Bonjour if they are on the same Wi-Fi network, but that's just not convenient enough in 2009. If I'm out with a friend and want to play a multi-player game we first have to find a Starbucks or whatever the heck to get on the same Wi-Fi network. Also, think of the ridiculous amount of social apps you could have if iPhones could talk to each other without needing Wi-Fi. Exchange business cards, flirt with the cute girl over there, etc.
Form a PURE programmers perspective, make XCode as helpful of an IDE as Eclipse or IntelliJ are in the Java world. There's so much time I waste on stupid stuff that the IDE could have found for me as I typed it.
I also don't understand why I can't color buttons without having to use images.
Better multitasking is absolutely key at this point. Android's got it, Palm's WebOS has it - both, it seems, in largely unrestricted and well-implemented fashion. Possibilities:
Push notifications with a good UI (message stack in addition to badging/sound/whatever - if they have to have an extra approval step so apps can't be obnoxious, so be it)
Multiple full processes (not possible with current OS, I realize, but then I've never seen a good explanation why the iPhone doesn't support virtual memory)
Smaller "background" versions of apps that can run in the background - no GUI and a significantly tighter memory constraint
A good mapping API. Let us access the Google Maps abstraction that the Maps application uses !
More Interface Builder goodness
Better simulator
Smart inbox. Incoming messages are routed to installed handlers based on type.
Synchronisation framework that simplifies syncing with desktop & Mobile Me.
Decent landscape support, without the multitude of bugs, especially for the camera picker. Better support for rotation and more control of it.
Access to EXIF data on images from the picker, so we can tell their location
Deeper access to the camera API, so that we are not rail-roaded into the standard photo taker / picker
Push notifications that can launch an application. (In lieu of full multi-tasking, which I don't think we'll get and which could be problematic.)
Better, more intuitive keyboard controls.
API for inter-application messaging.
Access to data from Calendar, iTunes, Mail, Notes and more (with user's permission)
A more accurate simulator, with, for example, ways to limit bandwidth, and use the Mac's camera to actually take a photo.
Phone-phone bluetooth for data exchange
Access to more of the views used by iPhone apps, e.g. the progress HUD, email "blobbing" mechanism for email addresses, thumbnail scrollers, HUD brought up in Photos app, and more.
Less sandboxing. It won't likely happen, but it would always be appreciated for an app to have slightly more power than they currently do (actual filesystem access, for example. even if it was read-only access, it would still allow for more interesting applications to exist).
EDIT: Also, access to the copy/paste API. But I hope that one is obvious to Apple.
My list:
More full-featured IB support as the Mac has
Inter-app Data transfer mechanism (could be C&P, but does not have to be)
Greatly improved camera API with deeper level of control and more flexibility
SDK access to bluetooth and more support for protocols
Real ObjectiveC framework around the address book like the Mac has today.
Warnings similar to the location warning when an app tries to access address book data.
I'm sure whatever they actually have prepared, there will be a few interesting twists.
Ability to send SMS messages without having to have launch the SMS client and have the user type the message.
Access to the raw camera data so that things can be done without having to take a picture and wait for it to save (like you can do with Android)
push notification so that you can launch tasks... would need to be user controllable.
A camera that can focus (I know... have to wait for the next iPhone for that... if they decide to put it in...)
A UIKit level drawing api.
We all know the feature set people want. Copy/Paste, MMS, Flash on iPhone, etc.
I would have thought those specific items were down the SO wish list (although it seems I'm wrong looking at the votes on this comment :-).
MMS is a pretty pointless app when you have eMail. Flash is not an OS issue - Flash could be delivered today.
I don't even want push notifications - they're just a patch, I want background apps. I also want fixes for all the broken APIs like Camera, video and landscape support. Support for CoreImage filters would be nice too but probably too much to wish for.
[[ABAddressBook sharedAddressBook] me] for being able to use the owner's Zip code, phone number, or whatever.
Ability to download files to local storage and sync them back to iTunes or your hard drive
Get EXIF data from photos
Pull all photos at once
Pull all contacts at once
Control screen brightness
Access to music in iPod section
Read access to email and text messages
Access to Safari cookies (so maybe I could make some kind of keep-me-logged-in app.)
fix table view in landscape mode
new camera API with direct access to the camera
distribution code signing automatically when uploading to the app store (instead of code signing in xcode)
ability to request more memory so users don't have to reboot their phones to get rid of background apps
A non-Mac based development envionment.