Searching MKMapview using JSON tends to return no results, or an incorrect one - iphone

I have an iphone app (iOS 5) that uses a UISearchBar to search an MKMapView. We used JSON queries, and used the fantastic answers from this question as reference (our code is very similar). The process itself works fine now, but we tend to get no results back from Google when we query them, or just get a really far away and incorrect one. Most times we can even search for "McDonald's" or "Subway" at it won't return any results. In general, it rarely gives a good result back unless we're very specific and include city and state and everything.
Is there another better way to go about this? Has something been updated since that answer that we should now take in to account? The problem doesn't seem to be that the code isn't working, but rather that Google just doesn't handle queries well the way we do it. This seems to be a fairly common use for MKMapview so I figured there should be an easier and better-working solution.
Any help would be much appreciated.

Here is a very useful list of the parameters that the Google Maps API supports:
http://querystring.org/google-maps-query-string-parameters/
You have a couple of options:
1) Get the user's location from the app and pass it into the search query with the sll parameter, e.g.
this search doesn't include a location:
https://maps.google.com/?q=starbucks
but this one does (I've used San Francisco in this example):
https://maps.google.com/?q=starbucks&sll=37.776414,-122.437477
Then you'll get results for the user's actual location. You'll also need to do something sensible if the user does not permit the app to access their location (in that case you may want to disable search).
2) If your app is for a specific place, then you can just add that place on the end of your search string. e.g. my Domesday app is only for England, so I include ",England" on the end of all my search requests, and that works nicely for me.

Related

"Okay Google, show pictures of [PARAMETER PHRASE]"

I'm creating a setup of a Google Assistant/Home that should IDEALLY respond to the phrase "Okay Google, show pictures of [PARAMETER PHRASE]" by giving me the parameter phrase. It also HAS to be able to function like a regular home ("Hey Google, how far away is the moon", "... tell me a joke", etc.), without having me reimplement all of that functionality (unmatched phrases should fallback to the Google Home).
If I use the Home, I'm afraid I won't be able to avoid "... tell [MY APP NAME] to ...", but it has a great mic and speaker built in.
I am alternatively looking into a raspberry pi solution for the added layer of control, but the Home has a fantastic mic and speaker already. And importantly, I absolutely don't want to recreate the core Google Home features (possibly able to pass off uncaught phrases to the Google Home backend?)
I can mask some non-parameterized commands with the Assistant Shortcuts ("Okay Google, cat time!", "Hey Google, show me cats") in order to simplify the call phrase, but that does not work because it's not parametrizable.
TLDR: I have a setup that needs to 1. work like a normal Google Home, but must 2. have additional functionality that I implement. I would like to 3. avoid having to state "... tell MY TARGET APP to [...]", but I need 4. parameters to be passed to my code., even if completely unparsed.
What are my options?
There are a bunch of possible approaches here, depending on the exact angle you want to tackle this. None really are perfect at this time, however, but since everything is evolving, we'll see what might develop.
It sounds like you're making an IoT picture frame or something like that? And you want to be able to talk to it? If so, you may want to look into the Assistant SDK, which lets you embed the Assistant into your IoT device. This would let you implement some voice commands yourself, but pass other things off to the Assistant to handle.
But this isn't a perfect solution, since it splits where the voice recognition works, where it is applied, and may not get you the hotword triggering.
It is also still in an early Developer Preview, so things might change, and it may evolve to be something closer to what you want... but it is difficult to tell right now.
Depending on the IoT appliance you're working on, you may be able to leverage the built-in commands by building a Smart Home Action. However, at the moment, these have a fairly limited set of appliance types they can work with. It also sounds like you're trying to deal with media control - which isn't something that Smart Home directly works with, and is (hopefully) a future Action API (there were some hints about this at I/O, with Cast compatibility promised... but no details).
If you really want to build for the Home and Assistant, you'll need to use the limitations around Actions on Google. And that does include some issues with the triggering name.
However... one good strategy is to pick a name that works well with the prefix phrases that are used. Since "Ask" is a legitimate prefix that Home handles, you could plan for a triggering name such as "awesome photo frame", and make the command "Ask awesome photo frame to show pictures of something".
More risky, since it isn't clearly documented, but it seems that some triggering names work without a prefix at all. So if your application is named "fly to the moon", it seems like you can say "Hey Google, fly to the moon" and the action will be triggered. If you can get a name like this registered, it will feel very natural for the user.
Finally, you can pick a reasonable name, but have your users set an alias or shortcut that makes sense to them. I'm not sure how this would fit in with solution (1), but being able for you to predefine shortcuts would make it pretty powerful.
You can't invoke your app without first connecting to your app using Ok Googe, talk to my app* because if it happens so, it will be like talking to the Core Assistant, not your app.
Google doesn't allow to talk an app without app invoke

GooglePlacesAutocompleteAdapter (Android Places API) returning results outside of boundary

It seems that the sample code from Google that demonstrates the Google Places API for Android is returning results from outside of the given boundary. (https://github.com/googlesamples/android-play-places/).
Searching for 'hardware' shows most results from Sydney (the hardcoded boundary), but also (occasionally) shows results from as far away as other cities (including Western Australia!). I've implemented GooglePlacesAutocompleteAdapter in my own code and have found similar results. It seems that the Boundary field is only a guideline for the search; can anyone confirm this?
This may or may not be related, but does anyone know also whether the results returned from Places.GeoDataApi.getAutocompletePredictions are the same results that can be expected from a similar call to the Web Services Places API? With a few tests it seems the web services call returns better results (closer to location, more relevant, and overall more results). The API docs do not seem to shed any light on this -- my guess is perhaps the getAutocompletePredictions query is performed on the 'name' of the Place rather than in a 'keyword' search as per the web service API implementation.
Thanks for the help.
As per the developer docs, the bounds is...
for geographically biasing the autocomplete predictions.
This means exactly what you suggested. Results inside are preferred, but not required.

How does Facebook search work?

More specifically, what factors determine the priorities they assign in response to a given query? I'm looking for answers that address numerous scenarios including queries that...
Specify the "type" of result (objects such as users, posts, pages, etc. or connections like friendships, likes, tags, etc.),
Have authentication tokens as well as ones that don't.
Have conditionals such as "since" and "until."
Don't even specify a type, such as this search for the word query.
I am actually working on an app that uses /search to search places and I use a bit of all scenarios. I couldn't write down a specific order they appear in and to be honest I highly doubt it's something as easy.
I'm 99% sure it works like the Search in Facebook does, using the user data to bring up the most relevant results. I live in Ireland for 2 years now, but while testing the app I constantly receive search results from Romania and actually close to my Hometown, which are relevant to me.
Regarding your observations, Facebook's algorithms might take into account the source of the request as well - which would be good, means it only improves as your app gets more users.

UIPickerView and a Giant Contact List?

I'm new to iOS Development and am trying to make an application that essentially sorts through a list of 300 names or so. I've got the Drill-Down part of the application down, aside from the detailView, but am now faced with a challenge.
What I would like to do is have users select from 3 fields with a UIPickerView to come up with shorter lists for every time a user is looking for a person. I'd like to use a .plist, but I also have an XML feed of the information. Before I waste all of my time structuring these data sources, does anybody have a good overview as to how I should approach this?
Also, I've asked some this question before, and they tell me to read up on introductory iOS development topics. I understand the mechanics of development, I just can't ever figure out how to approach a task properly. (I'm working on it!)
Thanks in advance. I'd share an image to help clarify, but my rep isn't high enough.
Snip: It looks like I misread your intention which makes my earlier comments irrelevant, you want to have the user select one of 3 options to shrink the list, if I'm not mistaken.
Some more questions for you, so I take it that this XML feed is going to be potentially changing between times that the user loads up the app? Will it only ever grow or are those 300 or so names that are loaded once set for good? The reason I ask so that you can maybe see my train of thought is whether or not using Core Data might be useful. You could easily store your large list locally, save time having to reload this large list frequently, and also you can use the built fetchedObjectController to search your collection of names. I'll keep thinking about it and once you get a chance to answer these questions we can continue.
Ill check back for an edit or comment, and see if I can give you an approach. Also, maybe edit your question with any of your own approach ideas and we could also start from there and refine them if needed.
Edit 2: From the information in the comments this is one of the ways that I could see this being done that make sense to me:
Since you seem to be able to control the information you receive from the feed I would set it up to send you only the contacts that need to be added/removed. You could handle this a few ways depending on your deployment intentions but I would go with the following:
Find a way to signal a first time run of the application, and as a result all contacts would be new, and you could populate your list fully with a slightly longer first time setup. Then any further changes could be quickly handled by smaller edits made to the local list.
You would need to set up Core Data for your application, which should be fairly straightforward in your case, and after this you can use the built in NSFetchRequest to do your searches that will then quickly return a list of narrowed down contacts. As for the physical picker that is just a matter of building the UI which will require some design from your end as you are the only one that knows what you are going for in that regard. Depending on the complexity of your app and what functionality you will want to include you could get away with 1-2 views that simply do the displaying of the contacts in a table and then the picker just reloads when appropriate.
I'm not familiar with the implementation of XML Feeds and receiving data from them, but I have done XML Response parsing into Core Data from a SOAP service before and they shouldn't be terribly different.
Regarding resource to get you started should you need them, I would recommend the following:
eBooks:
http://www.techotopia.com/index.php/Objective-C_2.0_Essentials
http://www.techotopia.com/index.php/IPhone_iOS_4_Development_Essentials_Xcode_4_Edition
Tutorials:
http://www.raywenderlich.com/
The eBooks I have linked are both absolutely fantastic and one of the few xCode 4.0 books that I was able to find that seemed to be of an actual usable quality. They both contain easy to follow and clear tutorials on simple and more advanced aspects of programming for iOS.
Ray's site is an immensely helpful resource as it contains both a very active forum base for iOS programming in addition to a constantly growing tutorial collection as there are 4-5 people that constantly are creating new tutorials that the community votes on and suggests every week. It contains some more advanced topics than the above books and I would recommend looking at it after doing a few walk through/tutorials from the books.
I'll stick around if you have any further questions, otherwise you can send me a notification via these comments, or just post another question and someone is bound to help you out!
-Karoly

How to implement a search system in a database for an iphone application

This is pretty wide question, but I'm hoping to get a push in the right direction (technologies and methodology).
Ok, I have an iphone app (which I am developing) that works with a web service (c#) through http requests. The web service connects to the underlying database, extracts the necessary data depending on the request and feeds it back to the application.
Now, I need to implement a search system in the app. The user searches for some words, and I need to provide the most relevant results. The search must be performed on different tables in the database. Each table can be searched in a number of columns. For example, when searching through the people table I need to search in the first name, lastname, company, and other fields. Other tables have other important columns.
I have so many questions that I don't even know where to start.
How do I make my sql queries to make the search, but still be fast enough. Do I need to make some extra tables with indexed content somehow?
How should I add relevance factor to the results so I can ultimately filter only the most relevant results? For example, if an user searches for Smith, maybe there is a person named Smith or even a Company. They should be displayed before any other content that can have smith in the description.
I know the question is a little vague/wide but I can explain more if somebody desires.
Thank you
This kind of depends on which language/rdbms you are using on your server. You might checkout various DB search solutions like Sphinx which will do all of that indexing for you and provide a simple Search API. Sphinx for example allows you to prioritize columns, define character mappings (ß->s, ä->a) etc.
In the end I have decided to use Lucene. It's a wonderful piece of technology and even if I had some doubts in the beginning, after reading 3/4 of the book called "Lucene in Action" it was clear to me that it had everything I needed (and much more).
I know it's not a fully-functional searching system (with all the elements needed), but merely a library handling the core of a search system. It will need some work to integrate it with my application/webservice/database. I will let you know how it goes :)
Thanks for your input!