Can a user exclude a keyword in search results? - algolia

For example, could they type "-Adventure" if they want results without the word adventure to appear in their search results?

Sure thing! You need to enable the advancedSyntax feature and then the - in front of words will be interpreted as a NOT.
index.search('Crazy -Adventure', { advancedSyntax: true }).then(...);
Will search all objects with Crazy and without Adventure.

Related

Is it possible to use INTENT instead of STRING as List Title in Google Action?

Some Background:
I use Lists a lot for a Google Action with a NodeJS fulfillment backend. The Action is primarily Voice-based. The reason for using List is that I can encode information in List's key and use it later to make a decision. Another reason is that Google Assistant will try to fuzzy match the user's input with the Title of the List's items to find the closest matched option. This is where thing's get a bit hard for me. Consider the following example:
{
JSON.stringify(SOME_OBJECT): {
title: 'Yes'
},
JSON.stringify(ANOTHER_OBJECT): {
title: 'No'
}
}
Now if I say Yes / No, I can get the user's choice and do something with information stored as stringified JSON in the choice's Key.
But, the users may say Sure or Yup or OK as they basically mean the same thing as saying Yes. But as those words don't match Yes, Google Assistant will ignore the "Yes" option. But all of these words belong to the smalltalk.confirmation.yes built-in intent. So, if I could use this intent instead of hardcoding the string Yes then I would be able to capture all of the inputs that mean Yes.
I know I could do this with a Synonyms list or Confirmation intent. But they also have some problems.
Using Synonyms would require me finding every word which is similar. Besides, I would also need to localize these synonyms to all the supported language.
With Confirmation intent, I won't be able to show some information to the user before asking them to choose an option. Besides, it also doesn't support encoding the options as I can do in List's key.
So, List is a good choice for me in this case.
So, is there any way to leverage the built-in intents for this purpose? What do you do in this situation?

Possible to get words matched fuzzily by MongoDB full text search?

I'm writing a UI that presents the results of a MongoDB full text search query, visually highlighting the matched search terms in each result; this works well enough for full word or phrase matches, but not for partial/fuzzy matches.
For example, if I search for "delete" a will get a search result that contains "deletion", which does not contain the full word "delete" and therefore won't be highlighted if I merely highlight the full search term matches. I do want the partial matches, though.
Is there any way to project the set of matched words/substrings when I execute the query?
I've so far been unable to find anything in the docs that hints at this being possible, but I thought it worth asking around. Any help would be greatly appreciated.
You can use the Mongo DB Atlas feature where you can search your text based on different Analyzers that MongoDB provides. And you can then do a search like this: Without the fuzzy object, it would do a full-text-match search.
$search:{
{
index: 'analyzer_name_created_from_atlas_search',
text: {
query: 'Text to do a full match or fuzzy match with',
path: 'sentence',
fuzzy:{
maxEdits: 2 #max 2 is allowed
}
}
}
}

How to allow leading wild cards in custom smart search web part (Kentico 10)

I have a custom index for my products and I am using the Subset Analyzer. This Analyzer works great, but if you do field searches, it does not work.
For example, I have a document with the following fields:
"documentname", "My-Document-Name"
"tags", "1234,5678,9101"
"documentdescription", "This is a great Document, My-Document-Name."
When I just search "name AND tags:(1234)", I get this document in my results because it searches +_content:name.
-- However:
When I search "documentname:(name)^3.0 AND tags:(1234)", I do not get this document in my results.
Of course, when I do "documentname:(*name*)^3.0" I get a parse error saying: '*' or '?' not allowed as first character in WildcardQuery.
How can I either enable wildcard query in my custom CMS.Search webpart?
First of all you have to make sure that a field you checking is in the index with proper name. documentname might not be in the index it can be called _title, depends how you index is set up. Get lukeall and check your index (it should be in \CMS\App_Data\CMSModules\SmartSearch\YourIndexName). You can use luke to test your searches as well.
For examples there is no tags but there is documenttags field.
P.S. Wildcards are working and you are right you can't use them as a first character by default (lucene documentation says: You cannot use a * or ? symbol as the first character of a search), but there is a way to set it up in lucene.net, although i dont know if there are setting for that in Kentico. But i dont think you need wildcards, so your query should be (assuming you have documentname and documenttags in the index):
+(documentname:"My-Name" AND documenttags:"tag1")

How can you filter search by matching String to a field in Algolia?

I'm trying to create filters for a search on an Android app where a specific field in Algolia must exactly match the given String in order to come up as a hit. For example if Algolia has a field like "foo" and I only want to return hits where "foo" is equal to "bar", then I would expect that I would have to use a line of code like this:
query.setFilters("foo: \"bar\"");
Any guesses as to why this isn't working like I see in the examples or how to do so?
Ah, I thought that attributesForFaceting was done by setting what was searchable or not. It was on a different page within the dashboard than I was previously using. Thanks #pixelastic.

MongoDB - Using regex wildcards for search that properly filter results

I have a Mongo search set up that goes through my entries based on numerous criteria.
Currently the easiest way (I know it's not performance-friendly due to using wildcards, but I can't figure out a better way to do this due to case insensitivity and users not putting in whole words) is to use regex wildcards in the search. The search ends up looking like this:
{ gender: /Womens/i, designer: /Voodoo Girl/i } // Should return ~200 results
{ gender: /Mens/i, designer: /Voodoo Girl/i } // Should return 0 results
In the example above, both searches are returning ~200 results ("Voodoo Girl" is a womenswear label and all corresponding entries have a gender: "Womens" field.). Bizarrely, when I do other searches, like:
{ designer: /Voodoo Girl/i, store: /Store XYZ/i } // should return 0 results
I get the correct number of results (0). Is this an order thing? How can I ensure that my search only returns results that match all of my wildcarded queries?
For reference, the queries are being made in nodeJS through a simple db.products.find({criteria}) lookup.
To answer the aside real fast, something like ElasticSearch is a wonderful way to get more powerful, performant searching capabilities in your app.
Now, the reason that your searches are returning results is that "mens" is a substring of "womens"! You probably want either /^Mens/i and /^Womens/i (if Mens starts the gender field), or /\bMens\b/ if it can appear in the middle of the field. The first form will only match the given field from the beginning of the string, while the second form looks for the given word surrounded by word boundaries (that is, not as a substring of another word).
If you can use the /^Mens/ form (note the lack of the /i), it's advisable, as anchored case-sensitive regex queries can use indexes, while other regex forms cannot.
$regex can only use an index efficiently when the regular expression has an anchor for the beginning (i.e. ^) of a string and is a case-sensitive match.