Calling import.io dataset created with "Chain API" via REST - rest

I created a dataset using 2 extractors: a "many rows" extractor which is then called by a "just one row" extractor via the "URLs from another API / Chain APIs" option. This has given me the data that I need and I have saved it as a dataset. Although the dataset is working I'm lost as to how to use the REST API to retrieve it?
I'm aware that there used to be an "integrate" button on the dataset page which would outline how to use an external client library to get the data but in it's absence now I don't know which of the APIs to use and how to use them?
I've attempted at using the "Query Methods" GET call "/store/connector/{id}/_query" but it requires an "id" which I don't know where to find from? I attempted to use the "_connectorVersionGuid" value when I saved the dataset as JSON but that didn't work.
Any help and advice would be much appreciated.
Thanks,
AJ

AJ,
Francesco here, from import.io.
First of all, thanks for formulating the question so clearly.
I have bad news and good news.
The bad news is at the moment Bulk and Chain are only available as client side feature, so it's not really possible to call a chain with a single REST call.
The good news is that we are actually working on it :)
Bulk as an API is actually in beta testing, and I hope to have a Chain as an API as well.
A workaround I sometimes use myself is to use an external integration as a service platform, like Node-RED (http://nodered.org/) or built.io (https://www.built.io/)

Related

How to connect API as data source in Tableau?

I need to use two data sources. One is SQL and another one is the response from a rest API.
I tried to implement WDC, but it needs an HTML and user need to interact with UI and getting the response.
But I don't want to create a html page.
Is there any way to use an API response as a data source in Tableau?
The short answer is that you can not use API directly as a data source but you should build a pipeline to transform this into flat-file o populate a database table.
The alternative answer is to use Python to connect to the REST API. You can choose to use TabPy or follow some pre-build solution like this one. Personally, I don't know how the performances could be.

Cakephp 3.1 REST - Filtering data

I have setup my application for REST access as per documentation. The default routes are working well. I am able to retrieve, update and delete records, however, I am not sure how I could filter data sending parameters to the controller. I wonder if I can do that using querystring or if there is a better way to accomplish that. Please can someone give me directions?
Reads about the Request object in the manual. And use the Search Plugin for filtering.
The search plugin comes with a lot of documentation that explains how to use it as well.
Your question is so generic that a proper answer would end up in a whole article - which I'm obviously not going to write, there is enough information available on HTTP requests and query params. Use Google or read these links:
https://developer.mozilla.org/en-US/docs/Web/HTTP/Messages
https://www.w3.org/Protocols/HTTP/Request.html

How do I use wit.ai with existing rows of data?

I have a lot of existing data that I would like to use as training data for a wit.ai chatbot. The data is stored in a csv file where each row has a statement/question and a response to that statement/question.
I know that wit.ai requires you to assign intents to comments made and so I'm wondering if there is a way to simply send over the data I have and have the chatbot start learning intents on its own.
Thanks!
Thanks for posting. We know this is not perfect yet but we release an import/export feature a few days ago. Looking at the structure of the json export, one can probably easily feed with existing data. It would require creating one story per statement/question and a response. More info here:
https://wit.ai/docs/recipes#copyexportversion-my-app
"Teaching" Wit.Ai is not exactly what some might think it is.
You will have to create stories for your User says column. The replies are irrelevant to be honest. You can't "teach" wit.ai to reply. Replies are defined in the story or in your code.
What wit.ai might need from your data are keywords and key-phrases which make the entity recognition better for wit.ai.
Here is the simplest example:
Entity color is recognized based on keywords listed. So if you have a lot of data as an example of user input - you can try to break it down first into "which entities which user input should produce" and then keywords from those input.
Using your data for "teaching" - would be a little difficult since it will require you to create a lot of Stories in wit.ai to cover possible user input and entity identification. But you can still do it like this:
(rough example)
Make one story about user asking the time for example
Mark in the user input which entities should be derived from that input:
Sort your list you have to get all possible way of asking for the time:
How late is it?
Can you tell me the time?
I wonder what's the time now?
Use a script (Python) to "shoot" all these user inputs at your story.
Once done - go to Understanding time of wit.ai and go through all input correcting\adding the entities you defined.
This process will "teach" entities if they are keywords based or some other algorithm.
That's the best I can think of about how to use your existing data. Wit.Ai is different from other language processing tool-sets and "teaching" it with existing data is somewhat "puzzling" :)

async autocomplete service

Call me crazy, but I'm looking for a service that will deliver autocomplete functionality similar to Google, Twitter, etc. After searching around for 20 min I thought to ask the geniuses here. Ideas?
I don't mind paying, but it would great if free.. Also is there a top notch NLP service that I can submit strings to and get back states, cities, currencies, company names, establishments, etc. Basically I need to take unstructured data (generic search string) and pull out key information with relevant meta-data.
Big challenge, I know.
Sharing solutions I found after further research.
https://github.com/haochi/jquery.googleSuggest
http://shreyaschand.com/blog/2013/01/03/google-autocomplete-api/
If you dont want to implement it yourself, you can use this service called 'Autocomplete as a Service' which is specifically written for these purposes. You can access it here - www.aaas.io.
you can add metadata with each record and it returns metadata along with the matching results. Do check out demo put up on the home page. It has got a very simple API specifically written for autocomplete search
It does support large datasets and you can apply filters as well while searching.
Its usage is simple - Add your data and use the API URL as autocomplete data source.
Disclaimer: I am founder of it. I will be happy to provide this service to you.

RESTful - GET or POST - what to do?

Im working on a web service that i want to be RESTful. I know about the CRUD way of doing things, but I have a few things that im not completly clear with. So this is the case:
I have a tracking service that collects some data in the browser (client) and then sends it off to the tracking server. There are 2 cases, one where the profile exists and one where it does not. Finally the service returns some elements that has to be injected to the DOM.
So basically i need 2 web services:
http://mydomain.tld/profiles/
http://mydomain.tld/elements/
Question 1:
Right now im only using GET, but im rewriting the server to support CRUD. So in that case i have to use POST if the profile does not exist. Something like http://mydomain.tld/profiles/ and then POST payload have the information to save. If the profile is existing i use PUT and http://mydomain.tld/profiles// and payload of PUT has data to save. All good, but problem is that as far as i understand, xmlhttp does not support PUT. Now is it ok to use POST even though its an update?
Question 2:
As said my service returns some elements to be injected into the DOM, when a track is made. Logically, to keep it RESTful, i guess that i would have to use POST/PUT to update the profile and then GET to get the elements to inject. But to save bandwidth and resources on the serverside, it makes more sense to return the elements with the POST/PUT to profiles, even though its a different resource. What are your take on this?
BR/Sune
EDIT:
Question 3:
In some cases i only want to update the profile and NOT receive back elements. Could i still use same resource and then using a payload parameter to specify if i want elements, e.g. "dont_receive_elements:true"
On question #1, are you sure that xmlhttp does not support "put"? I just ran http://www.mnot.net/javascript/xmlhttprequest/ on three browsers (Chrome, Firefox, IE) and according to the output, "put" was successful on all browsers. Following the information on http://www.slideshare.net/apigee/rest-design-webinar (and I highly recommend checking out the many Apigee videos and slideshows on restful API), "put" is recommended for the use case you mention.
But you may be able to avoid this issue entirely by thinking a little differently about your data. Is it possible to consider that you have a profile and that for each profile you have 0 or more sets of payload information? In this model the two cases are:
1. No profile exists, create profile with a POST on .../profiles/ Then add elements/tracking data with posts to .../profile/123/tracks/ (or .../profile/123/elements/)
2. Profile exists, just add the elements/tracking data
(Sorry without understanding your model in detail, it is hard to be very precise).
As for question #2 - going with a data model where a profile has 0 or more elements, you could update the profile (adding the necessary elements) and then return the updated profile (and its full graph of elements), saving you any additional gets.
More generally on question #2, as the developer of the API you have a fair amount of freedom in the REST world - if you are focused on making it easy and obvious for the consumers of your API then you are probably fine.
Bottom line: Check out www.apigee.com - they know much more than I.
#Richard - thanks alot for your links and feedback. The solution i came down to is to make the API simple and clean as you suggest in your comment, having seperate calls to each resouce.
Then to be able to save bandwidth and keep performance up, I made a "non-official" function in the API that works like a proxy internally and are called with a single GET, that updates a profile and returns an element. This, i know, is not very restful etc, but it handles my situation and is not part of the official API. The reason i need it to support GET for this i need to call it from javascript and cross domain.
I guess i could have solved the cross domain by using JSONP, but i would still have to make the API "unclean" :)