seems trivial, but I can't get it to work: I'm sending data encoded as JSON objects to API Gateway which invokes Firehose. The resulting files in S3 contain all the JSON objects in a single line. I've read that it should be possible to add newlines, but whatever I try, there is either an error or it simply doesn't do anything. The mappping template looks like this
{
"DeliveryStreamName": "file-datadump",
"Record": {
"Data": "$util.base64Encode($input.json('$'))"
}
}
Any ideas what to do?
The answer is ridiculously easy. Simply add a line break like so in the mapping template.
#set($payload = "$input.json('$')
")
{
"DeliveryStreamName": "stream-name",
"Record": {
"Data": "$util.base64Encode($payload)"
}
}
NO LONGER CURRENT! - DO NOT USE
I was having the exact same issue. As per this helpful git issue, you can hack around it by appending a base64 encoded newline, example:
{
"DeliveryStreamName": "$stageVariables.delivery_stream",
"Record": {
"Data":"$util.base64Encode($input.params().querystring)Cg=="
}
}
It's not ideal, but hope that helps!
Related
how can i type the REST API for "LIKE" Query on LoopBack ?
according to Loopback Documentation,
i already try like this :
ProductDealers?filter={"where":{"DealerCode":"T001","Active":"true","SKU":{"like":"1000.*"}}}
but it nothing happens,
please help me?
It would be something like
Post.find({
where: {
title: {
like: 'someth.*',
options: 'i'
}
}
});
and for api calls
?filter={"where":{"title":{"like":"someth.*","options":"i"}}}
Please take a look at this PR for more info
i found the answer , i didn't know why its can't work when i used brackets " {} "
but when i used " [] " , it works very well, like
Products?filter[where][Name][like]=%25" + valFilter + "%25&filter[where][Active]=1&filter[where][Deleted]=0"
Cheers!
A bit late but i actually found this after looking for something else.
The problem for most people is, that there is something called Url Encoding.
Read more here:
https://en.wikipedia.org/wiki/Percent-encoding
So if you stringify your json filter like in the examples above, make sure you put the stringified object into an Uri Encoder which will make sure you will get what you expect and will control the encoding of your values
let t_filter = {
where: {
title: {
like: 'someth.*',
options: 'i'
}
}
};
let result = encodeURI(JSON.stringify(t_filter));
After that send the result to your Api and not only the stringified Object
Here's a REST API that I am trying for downloading data as CSV file.
(function process(/*RESTAPIRequest*/ request, /*RESTAPIResponse*/ response) {
var data = '\n'; // workaround to separate <xml> start tag on first line
data += 'Firstname,Lastname,Username' + '\n';
data += 'Nikhil,vartak,niksofteng' + '\n';
data += 'Unknown,person,anonymous' + '\n';
response.setHeader("Content-Disposition", "attachment;filename=Xyz.csv");
response.setContentType("text/csv");
response.setBody({'data':data});
})(request, response);
According to docs setBody requires a JS object and thus if I just pass data variable I get error stating that data cannot be parsed into ScriptableObject.
So with the current code I get below response:
{
"result": {
"data": "\nFirstname,Lastname,Username\nNikhil,vartak,niksofteng\nUnknown,person,anonymous\n"
}
}
And the generated CSV looks like this:
Any idea how to get rid of that XML markup on 1st and 5th line?
The setBody method expects a Javascript object which it is then going to serialize to JSON or XML based on what the client tells it do via Accept header.
In your case you want to produce your own serialized format: CSV. So instead of using the setBody method, use the stream writer interface to directly write to the response stream.
response.setContentType("text/csv");
response.setStatus(200);
var writer = response.getStreamWriter();
writer.write('Firstname,Lastname,Username\n');
writer.write('Nikhil,vartak,niksofteng\n');
etc.
Note you will have to handle all the details of CSV format yourself, including properly encoding any special characters like if you want a field to contain a comma like "Nik,hil".
Cheers,
Silas
I am having some trouble with PUT requests to the google sheets api.
I have this code
spreadsheet_inputer := WebClient(`$google_sheet_URI_cells/R3C6?access_token=$accesstoken`)
xml_test := XDoc{
XElem("entry")
{
addAttr("xmlns","http://www.w3.org/2005/Atom")
addAttr("xmlns:gs","http://schemas.google.com/spreadsheets/2006")
XElem("id") { XText("https://spreadsheets.google.com/feeds/cells/$spreadsheet_id/1/private/full/R3C6?access_token=$accesstoken"), },
XElem("link") { addAttr("rel","edit");addAttr("type","application/atom+xml");addAttr("href","https://spreadsheets.google.com/feeds/cells/$spreadsheet_id/1/private/full/R3C6?access_token=$accesstoken"); },
XElem("gs:cell") { addAttr("row","3");addAttr("col","6");addAttr("inputValue","testing 123"); },
},
}
spreadsheet_inputer.reqHeaders["If-match"] = "*"
spreadsheet_inputer.reqHeaders["Content-Type"] = "application/atom+xml"
spreadsheet_inputer.reqMethod = "PUT"
spreadsheet_inputer.writeReq
spreadsheet_inputer.reqOut.writeXml(xml_test.writeToStr).close
echo(spreadsheet_inputer.resStr)
Right now it returns
sys::IOErr: No input stream for response 0
at the echo statement.
I have all the necessary data (at least i'm pretty sure) and it works here https://developers.google.com/oauthplayground/
Just to note, it does not accurately update the calendars.
EDIT: I had it return the response code and it was a 0, any pointers on what this means from the google sheets api? Or the fantom webclient?
WebClient.resCode is a non-nullable Int so it is 0 by default hence the problem would be either the request not being sent or the response not being read.
As you are obviously writing the request, the problem should the latter. Try calling WebClient.readRes() before resStr.
This readRes()
Read the response status line and response headers. This method may be called after the request has been written via writeReq and reqOut. Once this method completes the response status and headers are available. If there is a response body, it is available for reading via resIn. Throw IOErr if there is a network or protocol error. Return this.
Try this:
echo(spreadsheet_inputer.readRes.resStr)
I suspect the following line will also cause you problems:
spreadsheet_inputer.reqOut.writeXml(xml_test.writeToStr).close
becasue writeXml() escapes the string to be XML safe, whereas you'll want to just print the string. Try this:
spreadsheet_inputer.reqOut.writeChars(xml_test.writeToStr).close
A random query like
https://api.soundcloud.com/tracks.json?genre=Rnbhiphop
gives something like
[
{
"kind":"track",
"id":161532719,
(...)
"artwork_url":null,
(...)
},
{
"kind":"track",
"id":161532718,
(...)
"artwork_url":null,
(...)
},
(..)
]
In many, many cases, artwork_url is null, although this is not consistent.
However, when looking at the single track id 161532719 (first in list above) with
http://api.soundcloud.com/tracks/161532719.json
we get
{
"kind":"track",
"id":161532719,
(...)
"artwork_url":"http://i1.sndcdn.com/artworks-000087026689-ogd56p-large.jpg?e76cf77",
(...)
}
... which strangely enough reveals that track 161532719 HAS a valid artwork_url. The same is the case with many other tracks.
Is this a bug, or am I doing something wrong here?
It looks like the collection endpoint (in this case genre), has a backend bug where the artwork is not fetched.
null
is not the same as undefined, rather it is intentionally set. If you encounter a null value, you can use data gathered from the single track endpoint, aggregate it to the tracks data from there.
If you choose to do this, I recommend firing the request for tracks endpoint the second you have the id you need, and updating using the id.
If the uploader of the track didn't attach an image to it, Soundcloud will show the uploader avatar instead.
But, when you will try to get the "artwork_url" from a JSONObject of that kind of a track, you will get "null".
To fix this issue, and to use the same logic Soundcloud use just add:
String artUrl = trackJson.getString("artwork_url");
if (artUrl == "null") {
JSONObject user = trackJson.getJSONObject("user");
artUrl = user.getString("avatar_url");
}
I have created a controller for an "applications" table. The web and REST interfaces are working but I think the add and edit functions should be better.
When I tested add and edit I found the data needed to be posted in web FORM format (not JSON).
I found I needed to use "$this->request->input('json_decode')" in the save to decode the JSON data. I thought this happened automagically.
This function now works for add (edit is similar) and displays my json/add.ctp so I can return the successful record to the user.
public function add() {
if ($this->request->is('post')) {
$this->Application->create();
//Is the request REST passing a JSON object?
if (preg_match('/\.json/', $this->request->here)){
//This is a REST call
$this->set('status', $this->Application->save($this->request->input('json_decode')));
} else {
//This is an interactive session
if ($this->Application->save($this->request->data)) {
$this->Session->setFlash(__('The application has been saved.'));
return $this->redirect(array('action' => 'index'));
} else {
$this->Session->setFlash(__('The application could not be saved. Please, try again.'));
}
}
}
}
I used the "$this->request->here" to see if it ends in ".json". Is this the "correct" way to process the REST call?
There is an entire section in the CakePHP Book for this. I think it will answer your question(s):
http://book.cakephp.org/2.0/en/development/rest.html
The question is, does your action accept JSON data & Form Data? or just JSON data?
The .json is purely for the output of your data, you are able to send JSON data with the .xml extension, the difference being once the data is sterilised, it will output in XML.
if($this->request->is('post')) {
if(empty($this->request->data)){
$data = $this->request->input('json_decode', TRUE);
} else {
$data = $this->request->data;
}
} else {
$data = $this->params['url'];
}
Above is kind of what you should be doing, check if the data comes from a form, if not, decode JSON, and if it is NOT a POST, save parameters that have been included into the URL.
I am not saying the above is the "right" way to do it, but thats probably what you are looking for.