What does ```treat_info_as_error``` do in Alpha Vantage python module - alpha-vantage

For the alpha-vantage python module, does anyone know what the argument treat_info_as_error does?
This is what I found in terms of documentation. It's not much.
TimeSeries
Definition : TimeSeries(key=None, output_format='json', treat_info_as_error=True, indexing_type='date', >proxy=None)
This class implements all the api calls to times series

When you run an almost invalid API call on Alpha Vantage you'll get a key in the JSON response that says "note" or "Information" or something like that. This parameter is whether or not you want to treat those in your code as an error.
For some reference, here is a link to the docs.
For example, if you hit an API call threshold, or use the "demo" API key for calls other than what they are suited for.
Try this API call:
https://www.alphavantage.co/query?function=TIME_SERIES_INTRADAY&symbol=TSLA&interval=5min&apikey=demo
You'll see it says "Information" as the first key.

Related

Macros in Datafusion using Argument setter

Using Argument setter by supplying the parameter value I want to make the Datafusion pipeline as resuable. As said by many other answer's have tried implementing using the cloud reusable pipeline example given in Google guide.I was not able to pass the parameter Json file.So how to create the API to that parameter Json file stored in Google storage.Please explain the values to be passed to Argument setter like URL,Request response etc., If any one of you had implemented in your projects.
Thank you.
ArgumentSetter plugin reads from a HTTP endpoint and it must be publicly accessible as is depicted within the GCP documentation. Currently, there is not a way to read from a non-public file stored in GCS. This behavior has been reported in order to be improved to CDAP through this ticket.
Can you please provide what you've tried so far and where you're stuck?
The URL field in argument setter would contain the API endpoint you're making a call to. Make sure you include any headers your call would need like Authorization, Accept etc.
If you're having issues with argument setter a good check is to use Curl or any other tool to make sure you're able to talk to the endpoint you're trying to use.
Here's some documentation about Argument setter: https://github.com/data-integrations/argument-setter
Define a JSON file with appropriate name/value pairs. Upload it in a GCS bucket - make it public by changing permissions (add "allUsers" in permissions list). When you save it, the file will say "Public to Internet"
Copy the https path to the file and use it in Arguments Setter. If you're able to access this path from curl/ your browser, Argument Setter will be able to do too..
There are other problems I've encountered while using Argument Setter though - the pipe doesn't supersede runtime arguments over default values provided in the URL many a times, specially when the pipe is duplicated.
To make file public
You have to make your bucket public, currently there is no other way.
gsutil iam ch allUsers:objectViewer gs://BUCKET_NAME

Make my google home verify an oral code

I would like to build an app with a oral code verification.
i could just set my cde in dialogflow before then, juste verify it.
GH : "For continue, give me the code"
Me : " 1 2 3 4"
GH " Access granted" / "Access denied"
But how can do an input a get this code on dialogflow?
First of all - consider if you really want to do this. Having someone say a passcode out loud isn't really very secure and adds very little additional security in a multi-user environment.
There are two stages to this - the first is setting up an Intent to handle this, specifically in the format you want, and the second would be handling and verifying this is the correct code.
Setting up the Intent
We'll need two intents - one that prompts and sets a context so we know we're expecting the validation code, and one that checks for the code.
The prompting intent might look something like this:
The notable part here is that it is setting an output context. We'll see why that matters in a moment.
The one to handle numeric input might look like this:
There is a lot more to this one. First note that we're requiring an input context that matches the output context from the last Intent. This means that this Intent should only match if that Context has been set. This lets us talk about numbers elsewhere in our conversation without triggering this validation.
Next we're looking for sequences of numbers that match the #sys.number-sequence built-in Entity type. There are other entity types that may be useful for you - see the documentation for details and pick one that makes sense or experiment to find what works best in your case.
Finally, we're going to use a webhook for fulfillment to verify if the code is correct. Which is the next session...
Verifying the code
While there are ways to do the verification without a webhook, this is really the most straightforward way to do it. If you're using Google's library to handle input from Dialogflow, you can get the value with something like
var code = app.getArgument('number-sequence');
using whatever the parameter name is. If you're not using the library, you can find this in the JSON at result.parameters.number-sequence.
You would then verify this code, however you want, and return a message indicating if it is correct or not.
If you want to use a sequence of numbers as your code you can use the #sys.number-sequence entity to recognize it and then check the code in your webhook.
Another way would be to simply make a custom entity 'code' that has an entry of '1234'.

Artifactory search: REST API returns results but web search doesn't

Try searching for any classes here, e.g. use the string ReportJobMailNotification*
https://jaspersoft.jfrog.io/jaspersoft/webapp/#/search/archive/
It returns no results. Sometimes (seemingly in a random way) it comes up with a warning "Method Not Allowed".
But if you do the same search via REST API you get consistently many results:
https://jaspersoft.jfrog.io/jaspersoft/api/search/archive?name=ReportJobMailNotification*
Any specific reasons why this might happen?
It seems to work now. I guess it was a temporary glitch...

Manipulating path mapping in AWS API gateway integration

I would like to modify an url parameter /resource/{VaRiAbLe} in an API gateway to S3 mapping so that it actually points to /my-bucket/{variable}. That is, it accepts mixed-case input, and maps it to a lower-case name. Mapping path variables is relatively simple enough to S3 integrations, but I can't seem to get a lower-case mapping working.
Reading through the documentation for mapping parameters, it looks like the path parameters are simple string values (and not templated values), and so defining a mapping as method.request.path.variable.toLowerCase() won't work.
Does anyone have any ideas how to implement this mapping?
Map path variables to a JSON body, and then call another API method that actually does the S3 call?
Bite the bullet, and implement a Lambda function to do the S3 get for me?
Find another api method for S3 that accepts a JSON body that I can use to get the data?
Update using Orchestrated calls
Following the info from Jack, I figured I should try doing the orchestrated call, since the traffic volume is low enough that I am sure that I won't be able to keep the lambda hot.
As a proof of concept, I added two methods to my resource (sitting at /resource/{variable} - GET and POST. The GET method chains to the POST, which does the actual retrieving of the data.
POST method configuration
This is a vanilla S3 proxying method, where you set the URL Path parameter for {variable} to be method.request.body.variable.
GET method configuration
This is a HTTPS proxying method. You'll need an URL for the POST method, so you'll need to deploy the API to get the URL. The only other configuration needed here is a body mapping template with content like:
{
"variable" : "$input.params('variable').toLowerCase()",
"something" : "$input.params('something')"
}
This should be enough to get this working.
The downside to this looks to be that I'm adding an extra method (POST) to my API for that resource that could confuse consumers of the API. I think it should be possible to make the POST on the /resource resource, which would at least make a bit more sense from an API design standpoint.
Depending on how frequently this API will be called, I'd either go with the Lambda proxy or chaining two API Gateway methods together. If the API is called frequently enough to keep a Lambda function warm (say once a minute), then go with Lambda. If not, go with the orchestrated API call.
The orchestrated API call would be interesting, I'd be happy to help with that if you have questions.
As far as I know the only S3 API for getting object data is the GET that is documented in their API reference.

How do I use multiple levels in my REST call?

I'm trying to create a REST service with the following signature for a GET call:
//somesite/api/customer/1/invoices
Of course using the correct path, I can get this to work, but all the documentation that I look at for REST tells me how to query .../api/customer or .../api/customer/id, but nothing tells me how to define and get to the level after id.
I suspect it will have something to do with the router code, but could use some instruction on how to get to that next level.
Thanks