I am new to REST and I am trying to test a REST call to my Private azure blob storage. I downloaded a small REST call testing program that asks for a URL and Headers (both as string).
I need to list all the blobs in a container using the method described here: List Blobs (REST API)
I am basically wondering how to write the Headers (to include my Key to access the private container).
Thanks
Edit : The program I use to test REST calls is an Extension for Chrome named "Simple REST Client"
I think you should input it this way into that chrome plugin:
Authorization="SharedKey accountname:SharedKey" x-ms-date:2013-01-24T21:33:40 (Current Date)
At least according to this document:
http://msdn.microsoft.com/en-us/library/windowsazure/dd179428.aspx
Related
I'm working in an Asp.Net Core 2 web api for files hosted at Google Cloud Storage. The files hosted there are not public, so I can't use the MediaLink property of the object. I tried to make a download endpoint using MemoryStream but when there are many users downloading large files at once I run into memory issues.
My question is: is there a way to create something link a one-time download link for a file or something similar?
I'm also trying to implement what's described in this link but I'd need to give the bearer token to the user. I can't do that.
Any tips?
Yes. Google Cloud Storage offers a feature called "signed URLs" that is what you described: a URL that is only good for a short while to download a single file. The idea is that you craft a download URL, then use the private key of a service account to "sign" the URL. Anyone holding that final URL can use it to act as that service account for the purpose of downloading that one object.
Take a look: https://cloud.google.com/storage/docs/access-control/#Signed-URLs
Writing code to generate the signed URL is a bit tricky, but the client libraries provide helper methods in several languages to do it for you. You can also generate one with the gsutil command: gsutil signurl -d 10m privatekey.p12 gs://bucket/foo
There is a code sample for generating he signed URLs programatically on their GitHub project: Signed URLs
I managed to Create it using C#. I'm posting here because this will be useful to someone else:
1 - Create your private key
2 - Create and UrlSigner:
private readonly UrlSigner _urlSigner;
2 - In your class constructor:
using (var stream = File.OpenRead(_googleSettings.StorageAuthJson))
{
_urlSigner = UrlSigner.FromServiceAccountData(stream);
}
_googleSettings.StorageAuthJson has the physical path of the json file you downloaded when creating your key.
3 - Method to get the URL:
public string GetSignedUrl(string bucketName, string objectName, TimeSpan duration) {
var url = _urlSigner.Sign(bucketName, objectName, duration, null);
return url;
}
I am working on automating the deployment of my agent, but I'm having trouble doing some steps programmatically.
Dialogflow Fulfillment URL
I was able to get Export/Restore to work using the Dialogflow Enterprise API: https://cloud.google.com/dialogflow-enterprise/docs/reference/rest/v2beta1/projects.agent/export and https://cloud.google.com/dialogflow-enterprise/docs/reference/rest/v2beta1/projects.agent/restore with the agentContent.
But, since the agentContent is an encoded string, there is no way to replace the Fulfillment URL before restoring. Is there a way to update the Fulfillment URL via API?
Dialogflow Google Assistant Integration Settings
Same question with the Google Assistant Integration Settings. Because this is part of the Dialogflow console, I see this as part of the agent. Ideally, we can programmatically create all parts of the agent. Is this available or on the roadmap?
Google Actions: Action Discovery and Update
Lastly, there is the Action Discovery and Update section of the Google Actions console, where we enable intents for push or daily updates. Is there a way to programmatically do this as well?
Thanks.
There is no way to update the fulfilment URL through the API.
agent_content is indeed an encoded byte string of the zip file. But it's possible to programmatically generate the byte string after editing the contents of the export before zipping it.
Here's a python code snippet that may help:-
with open("skeleton_bot/agent.json", "r") as jsonFile:
data = json.load(jsonFile)
data['webhook']['url'] = "https://yoururl.com"
with open("skeleton_bot/agent.json", "w") as jsonFile:
json.dump(data, jsonFile)
shutil.make_archive('skeleton_bot', 'zip', 'skeleton_bot')
with open("../config/skeleton_bot.zip", 'rb') as file_data:
agent_content = file_data.read()
You can then use this byte string to import/restore to dialogflow
I am trying to build and test the auto complete feature on a master item lookup tables using Azure Search (for a ASP MVC application). The search index was done with the suggesterName SG set to ItemDisplayName
I was looking to test it first on Azure portal- so that I could aim to replicate the results via code. This is because the results I am getting in code are quite unexpected
As I type the substring the itemDislayName, the expectation was that upto 5 selected names will be displayed
On the portal, I tried a query string of
search=str&suggesterName=SG
with the base request URL containing the index, api version and sugestorName-but I don't get results of items containing 'str' and with the fuzziness as below
Could you please guide around
[1] how I can get suggestor output in azure portal-search explorer
[2] can I control fuzziness using queryType and ~1,~2
I was referring to these 3 links
1) https://learn.microsoft.com/en-us/rest/api/searchservice/suggestions
and
2) https://channel9.msdn.com/Shows/Azure-Friday/Azure-Search-103-Azure-Search-Suggestions-with-Liam-Cavanagh
3) gunnarpeipman.com/2016/07/azure-search-suggesters/
Azure Search Portal doesn't support the Suggestion API yet. You will need to use an HTTP client like Fiddler or Postman.
Make sure you use the right URL for you Suggest requests:
https://[service name].search.windows.net/indexes/[index name]/docs/suggest
Please use our User Voice page to vote for adding the Suggest API to the Portal: https://feedback.azure.com/forums/263029-azure-search
I have already created triggers and actions for IFTTT channel. Now I want to create a recipe using these triggers and actions, but I want to do it not through Maker, but using an API call. What would be the format of the API call (behind Maker UI) to create a recipe, I am seems to unable to see any documentation or examples?
Click the URL in your maker settings to see IFTTT's description, which reads as follows, and shows your individual API key which must be used in your code:
To trigger an Event Make a POST or GET web request to:
https://maker.ifttt.com/trigger/{event}/with/key/tIpcUAlqRkf8Mls9XepGN
With an optional JSON body of:
{ "value1" : "", "value2" : "", "value3" : "" }
The data is completely optional, and you can also
pass value1, value2, and value3 as query parameters or form variables.
This content will be passed on to the Action in your Recipe. You can
also try it with curl from a command line.
curl -X POST https://maker.ifttt.com/trigger/{event}/with/key/tIpcUAlqRkf8Mls9XepGN
Yes. There is a documentation for create / triggers and actions.
You need to create the API from your server like that:
http://api.test.com:8080/ifttt/v1/triggers/{{triggers}}
More information:
Login: https://developers.ifttt.com/channels/t4/triggers
Click triggers in left menu
Create trigger name
Then as the endpoint you need to give your api url.
There is no public API. With old-type call there is internal API you can see REST calls like create/api/state whilst building a recipe, yet it might be protected from use by third party and I did not check the traffic of new applet maker platform. Note, if you are a partner you can embed your recipes into your apps. Upper tier customers are allowed to request new features (such as API or templates).
I was looking for the same, but after wasting hours, NO LUCK. So, I have decided to create one. This might be too late, but here's one repo of IFTTT boilerplate (https://github.com/Dipen-Dedania/ifttt-boilerplate) using NodeJS and express to create your own recipe (custom triggers and actions)
If I'm using GWT File widget and form panel, can someone explain how to handle upload on blobstore on google application engine??
Take a look at gwtupload. There are examples on how to use it with GAE Blobstore.
Google blobstore is specifically designed to upload and serve blobs via http. Blobstore service (obtained using BlobstoreServiceFactory.getBlobstoreService()) generates http post action for you to use in the html form. By posting file to it you upload your blob to the blobstore. When you generate this action you provide a path to the handler (servlet) where you have access to uploaded blob key:
Map<String, BlobKey> blobs = blobstoreService.getUploadedBlobs(req);
BlobKey blobKey = blobs.get("data");
Note, that "data" is the file field in your form. All you have is a key to the blob (your file). From here you take control - you can save this key for later and/or immediately serve the blob on a page (using key):
BlobKey blobKey = new BlobKey(req.getParameter("blob-key"));
blobstoreService.serve(blobKey, res);
Of course, for details see Google documentation.
One nice feature of the blobstore that it's integrated with Google Mapper (rudimentary map-reduce) service (work in progress) which lets you process files uploaded as blobs line by line: http://ikaisays.com/2010/08/