I need to create a SAS so I can create an Azure SQL Extended Event session. The event session needs a file data storage target via SAS and I can't create one that works. Here's what I've tried:
Identified a storage account that's not blob; just general. I'm pretty sure I need general so I can create files directly.
Created a file share therein.
Using azure storage explorer, right clicked on that file share and selected, "Get Shared Access Signature."
Checked Read, Write, List and created.
This gives me the URL https://mystorageacct.file.core.windows.net/xevents?st=2018-12-25T16%3A29%3A51Z&se=2018-12-29T16%3A29%3A00Z&sp=rwl&sv=2018-03-28&sr=s&sig=mysig
If I just try to follow this URL or create a CloudFile object with it in code, I get the oft-seen error, Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. Signature did not match. String to sign used was rwl 2018-12-25T16:29:51Z 2018-12-29T16:29:00Z /file/cs7f0fbc5104d4ax435dx883/$root 2018-03-28
Tried adding in comp=list&restype=container as suggested here. No joy.
Ensured I have no access policy in use.
Went to the azure portal and created a different SAS at the storage account level (couldn't see a way to create it on the file share). That gave me this "File service SAS URL": https://mystorageacct.file.core.windows.net/?sv=2018-03-28&ss=bfqt&srt=sco&sp=rwdlacup&se=2018-12-30T01:25:16Z&st=2018-12-26T17:25:16Z&spr=https&sig=mysig
If I try that URL I get Value for one of the query parameters specified in the request URI is invalid. I don't know which parameter is in question, they look fine to me, but I don't know what the value srt=sco indicates. Based on this doc srt is resource type, but I don't know what the value sco indicates.
Very confused, looking for suggestions.
For any future readers, extended event sessions confusingly (because they write a file) require blob containers, not general/file/queue containers. At least I could only get them to work that way.
You are probably confused by how the SAS URLs are presented. In fact, the SAS URLs you got just provide examples of how to use the SAS token, they can't be used directly. Hence you saw those errors occur.
Service-level SAS URL, i.e. the one you got from Storage Explorer.
It's in the format of fileEndPoint/fileShareName?SASToken. The SASToken gives us permission to operate on all files inside the specified file share. To leverage the token, we need to add fileName in the URL, i.e. fileEndPoint/fileShareName/fileName?SASToken.
comp=list&restype=container is to list blobs in Blob Container, not for File Share.
Account-Level SAS URL, the one you got form Azure portal.
It's in the format of fileEndPoint/?SASToken. Likewise, we need to complement the URL to make it valid, i.e. fileEndPoint/fileShareName/fileName?SASToken. Note that this SASToken has all permission on all Storage resources because all choices are checked.
sco means we have permission to operate on service, container, and object, which indicates the scope of permission, check doc for details.
I am not familiar with Azure SQL Extended Event session, but if you only need to work with files inside one file share, 1st is enough.
Related
I am working on a pipeline which executes oAuth2 flow in order to access REST API json data. Once I have the bearer token I am executing a request which returns the following structure:
As you can see, since the response is quite large, there's paging enabled and as part of the response I get a link to the next page. In order to get to that resource I need to also present MS-ContinuationToken in the headers. So, this is how I basically do it in the config of the Copy activity that I use to get the data from the REST endpoint:
and the issue here is that I only get the first 2000 rows and the next page(s) don't seem to be visited at all. Pipeline executes successfully and only the first 2000 items are fetched.
NOTE: continuationToken and links.next.headers.value have the exact same values from the initial response.
Even if you fix the other issue you’ll have an issue with the “next” URL not including “v1”. This is a known issue in the partner center api team. I’ve escalated it pretty high. But they don’t want to break backwards compatibility by changing the “next” URI to include the v1 or to be relative. They are considering other options but I wouldn’t hold your breath.
I would ditch the idea of using data factory and instead write a .NET console app using the partner center SDK
(You might think to paginate manually with loops etc but the Copy activity doesn’t return eg the http headers, so you will need a complex set up to somehow store the data in a data store and be able to look up the last page in order to get the continuation token. I couldn’t figure it out)
I have a purchased license at fusion charts. Fusion charts document says here... https://www.fusioncharts.com/dev/upgrading/license-activation that i have to use this function call for applying license...
FusionCharts.options.license({
key: 'KEY_GOES_HERE',
creditLabel: false,
});
if i put my key in that anyone visiting my website can easily take it from view -> source of browser. how do i avoid it ? It is simple HTML website.
You can use either of the followings :
Instead of passing the keys directly in JS code, pass them via the environment, application variables. And add the env file to gitignore
Use Obfuscation for the key and JS code being used for license activation.
You can store the value of the key in your database & and fetch the data value so that the end-user is unable to access it.
If you know the domain/sub-domains where charts will be used, you can restrict the license keys to those domains by sending a request to sales#fusioncharts.com
As far as I know, there is no way to completely hide it from the frontend. The reason for this is that even if the key is hidden in some way when the request is made, the information has to exist and so the key could be found by looking at the request data.
What you wish to protect against determines how you should go about it. If you would just like to hide it from being picked up by webscrapers, then I would imagine encrypting it and then decrypting it before you send would be sufficient.
If you would like to protect against anyone stealing it, then the only way is to have a server act as a proxy. You would make a call to an API which would then use your API key on the backend, away from the prying eyes of users. Your backend API would then just return to the user the response from fusioncharts.
I am looking for assistance to find out how I can download and move a OneDrive file that is accessed through a shared directory, via the shared link method of sharing.
I have two users:
user 'A' who is a Microsoft Consumer and has a regular OneDrive account and will host a csv file 'test.csv' in a folder 'toshare'
and user 'B' who is also a regular Microsoft Consumer who should use the graph API to download test.csv and then move the file to a subdirectory /toshare/archive
Aside: I am currently using the chrome app "advanced REST client" to manually make the REST calls, and am getting Authenticated OAuth BEARER tokens by inspecting network traffic from Microsoft's online "Graph Explorer" tool. After we understand the calls, we'll integrate it into our Java app.
I have succesfully followed the instructions here:
https://developer.microsoft.com/en-us/graph/docs/api-reference/v1.0/api/shares_get
to view the folder contents.
To be more explicit, user 'A' has went into OneDrive and has right clicked the folder 'toshare' and selected shareLink. I have converted the shareLink to a share token and then used the following API call with the Graph API as user 'B':
GET https://graph.microsoft.com/v1.0/shares/<share-token>/root?$expand=children
this shows me all the files in the directory, which includes 'test.csv'
Now, using this information, how can I download test.csv? Assuming user 'B' doesn't know the name of the file, but can identify it by being a .csv file (we can do this in code). There does not appear to be much documentation on how to download the files through a share.
The closest I've gotten was to take the "webUrl" attribute of the children object for my file, and then turn that into a share token and call
GET https://graph.microsoft.com/v1.0/shares/<child-share-token>/root
This will show me the file meta-data. and then I try to download it by roughly following the api documentation to download https://developer.microsoft.com/en-us/graph/docs/api-reference/v1.0/api/item_downloadcontent
GET https://graph.microsoft.com/v1.0/shares/<child-share-token>/root/content
This is interesting because this works if I make the call with user 'A' but does not work for user 'B' who instead gets a 403 in advanced REST client. (If I run it in Graph Explorer, I get "The site in the encoded share URI is invalid." instead, which I've discovered with other experimentation, really means there's an authorization issue.)
GET https://graph.microsoft.com/v1.0/shares/<share-token>/root:/test.csv:/content
Also does not work, it returns: "400 Bad Request" with message: "Resource not found for the segment 'root:'." It seems like the path style file navigation does not work for shared directories?
At this point I'm rather stuck. After downloading the file, I also would like to move it into a subdirectory, denoting that it has already been read in. I'd also like to get this working for OneDrive for Business, but that seems to be another set of challenges that I'll leave for another day.
Any insight would be great thanks,
Jeremy
It's best to consider the shares/{id} segments to be similar to drives/{id}, at which point all of the previous documentation around children access is applicable. Given your scenario I'd use the path syntax:
https://graph.microsoft.com/v1.0/shares/<share-token>/root/children/test.csv
This obviously necessitates knowing the file name, but it sounds like you already have an algorithm to do that.
Theoretically your approach for creating a child-share-token would work, but it would now require that User B both provide authentication as well as to have explicit permissions. Since your share-token was a sharing link User B is most likely getting permission by virtue of the fact that they have the URL, in which case generating a new one is probably removing the special token that allows this to work. That's why it's best to always use the original share-token where possible.
Similar rules will apply to move the file. First off, we'll assume that the sharing link provides the ability to "Edit" otherwise none of this will work :). Second, we'll assume that the archive folder already exists (if it doesn't you'd need to create it using a POST to https://graph.microsoft.com/v1.0/shares/<share-token>/root/children that looks like what we've documented here).
To move the file you'd want to PATCH to https://graph.microsoft.com/v1.0/shares/<share-token>/root/children/test.csv and provide a new parentReference as documented here. It's always best to use id values if you have them, but you should also be able to provide the path to the parent in the form of /shares/<share-token>/root/children/archive.
I've been trying to get file uploads to work, following the instructions for both Dropbox and S3 but each time I just get this message:
File Upload URL not provided
It doesn't seem to be making any calls to the server. I've found this mention of a bug around file uploads:
https://github.com/formio/ngFormio/issues/322
But I suspect that applies if you're hosting it yourself. I'm using the cloud version.
I've configured it with e.g. the S3 bucket's URL, authentication etc.
What does this error actually mean?
Update: here's the syntax I'm using:
<formio form="https://formview.io/#/xxxxxxxxxxxxxxxxxxx/applicationform" url="'https://formview.io/#/xxxxxxxxxxxxxxxxxxx/applicationform'"></formio>
Thanks
In order to make the uploads work, you need to provide the URL of your form, which is used to generate the upload token to upload the files to the 3rd party providers. This can be done in one of two ways.
<formio src="'https://examples.form.io/example'"></formio>
You would use above if you wish to render the form from the JSON REST API of the form. In many cases, you may wish to provide the actual form object (which I suspect is what you are doing) like so.
<formio form="{...}"></formio>
This works fine for rendering the form, but it does not provide the URL context for file uploads. For this reason, we have the url parameter which you can include along with your form object for file uploads to work.
<formio form="{...}" url="'https://examples.form.io/example'"></formio>
Providing the url this way is passive. The form will not try to submit to that url, but rather just use it as the url configuration for file uploads.
I was trying to use the dashboard import API from v1.0 which can be found in the REST API reference. I logged in to http://localhost:8083/dev/api/docs/#/ , gave the correct authorization token, and a dash file in the body, and a 24 character importFolder and hit the Run button to fire the API. It returns 201 as HTTP response, which means the request was successful. However, when I go back to the homepage, I don't see any new dashboard imported in to the said folder. I have tried both cases, where the importFolder exists (already created manually be me), and does not already exist, where I expect the API to create it for me. Neither of these, however, create/import the dashboard
A few comments that should help you resolve this:
When running the command from the interactive API reference (swagger) you don't need the authentication token, because you're already logged in with an active session.
Make sure the json of your dashboard is valid, by saving it as a .dash file and importing via the UI
The folder field is optional - if you leave the field blank, the dashboard is imported to the root of your navigation/folders panel.
If you'd like to import to a specific folder, you'll need to provide the folder ID, not its name, which can be found several ways such as using the /api/v1/folders endpoint, where you can provide a name filtering field and use the oid property of the returned object as the value for the folder field in the import endpoint.
If you can't get this to work still, use chrome's developer tools to look at the outgoing request when you import from the UI and compare the request (headers, body and path) to what you're doing via swagger in order to find the issue.