Import API not working in sisense - rest

I was trying to use the dashboard import API from v1.0 which can be found in the REST API reference. I logged in to http://localhost:8083/dev/api/docs/#/ , gave the correct authorization token, and a dash file in the body, and a 24 character importFolder and hit the Run button to fire the API. It returns 201 as HTTP response, which means the request was successful. However, when I go back to the homepage, I don't see any new dashboard imported in to the said folder. I have tried both cases, where the importFolder exists (already created manually be me), and does not already exist, where I expect the API to create it for me. Neither of these, however, create/import the dashboard

A few comments that should help you resolve this:
When running the command from the interactive API reference (swagger) you don't need the authentication token, because you're already logged in with an active session.
Make sure the json of your dashboard is valid, by saving it as a .dash file and importing via the UI
The folder field is optional - if you leave the field blank, the dashboard is imported to the root of your navigation/folders panel.
If you'd like to import to a specific folder, you'll need to provide the folder ID, not its name, which can be found several ways such as using the /api/v1/folders endpoint, where you can provide a name filtering field and use the oid property of the returned object as the value for the folder field in the import endpoint.
If you can't get this to work still, use chrome's developer tools to look at the outgoing request when you import from the UI and compare the request (headers, body and path) to what you're doing via swagger in order to find the issue.

Related

What is the workflow for a basic Auth OIDC with Keycloak

I have keycloak on docker (v20.0.2) and as you know some versions change some or good part of the UI, so is hard to follow tutorials around the web...
I am trying to follow this particular tuto
https://developers.redhat.com/blog/2020/11/24/authentication-and-authorization-using-the-keycloak-rest-api#keycloak_sso_demo
that seems the more updated. My keycloak is actually behind traeffic and thomseddon/traeffic-fordward-auth with a docker-compose file (but the connection through traeffic is good and I have acces to admin UI)
So on step 10 of the tutorial things change for me, I have to look for that particular view inside:
Click on lateral menu Client Scope
Click on button Create client scope
Give a name to the scope, and click on Tab Mapper
All mappers are predefined... so there is no "New mapper" don't understand this bit
then just follow the tuto
With that series of steps I get an error when retriving the token...
https://keycloak:8443/realms/education/protocol/openid-connect/token
enter image description here
(this are fake local data from the realm I created for testing)
that responds with a or something similar I have also tried to change the grant_type to password, and the same happens can not query the token....
{
"error": "invalid_client",
"error_description": "Invalid client or Invalid client credentials"
}
But if I do not link a user with an scope/role as in the tuto suggest then I get the token, but of course I want to use the role or scope to limit who can see which endpoint and who can not
Any step that I'm missing from this update, do you have the same error?
Thank you in advance
I have tried to run it with different combinations of options to see if there is a toggle that actually allows me to fetch the token
Also with different types of grant_type
I will build an API in Python (I don't know Java and prefer Json instead of XML) that connect to this keycloak to allow users or not based on their scope/role/permission or something
I need to be able to block user so if user Student try to access an url from another Student he get blocked that url. So is based on the role or scope or I don't know which is prefered or easer to accomplish, the mission is to block users or not based on a factor that could be used for this in keycloak.

Can't create working Shared Access Signature for Azure Files

I need to create a SAS so I can create an Azure SQL Extended Event session. The event session needs a file data storage target via SAS and I can't create one that works. Here's what I've tried:
Identified a storage account that's not blob; just general. I'm pretty sure I need general so I can create files directly.
Created a file share therein.
Using azure storage explorer, right clicked on that file share and selected, "Get Shared Access Signature."
Checked Read, Write, List and created.
This gives me the URL https://mystorageacct.file.core.windows.net/xevents?st=2018-12-25T16%3A29%3A51Z&se=2018-12-29T16%3A29%3A00Z&sp=rwl&sv=2018-03-28&sr=s&sig=mysig
If I just try to follow this URL or create a CloudFile object with it in code, I get the oft-seen error, Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. Signature did not match. String to sign used was rwl 2018-12-25T16:29:51Z 2018-12-29T16:29:00Z /file/cs7f0fbc5104d4ax435dx883/$root 2018-03-28
Tried adding in comp=list&restype=container as suggested here. No joy.
Ensured I have no access policy in use.
Went to the azure portal and created a different SAS at the storage account level (couldn't see a way to create it on the file share). That gave me this "File service SAS URL": https://mystorageacct.file.core.windows.net/?sv=2018-03-28&ss=bfqt&srt=sco&sp=rwdlacup&se=2018-12-30T01:25:16Z&st=2018-12-26T17:25:16Z&spr=https&sig=mysig
If I try that URL I get Value for one of the query parameters specified in the request URI is invalid. I don't know which parameter is in question, they look fine to me, but I don't know what the value srt=sco indicates. Based on this doc srt is resource type, but I don't know what the value sco indicates.
Very confused, looking for suggestions.
For any future readers, extended event sessions confusingly (because they write a file) require blob containers, not general/file/queue containers. At least I could only get them to work that way.
You are probably confused by how the SAS URLs are presented. In fact, the SAS URLs you got just provide examples of how to use the SAS token, they can't be used directly. Hence you saw those errors occur.
Service-level SAS URL, i.e. the one you got from Storage Explorer.
It's in the format of fileEndPoint/fileShareName?SASToken. The SASToken gives us permission to operate on all files inside the specified file share. To leverage the token, we need to add fileName in the URL, i.e. fileEndPoint/fileShareName/fileName?SASToken.
comp=list&restype=container is to list blobs in Blob Container, not for File Share.
Account-Level SAS URL, the one you got form Azure portal.
It's in the format of fileEndPoint/?SASToken. Likewise, we need to complement the URL to make it valid, i.e. fileEndPoint/fileShareName/fileName?SASToken. Note that this SASToken has all permission on all Storage resources because all choices are checked.
sco means we have permission to operate on service, container, and object, which indicates the scope of permission, check doc for details.
I am not familiar with Azure SQL Extended Event session, but if you only need to work with files inside one file share, 1st is enough.

Calling IBM Function through HTTP

I created a function in IBM cloud which displays some JSON data when invoked. I am trying to figure out how client can consume this information. I am unable to find any information on net. I would like to preferable access the function through HTTP request or if that is not possible do it through some python script. Does anyone have more information on how this can be achieved?
Depending on the nature of your action/function, there are different ways to call it. In any case, you can find the required information about URL (and API key), by clicking the action in the action panel, and select 'endpoints' from there.
If you created a 'plain' action (i.e. one that accepts JSON and returns JSON), you will have to use the API key shown on the panel mentioned above. You can find it -- and the URL to use, in the 'Rest API' section. At the bottom of this page, there is also a complete curl command, which you can just copy & paste (and where you only need to insert the API key).
In case you created a web action (see here for details: https://console.bluemix.net/docs/openwhisk/openwhisk_webactions.html#openwhisk_webactions) , you can call it anonymously. The URL for that is different than the one referred to above -- you can find it in the 'web actions' section of the 'Endpoints' tab.

Downloading and Moving OneDrive files from shared link directory

I am looking for assistance to find out how I can download and move a OneDrive file that is accessed through a shared directory, via the shared link method of sharing.
I have two users:
user 'A' who is a Microsoft Consumer and has a regular OneDrive account and will host a csv file 'test.csv' in a folder 'toshare'
and user 'B' who is also a regular Microsoft Consumer who should use the graph API to download test.csv and then move the file to a subdirectory /toshare/archive
Aside: I am currently using the chrome app "advanced REST client" to manually make the REST calls, and am getting Authenticated OAuth BEARER tokens by inspecting network traffic from Microsoft's online "Graph Explorer" tool. After we understand the calls, we'll integrate it into our Java app.
I have succesfully followed the instructions here:
https://developer.microsoft.com/en-us/graph/docs/api-reference/v1.0/api/shares_get
to view the folder contents.
To be more explicit, user 'A' has went into OneDrive and has right clicked the folder 'toshare' and selected shareLink. I have converted the shareLink to a share token and then used the following API call with the Graph API as user 'B':
GET https://graph.microsoft.com/v1.0/shares/<share-token>/root?$expand=children
this shows me all the files in the directory, which includes 'test.csv'
Now, using this information, how can I download test.csv? Assuming user 'B' doesn't know the name of the file, but can identify it by being a .csv file (we can do this in code). There does not appear to be much documentation on how to download the files through a share.
The closest I've gotten was to take the "webUrl" attribute of the children object for my file, and then turn that into a share token and call
GET https://graph.microsoft.com/v1.0/shares/<child-share-token>/root
This will show me the file meta-data. and then I try to download it by roughly following the api documentation to download https://developer.microsoft.com/en-us/graph/docs/api-reference/v1.0/api/item_downloadcontent
GET https://graph.microsoft.com/v1.0/shares/<child-share-token>/root/content
This is interesting because this works if I make the call with user 'A' but does not work for user 'B' who instead gets a 403 in advanced REST client. (If I run it in Graph Explorer, I get "The site in the encoded share URI is invalid." instead, which I've discovered with other experimentation, really means there's an authorization issue.)
GET https://graph.microsoft.com/v1.0/shares/<share-token>/root:/test.csv:/content
Also does not work, it returns: "400 Bad Request" with message: "Resource not found for the segment 'root:'." It seems like the path style file navigation does not work for shared directories?
At this point I'm rather stuck. After downloading the file, I also would like to move it into a subdirectory, denoting that it has already been read in. I'd also like to get this working for OneDrive for Business, but that seems to be another set of challenges that I'll leave for another day.
Any insight would be great thanks,
Jeremy
It's best to consider the shares/{id} segments to be similar to drives/{id}, at which point all of the previous documentation around children access is applicable. Given your scenario I'd use the path syntax:
https://graph.microsoft.com/v1.0/shares/<share-token>/root/children/test.csv
This obviously necessitates knowing the file name, but it sounds like you already have an algorithm to do that.
Theoretically your approach for creating a child-share-token would work, but it would now require that User B both provide authentication as well as to have explicit permissions. Since your share-token was a sharing link User B is most likely getting permission by virtue of the fact that they have the URL, in which case generating a new one is probably removing the special token that allows this to work. That's why it's best to always use the original share-token where possible.
Similar rules will apply to move the file. First off, we'll assume that the sharing link provides the ability to "Edit" otherwise none of this will work :). Second, we'll assume that the archive folder already exists (if it doesn't you'd need to create it using a POST to https://graph.microsoft.com/v1.0/shares/<share-token>/root/children that looks like what we've documented here).
To move the file you'd want to PATCH to https://graph.microsoft.com/v1.0/shares/<share-token>/root/children/test.csv and provide a new parentReference as documented here. It's always best to use id values if you have them, but you should also be able to provide the path to the parent in the form of /shares/<share-token>/root/children/archive.

Unable to set Deezer playlist to private using api

I am working on a project where we create playlists in Deezer using the api, we have been doing this successfully since the call to make it was implemented. However one of our testers noticed today that the Playlists were no longer private. I can verify this is the case in the Deezer api explorer and in my code.
Go to http://developers.deezer.com/api/explorer?url=playlist/4341978# (need to change id to a playlist that your account has created and use getToken feature)
Change the method to post
Add parameter public with value of false
Open console and watch response
Will return an "Input Error" message
The exact same thing is happening in my code when sending the same kind of request (including the token as a parameter) which has previously definitely worked as intended.
Do you know if the api has changed and I am missing some extra parameters or config? Or possibly there is error which is causing this?