Bitbucket server rest api retrieving commit data from a tag that contains slashes - rest

I am trying to retrieve commit data for a tag that has slashes in the name using BitBucket REST API for bitbucket server.
Ex tag:
release/2020-09-23-v3.13.4
I am using the following rest URL for a GET request but getting a 404 error.
https://git.server.com/rest/api/1.0/projects/TEST/repos/test/commits/release/2020-09-23-v3.13.4
Is there a way I can properly format this call to retrieve the commit data for the tag above?
Thank you so much for your help.

According to the docs this one should work
https://git.server.com/rest/api/1.0/projects/TEST/repos/test/tags/release/2020-09-23-v3.13.4

Related

Data Factory can't download CSV file from web API with Basic Auth

I'm trying to download a CSV file from a website in Data Factory using the HTTP connector as my source linked service in a copy activity. It's basically a web call to a url that looks like https://www.mywebsite.org/api/entityname.csv?fields=:all&paging=false.
The website uses basic authentication. I have manually tested by using the url in a browser and entering the credentials, and everything works fine. I have used the REST connector in a copy activity to download the data as a JSON file (same url, just without the ".csv" in there), and that works fine. But there is something about the authentication in the HTTP connector that is different and causing issues. When I try to execute my copy activity, it downloads a csv file that contains the HTML for the login page on the source website.
While searching, I did come across this Github issue on the docs that suggests that the basic auth header is not initially sent and that may be causing an issue.
As I have it now, the authentication is defined in the linked service. I'm hoping that maybe I can add something to the Additional Headers or Request Body properties of the source in my copy activity to make this work, but I haven't found the right thing yet.
Suggestions of things to try or code samples of a working copy activity using the HTTP connector and basic auth would be much appreciated.
The HTTP connector expects the API to return a 401 Unauthorized response after the initial request. It then responds with the basic auth credentials. If the API doesn't do this, it won't use the credentials provided in the HTTP linked service.
If that is the case, go to the copy activity source, and in the additional headers property add Authorization: Basic followed by the base64 encoded string of username:password. It should look something like this (where the string at the end is the encoded username:password):
Authorization: Basic ZxN0b2njFasdfkVEH1fU2GM=`
It's best if that isn't hard coded into the copy activity but is retrieved from Key Vault and passed as secure input to the copy activity.
I suggest you try to use the REST connector instead of the HTTP one. It supports Basic as authentication type and I have verified it using a test endpoint on HTTPbin.org
Above is the configuration for the REST linked service. Once you have created a dataset connected to this linked service you can include it in you copy activity.
Once the pipeline executes the content of the REST response will be saved in the specified file.

Data Factory v2 - connecting using REST

The aim is to connect to a public REST api using ADF. It's my first stab at sending requests to a REST api in ADF. It is the Companies House ('CH') governement website's API in England.
I have created an account and obtained a key. Apparently, it is basic authentication and the user name is the API key and password will be ignored (CH note on authentication)
I want to explore the contents of the 'Search all' API (CH note on Search All) and want to copy the results to Blob Storage.
I therefore set the linked service to use REST as below, the obfuscated User Name is the key I obtained from CH, the password is jsut the key repeated as their documentation states they ignore the password:
[
I then have added a REST dataset referencing this linked service:
And the testing of the connection works fine.
Problems then arise in the copy data task, I am getting an error when previewing and also when I attempt a copy to blob of 'Invalid Authorization Header':
I'd be grateful for pointers on where I'm going wrong.
I can't reproduce your Auth error but i notice that you want to add some parameters with your GET request in the Request Body.
I think you need to add parameters in relativeUrl property:
A relative URL to the resource that contains the data. When this
property isn't specified, only the URL that's specified in the linked
service definition is used. The HTTP connector copies data from the
combined URL: [URL specified in linked service]/[relative URL
specified in dataset].
Also i suggest you checking the correct REST API format of Search Api you are using.There is no other special features in the ADF REST connector. Just make sure the GET request works locally and duplicate it.

How can I get a JSON response from Github's REST API?

I'm trying to use this guide to get a list of all issues from a repository. For example, let's look at the facebook/react repository.
When I do a GET request to https://github.com/facebook/react/issues/ it just returns the web page, but what I want is a JSON with all the issues.
How can I get a JSON response?
You need to use the API's root endpoint, on the api subdomain:
GET https://api.github.com/repos/facebook/react/issues
^^^^

Cannot retrieve results from Bamboo using REST API

I am trying to build result information from an Atlassian hosted instance of Bamboo using the REST API, but have hit a roadblock almost immediately.
Trying to get the result information for a specific plan I would expect the following URL to return a result set:
https://mydomain.atlassian.net/rest/api/latest/result/PROJECT-PLAN
Where PROJECT-PLAN has been copied from the UI URL for the given plan:
https://mydomain.atlassian.net/builds/browse/PROJECT-PLAN
This is based on the documentation here:
https://developer.atlassian.com/display/BAMBOODEV/Bamboo+REST+Resources#BambooRESTResources-BuildService—SpecificPlan
However, when I try to retrieve this through my browser (authenticated, with access to the UI URL), I get the following response:
<status>
<status-code>404</status-code>
<message>
null for uri: https://mydomain.atlassian.net/rest/api/latest/result/SFDC-AQB
</message>
</status>
I fear I must be missing something obvious.
Turns out that my base URL was directing me to the JIRA REST API, rather than Bamboo:
Adding "builds" into the URL solves the problem.
https://mydomain.atlassian.net/builds/rest/api/latest/...

Get raw file content using Stash Rest API

I am able to get raw file content using the Bitbucket REST API, as
https://api.bitbucket.org/1.0/repositories/AccountName/Repo_Slug/raw/master/MyFolder/MyFile.cs,
Is there a equivalent to get it from Stash using Stash Rest API. I couldn't find it here:
https://developer.atlassian.com/static/rest/stash/2.0.1/stash-rest.html#resources
Just specify the file's URL and append ?raw
http://example.com/projects/TES/repos/testrepo/browse/testfile?raw
As I mentioned, that is not a function of either REST API, it is just the full URL of the file.
With 1.0 of stash/bitbucket api, you could use the below to get the raw file content
https://example.com/rest/api/1.0/TES/repos/testrepo/raw/testfile?at=feature/branch