Access content of artifact inside Artifactory execution user plugin - plugins

Is it possible to access the content of an artifact inside an Artifactory execution user plugin?
I'd like to unzip a file an read a nested JSON file.

You can use afterCreate method of UserPlugins-Storage https://www.jfrog.com/confluence/display/RTF/User+Plugins#UserPlugins-Storage
For example take a look at https://github.com/jfrog/artifactory-user-plugins/tree/master/storage/remoteBackup

Related

How to use Bamboo's rest api to download an artifact

I am trying to write a bash script to download an artifact from Bamboo so that it can be used for other operations. I tried following the solution suggested in this post - using bamboo/rest/api/latest/result however I still just get the xml showing the location and the name of the artifact. The artifact itself is not downloaded. Any suggestions on how to go about this?
Below is the api cmd that I am using:
http://myhost.com:8085/bamboo/rest/api/latest/result/{projectKey}-{buildKey} [GET]
After much tweaking I managed to get it to work. I had messed up parsing of the artifact location link extracted from the xml. With the correct link, obtained from the xml, and the right authentication the artifact can be downloaded.

Read file from GH repository (Probot)

I'm using Probot/Octocat to run some code checks, one of the things I'm doing is checking against a list of breaking changes in a separate repository on an internal package update. Is there any easy way to read a file (.md) from a separate private repository easily within Probot on a pull request action, or do I need a manual request?
You can use Probot's Octokit instance to interact with GitHub API in an easy way.
Checking Octokit's API Docs, you can find here a way to get the content of a file:
context.octokit.rest.repos.getContent({
owner,
repo,
path
});
I used Octokit's API for reading a file in this script if you want to check an example

List files of a build artifact

I've seen that in the 5.0 preview of the REST API it seems possible to download a specific file from a build artifact using :
GET https://dev.azure.com/{organization}/{project}/_apis/build/builds/{buildId}/artifacts?artifactName={artifactName}&fileId={fileId}&fileName={fileName}&api-version=5.0-preview.5
But how to list the files of an artifact ? I don't know what to input for fileId.
My usecase is a folder archived as an artifact during the build. I would like to get download links for each files in the folder.
I found that the API you using is not having complete documentation.
I used below to download specific file from Build artifacts using PowerShell. You could get the container ID from GET build details.
https://$collectionurl/tfs/$teamproject/_apis/resources/Containers/$containerID?itemPath=drop%filename.txt

Retrieve a downloadTicket for an artifact in VSTS

I am trying to obtain a "publicly accessible" link for the artifact produced during the build process. The API does reference something called a downloadTicket but the API call doesn't seem to return anything related. I understand that the download would need to provide the downloadTicket through a header, but for now, my question is:
What call do I need to make, either through the REST API or within a
build task itself, to get the artifact information, including the
downloadTicket?
Or option two, is there something else I can to avoid uploading
the file to Azure blob, etc.?
Why do you have to retrieve the downloadTicket?
If you just want to download the artifacts you can use the REST API - Artifacts - Get to retrieve the downloadUrl, then you can share and use that URL to download the specific artifact. (Pleas note that the users need the correct permission to view/download build artifacts)
"downloadUrl": "https://{account}.visualstudio.com/{project}/_apis/build/builds/235/artifacts?artifactName=drop&api-version=5.0-preview.3&%24format=zip"
If you want to download the artifacts in build/release process, then you can use Download Build Artifacts task.
For your option 2, you can create a share location then select a file share as the Artifact publish location and specify the share location. Thus the artifacts will be published to the shared folder... You can set the share folder to "publicly accessible"...

Is it possible to export delivery pipeline settings?

I would like to use the same bluemix delivery pipeline for several apps. Could I export its settings to some template?
Thanks in advance!
From the doc found here,
Section: GENERATING A YAML FILE FROM A PIPELINE
You can generate a YAML file from a pipeline.
Generate the file from an existing pipeline with a URL in this format:
http(s)://<DevOps Services domain>/pipeline/user/project/yaml
This call does not require an accept header. You can use this call from a browser.
Note: For safety reasons, secure-stage environment property values are omitted from generated pipeline YAML files.
To reuse this downloaded template, per the link shared above - simply create a .bluemix folder at the root of your other code project folders and place this template file within this .bluemix folder. The file should be named as pipeline.yml
How to do it nowadays (March 2020)
To export a pipeline simple add /yamlto pipeline url. example:
From:
https://cloud.ibm.com/devops/pipelines/<pipeline-guid>?env_id=<ibm_cloud_region>
To:
https://cloud.ibm.com/devops/pipelines/<pipeline-guid>/yaml?env_id=<ibm_cloud_region>
Download the file and store it as pipeline.yml. Put it inside .bluemix folder in the root of your project as #Sanjay.Joshi said.