Concourse publishing meta data - concourse

Is there any best practise for publishing meta data, like a test coverage report, via Concourse? Zipping and putting to a git repo is not that nice.

It seems like metadata would fall into the category of build artifacts and could be pushed to an object store via a resource like the included s3 resource or s3-resource-simple.
The docs have an example of using the included s3 resource. These outputs could also be downloaded locally from a job.

Related

Azure Pipeline artifacts do not show under Storage

I have created a c++ pipeline where the output of the build pipeline is published to drop container. The structure is the following
drop/v1.0.0/Release/MyService.dll
drop/v1.1.0/Release/MyService.dll
drop/v1.1.0/Release/MyService.dll
My engineers will need to view drop folder and according to the version that needs to be manually deployed to a client the will download the dll file.
As far as I understand there is not any way to view them under Artifacts (what a shame). I go to the project settings under Storage but I cannot view them either there. Only place that I am able to find them is under the pipeline run and then I have to find in which version of the pipeline run a specific service version was produced. This is a maze. We have dozens of c++ projects and we have to keep track of which pipeline version run of each project matches the service version.
Is there any way to be able to access them like in a folder structure?
You could use Builds - List via rest API to get all the builds for a pipeline, then use : Artifacts - List rest API to get all the artifacts for a build. It will list all the download URL for artifacts, then you could download them together or choose the one you want to download.
Besides, you could use the publishLocation argument in publish build artifacts task to copy the artifacts to a file share (FilePath). And the file share must be accessible from the agent running the pipeline. In this way you could publish all your artifacts to the file share you want for better management.
In addition, you could also use Universal Package task to publish your artifacts to your feed for better review.

Referencing an artifact built by Github Actions

The upload/download artifact documentation implies that one should be able to build into the dist folder. My interpretation of this is that we can then reference this content in, for example, a static site, so that a site auto-builds itself for github pages on master pushes. However, it seems that artifacts are only uploaded to a specific location (i.e. GET /repos/{owner}/{repo}/actions/artifacts ) and can be downloaded only in zipped format, which defeats the purpose.
Is there a way to populate the dist folder of the repo, so that the file that was built becomes publicly and permanently accessible as part of the repo, and I can reference it without having to deploy it elsewhere like S3 etc?
Example
Here's a use case:
I have a dashboard which parses some data from several remote locations and shows it in charts. The page is deployed from /docs because it's a Github Pages hosted page.
the web page only reads static, cached data from /docs/cache/dump.json.
the dump.json file is generated via a scheduled Github Action which invokes a script that goes to the data sources and generates the dump.
This is how the web page can function quickly without on-page lockups due to lengthy data processing while the dump generation happes in the background. The web page periodically re-reads the /docs/cache/dump.json file to get new data, which should override old data on every scheduled trigger.
The idea is to have the action run and replace the dump.json file periodically, but all I can do is produce an artifact which I then have to manually fetch and unzip. Ideally, it would just replace the current dump.json file in place.
To persist changes made by a build process, it is necessary to add and commit them like after any change to a repo. Several actions exist for this, like this one.
So you would add the following to the workflow:
- name: Commit changes
uses: EndBug/add-and-commit#v7
with:
author_name: Commitobot
author_email: my#mail.com
message: "Updating build result!"
add: "docs/cache/dump.json"

Retrieve a downloadTicket for an artifact in VSTS

I am trying to obtain a "publicly accessible" link for the artifact produced during the build process. The API does reference something called a downloadTicket but the API call doesn't seem to return anything related. I understand that the download would need to provide the downloadTicket through a header, but for now, my question is:
What call do I need to make, either through the REST API or within a
build task itself, to get the artifact information, including the
downloadTicket?
Or option two, is there something else I can to avoid uploading
the file to Azure blob, etc.?
Why do you have to retrieve the downloadTicket?
If you just want to download the artifacts you can use the REST API - Artifacts - Get to retrieve the downloadUrl, then you can share and use that URL to download the specific artifact. (Pleas note that the users need the correct permission to view/download build artifacts)
"downloadUrl": "https://{account}.visualstudio.com/{project}/_apis/build/builds/235/artifacts?artifactName=drop&api-version=5.0-preview.3&%24format=zip"
If you want to download the artifacts in build/release process, then you can use Download Build Artifacts task.
For your option 2, you can create a share location then select a file share as the Artifact publish location and specify the share location. Thus the artifacts will be published to the shared folder... You can set the share folder to "publicly accessible"...

Downloading TeamCity artifact dependencies using REST

We've got a TeamCity (9.1) build configuration which is based on several snapshot dependencies to build correctly. I'm looking for a convenient way to provide each developer with a way to set up a proper build environment on their desktops. For this, I would like to download all the snapshot dependencies for a given build configuration from the TeamCity server onto the developer's desktop using the REST api.
I'm aware of how to access artifacts using REST. But this would address the artifacts created by a specific build configuration. I'm looking for a way to download all artifacts used by a given configuration specified by the dependencies.
There isn't an easy way to do this, however, it's not impossible. My answer is provided below followed by a possible alternate solution.
Answer:
The artifacts used by your target build are really just the artifacts that were created by its dependencies right?
I think what you are looking for is referenced here where you can query a build for all of its Snapshot Dependencies.
Once you have a list of the dependencies you would then need to query each of them for the artifacts they generated and then you could proceed to download them.
It's not the most straightforward thing and would require some slick Powershell or Python or whatever, but it is doable.
Another Idea:
Have you looked into something like Artifactory? It sounds like what you really need is a binary repository of sorts to track artifacts used, and artifacts created.
Or for small projects, you could probably get a way with just using a file share on the network where the build could "copy" to the share organizing files into "build" directories of some sort and then developers could "read" from the share.

Set the name of a ZIP downloadable from GitHub or Other ways to enroll Google Transit project on GitHub

I wan to start a Google Transit project (a city transport feed for google maps) and for the purpose of collaboration I want to use GitHub. Now one great thing is that GitHub is offering a ZIP file download that contains all your repository, and Google wants a ZIP with a required data, but that file should have name: google_transit.zip.
So my question is:
Can I somehow give Google a link that will give it a file called google_transit.zip, that will contain all the stuff that's in the master branch? Maybe this can be done with standard "download zip" option or with some hooks or something elseā€¦
GitHub will allow you to automatically download a Zip archive of the latest version of a branch using the following url:
https://github.com/:user/:repository/zipball/:branch [GET]
The archive will be given a special name following the git describe command output.
However, there's one way to achieve what you're after by leveraging the GitHub Repo Downloads API.
Every time your master branch is ready to be published, you'd execute the following steps:
If the download resource google_transit.zip already exists, remove it
Create a new download resource and name it google_transit.zip
Upload the latest zip archive using the provided information of the previous request
There's even a Ruby library (ruby-net-github-upload) that may help you automating this task.