Downloading TeamCity artifact dependencies using REST - rest

We've got a TeamCity (9.1) build configuration which is based on several snapshot dependencies to build correctly. I'm looking for a convenient way to provide each developer with a way to set up a proper build environment on their desktops. For this, I would like to download all the snapshot dependencies for a given build configuration from the TeamCity server onto the developer's desktop using the REST api.
I'm aware of how to access artifacts using REST. But this would address the artifacts created by a specific build configuration. I'm looking for a way to download all artifacts used by a given configuration specified by the dependencies.

There isn't an easy way to do this, however, it's not impossible. My answer is provided below followed by a possible alternate solution.
Answer:
The artifacts used by your target build are really just the artifacts that were created by its dependencies right?
I think what you are looking for is referenced here where you can query a build for all of its Snapshot Dependencies.
Once you have a list of the dependencies you would then need to query each of them for the artifacts they generated and then you could proceed to download them.
It's not the most straightforward thing and would require some slick Powershell or Python or whatever, but it is doable.
Another Idea:
Have you looked into something like Artifactory? It sounds like what you really need is a binary repository of sorts to track artifacts used, and artifacts created.
Or for small projects, you could probably get a way with just using a file share on the network where the build could "copy" to the share organizing files into "build" directories of some sort and then developers could "read" from the share.

Related

Azure Pipeline artifacts do not show under Storage

I have created a c++ pipeline where the output of the build pipeline is published to drop container. The structure is the following
drop/v1.0.0/Release/MyService.dll
drop/v1.1.0/Release/MyService.dll
drop/v1.1.0/Release/MyService.dll
My engineers will need to view drop folder and according to the version that needs to be manually deployed to a client the will download the dll file.
As far as I understand there is not any way to view them under Artifacts (what a shame). I go to the project settings under Storage but I cannot view them either there. Only place that I am able to find them is under the pipeline run and then I have to find in which version of the pipeline run a specific service version was produced. This is a maze. We have dozens of c++ projects and we have to keep track of which pipeline version run of each project matches the service version.
Is there any way to be able to access them like in a folder structure?
You could use Builds - List via rest API to get all the builds for a pipeline, then use : Artifacts - List rest API to get all the artifacts for a build. It will list all the download URL for artifacts, then you could download them together or choose the one you want to download.
Besides, you could use the publishLocation argument in publish build artifacts task to copy the artifacts to a file share (FilePath). And the file share must be accessible from the agent running the pipeline. In this way you could publish all your artifacts to the file share you want for better management.
In addition, you could also use Universal Package task to publish your artifacts to your feed for better review.

VSTS Release Phase Condition Based Off From One of Many Builds

First attempt at automated build and continuous deployment so any process suggestions / improvements are welcome.
I have a repository with different build definitions. One for each of the following: database project, api, and web. (Will add more later for etl / reports) Each build has a filter so it only builds if code in a specific path has been changed.
Currently I have separate releases using continuous deployment for each build. So when the code changes, it builds that auto deploys. This works, but really isn't practical because of dependencies.
What I am looking to do is have one release definition that includes all build artifacts. Then have deployment phases that only run conditionally if a specific build artifact was created (something in that project changed). This way all builds / releases don't run every time, but are tied together when there are related changes.
I am going down the path of trying to created a custom condition on the deployment phase, but can't seem to figure out a way to make this work. I appreciate any help with this.
I have a repository with different build definitions. One for each of
the following: database project, api, and web. (Will add more later
for etl / reports) Each build has a filter so it only builds if code
in a specific path has been changed
Path filters are not to be used in your situation.
If you see Microsoft's git repo,
They have all their codebase from the Windows and Devices Group (WDG) in one big repo. Each root folder is a separate product and completely unrelated to the rest. (eg. Xbox, HoloLens, Windows OS, etc).
Path filters makes sense here because if I git push code to Xbox, I don't want Hololens code also to be built.
Web / DB / API projects all need to be built together, packaged together and deployed together.
I am assuming the project uses .NET stack.
Keep the DB, Web and API projects are in the same solution. Create a single build definition that builds the solution and create multiple artifacts(dacpac, webdeploy package etc.) by adding multiple publish artifacts step.
See screenshot of a build with multiple artifacts.
Link the artifacts from this build to the Release Definition and you should be able to deploy.

Using artifact repository for storing full releases?

I've been looking into artifact repositories for something that our release team can use for storing outputs of full builds from multiple projects. From what I've read, artifact repositories are mostly used for storing library files required for a build. My assumption is that their intended use is to ensure developers and build servers are using the exact same binary dependencies during build process.
Few questions:
Is it possible to store the build output of entire projects into an artifact repository (A full release), a place to store artifacts ready for deployment?
Is this common practice?
Is it possible to have analytics of what was changed since the last build? Ex: can I see which artifacts have changed since the last release?
So, the short answer to your questions are: yes, yes, and mostly yes.
While it is true that Binary Managers such as Artifactory are used for dependency management they are also used to host entire builds.
In Artifactory this can be easily achieved through the Build Integration features. If you are not using any CI server such as Jenkins (for example) you can use the JFrog CLI to upload your builds and their corresponding Build Info.
In addition, with regards to analytics, not exactly as such, but in Artifactory you have the option to perform Build Diff and see the changes between builds.
Hope I helped,
Eran
p.s. I work for JFrog
Using Sonatype Nexus woks for what you need, you are able to deploy not just Java artifacts (example: .ear, .jar, .war files) you are able to deploy any kind of binaries, we are using it for storing reports for Orace BI Publisher, or .exe binaries.
Is it possible to store the build output of entire projects into an artifact repository (A full release), a place to store artifacts ready for deployment?
Yes, as I said before, you can store any kind of binaries you want.
Is this common practice?
I don't know if it is a common practice, but in my case It helped us to keep an order. Just evaluate if it works for you.
Is it possible to have analytics of what was changed since the last build? Ex: can I see which artifacts have changed since the last release?
Sonatype Nexus handle a version for each artifact (or binary) so you are able to store all the "history" from your deployments, also it is able to handle security policy for example you could not deploy the same binary twice with the same version it forces you deploy a new version in this way you can verify when an artifact has changed, the date and who uploaded the artifact.
This is how it looks like:

Can I download artifacts built by BuildHive?

I have started using the free Jenkins build service on BuildHive for one of my GitHub projects. This is also my first try doing anything with Maven. I have succeeded in building my project using this script on BuildHive:
cd base_dir
mvn package
The build log shows that the resulting JAR has been built. Now I would like to offer the JAR to my project's users as a download artifact because GitHub has discontinued the feature of manually uploading binaries in a separate download section.
Is there any way I can download an artifact, referencing it by a URL? If so, how do I construct the URL, knowing only the artifact's local path from the build log?
Alternatively, is there a way in which I can push the artifact to another place by adding a command to my build shell script after mvn package? I was thinking of something like a curl or ftpput commmand.
The best thing I was able to come up with as a quick workaround was to upload the artifacts in question to my FTP server via curl, as suggested by my original question. It works, but the downside are the FTP credentials in the build public log. I have counterbalanced that by a shell script on my DSL router which checks for FTP storage abuse every few minutes.
As an alternative I found that after creating a free CloudBees account for my little open source project, I got my own Jenkins build configuration as well as my own artifact repository where to deploy my build artifacts. This is much more elegant and does not involve posting any FTP credentials to a public server.
I am still open for BuildHive-only solutions if anyone has a smart idea. :-)

TeamCity: Best Practices to deploy produced installers (artifacts)

We got a TeamCity server which produces nightly deployable builds. We want our beta tester to have access these nightly builds.
What are the best practices to do this? TeamCity Server is not public, it is in our office, so I assume best approach would be pushing artifacts via FTP or something like that.
Also I have no clue how to trigger a script when an artifact created successfully. Does TeamCity provide a way to do that?
I don't know of a way to trigger a script, but I wouldn't worry about that. You can retrieve artifacts via a URL. Depending on what makes sense for your project, you could have a script set up on a scheduler (cron or Windows Scheduling) that pulls the artifact and sends it to the FTP site for the Beta testers. You can configure it to pull only the latest successful artifact. If you set up the naming right, if the build fails they beta testers won't notice because the new build number just won't be there, no bad builds would be pushed to them.
Read the following help page from the documentation. It shows how you send commands from your build script to tell teamCity to publish the artifacts to a given path.
In TeamCity 7.0+ you can use Deployer plugin. Installation steps can be found here. It also allows to upload artifacts via SMB and SSH.
I suggest you start looking at something like (n)Ant to handle your build process. That way you can handle the entire "build artifacts" -> "publish artifacts" chain in an automated manner. These tools are dependency based, so the artifacts would only be published if the build succeeded.