After a successful build in Unity Cloud Build I would like to send the generated .zip file url to a Continuous Integration system like Travis CI in order to to deploy that file into another server. To do this right now I need to call a custom webhook in my own server that checks (via API) the last build link and starts a Travis CI build via HTTP POST method.
What's the best approach to do this without needing my intermediate step? I'have tried to do this HTTP POST request inside the post-script method in Unity Cloud Build but this seems to be a bad workaround.
Related
The azure pipelines web interface provides a link to download pipeline artifacts on public pipelines regardless of whether your session is authenticated or not. However, the api for getting a download link does not work in this case. HOWEVER however, the download link produced by that API does work if you copy it into an unauthenticated session. How can I obtain a download URL for a public pipeline artifact without authenticating and without scraping the public web interface?
My use case is providing a tool to manually debug CI runs that work locally but fail in CI by replicating the CI environment in a docker container and downloading the exact build used in the failing CI run (a pipeline artifact). The interface I would like to provide for this is a command that takes as an argument a URL of a Github pull request, parses the relevant pipeline parameters out of it, and launches the docker container with the URL of the artifact to download. This should work, and it has worked in the past by scraping the artifact download URL out of the public azure web interface, but this breaks frequently.
Basically I just started using Google Cloud and I'm looking for a way to trigger a deployment ONLY when a pull request is accepted on a distant github repository.
I'm currently using the google "Cloud Build Trigger" to execute my 'cloudbuild.yaml' as soon as a push is detected on my master branch, but simply attempting a merge request seems to trigger my build process.
This is troublesome as a merge request will be reviewed by peers and I don't want my cloud application to rebuild if the merge request is to be denied after being reviewed.
As this feature is still in beta, I assume this is not supported yet and that there is a better way to handle such task, but when I heard of the trigger feature it seemed like the most straightforward way to connect my github repository to the build process on google cloud. Anyway, hope someone had to face this issue or can help me figure this one out.
Thanks !
Based on the documentation, Cloud Build triggers currently only support changes pushed to the build source (a remote Github repo in this case). There doesn't seem to be a way to distinguish between a merge resulting from a (remote) pull request or a local one from the Google Cloud Console GUI.
However, you are not without options. One alternative is to leverage Github's PullRequestEvent Webhook and deploy a GAS Web App or Cloud Function to serve as a web-hook endpoint. The GAS Web App or Cloud Function could then parse the event payload for GitHub's PullRequestEvent and if the pull request is closed and merged then you call the REST API for the Cloud Build service to start your build.
I have some build artifacts that have some debug information in them that I would like to display in a build summary. However, the only APIs I'm seeing for getting artifacts is as a zip file. If you go to the artifact screen and explore the artifacts, then copy the download URL, you get an API as:
https://{account}/_apis/resources/Containers/{container}?itemPath={file_name}
However, I can't seem to find a REST Client API to utilize this function in my ts script in the extension.
Any thoughts on how to get the actual files from this? The TS script needs to just grab the file and display it in the browser.
There isn’t such REST client API to do it.
You can call that API through HTTP request directly. To get container’s Id, you can use Get build artifacts REST API (in resource > data value).
There is an article about make http request.
5 Ways to Make HTTP Requests in Node.js
On the other hand, you don’t need to do it through extension, just call Logging Command (call command during build/release) to add additional information to build/release summary. (##vso[task.uploadsummary]local file path)
If I have a build that triggered by a github webhook in AWS CodeBuild, is there a way for me to inspect the content of the webhook body that triggered the build from my buildspec.yml file? Or is this content just lost?
In order to trigger a codebuild build from Github, you're going to need to be able to consume post data from Github and translate that into a call to codebuild. In that translation layer, just take what you need from the webhook and apply it to your codebuild build environment. One of the ways you can pass data into your codebuild build is through environment variables. I am not sure how you intend to trigger codebuild builds, but I assume you would need a translation layer that consumes the webhook and ultimately launches a codebuild build.
This is not supported today. Providing webhook payload as a preconfigured environment variable is a feature request CodeBuild team is aware of. Relevant forum post here: https://forums.aws.amazon.com/thread.jspa?threadID=269699
Outside of buildspec, you could achieve this by looking at the "initiator" field of your CodeBuild console/UI or BatchGetBuilds API response. This will give the GitHub hookshot id which you may then use to look up your webhook payload in GitHub.
We got a TeamCity server which produces nightly deployable builds. We want our beta tester to have access these nightly builds.
What are the best practices to do this? TeamCity Server is not public, it is in our office, so I assume best approach would be pushing artifacts via FTP or something like that.
Also I have no clue how to trigger a script when an artifact created successfully. Does TeamCity provide a way to do that?
I don't know of a way to trigger a script, but I wouldn't worry about that. You can retrieve artifacts via a URL. Depending on what makes sense for your project, you could have a script set up on a scheduler (cron or Windows Scheduling) that pulls the artifact and sends it to the FTP site for the Beta testers. You can configure it to pull only the latest successful artifact. If you set up the naming right, if the build fails they beta testers won't notice because the new build number just won't be there, no bad builds would be pushed to them.
Read the following help page from the documentation. It shows how you send commands from your build script to tell teamCity to publish the artifacts to a given path.
In TeamCity 7.0+ you can use Deployer plugin. Installation steps can be found here. It also allows to upload artifacts via SMB and SSH.
I suggest you start looking at something like (n)Ant to handle your build process. That way you can handle the entire "build artifacts" -> "publish artifacts" chain in an automated manner. These tools are dependency based, so the artifacts would only be published if the build succeeded.