I am trying to automate a build process by first getting the code from bitbucket as follows.
$output = "E:\FileName.xyz"
$url = 'https://bitbucket.org/WhatEver/WhatEverBranchName/get/master.zip'
$wc = New-Object -TypeName System.Net.WebClient
$wc.Headers.Add('Authorization','token oinksdilikncl--MyAccessToken--ioiwcnoisif89kfg9')
$wc.DownloadFile($url, $output)
When I execute this, The file i receive at FileName.xyz is a html file that redirects me to the loginpage of the bitbucket, essentially its asking for creds, even though I suppiled access token.
Where am I wrong? Are there other ways to do this say, Invoke-Webrequest? Or someone kindly direct me to a code sample please?
I have absolutely zero experience in powershell, but I tried to do the similar task in node and here are my findings.
First you create an "Oauth" in access management section of your bitbucket account setting. This gives you a "Key" and a "Secret".
Now using these Key and Secret you ask Bitbucket for a token. In my case I made a http request to https://bitbucket.org/site/oauth2/access_token. In your case you should use an equivalent of CURL (Invoke-RestMethod maybe?). the CURL command is like this:
$ curl -X POST -u "yourKeyHere:yourSecretHere" https://bitbucket.org/site/oauth2/access_token -d grant_type=client_credentials
my http request was like this (using superagent in node) with my Content-Type set to application/x-www-form-urlencoded:
request.post("https://yourKeyHere:yourSecretHere#bitbucket.org/site/oauth2/access_token").send('grant_type=client_credentials');
Now that you have the token, your program or your command can clone a private repo with it. But the url to your repo should be like this (keep the bracket around token):
https://x-token-auth:{tokenHere}#bitbucket.org/youRepoOwnerHere/RepoNameHere.git
Now you have the whole codebase on your machine. But you want a single file rather than the whole repo which I refer you to this Retrieve a single file from a repository but remember to use above repo url with the token instead of a normal repo url.
Actually, at least now (2 years after the original post), the things are more easy than that as it's enough to do a Basic auth. So as long as the script is private thous you have no problem having the creds written in it, the following should to the trick
Invoke-WebRequest -uri '<url>' -Headers #{ Authorization = 'Basic <auth_str_b64>' } -OutFile <dest_path>
where:
- url is something like https://bitbucket.org/<account>/<repo_name>/get/<branch_or_tag_or_whatever>.zip, get it from the Downloads page of the desired repository
- auth_str_b64 is the usual <username>:<password> pair base64 encoded
You can use the following to create/compute the encode string:
$encodedCreds = [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes('<username>:<password>'))
In order to avoid keeping the creds lying around in the script you could pass them as arguments or prompt them at runtime.
I've solved this problem like this:
# Instanciate the WebClient
$wc = New-Object -TypeName System.Net.WebClient
# Add the base64 encoded credentials
$wc.Headers.Add('Authorization','Basic {0}' -f [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f '{USERNAME}','{TOKEN}'))))
# Download the file
$wc.DownloadFile( 'https://{BITBUCKET_URL}/projects/{PROJECT}/repos/{REPOSITORY}/raw/{FILE.EXT}?at=refs%2Fheads%2F{BRANCH}', 'C:\file.txt' )
I am assuming you are using a Personal Access Token. Oh, and it's much, much faster then Invoke-WebRequest or Invoke-RestMethod.
Related
wget -N (or more verbose wget --timestamping) has the nice effect that files that are already downloaded are not attempted to be downloaded again.
That way you can save time and resources. I'm looking for the equivalent in PowerShell's Invoke-WebRequest.
Is there a way to respect the file's and the server's time stamp in Invoke-WebRequest?
based on what i can find in the documentation, no, it doesn't appear that Invoke-WebRequest has an option similar to that.
the best i could tell you is to check yourself in a script through conditionals and saving the new file with a different file name, since if you're using Invoke-WebRequest to download a file, i can only assume you're also using -OutFile as an option;
$File1Creation=(Get-ChildItem <PathToFile1> -Force).CreationTime
Invoke-WebRequest -Uri https://website.com -Outfile <PathToFile2>
$File2Creation=(Get-ChildItem <PathToFile2> -Force).CreationTime
if ($File1Creation -eq $File2Creation)
{
#do something here
} else {
#do something else here
}
the biggest problem is that, because I-WR doesn't have an option similar to it, unless your file has a timestamp embedded somewhere on its originating webpage, there's no way to check it prior to actually downloading it.
So I tried to put a docker-compose.yml file in the .github/workflows directory, of course it tried to pick that up and run it... which didn't work. However now this always shows up as a workflow, is there any way to delete it?
Yes, you can delete the results of a run. See the documentation for details.
To delete a particular workflow on your Actions page, you need to delete all runs which belong to this workflow. Otherwise, it persists even if you have deleted the YAML file that had triggered it.
If you have just a couple of runs in a particular action, it's easier to delete them manually. But if you have a hundred runs, it might be worth running a simple script. For example, the following python script uses GitHub API:
Before you start, you need to install the PyGithub package (like pip install PyGithub) and define three things:
PAT: create a new personal access GitHub token;
your repo name
your action name (even if you got deleted it already, just hover over the action on the actions page):
from github import Github
import requests
token = "ghp_1234567890abcdefghij1234567890123456" # your PAT
repo = "octocat/my_repo"
action = "my_action.yml"
g = Github(token)
headers = {'Accept': 'application/vnd.github.v3',
'Authorization': f'token {token}'}
for run in g.get_repo(repo).get_workflow(id_or_name=action).get_runs():
response = requests.delete(url=run.url, headers=headers)
if response.status_code == 204:
print(f"Run {run.id} got deleted")
After all the runs are deleted, the workflow automatically disappears from the page.
Yes, you can delete all the workflow runs in the workflow which you want to delete, then this workflow will disappear.
https://docs.github.com/en/rest/reference/actions#delete-a-workflow-run
To delete programmatically
Example (from the docs)
curl \
-X DELETE \
-H "Authorization: token <PERSONAL_ACCESS_TOKEN>"
-H "Accept: application/vnd.github.v3+json" \
https://api.github.com/repos/octocat/hello-world/actions/runs/42
I'm on a project that uses TeamCity for builds.
I have a VM, and have written a PowerShell script that backs up a few files, opens a ZIP artifact that I manually download from TeamCity, and then copies it to my VM.
I'd like to enhance my script by having it retrieve the ZIP artifact (which always has the same name).
The problem is that the download path contains the build number which is always changing. Aside from requesting the download path for the ZIP artifact, I don't really care what it is.
An example artifact path might be:
http://{server}/repository/download/{project}/{build_number}:id/{project}.zip
There is a "Last Successful Build" page in TeamCity that I might be able to obtain the build number from.
What do you think the best way to approach this issue is?
I'm new to TeamCity, but it could also be that the answer is "TeamCity does this - you don't need a PowerShell script." So direction in that regard would be helpful.
At the moment, my PowerShell script does the trick and only takes about 30 seconds to run (which is much faster than my peers that do all of the file copying manually). I'd be happy with just automating the ZIP download so I can "fire and forget" my script and end up with an updated VM.
Seems like the smallest knowledge gap to fill and retrieving changing path info at run-time with PowerShell seems like a pretty decent skill to have.
I might just use C# within PS to collect this info, but I was hoping for a more PS way to do it.
Thanks in advance for your thoughts and advice!
Update: It turns out some other teams had been using Octopus Deploy (https://octopus.com/) for this sort of thing so I'm using that for now - though it actually seems more cumbersome than the PS solution overall since it involves logging into the Octopus server and going through a few steps to kick off a new build manually at this point.
I'm also waiting for the TC administrator to provide a Webhook or something to notify Octopus when a new build is available. Once I have that, the Octopus admin says we should be able to get the deployments to happen automagically.
On the bright side, I do have the build process integrated with Microsoft Teams via a webhook plugin that was available for Octopus. Also, the Developer of Octopus is looking at making a Microsoft Teams connector to simplify this. It's nice to get a notification that the new build is available right in my team chat.
You can try to get your artefact from this url:
http://<ServerUrl>/repository/downloadAll/<BuildId>/.lastSuccessful
Where BuildId is the unique identifier of the build configuration.
My implementation of this question is, in powershell:
#
# GetArtefact.ps1
#
Param(
[Parameter(Mandatory=$false)][string]$TeamcityServer="",
[Parameter(Mandatory=$false)][string]$BuildConfigurationId="",
[Parameter(Mandatory=$false)][string]$LocalPathToSave=""
)
Begin
{
$username = "guest";
$password = "guest";
function Execute-HTTPGetCommand() {
param(
[string] $target = $null
)
$request = [System.Net.WebRequest]::Create($target)
$request.PreAuthenticate = $true
$request.Method = "GET"
$request.Headers.Add("AUTHORIZATION", "Basic");
$request.Accept = "*"
$request.Credentials = New-Object System.Net.NetworkCredential($username, $password)
$response = $request.GetResponse()
$sr = [Io.StreamReader]($response.GetResponseStream())
$file = $sr.ReadToEnd()
return $file;
}
Execute-HTTPGetCommand http://$TeamcityServer/repository/downloadAll/$BuildConfigurationId/.lastSuccessful | Out-File $LocalPathToSave
}
And call this with the appropriate parameters.
EDIT: Note that the current credential I used here was the guest account. You should check if the guest account has the permissions to do this, or specify the appropriate account.
Try constructing the URL to download build artifact using TeamCity REST API.
You can get a permanent link using a wide range of criteria like last successful build or last tagged with a specific tag, etc.
e.g. to get last successful you can use something like:
http://{server}/app/rest/builds/buildType:(id:{build.conf.id}),status:SUCCESS/artifacts/content/{file.name}
TeamCity has the capability to publish its artifacts to a built in NuGet feed. You can then use NuGet to install the created package, not caring about where the artifacts are. Once you do that, you can install with nuget.exe by pointing your source to the NuGet feed URL. Read about how to configure the feed at https://confluence.jetbrains.com/display/TCD10/NuGet.
Read the file content of the path in TEAMCITY_BUILD_PROPERTIES_FILE environment variable.
Locate the teamcity.configuration.properties.file row in the file, iirc the value is backslash encoded.
Read THAT file, and locate the teamcity.serverUrl value, decode it.
Construct the url like this:
{serverurl}/httpAuth/repository/download/{buildtypeid}/.lastSuccessful/file.txt
Here's an example (C#):
https://github.com/WideOrbit/buildtools/blob/master/RunTests.csx#L272
I often do the same mistake over again: in powershell, I run
wget http://example.com
instead of
wget http://example.com -OutFile somename
and when the command (wget aka Invoke-WebRequest) is done executing, the downloaded file is stored... apparently, nowhere.
Q: Is there a way to store the downloaded content post-factum?
No, if you dont specify -outfile, it is only returned to the pipeline to be used in next Statement.
I want to download a TeamCity artifact via powershell. It needs to be the last successful build of a specific branch.
I've noticed two common url paths to access the artifacts. One seems to be
/repository/download/BUILD_TYPE_EXT_ID/.lastSuccessful/ARTIFACT_PATH
The problem is that the file at the end relies on the release version. Within TeamCity there is syntax to specify all files \*.msi. Is there any way to specify an artifact starting with FileName-{version.number}.msi when trying to access this url?
EDIT:
The other url I noticed is for the REST API.
http://teamcity/guestAuth/app/rest/builds/branch:[BRANCH],buildType:[BUILD TYPE],status:SUCCESS,state:finished/artifacts/[BUILD PATH]
The problem is that I can't download artifacts from here. If I want to download the artifacts I have to use the current build id. The above url gives the following url: /guestAuth/app/rest/builds/id:[Build ID]/artifacts/content/[Artifact Path] to download the artifact.
I can use the first REST url to eventually get the second through the returned xml, but would prefer a more straightforward approach.
Unfortunately as TeamCity artifacts are not browsable the usual workarounds like wget recursive download or wildcards are not applicable.
Using wildcards in wget or curl query
How do I use Wget to download all Images into a single Folder
One workaround you could try is formatting the link in the job, saving the link to a text file and storing that as an artifact as well, with a static name. Then you just need to download that text file to get the link.
I found you can format the artifact URL in TeamCity job by doing:
%teamcity.serverUrl%/repository/download/%system.teamcity.buildType.id%/%teamcity.build.id%:id/<path_to_artifact>
In a command line step. You can write this to a file by doing:
echo %teamcity.serverUrl%/repository/download/%system.teamcity.buildType.id%/%teamcity.build.id%:id/myMsi-1.2.3.4.msi > msiLink.txt"
Now you have an artifact with a constant name, that points to the installer (or other artifact) with the changing name.
If you use the artifact msiLink.txt you don't need to use the REST interface (it's still two calls, both through the same interface).
You can easy download the latest version from batch/cmd by using:
wget <url_server>/repository/download/BUILD_TYPE_EXT_ID/.lastSuccessful/msiLink.txt ---user #### --passsword ####
set /P msi_url=<msiLink.txt
wget %msi_url% --user #### --passsword ####
Hope it helps.
Update:
Sorry I just realized the question asked for PowerShell:
$WebClient = New-Object System.Net.WebClient
$WebClient.Credentials = New-Object System.Net.Networkcredential("yourUser", "yourPassword")
$WebClient.DownloadFile( "<url_server>/repository/download/BUILD_TYPE_EXT_ID/.lastSuccessful/msiLink.txt", "msiLink.txt" )
$msi_link = [IO.File]::ReadAllText(".\msiLink.txt")
$WebClient.DownloadFile( $msi_link, "yourPath.msi" )