I have a Jenkins Job, and that needs to get triggered by Octopus deploy. For that I have used the step template - Jenkins -Queue Job from Octopus library Installed Community Step Templates.
In the existing powershell script of the template, I have updated my parameters to run the Jenkins job. And finally during execution, I am facing the below error:
Exception in jenkins job: The remote server returned an error: (403) Forbidden.
The remote script failed with exit code 1.
I tried the ways to authenticate Octopus with Jenkins, and still couldn't find a way. Can someone provide inputs with this ? Thanks in advance!
I encountered the same issue just now. I ended up just copying their template and making changes. The issue is related to not keeping the session cookie across requests. Note: I had to move their authentication code out of the function because the session variable data would not work for some reason.
$jenkinsServer = $OctopusParameters['jqj_JenkinsServer']
$jenkinsUserName = $OctopusParameters['jqj_JenkinsUserName']
$jenkinsUserPassword = $OctopusParameters['jqj_JenkinsUserPasword']
$jobURL = $jenkinsServer + $OctopusParameters['jqj_JobUrl']
$failBuild = [System.Convert]::ToBoolean($OctopusParameters['jqj_FailBuild'])
$jobTimeout = $OctopusParameters['jqj_JobTimeout']
$buildParam = $OctopusParameters['jqj_BuildParam']
$checkIntervals = $OctopusParameters['jqj_checkInterval']
$jobUrlWithParams = "$jobURL$buildParam"
Write-Host "job url: " $jobUrlWithParams
try {
$params = #{}
if ($jenkinsUserName -ne "") {
$securePwd = ConvertTo-SecureString $jenkinsUserPassword -AsPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential ($jenkinsUserName, $securePwd)
$head = #{"Authorization" = "Basic " + [System.Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes($jenkinsUserName + ":" + $jenkinsUserPassword ))}
$params = #{
Headers = $head;
Credential = $credential;
ContentType = "text/plain";
}
}
# If your Jenkins uses the "Prevent Cross Site Request Forgery exploits" security option (which it should),
# when you make a POST request, you have to send a CSRF protection token as an HTTP request header.
# https://wiki.jenkins.io/display/JENKINS/Remote+access+API
try {
$tokenUrl = $jenkinsServer + "crumbIssuer/api/json?tree=crumbRequestField,crumb"
$crumbResult = Invoke-WebRequest -Uri $tokenUrl -Method Get #params -UseBasicParsing -SessionVariable session | ConvertFrom-Json
Write-Host "CSRF protection is enabled, adding CSRF token to request headers"
$params.Headers += #{$crumbResult.crumbRequestField = $crumbResult.crumb}
} catch {
Write-Host $Error[0]
}
Write-Host "Start the build"
$returned = Invoke-WebRequest -Uri $jobUrlWithParams -WebSession $session -Method Post -UseBasicParsing #params
Write-Host "Job URL Link: $($returned.BaseResponse.Headers['Location'])"
$jobResult = "$($returned.BaseResponse.Headers['Location'])/api/json"
$response = Invoke-RestMethod -Uri $jobResult -Method Get #params -WebSession $session
$buildUrl = $Response.executable.url
while ($buildUrl -eq $null -or $buildUrl -eq "") {
$response = Invoke-RestMethod -Uri $jobResult -Method Get #params -WebSession $session
$buildUrl = $Response.executable.url
}
Write-Host "Build Number is: $($Response.executable.number)"
Write-Host "Job URL Is: $($buildUrl)"
$buildResult = "$buildUrl/api/json?tree=result,number,building"
$isBuilding = "True"
$i = 0
Write-Host "Estimate Job Duration: " $jobTimeout
while ($isBuilding -eq "True" -and $i -lt $jobTimeout) {
$i += 5
Write-Host "waiting $checkIntervals secs for build to complete"
Start-Sleep -s $checkIntervals
$retyJobStatus = Invoke-RestMethod -Uri $buildResult -Method Get #params -WebSession $session
$isBuilding = $retyJobStatus[0].building
$result = $retyJobStatus[0].result
$buildNumber = $retyJobStatus[0].number
Write-Host "Retry Job Status: " $result " BuildNumber: " $buildNumber " IsBuilding: " $isBuilding
}
if ($failBuild) {
if ($result -ne "SUCCESS") {
Write-Host "BUILD FAILURE: build is unsuccessful or status could not be obtained."
exit 1
}
}
}
catch {
Write-Host "Exception in jenkins job: $($_.Exception.Message)"
exit 1
}
Related
$API_KEY = "xxxxxxxxxx"
# Source image files
$ImageFiles = (Get-ChildItem -Path C:\Users\sam\Desktop\jpeg\* -filter *).Name
$uploadedFiles = #()
try {
foreach ($imageFile in $ImageFiles ) {
# 1a. RETRIEVE THE PRESIGNED URL TO UPLOAD THE FILE.
# Prepare URL for `Get Presigned URL` API call
$query = "https://api.pdf.co/v1/file/upload/get-presigned-url?
contenttype=application/octet-stream&name=" + `
[IO.Path]::GetFileName($imageFile)
$query = [System.Uri]::EscapeUriString($query)
# Execute request
$jsonResponse = Invoke-RestMethod -Method Get -Headers #{ "x-api-key" = $API_KEY } -Uri
$query
if ($jsonResponse.error -eq $false) {
# Get URL to use for the file upload
$uploadUrl = $jsonResponse.presignedUrl
# Get URL of uploaded file to use with later API calls
$uploadedFileUrl = $jsonResponse.url
# 1b. UPLOAD THE FILE TO CLOUD.
$r = Invoke-WebRequest -Method Put -Headers #{ "x-api-key" = $API_KEY; "content-type"
= "application/octet-stream" } -InFile $imageFile -Uri $uploadUrl
if ($r.StatusCode -eq 200) {
# Keep uploaded file URL
$uploadedFiles += $uploadedFileUrl
}
else {
# Display request error status
Write-Host $r.StatusCode + " " + $r.StatusDescription
}
}
else {
# Display service reported error
Write-Host $jsonResponse.message
}
}
if ($uploadedFiles.length -gt 0) {
# 2. CREATE PDF DOCUMENT FROM UPLOADED IMAGE FILES
# Prepare URL for `DOC To PDF` API call
$query = "https://api.pdf.co/v1/pdf/convert/from/image"
# Prepare request body (will be auto-converted to JSON by Invoke-RestMethod)
# See documentation: https://apidocs.pdf.co
$body = #{
"name" = $(Split-Path $DestinationFile -Leaf)
"url" = $uploadedFiles -join ","
} | ConvertTo-Json
# Execute request
$response = Invoke-WebRequest -Method Post -Headers #{ "x-api-key" = $API_KEY; "Content-
Type" = "application/json" } -Body $body -Uri $query
$jsonResponse = $response.Content | ConvertFrom-Json
if ($jsonResponse.error -eq $false) {
# Get URL of generated PDF file
$resultFileUrl = $jsonResponse.url;
$DestinationFile = "C:\Users\sam\Desktop\pdf\$imagefile.split('.')[0]"
# Download PDF file
Invoke-WebRequest -Headers #{ "x-api-key" = $API_KEY } -OutFile $DestinationFile -Uri
$resultFileUrl
Write-Host "Generated PDF file saved as `"$($DestinationFile)`" file."
}
else {
# Display service reported error
Write-Host $jsonResponse.message
}
}
}
catch {
# Display request error
Write-Host $_.Exception
}
Basically this script converts bulk number of JPEG images to PDF format .Its working initially but later when i execute the script it is getting failed by this error "The underlying connection was closed: An unexpected error occurred on a send." . I googled this issue and added these two lines at the beginning of the script –
#[Net.ServicePointManager]::SecurityProtocolNet.SecurityProtocolType]::Tls12 [Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls -bor [Net.SecurityProtocolType]::Tls11 -bor [Net.SecurityProtocolType]::Tls12 .
Even after adding these two am getting the same old error .Can anyone please help me with this issue
I am trying to get a target's capabilities under azure devops deployment group using Powershell REST API. However I am not sure which URL will work to fetch those capabilities. My Powershell script is working till fetching status of 'targets'. Please help if there is anything we can do to fetch capabilities.
Below is my script which is working till fetching target details:
$projects = "testing"
$projectlist = $projects.split(';')
$PAT = "#######################################33"
$Header = #{Authorization = 'Basic ' + [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(":$($PAT)")) }
foreach($projectName in $projectlist){
write-host "================================================="
$baseURL = "https://dev.azure.com/abc/$($projectName)/_apis/distributedtask/deploymentgroups"
$deploymentgroup=Invoke-RestMethod -Uri "https://dev.azure.com/abc/$($projectName)/_apis/distributedtask/deploymentgroups?api-version=6.0-preview.1" -Method get -Headers $Header
$deploymentgroupsname=$deploymentgroup.value.name
foreach($deploymentgroupname in $deploymentgroupsname){
$deploymentGroupURL = "$($baseURL)?name=$($deploymentgroupname)&api-version=6.0"
try{
$deploymentgroup=Invoke-RestMethod -Uri "$deploymentGroupURL" -Method get -Headers $Header
}catch{
write-host "URL is not accessible - $deploymentGroupURL"
}
$deploymentGroupResponse=$deploymentgroup.value
$deploymentGroupid=$deploymentGroupResponse.id
try{
$targets=Invoke-RestMethod -Uri "https://dev.azure.com/abc/$($projectName)/_apis/distributedtask/deploymentgroups/$($deploymentGroupid)/targets?api-version=6.0-preview.1" -Method get -Headers $Header
}catch{
write-host "URL is not accessible - $deploymentGroupURL"
}
if($null -ne $deploymentGroupId){
$targets.value.agent|select name, status|%{
$hostname=$_.name
$Status=$_.status
if($status -eq "offline"){
$targetURL = "$($baseURL)/$deploymentGroupId/targets?name=$($hostName)&api-version=6.0-preview.1"
try{
$target = (Invoke-RestMethod -Uri $targetURL -Method get -Headers $Header).value
$targetId = $target.id ;
if($null -ne $targetId){
$url = "$($baseURL)/$deploymentGroupId/targets/$($targetId)?api-version=6.0"
try{
write-host "Projectname is : $projectName"
write-host "deploymentGroupname is : $deploymentgroupname"
write-host "Server $hostname is not pingble"
}
catch{
write-host "TARGET DELETE ERROR: $hostName";Write-Error $_.Exception.Message
}
}
else{
write-host "Target $hostName NOT Found in DeploymentGroup $environment."
}
}catch {
write-host "TARGET LIST ERROR";Write-Error $_.Exception.Message
}
}
}
}else{
write-host "DeploymentGroup $deploymentgroupname NOT FOUND in $projectName"
}
}
}
To to fetch capabilities of the Deployment Group Targets, you can use the Rest API: Targets - Get. You need to add the parameter: $expand=capabilities in the URL.
GET https://dev.azure.com/{organization}/{project}/_apis/distributedtask/deploymentgroups/{deploymentGroupId}/targets/{targetId}?$expand=capabilities&api-version=6.0-preview.1
Then you will get the capabilities of the Deployment Group Targets.
To use the Rest API URL in the PowerShell, you can use the following format:
$url="https://dev.azure.com/{organization}/{project}/_apis/distributedtask/deploymentgroups/{deploymentGroupId}/targets/{targetId}?`$expand=capabilities&api-version=6.0-preview.1"
Update:
When we add the $expand=capabilities in the Rest API url, it will return the capabilities in the API Response.
Here is an example:
I am trying to work out a powershell script that:
retrieves an accesstoken (MSAL) to access (read/write) a sharepoint online site with subsites and documents. Preferably the Azure APP-registration ServicePrincipal can be granted access to just that site and access the sharepoint site/files without giving consent to the whole sharepoint environment. I don't know if that is possible currently as I can only grant application permission to files.readwrite.all and sites.readwrite.all. I do not see anything like files.readwrite.shared to grant access only to sites/collections that the serviceprincipal has access to. Anyone done this? I currently use the MSAL.PS powershell module to get a token using an AppRegistration with the admin-consented readwrite.all access but would like to limit that. The code for this is now:
Import-Module MSAL.PS;
$clientid = "my-appreg-client-id";
$tenantID = 'my-tenant-id';
$thumbPrint = 'certificate-thumbprint';
$ClientCertificate = Get-Item "Cert:\CurrentUser\My\$thumbPrint";
$myAccessToken = Get-MsalToken -ClientId $clientID -TenantId $tenantID -ClientCertificate
$ClientCertificate;
The script will read all files and folders from an UNC-share and build a file-collection of the onprem files. That part of the code is in place using a Get-ChildItem call to the UNC filetree.
Then, after getting the token, I need to get the current available files in the sharepoint online site document library structure and store that in a variable/hashtable which I can use to perform lookups between the onprem filecollection and the presence of those files and (sub)folders in the sharepoint site. If a folder does not yet exist I need to create that sharepoint folder and if a file is not yet present or the onprem version is newer I need to upload that file into sharepoint.
I have a script that does this using the old sharepoint.client.dll libraries but those support only basic authentication which will be unavailable any time soon for accessing the MS Online environment. So now I am searching for code to do this using the Microsoft Graph Api or other Rest API call. I am already struggling to get the contents of a site file collection so I hope that this generic problem description is enough to get some hints and tips/resources to get going.
Many thanks,
Eric
This is what I use. I'm using powershell in Linux.
## Get the Token
$clientId = "Application (Client) ID"
$clientSecret = "Client secret"
$tenantName = "TenantName.onmicrosoft.com"
$tokenBody = #{
Grant_Type = 'client_credentials'
Scope = 'https://graph.microsoft.com/.default'
Client_Id = $clientId
Client_Secret = $clientSecret
}
$tokenResponse = Invoke-RestMethod -Uri "https://login.microsoftonline.com/$TenantName/oauth2/v2.0/token" -Method POST -Body $tokenBody -ErrorAction Stop
$headers = #{
"Authorization" = "Bearer $($tokenResponse.access_token)"
"Content-Type" = "application/json"
}
## Use the SharePoint groups ObjectID. From this we'll get the drive ID.
$site_objectid = "Groups ObjectID"
## Create all the folders on the SharePoint site first. I've set microsoft.graph.conflictBehavior below to fail because I never want to rename or replace folders.
# Set the base directory.
$baseDirectory = "/test"
$directories = get-childItem -path $baseDirectory -recurse -directory
foreach ($directory in $directories) {
$URL = "https://graph.microsoft.com/v1.0/groups/$site_objectid/sites/root"
$subsite_ID = (Invoke-RestMethod -Headers $headers -Uri $URL -Method Get).ID
$URL = "https://graph.microsoft.com/v1.0/sites/$subsite_ID/drives"
$Drives = Invoke-RestMethod -Headers $headers -Uri $URL -Method Get
$Document_drive_ID = ($Drives.value | Where-Object { $_.name -eq 'Documents' }).id
$createFolderURL = "https://graph.microsoft.com/v1.0/drives/$Document_drive_ID/items/root:{0}:/children" -f $directory.parent.FullName
$file = $directory.Name
$uploadFolderRequestBody = #{
name= "$file"
folder = #{}
"#microsoft.graph.conflictBehavior"= "fail"
} | ConvertTo-Json
invoke-restMethod -headers $headers -method Post -body $uploadFolderRequestBody -contentType "application/json" -uri $createFolderURL
}
## Upload the files. I'm only adding files that are 4 days old or less because I run the script every 3 days for backup.
## These are set in the $sharefiles variable. To upload all files just remove everything after the pipe.
$sharefiles = get-childItem $baseDirectory -recurse | Where-Object {$_.LastWriteTime -gt (Get-Date).AddDays(-4)}
foreach ($sharefile in $sharefiles) {
$Filepath = $sharefile.FullName
$URL = "https://graph.microsoft.com/v1.0/groups/$site_objectid/sites/root"
$subsite_ID = (Invoke-RestMethod -Headers $headers -Uri $URL -Method Get).ID
$URL = "https://graph.microsoft.com/v1.0/sites/$subsite_ID/drives"
$Drives = Invoke-RestMethod -Headers $headers -Uri $URL -Method Get
$Document_drive_ID = ($Drives.value | Where-Object { $_.name -eq 'Documents' }).id
$Filename = $sharefile.Name
$upload_session = "https://graph.microsoft.com/v1.0/drives/$Document_drive_ID/root:{0}/$($Filename):/createUploadSession" -f $sharefile.directory.FullName
$upload_session_url = (Invoke-RestMethod -Uri $upload_session -Headers $headers -Method Post).uploadUrl
## We'll upload files in chunks.
$ChunkSize = 62259200
$file = New-Object System.IO.FileInfo($Filepath)
$reader = [System.IO.File]::OpenRead($Filepath)
$buffer = New-Object -TypeName Byte[] -ArgumentList $ChunkSize
$position = 0
$counter = 0
Write-Host "ChunkSize: $ChunkSize" -ForegroundColor Cyan
Write-Host "BufferSize: $($buffer.Length)" -ForegroundColor Cyan
$moreData = $true
While ($moreData) {
#Read a chunk
$bytesRead = $reader.Read($buffer, 0, $buffer.Length)
$output = $buffer
If ($bytesRead -ne $buffer.Length) {
#no more data to be read
$moreData = $false
#shrink the output array to the number of bytes
$output = New-Object -TypeName Byte[] -ArgumentList $bytesRead
[Array]::Copy($buffer, $output, $bytesRead)
Write-Host "no more data" -ForegroundColor Yellow
}
#Upload the chunk
$Header = #{
'Content-Range' = "bytes $position-$($position + $output.Length - 1)/$($file.Length)"
}
Write-Host "Content-Range = bytes $position-$($position + $output.Length - 1)/$($file.Length)" -ForegroundColor Cyan
#$position = $position + $output.Length - 1
$position = $position + $output.Length
Invoke-RestMethod -Method Put -Uri $upload_session_url -Body $output -Headers $Header -SkipHeaderValidation
#Increment counter
$counter++
}
$reader.Close()
}
Is there a way to get list of test points data via Azure DevOps API?
list
I tried this powershell script
function GetUrl() {
param(
[string]$orgUrl,
[hashtable]$header,
[string]$AreaId
)
# Area ids
# https://learn.microsoft.com/en-us/azure/devops/extend/develop/work-with-urls?view=azure-devops&tabs=http&viewFallbackFrom=vsts#resource-area-ids-reference
# Build the URL for calling the org-level Resource Areas REST API for the RM APIs
$orgResourceAreasUrl = [string]::Format("{0}/_apis/resourceAreas/{1}?api-preview=5.0-preview.1", $orgUrl, $AreaId)
# Do a GET on this URL (this returns an object with a "locationUrl" field)
$results = Invoke-RestMethod -Uri $orgResourceAreasUrl -Headers $header
# The "locationUrl" field reflects the correct base URL for RM REST API calls
if ("null" -eq $results) {
$areaUrl = $orgUrl
}
else {
$areaUrl = $results.locationUrl
}
return $areaUrl
}
$orgUrl = "https://dev.azure.com/fodservices"
$personalToken = "<my token pat>"
Write-Host "Initialize authentication context" -ForegroundColor Yellow
$token = [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes(":$($personalToken)"))
$header = #{authorization = "Basic $token"}
Write-Host "Demo 3"
$coreAreaId = "3b95fb80-fdda-4218-b60e-1052d070ae6b"
$tfsBaseUrl = GetUrl -orgUrl $orgUrl -header $header -AreaId $coreAreaId
$relDefUrl = "$($tfsBaseUrl)/_apis/testplan/Plans/70152/Suites/70154/TestPoint?api-version=5.0-preview.2"
try {
$output = Invoke-RestMethod -Uri $relDefUrl -Method Get -ContentType "application/json" -Headers $header
}
catch{
Write-Host "StatusCode:" $_.Exception.Response.StatusCode.value__
Write-Host "StatusDescription:" $_.Exception.Response.StatusDescription
}
$output.value | ForEach-Object {
Write-Host $_.id
}
the result is:
Demo 3
StatusCode: 404
StatusDescription: Not Found
Can anyone tell me what i'm doing wrong im new to using powershell and azure devops rest api
Look at the contents of the $tfsBaseUrl variable. It includes the organization name, but not the project name. You need to include the project name in the URL. Look at the documentation and compare your URL to the documentation's URL.
I'm working on a script to login to a sharepoint 2013 site and navigate to a few pages to make sure the site is working after updates and DR drills. I'm calling Invoke-WebRequest like this:
$site = Invoke-WebRequest -uri 'https://spsite' -Credential $(Get-Credential) -SessionVariable s
when I make the call I get a 401 Unauthorized error back. I have tried using basic authentication and building out the headers like this:
$u = 'domain\user'
$p = 'password'
$header = #{ Authorization = "Basic {0}" -f [convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $u,$p))) }
$site = Invoke-WebRequest -uri 'https://spsite' -Headers $header
with the same result. I'm hoping someone could offer another way to make this connection?
so I found a way to make this work in my situation and wanted to post the basics in case someone else runs into this.
I found that when the exception is thrown you can get the actual response from the web server from the exception object like this:
try{
$site = Invoke-WebRequest -uri 'https://spsite' -Credential $(Get-Credential) -SessionVariable s
}
catch{
$site = $_.Exception.Response
}
after that I was able to manipulate the $site variable to follow the redirection and submit the credentials as needed.
Use Export-PSCredential and Import-PSCredential from WFTools - you'll only have to enter your credentials once per box, and it will last as long as your password doesn't change: https://github.com/RamblingCookieMonster/PowerShell
Install-Module -Name WFTools -RequiredVersion 0.1.44
Import-Module WFTools;
$getCredentialMessage = "Please provide your Windows credentials";
$importedCredential = Import-PSCredential;
if ($importedCredential) {
Write-Host -ForegroundColor Yellow "Imported your cached credential."
while (-not $(Test-Credential -Credential $credential)) {
Write-Host -ForegroundColor Yellow "Your cached credentials are not valid. Please re-enter."
$credential = Get-Credential -Message $getCredentialMessage;
}
$credential = $importedCredential;
}
else {
$credential = Get-Credential -Message $getCredentialMessage;
while (-not $(Test-Credential -Credential $credential)) {
$credential = Get-Credential -Message $getCredentialMessage;
}
Export-PSCredential $credential;
}
# Here is where the magic happens
$site = Invoke-WebRequest -uri 'https://spsite' -Credential $credential