Can we replace nuget.config file with command parameters? - nuget

I am working on an Azure pipeline to run on a Windows self-hosted agent.
We configured a Artefact feed with an upstream to connect to Nuget.
As we are behind a firewall, it seems the only way to connect to Nuget.
My pipeline was working with this nuget.config file :
<packageSources>
<clear />
<add key="FeedName" value="https://***.pkgs.visualstudio.com/***/_packaging/FeedName/nuget/v3/index.json" />
</packageSources>
And this YAML:
- task: NuGetAuthenticate#0
- task: CmdLine#2
inputs:
script: '"C:\dotnet\dotnet.exe" publish ${{ parameters.solutionToPublishPath }} -c ${{ parameters.buildConfiguration }} -o $(Build.BinariesDirectory)'
The nuget.config file breaks the previous pipelines in TeamCity!!
To keep the old one running while I work on the new one, I am looking for a way to move the information from the nuget.config file to the script.
Is it possible ?
I tried with this:
- task: CmdLine#2
inputs:
script: '"C:\dotnet\dotnet.exe" add "src/project/project.API.csproj" package FeedName -s https://***.pkgs.visualstudio.com/***/_packaging/FeedName/nuget/v3/index.json'
I get this message which for me indicates that it tried to reach Nuget directly and failed, this is why we use a feed.
error: Unable to load the service index for source https://api.nuget.org/v3/index.json.
error: Response status code does not indicate success: 302 (Moved Temporarily).
Thanks for any help

You may check Replace Tokens extension to see whether it helps you. It can replace tokens in files with variable values during pipeline.

I would not call it a solution as I can't move the nuget.config information out of the file into the command line, I'll remove the file to enable Team City to work and put it back when running Azure pipelines. Thanks.

We are overriding Nuget.config in Azure DevOps pipeline script with DotNetCoreCLI#2 and restoreArguments
- task: DotNetCoreCLI#2
displayName: Restore
inputs:
command: 'restore'
projects: |
$(buildProjects)
!$(testProjects)
restoreArguments: --source https://api.nuget.org/v3/index.json --source $(Build.SourcesDirectory)/Nugets

Related

Checkov scan particular folder or PR custom branch files

Trying to run Checkov (for IaC validation) via Azure DevOps YAML pipelines, for ARM template files stored in Azure DevOps version control. The code below:
trigger: none
pool:
vmImage: ubuntu-latest
stages:
- stage: 'runCheckov'
displayName: 'Checkov - Scan ARM files'
jobs:
- job: 'RunCheckov'
displayName: 'Checkov solution'
steps:
- bash: |
docker pull bridgecrew/checkov
workingDirectory: $(System.DefaultWorkingDirectory)
displayName: 'Pull bridgecrew/checkov image'
- bash: |
docker run \
--volume $(pwd):/scripts bridgecrew/checkov \
--directory /scripts \
--output junitxml \
--soft-fail > $(pwd)/CheckovReport.xml
workingDirectory: $(System.DefaultWorkingDirectory)
displayName: 'Run checkov'
- task: PublishTestResults#2
inputs:
testRunTitle: 'Checkov run results'
failTaskOnFailedTests: false
testResultsFormat: 'JUnit'
testResultsFiles: 'CheckovReport.xml'
searchFolder: '$(System.DefaultWorkingDirectory)'
mergeTestResults: false
publishRunAttachments: true
displayName: 'Publish Test results'
The problem - how to change the path/folder of ARM templates to scan. Now it scans all ARM templates found under my whole repo1, regardless what directory value I set.
Also, how to scan PR files committed to custom branch during PR review, so it would trigger the build but the build would scan only those files in the custom branch. I know how to set to trigger build via DevOps repository settings, but again, how to assure build pipeline uses/scan particular PR commit files, not whole repo1 (and master branch).
I recommend you use the Docker image bridgecrew/checkov to set up a container job to run the Checkov scan. The container job will run all the tasks of the job into the Docker container started from this image.
In the container job, you can check out the source repository into the container, then use a script task (such as Bash task) to run the related Checkov CLI to do the files scan. On the script task, you can use the 'workingDirectory' option to specify the path/folder where the command lines run in. Normally, the command lines will only act on files which are in the specified directory and its subdirectories.
If you want to only scan the files in a specific branch in the job, you can clone/checkout the specific branch to the working directory of the job in the container, then like as above mentioned, use the related Checkov CLI to scan files under the specified directory.
[UPDATE]
In the pipeline job, you can try to call the Azure DevOps REST API "Commits - Get Changes" to get all the changed files and folders for the particular commit.
Then use the Checkov CLI with the parameter --directory (-d) or --file (-f) to scan the specified file or folder.

How do you copy azure repo folders to a folder on a VM in an Environment in a pipeline?

I have an Environment called 'Dev' that has a resource, which is a VM. As part of the 'Dev' pipeline I want to copy files from a specific folder on the develop branch of a specific repo to a specific folder on the VM that's on the Environment.
I've not worked with Environments before or yaml pipelines much but I gather I need to use the CopyFiles#2 task.
So I've got an azure pipeline yaml file something like this:
variables:
isDev: $[eq(variables['Build.SourceBranch'], 'refs/heads/develop')]
stages:
- stage: Build
jobs:
- job: Build
pool:
vmImage: 'windows-latest'
steps:
- task: CopyFiles#2
displayName: 'Copy Files'
inputs:
contents: 'myFolder\**'
Overwrite: true
targetFolder: $(Build.ArtifactStagingDirectory)
- task: PublishBuildArtifacts#1
inputs:
pathToPublish: $(Build.ArtifactStagingDirectory)
artifactName: myArtifact
- stage: Deployment
dependsOn: Build
condition: and(succeeded(), eq(variables.isDev, true))
jobs:
- deployment: Deploy
displayName: Deploy to Dev
pool:
vmImage: 'windows-latest'
environment: Dev
strategy:
runOnce:
deploy:
steps:
- script: echo Foo Bar
The first question is how to I get this to copy the files to a specific path on the Dev environment?
Is the PublishBuildArtifacts really needed? The reason I ask is that I want this to copy files every time the pipeline is run and not error if the artifact already exists.
It also feels a bit dirty to have to check the branch is the correct branch this way. Is there a better way to do it?
The deployment strategy you're using relies on specifying an agent pool, which means it doesn't run on the machines in the environment. If you use a strategy such as rolling, it will run the specified steps on those machines automatically, including any download steps to download artifacts.
Ref: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/deployment-jobs?view=azure-devops#deployment-strategies
You need to publish artifacts as part of the pipeline if you want them to be automatically available to down-stream jobs. Each run will get a different set of artifacts, even if the actual artifact contents are the same.
That said, based on the YAML you posted, you probably don't need to. In fact, you don't need the "build" stage at all. You could just add a checkout step during your rolling deployment, and the repo would be cloned on each of the target machines.
Ok, worked this out with help from this article: https://dev.to/kenakamu/azure-devops-yaml-release-pipeline-trigger-when-build-pipeline-completed-54d5.
I've taken the advice from Daniel Mann regarding the strategy being 'rolling'. I then split my pipeline into 2 pipelines; 1 for building the artifacts and 1 for releasing (copying them).
If you want just download the particular folders instead of all the source files from the repository, you can try using the REST API "Items - Get" to download each particular folder individually.
GET https://dev.azure.com/{organization}/{project}/_apis/git/repositories/{repositoryId}/items?path={path}&download=true&$format=zip&versionDescriptor.version={versionDescriptor.version}&resolveLfs=true&api-version=6.0
For example:
Have the repository like as below.
Now, in the YAML pipeline, I just want to download the 'TestFolder01' folder from the main branch.
jobs:
- job: build
. . .
steps:
- checkout: none # Do not check out all the source files.
- task: Bash#3
displayName: 'Download particular folder'
env:
SYSTEM_ACCESSTOKEN: $(System.AccessToken)
inputs:
targetType: inline
script: |
curl -X GET \
-o TestFolder01.zip \
-u :$SYSTEM_ACCESSTOKEN 'https://dev.azure.com/MyOrg/MyProj/_apis/git/repositories/ShellScripts/items?path=/res/TestFolder01&download=true&$format=zip&versionDescriptor.version=main&resolveLfs=true&api-version=6.0'
This will download the 'TestFolder01' folder as a ZIP file (TestFolder01.zip) into the current working directory. You can use the unzip command to decompress it.
[UPDATE]
If you want to download the particular folders in the deploy job which target to your VM environment, yes, the folders will be download into the pipeline working directory on the VM.
Actually, you can consider a VM type environment resource is a self-hosted agent installed on the VM. So, when your deploy job is targeting to the VM environment resource, it is running on the self-hosted agent on the VM.
The pipeline working directory is under the directory where you install the VM environment resource (self-hosted agent). Normally, you can use the variable $(Pipeline.Workspace) to get value of this path (see here).
stages:
- stage: Deployment
jobs:
- deployment: Deploy
displayName: 'Deploy to Dev'
environment: 'Dev.VM-01'
strategy:
runOnce:
deploy:
steps:
- task: Bash#3
displayName: 'Download particular folder'
env:
SYSTEM_ACCESSTOKEN: $(System.AccessToken)
inputs:
targetType: inline
script: |
echo "Current working directory: $PWD"
curl -X GET \
-o TestFolder01.zip \
-u :$SYSTEM_ACCESSTOKEN 'https://dev.azure.com/MyOrg/MyProj/_apis/git/repositories/ShellScripts/items?path=/res/TestFolder01&download=true&$format=zip&versionDescriptor.version=main&resolveLfs=true&api-version=6.0'

Create a json file during Azure DevOps build pipeline

I have a Azure DevOps build pipeline that runs a Cypress test. In that Cypress test we have a test user login with a e-mail and password. On my local system I have the password in a cypress.env.json file.
On the Azure build pipeline I get the message that the password is undefined which makes sense since we put the cypress.env.json file in the .gitignore not to expose it to the repo.
I've created a Azure variable to represent the password: $(ACCOUNT_PASSWORD)
So I think I need to create the cypress.env.json file in the build pipeline and use Azure variables for it, but I can't figure out how to create a file during the build step.
I have this task:
- task: CmdLine#2
displayName: 'run Cypress'
inputs:
script: |
npm run ci
So I need to add a task before this that creates the cypress.env.json file with the variable that represents the password:
{
"ACCOUNT_PASSWORD": $(ACCOUNT_PASSWORD)
}
You can add a simple PS script that creates the file:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
$json = '{
"ACCOUNT_PASSWORD": $(ACCOUNT_PASSWORD)
}'
$json | Out-File cypress.env.json
workingDirectory: '$(Build.SourcesDirectory)'
pwsh: true # For Linux
In the workingDirectory set the path to where you want the file to be created.
If you want to create a json file using Azure Pipeline's Bash#3 for linux environments, you can do
steps:
- task: Bash#3
inputs:
targetType: "inline"
script: |
echo '{"ACCOUNT_PASSWORD": "$(ACCOUNT_PASSWORD)"}' > server/cypress.env.json
cd server
echo $(ls)
cat git-tag.json
- task: Docker#2
inputs:
command: buildAndPush
...
The cd echo and cat commands are not necessary. They are only there to log stuff to the console so you can see where the file is and the contents. This task may seem trivial to many but for someone like me with little bash experience even something simple like this task took quite a while to figure out how to debug and get right.
Azure Pipelines runs this bash script in the Build.SourcesDirectory, which is the root of your project. In this example above I have a /server folder which is where I want to put the .json file.
Later on, I run the Docker#2 task which builds my server. The Docker#2 task looks at my Dockerfile which has the instruction COPY . ./ I could be wrong here being new to docker, but my assumption is the Azure VM running the pipeline executes the docker COPY command and copies the file from the Build.SourcesDirectory to a destination inside the docker container.

Task AzureStaticWebApp#0 'could not detect this directory' but its presented

I am working on building a pipeline using AzureDevOps, and I face a strange problem.
This is my pipeline:
- stage: 'Test'
displayName: 'Deploy to the test environment'
dependsOn: Build
jobs:
- job: 'Deploy'
steps:
- download: current
artifact: lorehub-front
- bash: cd $(Pipeline.Workspace); echo $(ls)
- bash: cd $(Pipeline.Workspace)/lorehub-front; echo $(ls)
- task: AzureStaticWebApp#0
displayName: 'Publish to Azure Static WebApp'
inputs:
app_location: $(Pipeline.Workspace)/lorehub-front
azure_static_web_apps_api_token: xxxx
The first bash shows that the folder 'lorehub-front' is presented
The second bash shows that the inside folder is correct files (index.html and etc)
Script contents:
cd /home/vsts/work/1/lorehub-front; echo $(ls)
android-chrome-192x192.png android-chrome-512x512.png
apple-touch-icon.png css env-config.js favicon-16x16.png
favicon-32x32.png favicon.ico fonts index.html js site.webmanifest
But I am receiving this error:
App Directory Location: '/home/vsts/work/1/lorehub-front' is invalid. Could not
detect this directory. Please verify your deployment configuration
file reflects your repository structure.
App Directory Location: '/home/vsts/work/1/lorehub-front' is invalid.
The method to resolve this issue is that you need to change the path to /lorehub-front.
- task: AzureStaticWebApp#0
displayName: 'Publish to Azure Static WebApp'
inputs:
app_location: /lorehub-front
azure_static_web_apps_api_token: xxxx
For more detailed info, you could refer to this doc: Tutorial: Publish Azure Static Web Apps with Azure DevOps
Enter / if your application source code is at the root of the repository, or /app if your application code is in a directory called app.
In case anyone else comes across this post from Google or whatever, I had the exact same problem as Andrei above but for the life of me I couldn't get the accepted solution here to work.
No matter what I put in app_location:, the task just flat out refused to see any files.
Upon further investigation, I found this github issue which claims the following (Emphasis mine):
The AzureStaticWebApp task’s app location is relative to the current directory
Looking at the documentation, we can see that this task uses the default directory of $(System.DefaultWorkingDirectory), which is different to $(Pipeline.Workspace) for whatever reason.
So, if you find yourself stuck in the same position and cannot get this task to recognise your artifacts, the solution is to add cwd: $(Pipeline.Workspace) to your task, e.g.
strategy:
runOnce:
deploy:
steps:
- download: current
artifact: WebApp
- task: AzureStaticWebApp#0
inputs:
app_location: /
skip_app_build: true
azure_static_web_apps_api_token: $(deployment_token)
cwd: $(Pipeline.Workspace)/WebApp

Not found scriptPath in azure devops

I put a shell script file in a folder on my repo root and tried to run that in my devops pipeline but it says that cannot find the scriptPath:
[error]Not found scriptPath: /home/vsts/work/1/s/pipelines/databricks-cli-config.sh
I am simply creating a task to run the shell script, like this:
- task: ShellScript#2
inputs:
scriptPath: 'pipelines/databricks-cli-config.sh'
args: '$(databricks_host) $(databricks_token)'
displayName: "Install and configure the Databricks CLI"
Any idea?
Make sure you checkout your code and you are on correct level. So if you are on regular job please add working directory:
- task: ShellScript#2
inputs:
scriptPath: 'pipelines/databricks-cli-config.sh'
args: '$(databricks_host) $(databricks_token)'
cwd: '$(System.DefaultWorkingDirectory)'
displayName: "Install and configure the Databricks CLI"
and if you use it on deployment job, by default code is not being checked out there. So you need you need to publish this script as artifact and then download it in deployment job (deployment jobs download artifact by default) or add
- checkout: self
step do download code on deployment job.
I assumed that you use YAML.