Missing step to analyse my project in SonarCloud - azure-devops

I have to create a Azure pipeline to install and deploy my Angular project,
one of those steps not working, in other words, the project doesnt was not found in SonarCloud :
My pipeline is :
parameters:
- name: name
default: '[EPP] Front Client'
type: string
- name: serviceName
default: 'EPP_Client'
type: string
- name: version
default: ''
type: string
jobs:
- job: Build_push_and_deploy_${{ parameters.serviceName }}
displayName: Build push and deploy ${{ parameters.name }}
steps:
- task: AzureKeyVault#1
inputs:
azureSubscription: '$(azureKeyVaultServiceConnection)'
KeyVaultName: '$(azureKeyVaultName)'
SecretsFilter: '*'
RunAsPreJob: false
- task: Docker#2
displayName: 'Build front image'
inputs:
containerRegistry: '$(containerRegistryServiceConnection)'
repository: 'front'
command: 'build'
sonarQubeRunAnalysis: true
arguments: >-
--build-arg ENVFILE=$(environment)
--cache-from $(azAcr)/front:latest
addPipelineData: false
tags: |
$(tag)
latest
- task: Docker#2
displayName: 'Login on ACR $(containerRegistryServiceConnection)'
inputs:
command: login
containerRegistry: $(containerRegistryServiceConnection)
- task: Docker#2
displayName: 'Push front image'
inputs:
containerRegistry: '$(containerRegistryServiceConnection)'
repository: 'front'
command: 'push'
tags: |
$(tag)
- task: HelmInstaller#0
displayName: 'Install Helm and Kubectl'
inputs:
helmVersion: '$(helmVersion)'
checkLatestHelmVersion: false
installKubectl: true
kubectlVersion: '$(kubectlVersion)'
- task: HelmDeploy#0
displayName: 'Deploy EPP-front service'
inputs:
connectionType: 'Kubernetes Service Connection'
kubernetesServiceConnection: '$(kubernetesServiceConnection)'
namespace: 'epp'
command: 'upgrade'
chartType: 'FilePath'
chartPath: 'helm/chart'
releaseName: 'epp-front-service'
overrideValues: 'image.tag=$(tag)'
valueFile: 'helm/values_$(environment).yaml'
arguments: >-
--timeout=15m0s
$(helmExtraArgs)
- task: SonarQubePrepare#5
displayName: 'Prepare analysis configuration'
inputs:
SonarQube: 'EPP'
organization: orga
scannerMode: 'CLI'
configMode: 'manual'
cliProjectKey: 'orga_${{parameters.serviceName}}'
cliProjectName: 'orga_${{parameters.serviceName}}'
cliSources: 'src/app'
extraProperties: |
sonar.projectKey=orga_${{parameters.serviceName}}
sonar.projectName=orga_${{parameters.serviceName}}
sonar.typescript.tsconfigPath=src/app/tsconfig.json
sonar.sources=src/app
sonar.test=src/app
sonar.test.inclusions=src/app/**/*.spec.ts
sonar.exclusions=**/node_modules/*
sonar.javascript.lcov.reportPaths=src/app/coverage/lcov.info,src/app/coverage-e2e/lcov.info
- task: SonarQubeAnalyze#5
displayName: 'Run Code Analysis'
- task: SonarQubePublish#5
displayName: 'Publish results on build summary'
inputs:
pollingTimeoutSec: '300'
on the other side, 'orga' is configured as organization and i had analyse other backedn project.
is there a steps that is missing ? or test step ?
PS : all steps are working properly when running this pipeline.
Error log :
2022-09-29T15:18:35.9334213Z ERROR: Error during SonarScanner execution
2022-09-29T15:18:35.9334796Z ERROR: Could not find a default branch to fall back on.
2022-09-29T15:18:35.9335298Z ERROR:
2022-09-29T15:18:35.9336129Z ERROR: Re-run SonarScanner using the -X switch to enable full debug logging.
2022-09-29T15:18:36.2621538Z ##[error]The process '/SonarQubeAnalyze_6d01813a-9589-4b15-8491-8164aeb38055/5.8.0/sonar-scanner/bin/sonar-scanner' failed with exit code 2
2022-09-29T15:18:36.2678083Z ##[section]Finishing: Run Code Analysis

Related

AzureDevops Pipeline Templates From Folder

I am trying to use templates in my pipeline that are in the same project folder but on Validate yam and run pipeline gives me an error
/templates/transform-settings.yml (Line: 1, Col: 1): A sequence was not expected
Here is the part of azure-pipelines.yml and template:
imagePullSecret: ' fd bfgbgf '
dockerfilePath: '**/Dockerfile'
vmImageName: 'ubuntu-latest'
testVar: 'test'
tag: '$(Build.BuildId)'
ConnectionStrings.DefaultConnection: ""
stages:
- template: templates/transform-settings.yml
- stage: Build
displayName: Build stage
jobs:
- job: Build
displayName: Build
pool:
vmImage: $(vmImageName)
steps:
- task: Docker#2
inputs:
command: buildAndPush
repository: $(imageRepository)
dockerfile: $(dockerfilePath)
containerRegistry: $(dockerRegistryServiceConnection)
.... and template:
- stage: TransformFiles
displayName: TransformFiles
variables:
- ${{ if eq(variables['Build.SourceBranchName'], 'development') }}:
- group: dev-secrets
- name: testVar
value: 'dev'
- name: ConnectionStrings.DefaultConnection
value: $(psql-conn-str-dev)
- ${{ if eq(variables['Build.SourceBranchName'], 'qa') }}:
- group: qa-secrets
- name: testVar
value: 'qa'
jobs:
- job: Transform_AppSettings
steps:
- bash: echo "===== Transforming appsettings.json for $(variables['Build.SourceBranchName']) environment ====="
displayName: 'File Transform'
- task: FileTransform#1
inputs:
folderPath: '$(System.DefaultWorkingDirectory)'
fileType: 'json'
targetFiles: 'appsettings.json'
- upload: appsettings.json
artifact: appsettings
/templates/transform-settings.yml (Line: 1, Col: 1): A sequence was not expected
Based on your yaml sample, the issue is related to the format of the YAML template.
To solve this issue, you need to add the stages: field at the top of the template YAML file.
For example:
azure-pipelines.yml
stages:
- template: templates/transform-settings.yml
- stage: Build
displayName: Build stage
jobs:
- job: Build
displayName: Build
pool:
vmImage: $(vmImageName)
steps:
- task: Docker#2
inputs:
command: buildAndPush
repository: $(imageRepository)
dockerfile: $(dockerfilePath)
containerRegistry: $(dockerRegistryServiceConnection)
transform-settings.yml
stages:
- stage: TransformFiles
displayName: TransformFiles
variables:
- ${{ if eq(variables['Build.SourceBranchName'], 'development') }}:
- group: dev-secrets
- name: testVar
value: 'dev'
- name: ConnectionStrings.DefaultConnection
value: $(psql-conn-str-dev)
- ${{ if eq(variables['Build.SourceBranchName'], 'qa') }}:
- group: qa-secrets
- name: testVar
value: 'qa'
jobs:
- job: Transform_AppSettings
steps:
- bash: echo "===== Transforming appsettings.json for $(variables['Build.SourceBranchName']) environment ====="
displayName: 'File Transform'
- task: FileTransform#1
inputs:
folderPath: '$(System.DefaultWorkingDirectory)'
fileType: 'json'
targetFiles: 'appsettings.json'
- upload: appsettings.json
artifact: appsettings
For more detailed info, you can refer to this doc: YAML template.

Conditional dependson for multiple stage YAML

I have a YAML pipeline which contains some template files.
Within my pipeline, there are 4 stages that run in parallel to apply DSC. I then have a destroy task which i would like to run, only when all 4 tasks have ran successfully. When i try to add a depends on with a list:
dependsOn:
- Stage_A
- Stage_B
- Stage_C
- Stage_D
The error I get is:
The 'dependsOn' parameter is not a valid String.
My template YAML looks like:
...
stages:
...
- template: Apply-DSC.yml
parameters:
azureSub: '[sub]'
AutoAccountResourceGroup: 'rg'
AutoAccountName: 'aa'
environment: 'b1'
stageDependsOn: 'b1_apply'
- template: Destroy-Pipeline.yml
parameters:
azureSub: '[sub]'
terraformStorageAccountResourceGroup: 'rg'
terraformStorageAccountName: '[]'
terraformStorageContainerName: '[]'
terraformStorageRemoteStateKey: '[].tfstate'
environment: 'b1'
terraformEnvironmentFileName: 'B01'
dependsOn: 'Stage_A'
I have 4 stages within my Apply-DSC.yml
Stage_A
Stage_B
Stage_C
Stage_D
Question is, is this possible for my destroy stage to await a successful deployment of Stages A-D when using these stage templates?
Thanks.
Edit: Adding Destroy-Pipeline.yml
# Run & upload Terraform plan
parameters:
- name: azureSub
type: string
- name: terraformStorageAccountResourceGroup
type: string
- name: terraformStorageAccountName
type: string
- name: terraformStorageContainerName
type: string
- name: terraformStorageRemoteStateKey
type: string
- name: environment
type: string
- name: terraformEnvironmentFileName
type: string
- name: dependsOn
type: string
stages:
- stage: Destroy_${{ parameters.environment }}
dependsOn: ${{ parameters.dependsOn }}
jobs:
- deployment: '${{ parameters.environment }}_Destroy'
displayName: '${{ parameters.environment }} Destroy'
environment: '${{ parameters.environment }} destroy'
pool:
vmImage: windows-latest
strategy:
runOnce:
deploy:
steps:
- download: current
artifact: 'drop'
name: 'Download_Terraform_code'
displayName: 'Download Terraform code'
- task: ms-devlabs.custom-terraform-tasks.custom-terraform-installer-task.TerraformInstaller#0
inputs:
terraformVersion: '$(TerraformVersion)'
displayName: 'Install Terraform'
- task: TerraformCLI#0
inputs:
command: 'init'
workingDirectory: '$(Pipeline.Workspace)/Drop'
backendType: 'azurerm'
backendServiceArm: '${{ parameters.azureSub }}'
backendAzureRmResourceGroupName: '${{ parameters.terraformStorageAccountResourceGroup }}'
backendAzureRmStorageAccountName: '${{ parameters.terraformStorageAccountName }}'
backendAzureRmContainerName: '${{ parameters.terraformStorageContainerName }}'
backendAzureRmKey: '${{ parameters.terraformStorageRemoteStateKey }}'
allowTelemetryCollection: false
displayName: 'Terraform Init'
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
terraform workspace select $(WorkspaceEnvironment)
workingDirectory: '$(Pipeline.Workspace)/Drop'
displayName: 'Select Workspace'
- task: TerraformCLI#0
inputs:
command: 'plan'
environmentServiceName: '${{ parameters.azureSub }}'
commandOptions: '-destroy -var-file="./environments/${{ parameters.terraformEnvironmentFileName }}.tfvars" -input=false'
allowTelemetryCollection: false
workingDirectory: '$(Pipeline.Workspace)/Drop'
displayName: 'Plan Destroy'
- task: TerraformCLI#0
inputs:
command: 'destroy'
workingDirectory: '$(Pipeline.Workspace)/Drop'
environmentServiceName: '${{ parameters.azureSub }}'
commandOptions: '-var-file="./environments/${{ parameters.terraformEnvironmentFileName }}.tfvars" -input=false '
allowTelemetryCollection: false
displayName: 'Run Destroy'
I changed the type from string to object
parameters:
- name: dependsOn
type: object
default: []
Then within my template block i added the object like:
- template: Destroy-Pipeline.yml
parameters:
...
dependsOn: ['Stage_A', 'Stage_B' ...]

set and refer to variables in yaml

I have the following yml code that sets and refers to some variables as follows:
<one.yml>
- task: AzurePowerShell#4
displayName: 'Copy functions templates'
inputs:
azureSubscription: ${{parameters.serviceConnection}}
ScriptPath: ${{ parameters.root }}/Scripts/ReleaseManagement/CopyChildTemplatesToContainer.ps1
ScriptArguments: '-resourceGroupName ''${{ parameters.solutionAbbreviation}}-data-${{ parameters.environmentAbbreviation}}''
name: copyFunctionsTemplates
- powershell: |
Write-Host "##vso[task.setvariable variable=data_containerSASToken;isOutput=true]$(copyFunctionsTemplates.containerSASToken)"
Write-Host "##vso[task.setvariable variable=data_containerEndPoint;isOutput=true]$(copyFunctionsTemplates.containerEndPoint)"
displayName: 'set data output variables'
name: dataVariables
<two.yml>
stages:
- deployment: ${{ parameters.stageName }}_DeployResources
displayName: ${{ parameters.stageName }}_DeployResources
- stage: ${{ parameters.stageName }}
dependsOn: ${{ parameters.dependsOn }}
condition: ${{ parameters.condition }}
jobs:
- deployment: ${{ parameters.stageName }}_DeployResources
displayName: ${{ parameters.stageName }}_DeployResources
steps:
- template: one.yml
jobs:
- job: ${{ parameters.stageName }}_DeployFunctions
dependsOn: ${{ parameters.stageName }}_DeployResources
variables:
data_containerEndPoint: $[ dependencies.DeployResources.outputs['DeployResources.dataVariables.data_containerEndPoint'] ]
data_containerSASToken: $[ dependencies.DeployResources.outputs['DeployResources.dataVariables.data_containerSASToken'] ]
steps:
- ${{ each func in parameters.functionApps }}:
- template: three.yml
<three.yml>
steps:
- task: AzureResourceGroupDeployment#2
displayName: 'deploy ${{ parameters.name }} data resources'
inputs:
azureSubscription: ${{parameters.serviceConnection}}
resourceGroupName: ${{parameters.solutionAbbreviation}}-data-${{parameters.environmentAbbreviation}}
location: ${{parameters.location}}
csmFile: ${{ parameters.root }}/functions_arm_templates/${{ parameters.name }}/Infrastructure/data/template.json
csmParametersFile: ${{ parameters.root }}/functions_arm_templates/${{ parameters.name }}/Infrastructure/data/parameters/parameters.${{parameters.environmentAbbreviation}}.json
overrideParameters: -environmentAbbreviation "${{parameters.environmentAbbreviation}}"
-tenantId "${{parameters.tenantId}}"
-solutionAbbreviation "${{parameters.solutionAbbreviation}}"
-containerBaseUrl "$(data_containerEndPoint)functions/${{ parameters.name }}/Infrastructure/data/"
-containerSasToken "$(data_containerSASToken)"
deploymentMode: 'Incremental'
On enabling the debug mode while running pipeline, I see values printed for data_containerSASToken and data_containerEndPoint from the task 'Copy functions templates' however I see empty values from the task 'deploy data resources'. What am I missing?
Your problem may be in when you retrieve the output from the previous job:
data_containerEndPoint: $[ dependencies.DeployResources.outputs['DeployResources.dataVariables.data_containerEndPoint'] ]
That's looking for a prior job called DeployResources, but the prior job is actually called {{ parameters.stageName }}_DeployResources.

How to pass a list as build parameter to a YAML template in Azure DevOps Server 2019 (on-prem)?

For example, consider the following template (named xyz.yml, for example):
parameters:
projects: ['p1', 'p2', 'p3']
steps:
- ${{ each project in parameters.projects }}:
- task: VSBuild#1
displayName: Build ${{ project }}
inputs:
solution: ${{ project }}.sln
...
Now, suppose I have the following azure-pipelines.yml file:
...
steps:
...
- template: xyz.yml
parameters:
projects: ???
...
How can I feed the projects template parameter from a build variable? Suppose at the time of the build I want to request building just p1 and p3. How can I do it?
You could try to use stepList type parameter and pass the same parameter value to template.
For example:
main.yaml:
parameters:
- name: mySteplist
type: stepList
default:
- task: CmdLine#2
inputs:
script: |
echo Write your commands here
echo Hello world1
- task: CmdLine#2
inputs:
script: |
echo Write your commands here
echo Hello world2
trigger:
- none
steps:
- template: stepstem.yml
parameters:
buildSteps:
- ${{ parameters.mySteplist }}
# - template: stepstem.yml
# parameters:
# buildSteps:
# - bash: echo Test #Passes
# displayName: succeed
# - bash: echo "Test"
# displayName: succeed
# - ${{ parameters.mySteplist }}
- task: CmdLine#2
inputs:
script: |
echo Write your commands here
echo Hello world3
stepstem.yaml:
parameters:
- name: buildSteps # the name of the parameter is buildSteps
type: stepList # data type is StepList
default: []
steps:
- ${{ parameters.buildSteps }}
- task: CmdLine#2
inputs:
script: |
echo Write your commands here
echo Hello world tem
- script: echo "hello"
So, you could use VSBuild#1 tasks as default parameter value and could change it when queue build.
Check the following example:
#xyz.yml
parameters:
projects: []
steps:
- ${{ each project in parameters.projects }}:
- task: VSBuild#1
displayName: Build ${{ project }}
inputs:
solution: ${{ project }}.sln
...
...
#azure-pipelines.yml
steps:
- template: xyz.yml
parameters:
projects: ["p1", "p3"]

Parameterized variable names as task input in Azure Pipelines

I've been trying to make a YAML template that first uses the AzureKeyVault#1 task to get the value ofsome Azure KeyVault secrets, and then uses these secrets for the sqlUsername and sqlPassword in asqlAzureDacpacDeployment#1 task.
I want to make the names of the KeyVault secrets a parameter, so that this template can be used for many different situations.
I've successfully used this technique before, with an AzureKeyVault#1 task and then an AzurePowerShell#4 task, where the secret gets injected as an environment variable for the PowerShell script.
This is a dressed down version of the working template:
parameters:
- name: subscription
type: string
- name: keyVaultSecretName
type: string
- name: keyVault
type: string
jobs:
- job: Run_PowerShell_With_Secret
pool:
name: Azure Pipelines
vmImage: windows-latest
steps:
- task: AzureKeyVault#1
inputs:
azureSubscription: ${{ parameters.subscription }}
keyVaultName: ${{ parameters.keyVault }}
secretsFilter: ${{ parameters.keyVaultSecretName }}
- task: AzurePowerShell#4
inputs:
azureSubscription: ${{ parameters.subscription }}
ScriptPath: 'some_script.ps1'
azurePowerShellVersion: LatestVersion
env:
SECRETVALUE: $(${{ parameters.keyVaultSecretName }})
And here is the template where I can't get the same technique to work:
parameters:
- name: subscription
type: string
- name: keyVault
type: string
- name: sqlServerName
type: string
- name: sqlDatabaseName
type: string
- name: sqlServerAdminSecretName
type: string
- name: sqlServerPasswordSecretName
type: string
- name: dacpacName
type: string
- name: artifactName
type: string
jobs:
- job: Deploy_SQL_Database
pool:
name: Azure Pipelines
vmImage: windows-latest
steps:
- task: DownloadPipelineArtifact#2
inputs:
artifact: ${{ parameters.artifactName }}_artifacts
- task: AzureKeyVault#1
inputs:
azureSubscription: ${{ parameters.subscription }}
keyVaultName: ${{ parameters.keyVault }}
secretsFilter: '${{ parameters.sqlServerAdminSecretName }}, ${{ parameters.sqlServerPasswordSecretName }}'
- task: sqlAzureDacpacDeployment#1
inputs:
azureSubscription: ${{ parameters.subscription }}
ServerName: ${{ parameters.sqlServerName }}.database.windows.net
DatabaseName: ${{ parameters.sqlDatabaseName }}
sqlUsername: $(${{ parameters.sqlServerAdminSecretName }})
sqlPassword: $(${{ parameters.sqlServerPasswordSecretName }})
DacpacFile: $(Pipeline.Workspace)\${{ parameters.dacpacName }}.dacpac
I can get the template to work if I hardcode the secret names:
parameters:
- name: subscriptionName
type: string
- name: keyVault
type: string
- name: sqlServerName
type: string
- name: sqlDatabaseName
type: string
- name: dacpacName
type: string
- name: artifactName
type: string
jobs:
- job: Deploy_${{ parameters.sqlDatabaseName }}_Database
pool:
name: Azure Pipelines
vmImage: windows-latest
steps:
- checkout: none
- task: AzureKeyVault#1
inputs:
azureSubscription: ${{ parameters.subscriptionName }}
keyVaultName: ${{ parameters.keyVault }}
secretsFilter: 'SQLServerAdmin, SQLServerPassword'
- task: DownloadPipelineArtifact#2
inputs:
artifact: ${{ parameters.artifactName }}_artifacts
- task: sqlAzureDacpacDeployment#1
inputs:
azureSubscription: ${{ parameters.subscriptionName }}
ServerName: ${{ parameters.sqlServerName }}.database.windows.net
DatabaseName: ${{ parameters.sqlDatabaseName }}
sqlUsername: $(sqlServerAdmin)
sqlPassword: $(sqlServerPassword)
DacpacFile: $(Pipeline.Workspace)\${{ parameters.dacpacName }}.dacpac
Although this works for us for now, I find this sub-optimal. Is there any way I can make these parameterized variable names work?