I have problem when build a C++ project in azure-pipeline, some dll files was access denied.
So I need to run a batch script to stop services which using these dll
I was try to run my script at pre-build event in Visual Studio but it execute after Initialize Job task, so not work
Are there any way to run script in Initialize Job?
Are there any way to run script in Initialize Job?
I am afraid there is no such way to run script in Initialize Job at this moment. The Prepare job/Initialize Job are some of the predefined work built into the pipeline. We could not add our custom script in or before those jobs.
So, to resolve this issue, we have to find the reason for this error and resolve it.
Generally, this error will most likely make an appearance if your Build and Release Agent goes offline or a build is interrupted due to an issue on the machine itself, and where specific files have been created mid-flight within the Azure Devops directory. When Azure Devops/TFS then re-attempts a new build and to write to/recreate the files that already exist, it fails, and the above error is displayed.
The best resolution is to log in to the agent machine manually, navigate to the affected directory/file (in this example, C:\VSTS\_work\xxx\xx\.tmp) and delete the file/folder in question. Removing the offending items will effectively "clean slate" the next Build definition execution, which should then complete without issue.
Hope this helps.
I had to solve this exact same problem. The solution is not ideal, but it works.
I created two pipelines. The first pipeline does any required pre-build steps, like stopping services. The second pipeline is the actual build pipeline, and it gets triggered when the first pipeline finishes. (See the build triggers section in pipeline #2.)
Related
Hoping someone could help me out with an issue I'm running into. I have 4 different pipelines set up with the first triggering the second upon build completion and so on down the line. The triggers are not kicking off after the previous pipeline steps build completion as they are supposed to do so. THey're also all on the same branch so i'm at a loss as to what to do. Any ideas? Classic pipeline not a YAML
First, you need to make sure that your MPV Automated Testing Step 1 pipeline runs successfully, because a failed run will not trigger the Build completion trigger.
I tested two pipelines on the same branch. On my side, build completion trigger works well.
In addition, there is a recently event of availability degradation of Azure DevOps, which could affected these services, and it has been resolved. If you want to know more information, please click here. You can try again to see if the problem still exists.
I'm running a deployment on Azure pipelines and the release got stuck while copying files. This resulted in an unending process which is not desirable.
The error I received was:
ERROR 32 (0x00000020) Copying File
C:\azagent\A2_work\r3\a_mach1-light\build\MyProject\HtmlRenderer.dll
The process cannot access the file because it is being used by another
process.
I think this issue is not a permanent one since I've been able to successfully run the same release pipeline several times in the past but I want to figure out why it happens.
I've read several docs relating to Azure pipelines and troubleshooting on microsoft's website but none has been useful.
I've also tried to rerun the build pipeline so it could create another release but the issue still persists.
I'd be happy to get advice from anyone who has experienced this before.
Thanks
You can try restarting the "Azure Pipelines Agent (MyCompanyName.MyMachineName)" service on the machine which hosts your deployment agent. See below screenshot.
For more Troubleshooting steps:
Detect files and folders in use
You can use tools like Process Monitor and Process Explorer to find out what process is using this specific file. See this thread.
Anti-virus exclusion
Anti-virus software scanning your files can cause file or folder in use errors during a build or release. Adding an anti-virus exclusion for your agent directory and configured "work folder" may help to identify anti-virus software as the interfering process.
If you use Msbuild in your pipeline:
MSBuild and /nodeReuse:false
MSBuild and /maxcpucount:[n]
Check this document for more information above troubleshooting steps.
I have an Azure DevOps pipeline build that has several steps and the build is long. Every time there is something wrong with the build we review the logs and identify issues or come up with theories, then in case of a theory we have to insert a diagnostic command line (such as get directory, show contents of a file, etc) in between the steps; and in case of a fix we add a fix but we have to wait for the whole pipeline to rerun and find out. This is causing us to take a lot of time to fix build issues.
If we had access to the state of the agent of an unfinished build and we could just log on using RDP or any other terminal and checkout the contents, and the state of the files on disk that would have saved us a lot of hours.
Is there any way with Azure DevOps to do any diagnostic of this type?
No, if you are using hosted agent. If you are using self-hosted agent you can obviously log in to that one. You can, however, implement steps that only work if the build failed and those steps can attempt to capture information you are interested in (say publish the state of the build directory).
If you are using Azure DevOps Services, there is a new REST API version out that will let you do a "preview" run of changes to the YAML definitions: https://learn.microsoft.com/en-us/azure/devops/release-notes/2020/sprint-165-update#preview-fully-parsed-yaml-document-without-committing-or-running-the-pipeline
I have a build process in which I have a couple of Tests Tasks. Some of them may become quite time consuming when they all run and most of the time, most tests are not expected.
Still, I would like to have ALL these tests run on a scheduled trigger.
I know I could simply clone the pipeline and use one for gating with impacted tests only and the other one for schedule with all tests but as an OO developer, I don't like this.
I already tried linking the checkbox parameter to a process variable and modifying it using PowerShell but failed to have it work (How can I modify a process variable using Powershell in a Azure build pipeline).
Isn't there any other way of doing this?
You may be able to do this by setting the following condition on the test tasks that you'd only like to run during the scheduled build:
eq(variables['Build.Reason'], 'Schedule')
See here for a list of predefined variables (search for 'Build.Reason'):
https://learn.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops&tabs=yaml
See here for more information on expressions:
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/expressions?view=azure-devops
It looks like functionality for this is now built in. According to the docs a variable can be set which will cause all tests to be run:
By setting a build variable. Even after TIA has been enabled in the VSTest task, it can be disabled for a specific build by setting the variable DisableTestImpactAnalysis to true. This override will force TIA to run all tests for that build. In subsequent builds, TIA will go back to optimized test selection
How to get the path to a script (not procedure) in Azure DevOps build?
I'm trying to get the path to either my solution or project file to use in Visual Studio database project PostDeployment.
Works locally
In the Script.PostDeployment.sql file in a Visual Studio Database project I have the following code
SELECT #solutionDir = REPLACE('$(SolutionPath)','MySoulution.sln','');
SET #File = #solutionDir + 'myScript.sql'
-- and here I can use the #File
When I do a local publish/or build I can use the variable/macro $(SolutionPath) where I get the full local path to the solution.
With that I can point to the script I need to access.
Fails in AzureDevops
But Azure DevOps build doesn´t have the $(SolutionPath) macro and I get the following error
SCRIPT.POSTDEPLOYMENT.SQL(17,32,17,32): Build error SQL72008: Variable SolutionPath is not defined.
So I need a way (in AzureDevops) in the build step to get the path to the scripts.
What I have tried
I have tried all kinds of macros that just don't seem to work.
I have tried to work with predefined variables
I tried to use SqlCommandVariableOverride but it seems to have gone missing from the database project some time ago.
Questions
How can I add $(SolutionPath) (or something else) to my build step so this works both locally and in Azure DevOps build?
Is there another way to get a solution or project url to the postdeployment SQLCMD file?
Is MSBuild Extension Pack something I should look at? I will if it is my only option.
p.s
I also created a ticket for this in Azure Pipelines Tasks since they don´t seem to be that active in answering and I can't wait.
e.s
I asked the same question at developercommunity.visualstudio.com. Lets hope somebody there will be able to answer the question. If so I will update this question with an answer.
So first double check that your build is publishing the artifacts. there should be a blue button in the top right corner of your build that says "artifacts". To have access to the artifact you need to do an archive task after you do the build and point it at
$(Build.ArtifactStagingDirectory)
Then I would recommend not using SQL to make direct references to the filesystem for the build server. If this is absolutely necessary then you should use environment variables.
I would recommend setting up a release pipeline for deployment of the SQL script. Release pipelines are really designed for deployment, whereas builds are for compilation.
To configure the release, point it to the artifact and then you can access that artifact by $(System.DefaultWorkingDirectory)/**/*.zip then run whatever commands you want to do to process that.