I source multiple artifacts in a release pipeline and do multiple transformation before publishing the files to SF Cluster. Is there a way to look/Debug the content just before publishing so I understand my transformation are working correctly. I am thinking to connect to azure storage and publish those file to have a look. Is there a better way to look through the content before publishing?
Also, is there a way to look at locked (secured) variable content?
Add a command line / shell script step and run whatever commands you want to investigate the file system.
Related
I have an Azure DevOps pipeline build that has several steps and the build is long. Every time there is something wrong with the build we review the logs and identify issues or come up with theories, then in case of a theory we have to insert a diagnostic command line (such as get directory, show contents of a file, etc) in between the steps; and in case of a fix we add a fix but we have to wait for the whole pipeline to rerun and find out. This is causing us to take a lot of time to fix build issues.
If we had access to the state of the agent of an unfinished build and we could just log on using RDP or any other terminal and checkout the contents, and the state of the files on disk that would have saved us a lot of hours.
Is there any way with Azure DevOps to do any diagnostic of this type?
No, if you are using hosted agent. If you are using self-hosted agent you can obviously log in to that one. You can, however, implement steps that only work if the build failed and those steps can attempt to capture information you are interested in (say publish the state of the build directory).
If you are using Azure DevOps Services, there is a new REST API version out that will let you do a "preview" run of changes to the YAML definitions: https://learn.microsoft.com/en-us/azure/devops/release-notes/2020/sprint-165-update#preview-fully-parsed-yaml-document-without-committing-or-running-the-pipeline
I'm very interested in the options which open up for us with the az devops cli.
I'd like to be able to generate a yaml file locally and run it from local file by using the "az pipelines run" command. is this possible?
it would allow for very fast iteration of pipeline creation. At present we are making updates to a yaml file in repo, committing, running and then reviewing (which isn't as smooth a process as it could be).
thanks
Unfortunately, this is not possible.
There is a suggestion under review on the Visual Studio Developer Community.
How to get the path to a script (not procedure) in Azure DevOps build?
I'm trying to get the path to either my solution or project file to use in Visual Studio database project PostDeployment.
Works locally
In the Script.PostDeployment.sql file in a Visual Studio Database project I have the following code
SELECT #solutionDir = REPLACE('$(SolutionPath)','MySoulution.sln','');
SET #File = #solutionDir + 'myScript.sql'
-- and here I can use the #File
When I do a local publish/or build I can use the variable/macro $(SolutionPath) where I get the full local path to the solution.
With that I can point to the script I need to access.
Fails in AzureDevops
But Azure DevOps build doesn´t have the $(SolutionPath) macro and I get the following error
SCRIPT.POSTDEPLOYMENT.SQL(17,32,17,32): Build error SQL72008: Variable SolutionPath is not defined.
So I need a way (in AzureDevops) in the build step to get the path to the scripts.
What I have tried
I have tried all kinds of macros that just don't seem to work.
I have tried to work with predefined variables
I tried to use SqlCommandVariableOverride but it seems to have gone missing from the database project some time ago.
Questions
How can I add $(SolutionPath) (or something else) to my build step so this works both locally and in Azure DevOps build?
Is there another way to get a solution or project url to the postdeployment SQLCMD file?
Is MSBuild Extension Pack something I should look at? I will if it is my only option.
p.s
I also created a ticket for this in Azure Pipelines Tasks since they don´t seem to be that active in answering and I can't wait.
e.s
I asked the same question at developercommunity.visualstudio.com. Lets hope somebody there will be able to answer the question. If so I will update this question with an answer.
So first double check that your build is publishing the artifacts. there should be a blue button in the top right corner of your build that says "artifacts". To have access to the artifact you need to do an archive task after you do the build and point it at
$(Build.ArtifactStagingDirectory)
Then I would recommend not using SQL to make direct references to the filesystem for the build server. If this is absolutely necessary then you should use environment variables.
I would recommend setting up a release pipeline for deployment of the SQL script. Release pipelines are really designed for deployment, whereas builds are for compilation.
To configure the release, point it to the artifact and then you can access that artifact by $(System.DefaultWorkingDirectory)/**/*.zip then run whatever commands you want to do to process that.
I'm new to Azure DevOps and I'm trying to understand how to package a release of a PowerShell script project I'm working on.
I'm previously familiar with GitHub and the manual process for drafting a new release of my project repo. I'm now experimenting with Azure DevOps and what I want to achieve is a similar output to GitHub where my repo of PowerShell scripts are packaged into a zip file which I can publish for release.
I'm not familiar with the pipeline process in Azure DevOps or YAML as a newbie to proper release cycle tools. Previously I've just created scripts and shared them simply as they are or dropped them into a GitHub repo and manually packaged a release. I'm not likely to be turning out large numbers of builds and so have never had to come at this from an automated standpoint which seems to be the way Azure is driving me unless I'm missing something?
It's pretty simple. I prefer to do this using the old-fashing GUI (hint: there is a link when starting a new Build Pipeline that says Use the classic editor), and then convert to YAML after I get my Build Pipeline working.
1) Create your standard Build Pipeline.
2) Add the step to ZIP your files
3) Add properties to that Archive step. Specify the source to zip and target where you want the zip file to end up at.
4) Lastly, convert that single step to a YAML step by clicking in the upper-right corner on the link View YAML.
There are a lot of steps I am leaving out, but I hope this leads you into the right direction.
I have a PowerShell script that I want to re-use across multiple build pipelines. My question is, is there a way I can "store" or "save" my PowerShell script at the project or organization scope so that I can use it in my other build pipelines? If so, how? I can't seem to find a way to do this. It would be super handy though.
It is now possible to check out multiple repositories in one YAML pipeline. You could place your script in one repository and check it out in a pipeline of any other repository. You could then reference the script directly on the pipeline workspace.
More info here.
Depending on how big theese scripts are you can create Taskgroups that contain powershell-tasks with the script as inline-powershell. But this only works on project-scope..
Another attempt i'd try would be to create a repo containing your powershell-scripts, add this repo as submodule in the repository you are trying to build and then call the scripts from the submodule-folder. But this only works when using git-repos.
Or you could create a custom build-task that contains your script.
From what I have seen, no.
A few different options I have explored are:
If using a non-hosted agent, saving the file onto the build server. Admittedly this doesn't scale well, but it is better than copy/pasting the script all over. I was able to put these scripts into version control and deploy them via their own pipeline so that might be an solution for scaling (if necessary)
Cloning another repository that has these shared scripts during the process.
I've been asking for this feature for a bit, but it seems the Azure DevOps team has higher priorities.
How about putting the powershell in a nuget package and install that in depending projects?
I just discovered YAML templates (https://learn.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azdevops#step-re-use).
I think it may help you in this case (depending how large it is your file), you can put an inline powershell script in that template yaml, and reuse it on your main yaml.
Documentation is pretty straightforward.