I am using IIS Deployment template in Release pipeline to deploy MVC application to VM and it is working fine. But after deploying application, we want to run any ad-hoc sql changes using script files in SQL server using custom task Run SQLCMD Scripts from VSTS market place.
Relese pipeline, scripts are in zip file, Can anyone suggest what we should key-in in "Path to folder containing SQLCMD script files"?
You can try referencing the variable
$(Build.ArtifactStagingDirectory)
In Release Pipeline, the artifacts will be downloaded to the path : $(System.ArtifactsDirectory).
According to your screenshot, I noticed that you are using the "Extract files" task. This task will find the zip files in the $(System.ArtifactsDirectory) and extract it.
The unzipped folder name is set in the "Extract files" task (Destination folder).
So you could try to use the following path:
$(System.ArtifactsDirectory)/Destination folder name
You can also expand this path according to the actual location of the file.
Hope this helps.
Related
When you set up an azure devops agent on a build machine, it will have a working folder (by default _work) where it will create subfolders for each pipeline that it has to run.
These folders have integer names like "80" or "29". This makes it hard to trouble shoot issues on a given build machine, when you have many pipelines, as you don't know which folder it each pipeline relates to.
Is there a way to figure out the mapping from pipeline > folder number, or to name these folders more explicitly?
Rename the folders is currently not supported in Azure DevOps.
Each pipeline maps a folder in the agent-> _work.
1.You could check the pipeline log to figure out which folder is your pipeline's working folder. (Enable system diagnostics)
2.You could also add a command line task in your pipeline to echo this directory.
echo $(System.DefaultWorkingDirectory)
I created a deployment group and able to create a agent on my server (linux machine) . Target machine was set. All I need was to create a release pipeline, So I created a release pipeline and select artifacts from build ( I already build that).
For Stage I select empty job and then select “Deployment Group Job” , Add my deployment group that I created and save the changes.
I select another task i.e “Copy Files” that copy files from artifacts to “/var/www/html” on my server(linux).
After this I run the release and deploy it. It succeed but what is does, that it copy zip file to the path I provide on my server.
/var/www/html/11.zip
That is not what I want. I want that it unzip the artifact that was built and deploy code to path I mention so that I can run my application there.
I may be choosing wrong task. But all I need it to run my application from my server. If I need to add some different task , what are those ?
If I need to add some different task , what are those ?
Since you need to deploy Unzipped files to the target path, you can directly use the Extract files task to replace the Copy file task in Deployment Group.
Here is an example:
In the Destination folder field , you could input the target path.
Then this task will unzip the zip file and send the unzipped files to the target path.
We are using a Microsoft Hosted Agent to run a build pipeline for an automated test scenario for our application. What we would like to achieve is, having our automation procedure, to download a file (from a headless chrome browser), then navigate to the path where it is downloaded and open it.
How could I find the path where the files are being downloaded inside the Agent?
Quick compilation of list of pre-defined variables related to paths for the build on linux and windows self hosted agents from official doc link.
The one you are looking is Agent.BuildDirectory or Pipeline.Workspace.
List of predefined variables:
Variable type
Variable
Description
Example
Agent
Agent.BuildDirectory
The local path on the agent where all folders for a given build pipeline are created
D:\..\agent\_work\1
Agent
Agent.HomeDirectory
The directory the agent is installed into
C:\agent
Agent
Agent.TempDirectory
A temporary folder that is cleaned after each pipeline job
D:\..\agent\_work\_temp
Agent
Agent.ToolsDirectory
The directory used by tasks such as Node Tool Installer and Use Python Version to switch between multiple versions of a tool
D:\..\agent\_work\_tool
Agent
Agent.WorkFolder
The working directory for agent
c:\agent_work
Build
Build.SourcesDirectory
The local path on the agent where your source code files are downloaded.
c:\agent_work\1\s
Build
Build.ArtifactStagingDirectory
The local path on the agent where any artifacts are copied to before being pushed to their destination. A typical way to use this folder is to publish your build artifacts with the Copy files and Publish build artifacts tasks
c:\agent_work\1\a
Build
Build.StagingDirectory
The local path on the agent where any artifacts are copied to before being pushed to their destination.
c:\agent_work\1\a
Build
Build.BinariesDirectory
The local path on the agent you can use as an output folder for compiled binaries
c:\agent_work\1\b
Build
Build.Repository.LocalPath
The local path on the agent where your source code files are downloaded.
c:\agent_work\1\s
Build
Common.TestResultsDirectory
The local path on the agent where the test results are created.
c:\agent_work\1\TestResults
Pipeline
Pipeline.Workspace
The Workspace directory for a particular pipeline
/home/vsts/work/1
System
System.DefaultWorkingDirectory
The local path on the agent where your source code files are downloaded.
c:\agent_work\1\s
When you install the agent, you specify the work directory. In pipeline tasks, you can find out exactly where within that directory files are staged with variables like $(Agent.BuildDirectory). This might not be the exact location you need, but I think it is in the right direction.
For a complete list of predefined variables, see here: https://learn.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops&tabs=yaml
You can check this document:
So for Linux its default location /home/<username>/Downloads.
I do the test in Microsoft hosted agent-window2019 with test C# code(Hint from Daniel!) like:
static void Main(string[] args)
{
string FILEURI = "https://www.nuget.org/api/v2/package/Cauldron.Newton/2.0.0";
System.Diagnostics.Process prozess = new System.Diagnostics.Process();
prozess.StartInfo.FileName = #"C:\Program Files (x86)\Google\Chrome\Application\chrome.exe";
prozess.StartInfo.Arguments = "--download " + FILEURI;
prozess.Start();
Console.WriteLine("Test starts.");
}
And then I use command like dir c:\users\VssAdministrator\cauldron.newton.2.0.0.nupkg /s /b to find the location of downloaded file: cauldron.newton.2.0.0.nupkg.
Then i confirmed the default download location of Chrome is still C:/Users/{user}/Downloads, same as using self-agent or downloading locally. (VssAdministrator is user when run windows hosted agent)
So I think Linux hosted agent should have similar behavior. You can try to find your file from /home/<username>/Downloads folder. Hope it helps.
did anyone have worked custom build on the powershell and upload the artifact to the VSTS directory.
I have configured Build Process on the VSTS agent, Build process all passed and i have artifact, we also need to encrypt the app, the process is on CLI, i have powershell script execute those tasks, but i couldn't able to upload that articats to VSTS directory,
his anyone have any idea how can i achieve this goal.
The tasks (i) button helps understand the directory you'll be working in the context of for a task.
In the instance of the Powershell task it will be working from the $(System.DefaultWorkingDirectory). If the powershell task is in version control use the ... button to select the powershell script you wish to execute.
Once your script executes you'll probably want a Copy Task to copy the file you just encrypted to $(Build.ArtifactStagingDirectory) something like this (obviously you'll need to modify the contents field so it copies the encrypted file from the powershell step):
Then you're ready to publish to Azure DevOps taking the contents of $(Build.ArtifactStagingDirectory) and making this an artifact for the build. In the below example this artifact will be called drop.
Hope that helps.
I've managed to get Teamcity running and connecting into bitbucket and the final step I'd like would be to get the MVC 4 project copied into another folder on the server ready for an xcopy deployment onto a web host.
I'm using MSBUILD, as the build agent.
Thanks in advance.
Preferred way is to use publishing targets in MSBuild.
Add new build step with runner type MSBuild
Set Build file path to your web project csproj file
Set Target to Clean;Build;Publish
Set Command line parameters to /p:Configuration=Release;PublishDir=\\your\target\path
Hope this helps.
You could use the CommandLine buildrunner to xcopy.
Personally I would not even copy the result to a different server.
For deployment I would have a deployment project in Teamcity that gets the required artifcat via wget from the Teamcity Rest-Api and uploads it to the hosting provider.
This can also be done in CommandLine buildrunner.
Under general settings
Click "Show advanced options"
Under Artifact Paths you can specify what you would like put under a new folder
**/* => target_directory
Or you can zip up your files and put under a new folder like this
**/* => newfolder/mypackage.zip
See more details here: https://www.jetbrains.com/help/teamcity/2019.2/configuring-general-settings.html#ConfiguringGeneralSettings-ArtifactPaths