Azure DevOps - How can I have my powershell script output data to a text file that is located on the same Repo (git) as my script on Azure - azure-devops

I've got a powershell script located on Azure repo (git), I need that script to output data to a text file located on the same repo when I run the build. However when I use the path ./textFile.txt the text file is created on the vm box rather than on the repo.
How can I have the script output the data to the text file on the repo?

When a build is triggered in Azure DevOps, the git repo (specified branch) is cloned to an agent (vm box as you imagine). That implies the execution of the script and output are limited to the agent.
Now, to meet the need, you need to perform git actions like you do on your local machine to create a new branch and continue with a PullRequest into target remote branch. After the merge (you can automate this as well with Azure DevOps restapi), the output is within the branch that was used in the build process.

Related

Azure DevOps - Pipeline to pull code down to remote server

I am attempting to create a pipeline in our Azure DevOps org that will automatically 'pull' code down to a single remote server whenever a push request is sent to the master branch of my particular repo. I am having a difficult time understanding the entire process and what I actually need to accomplish this relatively simple pipeline.
Currently, my remote server has a folder on the C: drive with various .ps1 files. I would like to turn this into my repo and install the self-hosted agent on this same server so that way anytime I push something to the master branch on my local server it will automatically be pulled down to my remote server and the scheduled tasks I have running will pick be running the most up to date code.
I believe what I need to do first is install a self-hosted agent on my remote server. I am not completely sure though if this agent is suppose to be a deployment agent or a build agent.. or both? Since I am not technically building a project, but rather simply overwriting .ps1 files, I imagine it should only be given permissions for a deployment agent.
Something else I can't wrap my head around is how I specify the location of my repo on the remote server. Can I define this dynamically or do I need to specify in my path the target path of that specific repo?
According to your description, you could simplify your requirement to be: copy files from a source folder to a target folder on a remote machine over SSH using Copy Files Over SSH task, and then run related Git commands like the following.
cd repo_directory
git add .
git commit -m "upadte"
git push
Thus this remote repo is updated using SSH Deployment task.
In addition, you need to deploy a self-hosted build agent because it can be used in build pipeline.
Finally, configuring a schedule trigger for this build pipeline.

How can I check-in files in VSO with azure pipelines

I am trying to do automate runtime generated files through pipeline , needs to do forcefully check-in current DEV branch but that step should skip in UAT / PROD branches. is it possible?
What I have done?
I added one command line task in the pipeline and calling some exe file which perform to generate the class files and copying in Project directory.
What needs to be done?
that file which copied in destination location it should do automatic checkin dev branch
Thanks
First to say, storing file(s) into source control during a pipeline is not a recommended way. It may pollute our source code.
Build artifacts should be pushed to a build drop or uploaded to an artifacts feed, not put back into source control.
If you insist on this. You could use a powershell script to handle the check in process.
Not sure which kind of version control you are using. Kindly check below samples:
For TFVC:
How can I check-in files in TFS with azure pipelines
For GIT:
Azure DevOps pipeline task to update a file and check-in TFS
check-in current DEV branch but that step should skip in UAT / PROD branches.
You could speicify target branch in your script as need.

How to use Azure Powershell Datafactory cmdlets against specific branches in Azure DevOps repositories?

I have a Datafactory in AzureDevOps Repos.
I am trying to use Azure Powershell cmdlets to create a trigger, such as the following :
The problem is, when I run the above command, the changes are not showing on the Az DevOps Repo. So the trigger I just created does not appear in the Repo.
As a workaround, I have been able to manually create JSON files in my local git repo > merge with local master > push to online master and then the trigger works just fine and the JSON file is also visible in the Az DevOps Repo.
But how to instruct the Powershell cmdlets to work on a specific branch? Because when I just use PS cmdlets, whatever is added (trigger/dataset/pipeline) does not show up online in the Azure DevOps Repo.
So, question is, where do cmdlets like Set-AzDatafactorV2Trigger operate/make the changes and can that be changed to work against specific branches in a git repo?
I believe that when you run powershell commands against a Data Factory which is connected to git, the updates are made to the "live mode" of ADF and not the the git version of the factory.
Have you tried to switch to "live mode" to see if your triggers are there:

How to download files from self hosted VM to VSTS

I have python solution which resides in VSTS repository. Using build pipeline and private agent, the source code gets copied to VM.
After executing the python files, output is stored in 3 different files at the source directory level.
I want to download/copy these output files from private hosted VM to VSTS repository.
How can this be achieved?
Thank you
The only way to get something into the repository is by checking it in via source control.
Maybe it's enough for you to just publish these files as a build artifact. You have the option to publish directly to VSTS or to any Windows file share.
If you really want these files in your repository I'd suggest you publish them as build artifacts and check them in with a release pipeline. You could add a new stage in your existing release pipeline or add a new release pipeline that triggers automatically every time your build completes.
You can call git command to add and push changes to repository, for example:
Check Allow Scripts to access the OAuth token option
Add Command Line task (Tool:git; Arguments: add [file path]; Working folder: $(System.DefaultWorkingDirectory))
Add command line task (Tool:git; Arguments: commit –m “add build result”; Working folder: $(System.DefaultWorkingDirectory))
Add command line task (Tool: git; Arguments: push https://test:$(System.AccessToken)#{account}.visualstudio.com/{project}/_git/{repository} HEAD:master
Related article: Keep Git repository in sync between VSTS / TFS and Git
On the other hand, the better way is publishing the result files as artifact of build through Publish Build Artifact task.

Do I need to import code to VSTS?

I am having my repository at bitbucket, now for CI I am creating build definition on VSTS, please note that due to team constraints, I need to continue bitbucket, so I had configured VSTS to trigger build when changes are made to master branch.
Now for placing files such as .Nuspec files, Powershell Scripts (those I need for build process), do I need to Import code from repository to VSTS? because when I go to Code > Files, it shows Project is empty. Add some code!
It's unnecessary to import files/code into VSTS.
If you need to specify files (such as .nuspec or .ps1 etc) in your build definition, you can specify the files from your bitbucket repo or from the build agent machine.
If the files already managed in your bitbucket repo, you can select the files directly.
If the files are not managed in your bitbucket repo, there are two options you can follow:
Option 1: copy the files into build agent machine
If you do not use Hosted agent for your CI build, you can copy the files into the build agent machine to the directory where the agent machine can access. Such as copy test.ps1 into C:\test\test.ps1 of the agent machine, then you can specify the powershell file with the path C:\test\test.ps1 in your build definition.
Option 2: add the files into your bitbucket repo
You can also commit the files into your bitbucket repo. Then specify the files with relative path. Such as add the file test.ps1 into your bitbucket repo's mytest directory (rootRepo\mytest\test.ps1), then you can specify the powershell with the path mytest\test.ps1 in your build definition.