Persisting Powershell variables between steps - powershell

Am working on VSTS release task for deploying the Web Application along with Database. Unfortunately, we are not creating any Build Definition for creating drop folder. But, my client will provide drop folder for this project, what I need is “I want to copy the files in VM along with creation of System-Timed folder” at release level. For that I created a folder with the help of PowerShell Task.
$FileName = (Get-Date).tostring("dd-MM-yyyy-hh-mm-ss")
$Fname = New-Item -itemType Directory -Path C:\Database -Name ("Test "+ $FileName)
Write-Host "##vso[task.setvariable variable=$Fname;]$Fname"
Write-Output ("##vso[task.setvariable variable=$Fname;]UpdatedValueInScript")
But, I’m not able to use that the above PowerShell Script output variable in next “Copy Files” task.
Note: For creating folder in VM, I followed this link

Your variable name should be static.
The value should change.
Write-Host "##vso[task.setvariable variable=Fname;]$Fname"

Related

if then else not seeing else argument

I'm trying to learn myself some PowerShell scripting to automate some tasks at work.
The latest task I tried to automate was to create a copy of user files to a network-folder, so that users can easily relocate their files when swapping computers.
Problem is that my script automatically grabs the first option in the whole shebang, it never picks the "else"-option.
I'll walk you through part of the script. (I translated some words to make it easier to read)
#the script asks whether you want to create a copy, or put a copy back
$question1 = Read-Host "What would you like to do with your backup? make/put back"
if ($question1 -match 'put back')
{Write-Host ''
Write-Host 'Checking for backup'
Write-Host ''
#check for existing backup
if (-Not(Test-Path -Literalpath "G:\backupfolder"))
{Write-Host "no backup has been found"}
Elseif (Test-Path -LiteralPath "G:\backupfolder")
{Write-Host "a backup has been found."
Copy-Item -Path "G:\backupfolder\pictures\" -Destination "C:\Users\$env:USERNAME\ ....}}
Above you see the part where a user would want the user to put a "backup" back.
It checks if a "backup" exists on the G-drive. If the script doesn't see a backup-folder it says so. If the script DOES see the backup it should copy the content from the folders on the G-drive to the similarly named folder you'd find on the user-profile-folder. Problem is: So far it only acts as if there is never a G:\backupfolder to be found. It seems that I'm doing something wrong with if/then/else.
I tried with if-->Else, and with if-->Elseif, but neither works.
I also thought that it could be the Test-Path, so I tried adding -LiteralPath, but to no avail.
There is more to the script but it's just more if/then/else. If I can get it to work on this part I should be able to get the rest working. What am I not seeing/doing wrong?

Azure DevOps Pipeline - Power shell script , Copy Files using Variables

I am working on Azure DevOps Build Pipeline and one of the task is to copy my dll and pdb files into a staging folder for example
Code
MyProject
Bin
Debug
MyProject.dll
MyProject.pdb
Staging
Client
Libraries
I want to use PowerShell script task and I am using inline script.
When I give below it is not working
Copy-Item $(Build.Repository.LocalPath)\Code\MyProject\Bin\$(DebugBuildConfiguration)
-Destination $(ClientLibrariesFolder)
Below are my variables
Variable Name Variable Value
StagingFolder $(Build.Repository.LocalPath)\Staging
DebugBuildConfiguration Debug
ClientLibrariesFolder $(StagingFolder)\Client\Libraries
I donot get any error. But nothing happens.
SOLUTION:
I solved my issue following below
I added new variable like below
CodeLocalPath : $(Build.Repository.LocalPath)
I added Powershell task to my Azure DevOps build pipeline.
I gave Type as Inline.
In Script I gave below
$destination = "{0}" -f $env:ClientLibrariesFolder
# Copy MyProject.dll to Staging\Client\Libraries
$sourcefolder = "{0}\Code\MyProject\Bin\{1}\MyProject.dll" -f $env:CodeLocalPath, $env:DebugBuildConfiguration
"Source : {0} and Destination : {1} " -f $($sourcefolder), $($destination)
Copy-Item $($sourcefolder) -Destination $($destination)
# Copy MyProject.pdb to Staging\Client\Libraries
$sourcefolder = "{0}\Code\MyProject\Bin\{1}\MyProject.pdb" -f $env:CodeLocalPath, $env:DebugBuildConfiguration
"Source : {0} and Destination : {1} " -f $($sourcefolder), $($destination)
Copy-Item $($sourcefolder) -Destination $($destination)
I donot get any error. But nothing happens.
What do you mean "But nothing happens"? Do you mean that no files have been copied into your Repos?
If yes, that is the correct behavior of Devops. Because this is not recommended to upload any file back to your repo.
If you set the system.debug=true in the variables tab, you will find the log like:
##[debug]Copy-Item C:\VS2017Agent\_work\8\s\TestSample\TestSample\Bin\Debug\*.* -Destination C:\VS2017Agent\_work\8\s\TestSample\Staging\Client\Libraries'
It will not copy the file to the repos. That should be the reason why you see nothing happens.
Besides, Looking at Microsoft's documentation the descriptions are as follows:
$(Build.Repository.LocalPath): The local path on the agent where
your source code files are downloaded. For example: c:\agent_work\1\s
By default, new build definitions update only the changed files. You
can modify how files are downloaded on the Repository tab.
So, the value of this variable is pointing to the agent instead of repos.
Note: Copy-Item in powershell should copy the files instead of a folder, try to use *.* to include all file in the folder.

Make a .json-file more flexible with Variables for automation deploy

I've got a PowerShell-Script to create a VM from an Image in Azure and in this Script I deposited a .json (Parameter for VM, etc.). But if I want to create more than one VM the Names of the VM, Vnet, etc. cannot be the same for every execution (have to be in the same Resource Group).
So my Question: How can I insert Variables in the .json File to change the Name of the VM, etc. for every execution? Perhaps I have to rethink?
A very basic approach could be something like this:
# Grab the file contents
$contents = Get-Content -Path $templateFile
# Update some tokens in the file contents
$contents = $contents.replace("original value", "new value")
# Push the updated contents to a new file
Set-Content -Path $updatedFile -Value $contents
If you have a value that changes with every deployment, you could also consider using the -TemplateParameterObject parameter with the New-AzureRmResourceGroupDeployment cmdlet. That way, you can generate the values in your powershell script without having to output them to json file first.
For more details, have a look at the cmdlet specs

Powershell Delete Locked File But Keep In Memory

Until recently, we've been deploying .exe applications by simply copying them manually to the destination folder on the server. Often though, the file was already running at the time of deployment (the file is called from a SQL Server job)--sometimes even multiple instances. We don't want to kill the process while it's running. We also can't wait for it to finish because it keeps on being invoked, sometimes multiple times concurrently.
As a workaround, what we've done is a "cut and paste" via Windows Explorer on the .exe file into another folder. Apparently, what this does is it moves the file (effectively a delete) but keeps it in RAM so that the processes which are using it can continue without issues. Then we'd put the new files there which would get called when any later program would call it.
We've now moved to an automated deploy tool and we need an automated way of doing this.
Stop-Process -name SomeProcess
in PowerShell would kill the process, which I don't want to do.
Is there a way to do this?
(C# would also be OK.)
Thanks,
function moverunningprocess($process,$path)
{
if($path.substring($path.length-1,1) -eq "\") {$path=$path.substring(0,$path.length-1)}
$fullpath=$path+"\"+$process
$movetopath=$path + "--Backups\$(get-date -f MM-dd-yyyy_HH_mm_ss)"
$moveprocess=$false
$runningprocess=Get-WmiObject Win32_Process -Filter "name = '$process'" | select CommandLine
foreach ($tp in $runningprocess)
{
if ($tp.commandline -ne $null){
$p=$tp.commandline.replace('"','').trim()
if ($p -eq $fullpath) {$moveprocess=$true}
}
}
if ($moveprocess -eq $true)
{
New-Item -ItemType Directory -Force -Path $movetopath
Move-Item -path "$path\*.*" -destination "$movetopath\"
}
}
moverunningprocess "processname.exe" "D:\Programs\ServiceFolder"
Since you're utilizing a SQL Sever to call the EXE. Why do you add a table that contains the path to the latest version of the file and modify your code that fires the EXE. That way when a new version is rolled out, you can create a new folder, place the file in it, and update the table pointing to it. That will allow any still active threads to have access to the old version and any new threads will pickup up the new executable. You then can delete the old file after it's no longer needed.

How do I transfer my build output files to an Azure VM using PowerShell DSC?

I've been toying around with DSC and I think it's an awesome platform. I made a few tests to automate the deployment of our TFS build outputs and to automatically install the web applications and configure an environment.
This was relatively easy, as I could pass my drop folder path to the DSC script using a file share on our internal network, and use relative paths inside the configuration to select each of our modules.
My problem now is in how to expand this to Azure virtual machines. We wanted to create these scripts to automatically deploy to our QA and Production servers, which are hosted on Azure. Since they are not in our domain, I can't use the File resource anymore to transfer the files, but at the same time I want exactly the same functionality: I'd like to somehow point the configuration to our build output folder and copy the files from there to the virtual machines.
Is there some way I can copy the drop folder files easily from inside a configuration that is run on these remote computers, without sharing the same network and domain? I successfully configured the VMs to accept DSC calls over https using certificates, and I just found out that the Azure PowerShell cmdlets enable you to upload a configuration to Azure storage and run it in the VMs automatically (which seems a lot better than what I did) but I still don't know how I'd get access to my build outputs from inside the virtual machine when the configuration script is run.
I ended up using the Publish-AzureVMDscExtension cmdlet to create a local zip file, appending my build outputs to the zip, and then publishing the zip, something along those lines:
function Publish-AzureDscConfiguration
{
[CmdletBinding()]
Param(
[Parameter(Mandatory)]
[string] $ConfigurationPath
)
Begin{}
Process
{
$zippedConfigurationPath = "$ConfigurationPath.zip";
Publish-AzureVMDscConfiguration -ConfigurationPath:$ConfigurationPath -ConfigurationArchivePath:$zippedConfigurationPath -Force
$tempFolderName = [System.Guid]::NewGuid().ToString();
$tempFolderPath = "$env:TEMP\$tempFolderName";
$dropFolderPath = "$tempFolderPath\BuildDrop";
try{
Write-Verbose "Creating temporary folder and symbolic link to build outputs at '$tempFolderPath' ...";
New-Item -ItemType:Directory -Path:$tempFolderPath;
New-Symlink -LiteralPath:$dropFolderPath -TargetPath:$PWD;
Invoke-Expression ".\7za a $tempFolderPath\BuildDrop.zip $dropFolderPath -r -x!'7za.exe' -x!'DscDeployment.ps1'";
Write-Verbose "Adding component files to DSC package in '$zippedConfigurationPath'...";
Invoke-Expression ".\7za a $zippedConfigurationPath $dropFolderPath.zip";
}
finally{
Write-Verbose "Removing symbolic link and temporary folder at '$tempFolderPath'...";
Remove-ReparsePoint -Path:$dropFolderPath;
Remove-Item -Path:$tempFolderPath -Recurse -Force;
}
Publish-AzureVMDscConfiguration -ConfigurationPath:$zippedConfigurationPath -Force
}
End{}
}
By using a zip inside the zip used by Azure, I can access the inner contents in the working directory of the PowerShell DSC Extension (in the DSCWork folder). I tried just adding the drop folder directly to the zip (without zipping it first), but then the DSC Extension copies the folder to the modules path, thinking it is a module.
I'm not completely happy with this solution yet, and I'm having a few problems already, but it makes sense in my mind and should work fine.