Powershell SVN Commit After Delete - powershell

I am writing a script to remove all files from a directory in SVN:
$svn = 'C:\Program Files\TortoiseSVN\bin\svn.exe'
$commitmsg = 'C:\TradeSupport\AB Reports\msg.txt'
$Reports = Get-Content -Path 'C:\TradeSupport\AB Reports\AB Reports.csv'
Foreach ($Report in $Reports){
& $svn remove --force C:\SVN\Test\$Report
& $svn commit -F $commitmsg C:\SVN\Test\$Report}
& $svn update $commitmsg 'C:\SVN\Test'
The files are TestA and TestB. Running the script deletes the files but does not commit the change. No error is thrown, but I have to go back and physically commit the change. What would be the best way to automate this process?
I also had the commit point to the directory itself, while outside the ForEach loop, but that did not work either.

I had to change the directory I was working in to C:\SVN\Test

Related

if then else not seeing else argument

I'm trying to learn myself some PowerShell scripting to automate some tasks at work.
The latest task I tried to automate was to create a copy of user files to a network-folder, so that users can easily relocate their files when swapping computers.
Problem is that my script automatically grabs the first option in the whole shebang, it never picks the "else"-option.
I'll walk you through part of the script. (I translated some words to make it easier to read)
#the script asks whether you want to create a copy, or put a copy back
$question1 = Read-Host "What would you like to do with your backup? make/put back"
if ($question1 -match 'put back')
{Write-Host ''
Write-Host 'Checking for backup'
Write-Host ''
#check for existing backup
if (-Not(Test-Path -Literalpath "G:\backupfolder"))
{Write-Host "no backup has been found"}
Elseif (Test-Path -LiteralPath "G:\backupfolder")
{Write-Host "a backup has been found."
Copy-Item -Path "G:\backupfolder\pictures\" -Destination "C:\Users\$env:USERNAME\ ....}}
Above you see the part where a user would want the user to put a "backup" back.
It checks if a "backup" exists on the G-drive. If the script doesn't see a backup-folder it says so. If the script DOES see the backup it should copy the content from the folders on the G-drive to the similarly named folder you'd find on the user-profile-folder. Problem is: So far it only acts as if there is never a G:\backupfolder to be found. It seems that I'm doing something wrong with if/then/else.
I tried with if-->Else, and with if-->Elseif, but neither works.
I also thought that it could be the Test-Path, so I tried adding -LiteralPath, but to no avail.
There is more to the script but it's just more if/then/else. If I can get it to work on this part I should be able to get the rest working. What am I not seeing/doing wrong?

Azure DevOps Pipeline - Power shell script , Copy Files using Variables

I am working on Azure DevOps Build Pipeline and one of the task is to copy my dll and pdb files into a staging folder for example
Code
MyProject
Bin
Debug
MyProject.dll
MyProject.pdb
Staging
Client
Libraries
I want to use PowerShell script task and I am using inline script.
When I give below it is not working
Copy-Item $(Build.Repository.LocalPath)\Code\MyProject\Bin\$(DebugBuildConfiguration)
-Destination $(ClientLibrariesFolder)
Below are my variables
Variable Name Variable Value
StagingFolder $(Build.Repository.LocalPath)\Staging
DebugBuildConfiguration Debug
ClientLibrariesFolder $(StagingFolder)\Client\Libraries
I donot get any error. But nothing happens.
SOLUTION:
I solved my issue following below
I added new variable like below
CodeLocalPath : $(Build.Repository.LocalPath)
I added Powershell task to my Azure DevOps build pipeline.
I gave Type as Inline.
In Script I gave below
$destination = "{0}" -f $env:ClientLibrariesFolder
# Copy MyProject.dll to Staging\Client\Libraries
$sourcefolder = "{0}\Code\MyProject\Bin\{1}\MyProject.dll" -f $env:CodeLocalPath, $env:DebugBuildConfiguration
"Source : {0} and Destination : {1} " -f $($sourcefolder), $($destination)
Copy-Item $($sourcefolder) -Destination $($destination)
# Copy MyProject.pdb to Staging\Client\Libraries
$sourcefolder = "{0}\Code\MyProject\Bin\{1}\MyProject.pdb" -f $env:CodeLocalPath, $env:DebugBuildConfiguration
"Source : {0} and Destination : {1} " -f $($sourcefolder), $($destination)
Copy-Item $($sourcefolder) -Destination $($destination)
I donot get any error. But nothing happens.
What do you mean "But nothing happens"? Do you mean that no files have been copied into your Repos?
If yes, that is the correct behavior of Devops. Because this is not recommended to upload any file back to your repo.
If you set the system.debug=true in the variables tab, you will find the log like:
##[debug]Copy-Item C:\VS2017Agent\_work\8\s\TestSample\TestSample\Bin\Debug\*.* -Destination C:\VS2017Agent\_work\8\s\TestSample\Staging\Client\Libraries'
It will not copy the file to the repos. That should be the reason why you see nothing happens.
Besides, Looking at Microsoft's documentation the descriptions are as follows:
$(Build.Repository.LocalPath): The local path on the agent where
your source code files are downloaded. For example: c:\agent_work\1\s
By default, new build definitions update only the changed files. You
can modify how files are downloaded on the Repository tab.
So, the value of this variable is pointing to the agent instead of repos.
Note: Copy-Item in powershell should copy the files instead of a folder, try to use *.* to include all file in the folder.

Persisting Powershell variables between steps

Am working on VSTS release task for deploying the Web Application along with Database. Unfortunately, we are not creating any Build Definition for creating drop folder. But, my client will provide drop folder for this project, what I need is “I want to copy the files in VM along with creation of System-Timed folder” at release level. For that I created a folder with the help of PowerShell Task.
$FileName = (Get-Date).tostring("dd-MM-yyyy-hh-mm-ss")
$Fname = New-Item -itemType Directory -Path C:\Database -Name ("Test "+ $FileName)
Write-Host "##vso[task.setvariable variable=$Fname;]$Fname"
Write-Output ("##vso[task.setvariable variable=$Fname;]UpdatedValueInScript")
But, I’m not able to use that the above PowerShell Script output variable in next “Copy Files” task.
Note: For creating folder in VM, I followed this link
Your variable name should be static.
The value should change.
Write-Host "##vso[task.setvariable variable=Fname;]$Fname"

Powershell Delete Locked File But Keep In Memory

Until recently, we've been deploying .exe applications by simply copying them manually to the destination folder on the server. Often though, the file was already running at the time of deployment (the file is called from a SQL Server job)--sometimes even multiple instances. We don't want to kill the process while it's running. We also can't wait for it to finish because it keeps on being invoked, sometimes multiple times concurrently.
As a workaround, what we've done is a "cut and paste" via Windows Explorer on the .exe file into another folder. Apparently, what this does is it moves the file (effectively a delete) but keeps it in RAM so that the processes which are using it can continue without issues. Then we'd put the new files there which would get called when any later program would call it.
We've now moved to an automated deploy tool and we need an automated way of doing this.
Stop-Process -name SomeProcess
in PowerShell would kill the process, which I don't want to do.
Is there a way to do this?
(C# would also be OK.)
Thanks,
function moverunningprocess($process,$path)
{
if($path.substring($path.length-1,1) -eq "\") {$path=$path.substring(0,$path.length-1)}
$fullpath=$path+"\"+$process
$movetopath=$path + "--Backups\$(get-date -f MM-dd-yyyy_HH_mm_ss)"
$moveprocess=$false
$runningprocess=Get-WmiObject Win32_Process -Filter "name = '$process'" | select CommandLine
foreach ($tp in $runningprocess)
{
if ($tp.commandline -ne $null){
$p=$tp.commandline.replace('"','').trim()
if ($p -eq $fullpath) {$moveprocess=$true}
}
}
if ($moveprocess -eq $true)
{
New-Item -ItemType Directory -Force -Path $movetopath
Move-Item -path "$path\*.*" -destination "$movetopath\"
}
}
moverunningprocess "processname.exe" "D:\Programs\ServiceFolder"
Since you're utilizing a SQL Sever to call the EXE. Why do you add a table that contains the path to the latest version of the file and modify your code that fires the EXE. That way when a new version is rolled out, you can create a new folder, place the file in it, and update the table pointing to it. That will allow any still active threads to have access to the old version and any new threads will pickup up the new executable. You then can delete the old file after it's no longer needed.

Mercurial: Windows script to add subrepository automatically

RyanWilcox had posted a script at here, that can use the following command to add subrepository automatically:
$ cd $TOP_OF_HG_PROJECT
$ addsubrepo.sh $URL_TO_HG_PROJECT relative_path/you/want_to/put_the_subrepo
How to translate it into Windows batch or powershell script?
Here is a quick and dirty translation. Haven't tested as I got no Mercury around. The initial script seems to be easy enough to translate into Powershell.
# Project and relative paths as script parameters
param([string]$project, [string]$relPath)
# Let's see if there is an item .hg. If not, report error and quit
if((test-path ".hg") -eq $false ) {
"You MUST run this at the top of your directory structure and use relative paths"
return
}
# Call Mercury
& hg clone $project $relPath
# Add data to .hgsub using composite formatting string
add-content -path ".hgsub" -value $("{0} = {1}" -f $relPath, $project)
# Check that .hgsub exists and issue Mercury commands if it does
if(test-path ".hgsub") {
hg add .hgsub
hg commit
} else {
"failure, see error messages above"
}