Mercurial: Windows script to add subrepository automatically - powershell

RyanWilcox had posted a script at here, that can use the following command to add subrepository automatically:
$ cd $TOP_OF_HG_PROJECT
$ addsubrepo.sh $URL_TO_HG_PROJECT relative_path/you/want_to/put_the_subrepo
How to translate it into Windows batch or powershell script?

Here is a quick and dirty translation. Haven't tested as I got no Mercury around. The initial script seems to be easy enough to translate into Powershell.
# Project and relative paths as script parameters
param([string]$project, [string]$relPath)
# Let's see if there is an item .hg. If not, report error and quit
if((test-path ".hg") -eq $false ) {
"You MUST run this at the top of your directory structure and use relative paths"
return
}
# Call Mercury
& hg clone $project $relPath
# Add data to .hgsub using composite formatting string
add-content -path ".hgsub" -value $("{0} = {1}" -f $relPath, $project)
# Check that .hgsub exists and issue Mercury commands if it does
if(test-path ".hgsub") {
hg add .hgsub
hg commit
} else {
"failure, see error messages above"
}

Related

In PowerShell script, how can I restore packages to resolve error: This project references NuGet package(s) that are missing on this computer

I am tasked with writing a PowerShell script to download the latest source code for a given branch, rebuild it and deploy it. The script I've written, is able to download projects, and in most cases has been able to rebuild them. But with one web project I get this error:
error : This project references NuGet package(s) that are missing on this computer. Use NuGet Package Restore to download them. For more information, see http://go.microsoft.com/fwlink/?LinkID=322105. The missing file is ..\packages\Microsoft.Net.Compilers.2.0.1\build\Microsoft.Net.Compilers.props.
I've researched if PowerShell has a Update-Package command like the one available in the VS command prompt but have been unable to find the equivalent.
I know there's a cmdlet for Packages but from what I've seen they're used to update a specific package...what I need is to be able to have it download/update all packages referenced in the project.
Some points that might be of interest...
When I get latest on the solution to a new empty folder, the only thing in the packages folder is Modernizr.2.6.2. This is the same whether I'm getting latest in VS or in my PowerShell script.
If I open the solution within VS 2017, I am able to rebuild the solution with no problems. It downloads/installs over a dozen other packages...one of which is the Microsoft.Net.Compilers.props package referred to in the error message.
But if I delete everything and re-download the source code and then through my PowerShell script I call MSBuild to rebuild the solution I get the error mentioned above. It never seems to download/install the missing packages.
Anyone have any ideas how I can use MSBuild within my PowerShell script to rebuild the project and have it automatically update/install any packages it needs?
Thanks
I was able to find a solution to my problem on this page :Quickly Restore NuGet Packages With PowerShell
On that page is a script that uses Nuget.exe to download the packages based on the packages.config:
#This will be the root folder of all your solutions - we will search all children of
this folder
$SOLUTIONROOT = "C:\Projects\"
#This is where your NuGet.exe is located
$NUGETLOCATION = "C:\Projects\NuGet\NuGet.exe"
Function RestoreAllPackages ($BaseDirectory)
{
Write-Host "Starting Package Restore - This may take a few minutes ..."
$PACKAGECONFIGS = Get-ChildItem -Recurse -Force $BaseDirectory -ErrorAction SilentlyContinue | Where-Object { ($_.PSIsContainer -eq $false) -and ( $_.Name -eq "packages.config")}
ForEach($PACKAGECONFIG in $PACKAGECONFIGS)
{
Write-Host $PACKAGECONFIG.FullName
$NugetRestore = $NUGETLOCATION + " install " + " '" + $PACKAGECONFIG.FullName + "' -OutputDirectory '" + $PACKAGECONFIG.Directory.parent.FullName + "\packages'"
Write-Host $NugetRestore
Invoke-Expression $NugetRestore
}
}
RestoreAllPackages $SOLUTIONROOT
Write-Host "Press any key to continue ..."
$x = $host.UI.RawUI.ReadKey("NoEcho,IncludeKeyDown")
I modified and added this function to my PS script and call it first to download all the packages and that does the job!
You need to call the restore target of MSBuild to download NuGet packages. You can do that by running something like:
git clone [your repo]
cd [your repo]
msbuild /target:Restore [Your Solution]
msbuild [Your Solution]
function buildVS
{
param
(
[parameter(Mandatory=$true)]
[String] $path,
[parameter(Mandatory=$false)]
[bool] $nuget = $true,
[parameter(Mandatory=$false)]
[bool] $clean = $true
)
process
{
$msBuildExe = 'C:\Program Files (x86)\Microsoft Visual Studio\2019\Professional\MSBuild\Current\Bin\MSBuild.exe'
if ($nuget) {
Write-Host "Restoring NuGet packages" -foregroundcolor green
& "$($msBuildExe)" "$($path)" /p:Configuration=Release /p:platform=x64 /t:restore
}
if ($clean) {
Write-Host "Cleaning $($path)" -foregroundcolor green
& "$($msBuildExe)" "$($path)" /t:Clean /m
}
Write-Host "Building $($path)" -foregroundcolor green
& "$($msBuildExe)" "$($path)" /t:Build /p:Configuration=Release /p:platform=x64
}
}

Azure DevOps Pipeline - Power shell script , Copy Files using Variables

I am working on Azure DevOps Build Pipeline and one of the task is to copy my dll and pdb files into a staging folder for example
Code
MyProject
Bin
Debug
MyProject.dll
MyProject.pdb
Staging
Client
Libraries
I want to use PowerShell script task and I am using inline script.
When I give below it is not working
Copy-Item $(Build.Repository.LocalPath)\Code\MyProject\Bin\$(DebugBuildConfiguration)
-Destination $(ClientLibrariesFolder)
Below are my variables
Variable Name Variable Value
StagingFolder $(Build.Repository.LocalPath)\Staging
DebugBuildConfiguration Debug
ClientLibrariesFolder $(StagingFolder)\Client\Libraries
I donot get any error. But nothing happens.
SOLUTION:
I solved my issue following below
I added new variable like below
CodeLocalPath : $(Build.Repository.LocalPath)
I added Powershell task to my Azure DevOps build pipeline.
I gave Type as Inline.
In Script I gave below
$destination = "{0}" -f $env:ClientLibrariesFolder
# Copy MyProject.dll to Staging\Client\Libraries
$sourcefolder = "{0}\Code\MyProject\Bin\{1}\MyProject.dll" -f $env:CodeLocalPath, $env:DebugBuildConfiguration
"Source : {0} and Destination : {1} " -f $($sourcefolder), $($destination)
Copy-Item $($sourcefolder) -Destination $($destination)
# Copy MyProject.pdb to Staging\Client\Libraries
$sourcefolder = "{0}\Code\MyProject\Bin\{1}\MyProject.pdb" -f $env:CodeLocalPath, $env:DebugBuildConfiguration
"Source : {0} and Destination : {1} " -f $($sourcefolder), $($destination)
Copy-Item $($sourcefolder) -Destination $($destination)
I donot get any error. But nothing happens.
What do you mean "But nothing happens"? Do you mean that no files have been copied into your Repos?
If yes, that is the correct behavior of Devops. Because this is not recommended to upload any file back to your repo.
If you set the system.debug=true in the variables tab, you will find the log like:
##[debug]Copy-Item C:\VS2017Agent\_work\8\s\TestSample\TestSample\Bin\Debug\*.* -Destination C:\VS2017Agent\_work\8\s\TestSample\Staging\Client\Libraries'
It will not copy the file to the repos. That should be the reason why you see nothing happens.
Besides, Looking at Microsoft's documentation the descriptions are as follows:
$(Build.Repository.LocalPath): The local path on the agent where
your source code files are downloaded. For example: c:\agent_work\1\s
By default, new build definitions update only the changed files. You
can modify how files are downloaded on the Repository tab.
So, the value of this variable is pointing to the agent instead of repos.
Note: Copy-Item in powershell should copy the files instead of a folder, try to use *.* to include all file in the folder.

How do I set an alias that works only from a particular folder in powershell?

I'm using powershell to manage the building of some .net projects on my machine, and I'd like to create aliases for them. The only trick is that I'd like to only use the aliases when I'm in the folder containing all the code. Is there a way to apply aliases only in a particular folder?
I wouldn't recommend it due to its obscurity, but you could dynamically add and remove aliases or functions via the prompt function that determines the interactive prompt string, because it is called after every command.
Note that PowerShell aliases only allow aliasing command names (or paths); that is, you can't bake arguments into them, which is why the following example uses a function instead (but it would work analogously for aliases):
function prompt {
# Define function `gs` on demand whenever the current location is a in a Git
# repo folder, and remove it when switching to any other folder.
if (Test-Path ./.git) { function global:gs { git status $Args } }
else { Remove-Item -EA Ignore function:global:gs }
# Define the standard PS prompt string.
"PS $PWD$('>' * ($nestedPromptLevel + 1)) "
}
To lessen the obscurity, you could modify the prompt string to signal whether or not the folder-specific commands are in effect:
function prompt {
# Define function `gs` on demand whenever the current location is a in a Git
# repo folder, and remove it when switching to any other folder.
if (Test-Path ./.git) {
$indicator = '[repo]'
function global:gs { git status $Args }
} else {
$indicator = ''
Remove-Item -EA Ignore function:global:gs
}
# Define the standard PS prompt string.
"PS $PWD $indicator$('>' * ($nestedPromptLevel + 1)) "
}
Now your prompt will contain substring [repo] (e.g., PS /Users/jdoe/Projects/foo [repo]>) whenever the current location is a Git repo folder.

Powershell SVN Commit After Delete

I am writing a script to remove all files from a directory in SVN:
$svn = 'C:\Program Files\TortoiseSVN\bin\svn.exe'
$commitmsg = 'C:\TradeSupport\AB Reports\msg.txt'
$Reports = Get-Content -Path 'C:\TradeSupport\AB Reports\AB Reports.csv'
Foreach ($Report in $Reports){
& $svn remove --force C:\SVN\Test\$Report
& $svn commit -F $commitmsg C:\SVN\Test\$Report}
& $svn update $commitmsg 'C:\SVN\Test'
The files are TestA and TestB. Running the script deletes the files but does not commit the change. No error is thrown, but I have to go back and physically commit the change. What would be the best way to automate this process?
I also had the commit point to the directory itself, while outside the ForEach loop, but that did not work either.
I had to change the directory I was working in to C:\SVN\Test

Idiomatic powershell translation of (cd $dir && someCommand)

I have a python script that spits out JSON I'd like to capture with ConvertFrom-Json. Unfortunately, this script requires that I cd to a different directory before I run it. What's the idiomatic powershell way to do this?
This works:
$q = powershell.exe -Command "cd some\other\dir; python JsonMaker.py | ConvertFrom-Json"
As does this:
$cwd=Get-Location
cd some\other\dir
$q=python JsonMaker.py | ConvertFrom-Json
cd "$cwd"
But changing the current working directory seems dicey to me - what if the python script outputs malformed JSON, will I be left in some\other\dir ?
In the unix shell scripting world, I'd obviously do something like
(cd some/other/dir && python JsonMaker.py) | commandThatUsesJson
or read the input in with $(cd some/other/dir && python JsonMaker.py). However, in unix subshells are cheap. In powershell I see a noticeable delay to starting a subshell.
What's the approach long-time Powershell users take to something like this?
I'd probably use pushd/popd:
pushd some\other\dir
$q=python JsonMaker.py | ConvertFrom-Json
popd
Your script looks fine to me. Unless ConvertFrom-Json throws a terminating exception (which I don't think it will), the script will continue and your cd $cwd line would reutnr you back.
You coulod also use Push-/Pop-Location, but it's basically just a "pretty" way of doing what you already have. Ex.
#Save location
Push-Location
#Script
Set-Location some\other\dir
python JsonMaker.py | ConvertFrom-Json
#Return to previous location
Pop-Location
python.exe JsonMaker.py runs as a child process. Changes made to the current directory in a child process don't affect the parent. ConvertFrom-Json also doesn't affect the current directory. It converts a JSON string to an object representing the JSON data or throws a (non-terminating) error if the JSON string is malformed.
If you want to be on the safe side, run the conversion in a try block and put the statement to return from the temporary working directory after that block:
try {
$q = python JsonMaker.py | ConvertFrom-Json
} catch {
# error handling (optional)
}
cd "$cwd"
or in a finally clause:
try {
$q = python JsonMaker.py | ConvertFrom-Json
} catch {
# error handling (optional)
} finally {
cd "$cwd"
}
As others have already mentioned, I'd use Push-Location and Pop-Location (or their aliases pushd and popd) as a simpler way of changing to a different working directory and returning to the original directory. The cmdlets work similar to the Unix shell commands pushd and popd.
I'd also recommend adding the extension to the executable name (to avoid unintentionally running different executable files with the same basename (e.g. python.cmd or python.com) and using the call operator (&). Running the command in a new powershell.exe process is not necessary, and would also return just a string representation of the object created from the JSON string instead of the object itself, which is probably not what you want.
Modified code:
Push-Location 'D:\some\other\dir'
try {
$q = & python.exe JsonMaker.py | ConvertFrom-Json
} catch {
# error handling (optional)
} finally {
Pop-Location
}
or like this if you want to conditionally run the python script only if changing the directory was successful (thus fully emulating the behavior of &&):
try {
Push-Location 'D:\some\other\dir' -ErrorAction Stop
$q = & python.exe JsonMaker.py | ConvertFrom-Json
} catch {
# error handling (optional)
} finally {
Pop-Location
}