Cake Build: Collecting environment variables from cmd file - cakebuild

I want to run a a project with msbuild. The msbuild file contains references to environment variables.
There is a cmd-file which will set these environment variables. I need to call it from my Cake script before.
If I use StartProcess to call this cmd-file before I start the compiler it won't work because Cake does not collect environment variables.
How do I call the cmd-file the correct way?
Contents of the batch-file:
#SET BDS=C:\Program Files (x86)\Embarcadero\RAD Studio\12.0
#SET BDSINCLUDE=C:\Program Files (x86)\Embarcadero\RAD Studio\12.0\include
#SET BDSCOMMONDIR=C:\Users\Public\Documents\RAD Studio\12.0
#SET FrameworkDir=C:\Windows\Microsoft.NET\Framework\v3.5
#SET FrameworkVersion=v3.5
#SET FrameworkSDKDir=
#SET PATH=%FrameworkDir%;%FrameworkSDKDir%;C:\Program Files (x86)\Embarcadero\RAD Studio\12.0\bin;C:\Program Files (x86)\Embarcadero\RAD Studio\12.0\bin64;%PATH%
#SET LANGDIR=DE

The StartProcess Alias has an overload which accepts a ProcessSettings object, which contains an EnvironmentVariables property, which is just a generic dictionary of string and string. This allows you to construct any environment variables that need to be passed to the process that you want to start. An example would be the following:
StartProcess("cmd", new ProcessSettings{
Arguments = "/c set",
EnvironmentVariables = new Dictionary<string, string>{
{ "CI", "True" },
{ "TEMP", MakeAbsolute(Directory("./Temp")).FullPath }
}
});
The same technique could also be used to call MSBuild directly, rather than using the batch file, as the MSBuildSettings object has the same property.

Related

Powershell ps2exe config variable

I would like To store a variable in a config file for a .ps1(powershell script) converted to .exe using ps2exe
$LnkPath="...\" #Directory Path of .lnk
$LnkFile="\nnnn.lnk" #name of .lnk file
Invoke-Item $ScriptPath + $LnkFile
I was hoping to have $LnkFile and $LnkPath as config file variables so if the version of the lnk stops working, i can just point to a new lnk.
There is a reason why the version of the .lnk file stops working, but it is complicated, and not worth anyone's time.
edit:
The config file created with the optional -configFile switch isn't meant for use by the wrapped script - it merely contains runtime metadata for the generated executable (in the form of an XML file placed alongside the executable with additional extension .config).
However, you can create your own config file.
While PowerShell has a configuration-data format that uses hashtable-literal syntax, which can be read with Import-PowerShellDataFile, as of PowerShell 7.2.x there is no way to create this format programmatically.
A simple alternative that supports both reading and programmatic creation is to use a JSON file:
The following assumes that your script file is foo.ps1, to be converted to foo.exe, with a configuration file foo.json located in the same directory (which you'll have to bundle with your .exe file when deploying it):
First, create your JSON config file:
#{ LnkPath = '...\'; LnkFile = 'nnnn.lnk' } | ConvertTo-Json > foo.json
Now you can read this file from foo.ps1 / foo.exe as follows:
# Determine this script's / executable's full path.
$scriptOrExePath =
if ($PSCommandPath) { # Running as .ps1
$PSCommandPath
} else { # Running as .exe"
Convert-Path ([Environment]::GetCommandLineArgs()[0])
}
# Look for the JSON config file in the same directory as this script / executable, load it and parse it into an object.
$config =
Get-Content -Raw ([IO.Path]::ChangeExtension($scriptOrExePath, '.json')) |
ConvertFrom-Json
# $config is now an object with .LnkPath and .LnkFile properties.
$config # Output for diagnostic purposes.
Note the need to use [Environment]::GetCommandLineArgs() to determine the executable path when running as an .exe file, because the usual automatic variables indicating the script path ($PSCommandPath) and script directory ($PSScriptRoot) aren't available then.

powershell dot sourced function parameter overwrites local variables

I'm using powershell dot-sourcing to set variables in the current scope and have encountered an interesting feature.
It seems that the parameter of the function will also overwrite any local variable of the same name.
Is this expected?
Should I just use $global:MyVar instead to set variables in the local scope from other scripts?
# Given
function TestX([string]$X)
{
Write-Host "`$X = $X"
}
# And variable $X
# Note that the variable name is the same as the parameter name in 'TestX'
$X = "MyValue"
PS> TestX $X
MyValue
PS> $X; TestX "123456"
MyValue
123456
PS> $X; . TestX "123456"
MyValue
123456
PS> $X; . TestX "123456"
123456
123456
EDIT:
To expand on what I'm trying to accomplish...
I have a set of scripts used for a build process. These scripts are used to target multiple environments. There are different configurations for each environment (DEV, TEST, QA, PROD) that apply different rules/settings/etc. These configurations are stored in directories. Among these settings are some powershell files that are used to set script-wide settings for that particular environment. For example, target server URL, target server UNC, etc..
Among the build process scripts there is a function Confirm-TargetEnvironmentVariables. As the name implies, it checks to see if the environment variables have been loaded and if not, loads them. This function is sprinkled throughout the various script files/functions to ensure that when a function uses one of these script-wide variables, it has been set.
It was this function that I used to call with dot-sourcing.
function Confirm-TargetEnvironmentVariables([string]$TargetEnvironment)
{
...
}
# Like this..
. Confirm-TargetEnvironmentVariables "PROD"
This all worked just fine. Until the I had a need to switch between loading variables from more than 1 environment (for refreshing TEST from PROD for example, I need variable info from both). And in fact this still works, except for the fact that in the script that was calling Confirm-TargetEnvironmentVariables I already had a variable called $TargetEnvironment. So I was trying to do this:
$SourceEnvironment = "PROD"
$TargetEnvironment = "TEST"
. Confirm-TargetEnvironmentVariables $SourceEnvironment
# Do stuff with loaded "PROD" variables...
. Confirm-TargetEnvironmentVariables $TargetEnvironment
# Do stuff with loaded "TEST" variables...
But what was happening was this:
$SourceEnvironment = "PROD"
$TargetEnvironment = "TEST"
. Confirm-TargetEnvironmentVariables $SourceEnvironment
# Do stuff with loaded "PROD" variables...
# The value of $TargetEnvironment has been set to "PROD" by dot-sourcing!!
. Confirm-TargetEnvironmentVariables $TargetEnvironment
# Do stuff with loaded... "PROD" variables!!!
So this should provide more context hopefully. But ultimately it still raises the question of why dot-sourcing includes parameter variables when bringing variables into the local scope. Is this by design? I can't think of a scenario where this would be desired behavior though.
Should just use $global:MyVar instead to set variables in the local scope from other scripts?
I'd recommend you avoid writing functions that either requires dot-sourcing to work correctly, or that write to global variables.
Instead, use Set-Variable's -Scope parameter to write to a variable in the calling scope:
function Test-SetVariable
{
param([string]$Name,$Value)
# '1' means "one level up", so it updates the variable in the caller's scope
Set-Variable -Name $Name -Value $Value -Scope 1
}

how to add variables together in VSTS

I want to use a variable which is composed of another vsts variable and text, for instance:
vnetname = $vnet_prefix + "vnetid"
However i get an error saying that "A positional parameter cannot be found that accepts argument +.
Anyone advise?
If you mean use the variable in build/release processes, then you can add a variable like this (reference below screenshot):
vnetname = $(vnet_prefix)_vnetid
Then you can use the variable $vnetname or $(vnetname) directly, see Build variables-Format for how to use the variables in different tools.
Alternatively you can pass the value with Logging Commands:
Copy and paste below strings then save as *.ps1 file:
$value = $env:vnet_prefix + "vnetid"
Write-Host "##vso[task.setvariable variable=vnetname]$value"
Check in the PS file
Add a PowerShell task to run the PS file
Use the variable $vnetname in later steps

How to Access Custom PowerShell 5.0 Classes from a separate ps1 file

I created a class in a ps1 file that works just fine IN the ps1 file itself. I can run various tests with the class and the tests in the same file.
My trouble is I don't seem to be able to find a way to put my Class in one file and the Class Usage code in another.
With functions, you can just dot source them to bring any external ps1 files into your current script. It looks like Classes for Powershell do not work this way.
How can I organize code to keep classes in separate files from the executing script?
Do I have to use modules? How do I do that?
In the file Hello.psm1:
class Hello {
# properties
[string]$person
# Default constructor
Hello(){}
# Constructor
Hello(
[string]$m
){
$this.person=$m
}
# method
[string]Greetings(){
return "Hello {0}" -f $this.person
}
}
In the file main.ps1:
using module .\Hello.psm1
$h = New-Object -TypeName Hello
echo $h.Greetings()
#$hh = [Hello]::new("John")
$hh = New-Object -TypeName Hello -ArgumentList #("Mickey")
echo $hh.Greetings()
And running .\main.ps1:
Hello
Hello Mickey
This may not apply to your case, but I have had nightmares with PowerShell caching information (in my case it was just for modules). Things I'd changed in the module weren't being properly loaded - even after a reboot.
I found deleting the cache stored in a subdirectory of C:\Users\<YourUsername>\AppData\Local\Microsoft\Windows\PowerShell solved my issue.
The simplest solution is to use the using keyword, at the very beginning of your script :
using module .\path\to\your\Module.psm1
Path can be relative from your current file.
To prevent caching problem, use powershell.exe .\script.ps1 to execute it, even in the PoweShell ISE console.

Is it possible to create a path which consist of an environment variable + the filename

I want to copy an .exe file to /App/Data/Local/Temp and run the exe afterwards. Instead of adding the full, static filepath i would like to use $env:TEMP to have no dependency to the user account inside the ps. Based of $env:TEMP + the .exe file i tried to create a new variable $LocalInstall which i could use later inside the ps but it seems not to work.
$LocalInstallFile=$env:TEMP."\Agent.exe"
Later $LocalInstallFile should be used to run the installation with installation properties
Invoke-Expression "$LocalInstallFile /DIR=c:\"
Could i instead also use
Invoke-Expression "$env:TEMP\Agent.exe"
The expression
$LocalInstallFile=$env:TEMP."\Agent.exe"
will expand the $env:TEMP environment variable and then try to invoke the property called "\Agent.exe" on the string object which, of course, doesn't exist, so $LocalInstallFile is null. Instead, create the string like this:
$LocalInstallFile="$env:TEMP\Agent.exe"