Is there a way to get only the locally declared variables in a powershell script?
In this snippit, I would want it to return only myVar1, myVar2, anotherVar:
$myVar1 = "myVar1"
$myVar2 = "myVar2"
$anotherVar = "anotherVar"
Get-Variable -Scope Script
But it instead returns a ton of other local script variables.
The problem I'm trying to solve, and maybe you can suggest another way, is that I have many Powershell scripts that have a bunch of misc variable constants declared at the top.
I want to export them all to disk (xml) for import later.
So to call Get-Variable bla* | Export-Clixml vars.xml, I need to know all of the variable names.
So is there a way I can like do
$allVars = {
$myVar1 = "alex"
$myVar2 = "iscool"
$anotherVar = "thisisanotherVar"
}
Get-Variable allVars | Export-Clixml "C:\TEMP\AllVars.xml"
And then later Import-Clixml .\AllVars.xml | %{ Set-Variable $_.Name $_.Value } ?
So that the rest of the script could still use $myVar1 etc without major changes to what is already written?
The issue is there are more variables that are accessible in that scope beyond the ones you already declared. One thing you could do is get the list of variables before you declare yours. Get another copy of all the variables and compare the list to just get yours.
$before = Get-Variable -Scope Local
$r = "Stuff"
$after = Get-Variable -Scope Local
# Get the differences
Compare-Object -Reference $current -Difference $more -Property Name -PassThru
The above should spit out names and simple values for your variables. If need be you should be able to easily send that down the pipe to Export-CliXML. If your variables are complicated you might need to change the -depth for more complicated objects.
Caveat: If you are changing some default variable values the above code currently would omit them since it is just looking for new names.
Also not sure if you can import them exactly in the same means as they were exported. This is largely dependent on your data types. For simple variables this would be just fine
I need to know all of the variable names.
The only other way that I am aware of (I never really considered this) would be to change all of the variable to have a prefix like my_ so then you could just have one line for export.
Get-Variable my_* | Export-Clixml vars.xml
Related
Hopefully this answer isn't above me. I've created a custom object with properties and methods. I create several of them on the fly, depending on what the user selects at the beginning.
So for this example, the script might create $PRD1, $PRD2, $TST1 and $TST4.
$PRD1, $PRD2, $TST1 and $TST4 will have some properties like DebugMode, DisableAppsStartTime, DisableAppsStopTime. They'll have some methods like DisableApps(), EnableApps().
How can I find out which variables the script ended up creating? I can use Get-Variable to know the ones it created (plus I DO still have the initial list of names to create). My issue is that I'm having trouble figuring out to call the ones I've created, in a manner that allows me to use the methods and properties, without a ridiculous mash up of nested foreach/if/switch commands.
I certainly hope that made sense.
Thanks in advance,
SS
I DO still have the initial list of names to create
Assuming that $list contains this list, the following creates an (ordered) hash table of those variables that were actually created from that list:
$variableMap = [ordered] #{}
(Get-Variable -ErrorAction Ignore -Scope Local $list).
ForEach({ $variableMap[$_.Name] = $_.Value })
Note: -Scope Local limits the lookup to the current scope[1]; omit it to target all variables visible in the current scope, which includes those from ancestral (parent) scopes.
You can then loop over $variableMap.Keys to process them all, or access one by name selectively, e.g., $variableMap.PRD1 (or $variableMap['PRD1']).
You then use regular dot notation to access properties and methods of these entries; e.g., $variableMap.PRD1.DisableApps().
[1] This includes variables created with the AllScope option, e.g., $HOME, because they are copied to every scope, as the name suggests. You can find all such variables with
Get-Variable | Where-Object Options -match 'AllScope'
I just did this with the where-object cmdlet and the -like operator with an foreach loop.
foreach($var in (get-variable | Where-object {$_.name -like '*PRD*' -or $_.name -like '*TST*'})){
$var
}
So my challenge today.
I have a config file (really just a txt document) that stores variables to store information passed between scripts or to be used after restarts.
I am looking for a more efficient way to read and update the file. Currently I read the file with:
Get-Content $current\Install.cfg | ForEach-Object {
Set-Variable -Name line -Value $_
$a, $b = $line.Split('=')
Set-Variable -name $a -Value $b
}
But to overwrite the contents, I recreate the file with:
ECHO OSV=$OSV >>"$ConfigLoc\tool.cfg"
ECHO OSb=$OSb >>"$ConfigLoc\tool.cfg"
ECHO cNum=$cNum >>"$ConfigLoc\tool.cfg"
ECHO cCode=$cCode >>"$ConfigLoc\tool.cfg"
ECHO Comp=$Comp >>"$ConfigLoc\tool.cfg"
Each time I have added a new saved variable, I have just hardcoded the new variable into both the original config file and the config updater.
As my next updates require an additional 30 variables to my current 15. I would like something like:
Get-Content $current\Install.cfg | ForEach-Object {
Set-Variable -Name line -Value $_
$a, $b = $line.Split('=')
ECHO $a=$$a
}
Where $$a uses the variable $a in the loop as the variable name to load the value.
Best example i can show to clarify is:
ECHO $a=$$a (in current loop)
Echo OSV=$OSV (actually appears in code as)
Not sure how to clarify this anymore, or how to achieve it with the variable title also being a variable.
If you want to create a file that has name=value parameters, here's an alternate suggestion. This is a snippet of a real script I use every day. You might modify it so it reads your .csv input and uses it instead of the hard coded values.
$Sites = ("RawSiteName|RoleName|DevUrl|SiteID|HttpPort|HttpsPort", `
"SiteName|Name of role|foo.com|1|80|443" `
) | ConvertFrom-CSV -Delimiter "|"
$site = $sites[0]
Write-Host "RawSiteName =$($site.RawSiteName)"
You might be able to use something similar to $text = Get-Content MyParameters.csv and pipe that to the ConvertFrom-CSV cmdlet. I realize it's not a direct answer to what you are doing but it will let you programmatically create a file to pass across scripts.
Thanks for the help everyone. This is the solution I am going with. Importing and exporting couldn't be simpler. If I have to manually update the XML install default I can with ease which is also amazing. I also love the fact that even if you import as $Test you can still use $original to access variables. I will be creating multiple hashtables to organize the different data I will be using going forward and just import/export it in a $config variable as the master.
$original = #{
OSV='0'
OSb='0'
cNum='00000'
cCode='0000'
Client='Unknown'
Comp='Unknown'
}
$original | Export-Clixml $Home\Desktop\sample.cfg
$Test = Import-Clixml $Home\Desktop\sample.cfg
Write $Test
Write $original.Client
In essence, you're looking for variable indirection: accessing a variable indirectly, via its name stored in another variable.
In PowerShell, Get-Variable allows you to do that, as demonstrated in the following:
# Sample variables.
$foo='fooVal'
$bar='barVal'
# List of variables to append to the config file -
# the *names* of the variables above.
$varsToAdd =
'foo',
'bar'
# Loop over the variable names and use string expansion to create <name>=<value> lines.
# Note how Get-Variable is used to retrieve each variable's value via its *name*.
$(foreach ($varName in $varsToAdd) {
"$varName=$(Get-Variable $varName -ValueOnly)"
}) >> "$ConfigLoc/tool.cfg"
With the above, the following lines are appended to the output *.cfg file:
foo=fooVal
bar=barVal
Note that you can read such a file more easily with the ConvertFrom-StringData, which outputs a hashtable with the name-value pairs from the file:
$htSettings = Get-Content -Raw "$ConfigLoc/tool.cfg" | ConvertFrom-StringData
Accessing $htSettings.foo would then return fooVal, for instance.
With a hashtable as the settings container, updating the config file becomes easier, as you can simply recreate the file with all settings and their current values:
$htSettings.GetEnumerator() |
ForEach-Object { "$($_.Key)=$($_.Value)" } > "$ConfigLoc/tool.cfg"
Note: PowerShell by default doesn't enumerate the entries of a hashtable in the pipeline, which is why .GetEnumerator() is needed.
Generally, though, this kind of manual serialization is fraught, as others have pointed out, and there are more robust - though typically less friendly - alternatives.
With your string- and line-based serialization approach, there are two things to watch out for:
All values are saved as a strings, so you have to manually reconvert to the desired data type, if necessary - and even possible, given that not all objects provide meaningful string representations.
Generally, the most robust serialization format is Export-CliXml, but note that it is not a friendly format - be careful with manual edits.
ConvertFrom-StringData will fail with duplicate names in the config file, which means you have to manually ensure that you create no duplicate entries when you append to the file - if you use the above approach of recreating the file from a hashtable every time, however, you're safe.
I have been successfully using [environment]::getfolderpath("ProgramFiles") to get the path to program Files, but now I have a need to also access Program Data, and it looks from this enumeration like ProgramData is not available with this method. Is that true, or am I missing something here?
$env: to access environmental variables
$env:ProgramData
The quickest way is to use $env:ProgramData as BenH already pointed out in your question comments.
Using the .net Specialfolder, you would have needed to use the CommonApplicationData
Instead of using a string though such as your initial example:
[Environment]::GetFolderPath('CommonApplicationData')
I'd suggest using the enumeration as you will get the possible enumeration values directly into the intellisense while developping.
[Environment]::GetFolderPath([System.Environment+SpecialFolder]::CommonApplicationData)
Finally, because you knew the path you were looking for but not the corresponding variable, you could have listed them all neatly using something like:
$SpecialFolders = New-Object -TypeName psobject
[Environment+SpecialFolder]::GetNames([Environment+SpecialFolder]) | sort |
foreach {Add-Member -InputObject $SpecialFolders -Type NoteProperty -Name
($_) -Value ([Environment]::GetFolderPath($_)) }
$SpecialFolders | fl
Using that snippet, you could have determined that c:\programdata was a special folder path belonging to CommonApplicationData.
The enumeration can still be handy if a specified folder is not in the $env scope (example: My documents special folder).
I am creating my own set of cmdlets. They all need the same state data (like location of DB and credentials for connecting to DB). I assume this must be a common need and wonder what the common idiom for doing this is.
the obvious one is something like
$db = my-make-dbcreds db=xxx creds=yyyy ...
my-verb1 $db | my-verb2 $db -foo 42...
my-verb8 $db bar wiz
.....
but i was wondering about other ways. Can I silently pipe the state from one to another. I know I can do this if state is the only thing I pipe but these cmdlets return data
Can I set up global variables that I use if the user doesnt specify state in the command
Passing the information state through the pipe is a little lost on me. You could update your cmdlets to return objects that the next cmdlet will accept via ValueFromPipeline. When you mentioned
like location of DB and credentials for connecting to DB
the best this I could think that you want is....
SPLATTING!
Splatting is a method of passing a collection of parameter
values to a command as unit. Windows PowerShell associates
each value in the collection with a command parameter.
In its simplest form
$params = #{
Filter = "*.txt"
Path = "C:\temp"
}
Get-ChildItem #params
Create a hashtable of parameters and values and splat them to the command. The you can edit the table as the unique call to the cmdlet would allow.
$params.Path = "C:\eventemperor"
Get-ChildItem #params
I changed the path but left the filter the same. You also dont have to have everything in $params you splat and use other parameters in the same call.
It is just a matter of populating the variables as you see fit and changing them as the case requires.
Spewing on the pipeline
Pretty that is what it is actually called. If you use advanced function parameters you can chain properties from one cmdlet to the next if you really wanted to. FWIW I think splatting is better in your case but have a look at the following.
function One{
param(
[parameter(Mandatory=$true,
ValueFromPipeline=$True,
ValueFromPipelineByPropertyName=$true)]
[String[]]
$Pastry
)
write-host "You Must Like $Pastry"
Write-Output (New-Object -TypeName PSCustomObject -Property #{Pastry= $pastry})
# If you have at least PowerShell 3.0
# [pscustomobject]#{Pastry= $pastry}
}
Simple function that writes the variable $pastry to the console but also outputs an object for the next pipe. So running the command
"Eclairs" | One | One | Out-Null
We get the following output
You Must Like Eclairs
You Must Like Eclairs
We need to pipe to Out-Null at the end else you would get this.
Pastry
------
{Eclairs}
Perhaps not the best example but you should get the idea. If you wanted to extract information between the pipe calls you could use Tee-Object.
"Eclair" | One | Tee-Object -Variable oneresults | One | Out-Null
$oneresults
Consider Parameter Default Values
Revisiting this concept after trying to find a better way to pass SQL connection information between many function working against the same database. I am not sure if this is the best thing to do but it certainly simplifies thing for me.
The basic idea is to add a rule for your cmdlet or wildcard rule if your cmdlets share a naming convention. For instance I have a series of functions that interact with our ticketing system. They all start with Get-Task.... and all configured with SQL connection information.
$invokeSQLParameters = #{
ServerInstance = "serverName"
Username = $Credentials.UserName
Password = $Credentials.GetNetworkCredential().Password
}
$PSDefaultParameterValues.Add("New-Task*:Connection",$invokeSQLParameters)
$PSDefaultParameterValues.Add("Get-Task*:Connection",$invokeSQLParameters)
So now in my functions I have a parameter called Connection that will always be populated with $invokeSQLParameters as long as the above is done before the call. I still use splatting as well
Invoke-Sqlcmd -Query $getCommentsQuery #Connection
You can read up more about this at about_parameters_default_values
Basically I have this code:
$file = $web.GetFile("Pages/default.aspx")
$file.CheckOut()
and I was wondering if there is anyway to use a pipe and the powershell equivalent of this to rewrite it as:
$web.GetFile("Pages/default.aspx") | $this.CheckOut()
When I try this I get the error:
Expressions are only allowed as the first element of a pipeline.
I also tried using $_ instead of $this but got the same error.
Actually there is a $this in a few cases. You can create a ScriptProperty or ScriptMethod and attach it to an object, and $this will be the original object. You can then define these in types files (I'd recommend using the module EZOut, it makes life much easier) so that any time you see that type, you get that method.
For example:
$Web | Add-Member ScriptMethod EditFile { $this.Checkout() }
Hope this helps
What you're looking for is $_ and it represents the current object in the pipeline. However you can only access $_ in a scriptblock of a command that takes pipeline input e.g.:
$web.GetFile("Pages/default.aspx") | Foreach-Object -Process {$_.Checkout()}
However there are aliases for the Foreach-Object cmdlet {Foreach and %} and -Process is the default parameter so this can be simplified to:
$web.GetFile("Pages/default.aspx") | Foreach {$_.Checkout()}
One other point, the GetFile call appears to return a single file so in this case, the following would be the easiest way to go:
$web.GetFile("Pages/default.aspx").Checkout()
Of course, at this point you no longer have a variable containing the file object.
$_ is the variable for "current object" in powershell.
However, you aren't passing any data, this is just variable assignment. You can only use the pipeline if you manipulate the actual output of a command and use it as input down the pipeline.
I think what you want can be accomplish with nested parentheses:
($web.GetFile("Pages/default.aspx")).CheckOut()
In PS, anything you put inside parentheses gets treated as its own object, and you can apply methods to that inline without variable reassignment.
Assignment does silence the default output, but it does not prevent an object from being further referenced.
($file = $web.GetFile("Pages/default.aspx")).CheckOut()
Of course, it's much more common to either store the return value in a variable and do stuff with it or chain methods/properties/pipes.