Dot Sourced Variables VS Global Variables - powershell

I have two ways of referencing script variables from a separate script file. Here are two basic examples:
1. Dot Source
Variables.ps1
$Source = "source"
$Destination = "dest"
Execute.ps1
. .\Variables.ps1
Copy-Item -Path $Source -Destination $Destination -Force
2. Global Variable
Variables.ps1
$Global:Source = "source"
$Global:Destination = "dest"
Execute.ps1
.\Variables.ps1
Copy-Item -Path $Source -Destination $Destination -Force
I have done research but have yet to find a definitive reason as to use one over the other. Are there limitations or cautions I should exercise when using these methods? Any input is greatly appreciated. Thank you for your time.
EDIT:
#mklement0 gave a great answer as to why to use dot-sourcing over global variables. I would love to still keep this discussion open. If there is another point of view, or an explanation as to when using global variables is more beneficial, I would enjoy hearing it and up-voting accordingly. Thank you.

I suggest you use dot-sourcing, without explicit global variables (method 1):
That way, it requires a deliberate effort to add variables to the current scope. Note that dot-sourcing adds the variables to the current scope, which may or may not be the current session's global scope (child scopes are created by calling scripts (without dot-sourcing) and script blocks with &, for instance).
By contrast, using global variables (method 2) creates session-global variables irrespective of invocation method, so that even accidental, non-dot-sourced invocations of the script end up altering the global state.

Related

Rename files & append Number in Powershell [duplicate]

I've never used powershell or any cmd line to try and rename files, nor so I really know much about script writing in general.
I've already had some success in renaming the files in question but am stuck on the last piece of the puzzle.
Original file names:
NEE100_N-20210812_082245.jpg
NEE101_E-20210812_083782.jpg
NEE102_W-20210812_084983.jpg
I successfully change those to AT-###-N-......jpg using:
Rename-Item -NewName {$_.name -replace "NEE\d\d\d_", "AT-112-"}
And this is what they looked like after:
AT-112-N-20210812_082245.jpg
AT-112-E-20210812_083782.jpg
AT-112-W-20210812_084983.jpg
Now however, I have a few files that look like this:
AT-112-NewImage-20210812_083782.jpg
AT-112-NewImage-20210812_093722.jpg
and I want to change them to:
AT-112-D1-20210812_083782.jpg
AT-112-D2-20210812_093722.jpg
...and so on.
I've tried a few things here to try and do that. Such as replacing "NewImage" with "D" and then using something like this (not exact, just an example):
$i = 1
Get-ChildItem *.jpg | %{Rename-Item $_ -NewName ('19981016_{0:D4}.jpg' -f $i++)}
But this did not work. I have seen scripts that use sequential numbering either added as a suffix or a prefix. But I can't figure out how to do this if what I want to have sequence numbering in the middle of the name.
Hopefully this make sense, if I need more elaboration, let me know. Thanks!
You need to use an expression (inside (...)) as your -replace substitution operand in order to incorporate a dynamic value, such as the sequence number in your case.
In order to use a variable that maintains state across multiple invocations of a delay-bind script block ({ ... }, the one being passed to the -NewName parameter in your first attempt), you need to create the variable in the caller's scope and explicitly reference it there:
This is necessary, because delay-bind script blocks run in a child scope, unfortunately,[1] so that any variables created inside the block go out of scope after every invocation.
Use Get-Variable to obtain a reference to a variable object in the caller's (parent) scope[2], and use its .Value property, as shown below.
$i = 1
Get-ChildItem *.jpg | Rename-Item -NewName {
$_.Name -replace '-NewImage-', ('-D{0}-' -f (Get-Variable i).Value++)
} -WhatIf
Note: The -WhatIf common parameter in the command above previews the operation. Remove -WhatIf once you're sure the operation will do what you want.
Note: The above solution is simple, but somewhat inefficient, due to the repeated Get-Variable calls - see this answer for more efficient alternatives.
[1] This contrasts with the behavior of script blocks passed to Where-Object and ForEach-Object. See GitHub issue #7157 for a discussion of this problematic discrepancy.
[2] Without a -Scope argument, if Get-Variable doesn't find a variable in the current scope, it looks for a variable in the ancestral scopes, starting with the parent scope - which in this case the caller's. You can make the call's intent more explicitly with -Scope 1, which starts the lookup from the parent scope.

Powershell rename files with sequential numbers in the middle of name

I've never used powershell or any cmd line to try and rename files, nor so I really know much about script writing in general.
I've already had some success in renaming the files in question but am stuck on the last piece of the puzzle.
Original file names:
NEE100_N-20210812_082245.jpg
NEE101_E-20210812_083782.jpg
NEE102_W-20210812_084983.jpg
I successfully change those to AT-###-N-......jpg using:
Rename-Item -NewName {$_.name -replace "NEE\d\d\d_", "AT-112-"}
And this is what they looked like after:
AT-112-N-20210812_082245.jpg
AT-112-E-20210812_083782.jpg
AT-112-W-20210812_084983.jpg
Now however, I have a few files that look like this:
AT-112-NewImage-20210812_083782.jpg
AT-112-NewImage-20210812_093722.jpg
and I want to change them to:
AT-112-D1-20210812_083782.jpg
AT-112-D2-20210812_093722.jpg
...and so on.
I've tried a few things here to try and do that. Such as replacing "NewImage" with "D" and then using something like this (not exact, just an example):
$i = 1
Get-ChildItem *.jpg | %{Rename-Item $_ -NewName ('19981016_{0:D4}.jpg' -f $i++)}
But this did not work. I have seen scripts that use sequential numbering either added as a suffix or a prefix. But I can't figure out how to do this if what I want to have sequence numbering in the middle of the name.
Hopefully this make sense, if I need more elaboration, let me know. Thanks!
You need to use an expression (inside (...)) as your -replace substitution operand in order to incorporate a dynamic value, such as the sequence number in your case.
In order to use a variable that maintains state across multiple invocations of a delay-bind script block ({ ... }, the one being passed to the -NewName parameter in your first attempt), you need to create the variable in the caller's scope and explicitly reference it there:
This is necessary, because delay-bind script blocks run in a child scope, unfortunately,[1] so that any variables created inside the block go out of scope after every invocation.
Use Get-Variable to obtain a reference to a variable object in the caller's (parent) scope[2], and use its .Value property, as shown below.
$i = 1
Get-ChildItem *.jpg | Rename-Item -NewName {
$_.Name -replace '-NewImage-', ('-D{0}-' -f (Get-Variable i).Value++)
} -WhatIf
Note: The -WhatIf common parameter in the command above previews the operation. Remove -WhatIf once you're sure the operation will do what you want.
Note: The above solution is simple, but somewhat inefficient, due to the repeated Get-Variable calls - see this answer for more efficient alternatives.
[1] This contrasts with the behavior of script blocks passed to Where-Object and ForEach-Object. See GitHub issue #7157 for a discussion of this problematic discrepancy.
[2] Without a -Scope argument, if Get-Variable doesn't find a variable in the current scope, it looks for a variable in the ancestral scopes, starting with the parent scope - which in this case the caller's. You can make the call's intent more explicitly with -Scope 1, which starts the lookup from the parent scope.

How to create global variable but without to use "global:"

How to create subj, but absolutely the same as $null, without any "global:" prefix AND this variable should be available anywhere in such notation, in functions body, for example? Powershell is 6.0.3 (Linux)
As for my knowledge this is not possible since you need to define the scope of a variable to use it globally within functions.
It might work to put the variable in a separate .ps1 file and dot-source it in the body and and in every function you want to access its data. But changing the variables value won't be global, therefore it would be "read only".
How about New-Variable cmdlet?
new-variable -scope global -name a -value "One"
also, check about_scopes help file
get-help about_scopes

Microsoft's Consistency in PowerShell CmdLet Parameter Naming

Let's say I wrote a PowerShell script that includes this commmand:
Get-ChildItem -Recurse
But instead I wrote:
Get-ChildItem -Re
To save time. After some time passed and I upgraded my PowerShell version, Microsoft decided to add a parameter to Get-ChildItem called "-Return", that for example returns True or False depending if any items are found or not.
In that virtual scenario, do I have I to edit all my former scripts to ensure that the script will function as expected? I understand Microsoft's attempt to save my typing time, but this is my concern and therefore I will probably always try to write the complete parameter name.
Unless of course you know something I don't. Thank you for your insight!
This sounds more like a rant than a question, but to answer:
In that virtual scenario, do I have I to edit all my former scripts to ensure that the script will function as expected?
Yes!
You should always use the full parameter names in scripts (or any other snippet of reusable code).
Automatic resolution of partial parameter names, aliases and other shortcuts are great for convenience when using PowerShell interactively. It lets us fire up powershell.exe and do:
ls -re *.ps1|% FullName
when we want to find the path to all scripts in the profile. Great for exploration!
But if I were to incorporate that functionality into a script I would do:
Get-ChildItem -Path $Home -Filter *.ps1 -Recurse |Select-Object -ExpandProperty FullName
not just for the reasons you mentioned, but also for consistency and readability - if a colleague of mine comes along and maybe isn't familiar with the shortcuts I'm using, he'll still be able to discern the meaning and expected output from the pipeline.
Note: There are currently three open issues on GitHub to add warning rules for this in PSScriptAnalyzer - I'm sure the project maintainers would love a hand with this :-)

Are Powershell Profile scripts dot-sourced?

The Microsoft.PowerShell_profile.ps1 script I am using creates a lot of variables when it runs. I have set all the variables' scope to "Script", but the variables used in the script never go out-of-scope.
I would like the variables to go out-of-scope once the script is done running and control is handed over to me.
If I compare the number of global, local, and script variables I have, I come up with the same number.
Example:
# Profile script does what it does.
Get-Variable -Scope Global | Measure-Object
Get-Variable -Scope Local | Measure-Object
Get-Variable -Scope Script | Measure-Object
Output:
60
60
60
Currently, I am capturing a snapshot of the variables at the beginning of my profile script, then removing any new variables at the end.
Example:
$snapshotBefore = Get-Variable
$profileVar1 = 'some value'
$profileVar2 = 'some other value'
$snapshotAfter = Get-Variable
# Compare before and after, and create list of new variables.
Remove-Variable $variablesToRemove
Yes, PowerShell profiles are dot-sourced by design, because that's what allows the definitions contained in them (aliases, functions, ...) to be globally available by default - which is, after all, the main purpose of profile files.
Unfortunately, there is no scope modifier that allows you to create a temporary scope for variables you only want to exist while the profile is loading - even scope local is effectively global in a profile script; similarly, using scope private is also not an option, because the profile's script scope - due to being dot-sourced - is the global scope.
Generally speaking, you can use & (the call operator) with a script block to create variables inside that block that are scoped to that block, but that is usually at odds with creating globally available definitions in a profile, at least by default.
Similarly, calling another script without dot-sourcing it, as in your own answer, will not make its definitions globally available by default.
You can, however, create global elements from non-dot-sourced script blocks / script by specifying the global scope explicitly; e.g.: & { $global:foo = 'Going global' }, or & { function global:bar { 'global func' } }.
That said, the rationale behind dot-sourcing profiles is likely that it's easier to make all definitions global by default, making the definition of typical elements of a profile - aliases, functions, drive mappings, loading of modules - simpler (no need to specify an explicit scope).
By contrast, global variables are less typical, and to define the typical elements listed above you don't usually need script-level (and thus global) variables in your profile.
If you still need to create (conceptually) temporary variables in your profile (which is not a requirement for creating globally available aliases, functions, ...):
A simple workaround is to use an exotic variable name prefix such as __ inside the profile script to reduce the risk of their getting referenced by accident (e.g, $__profileVar1 = ...).
In other words: the variables still exist globally, but their exotic names will typically not cause problems.
However, your approach, even though it requires a little extra work, sounds like a robust workaround, here's what it looks like in full (using PSv3+ syntax):
# Save a snapshot of current variables.
# * If there are variables that you DO want to exist globally,
# define them ABOVE this command.
# * Also, load MODULE and dot-source OTHER SCRIPTS ABOVE this command,
# because they may create variables that *should* be available globally.
$varsBefore = (Get-Variable).Name
# ... define and use temporary variables
# Remove all variables that were created since the
# snapshot was taken, including $varsBefore.
Remove-Variable (Compare-Object $varsBefore (Get-Variable).Name).InputObject
Note that I'm relying on Compare-Object's default behavior of only reporting differences between objects and, assuming you haven't tried to remove any variables, only the variables added are reported.
Note that while it can be inferred from the actual behavior of profile files that they are indeed dot-sourced - given that dot-sourcing is the only way to add elements to the current scope (the global scope, in the case of profiles) -
this fact is not explicitly documented as such.
Here are snippets from various help topics (as of PSv5) that provide clues (emphasis mine):
From Get-Help about_Profiles:
A Windows PowerShell profile is a script that runs when Windows PowerShell
starts. You can use the profile as a logon script to customize the
environment. You can add commands, aliases, functions, variables, snap-ins,
modules, and Windows PowerShell drives. You can also add other
session-specific elements to your profile so they are available in every
session without having to import or re-create them.
From Get-Help about_Variables:
By default, variables are available only in the scope in which
they are created.
For example, a variable that you create in a function is
available only within the function. A variable that you
create in a script is available only within the script (unless
you dot-source the script, which adds it to the current scope).
From Get-Help about_Operators:
. Dot sourcing operator
Runs a script in the current scope so that any functions,
aliases, and variables that the script creates are added to the current
scope.
From Get-Help about_Scopes
But, you can add a script or function to the current scope by using dot
source notation. Then, when a script runs in the current scope, any
functions, aliases, and variables that the script creates are available
in the current scope.
To add a function to the current scope, type a dot (.) and a space before
the path and name of the function in the function call.
So it does sounds like Powershell dot-sources the profile. I couldn't find a resource that specifically says that, or other forums that have asked this question.
I have found an answer, and wanted to post it here.
I have changed my profile to only call a script file. The script now has its own scope, and as long as the variables aren't made global, they will go out-of-scope once the profile finishes loading.
So now my profile has one-line:
& (Split-Path $Path $profile -Parent | Join-Path "Microsoft.PowerShell_profile_v2.ps1")
Microsoft.PowerShell_profile_v2.ps1 can now contain proper scope:
$Global:myGlobalVar = "A variable that will be available during the current session"
$Script:myVar = "A variable that will disappear after script finishes."
$myVar2 = "Another variable that will disappear after script finishes."
What this allows, is for the profile script to import modules that contain global variables. These variables will continue to exist during the current session.
I would still be curious why Microsoft decided to call the profile in this way. If anyone knows, and would like to share. I would love to see the answer here.