This question already has answers here:
PowerShell copy an array completely
(5 answers)
Closed 5 years ago.
I have a problem with a sample of code I am writing. It is a bit of a simple question but it is an issue that has taken some time, which I do not have. I already tried the stackoverflow relevant questions and search but did not find anything that helps a lot.
I have the following code:
#Importing some file from a csv to a variable
$output = import-csv -LiteralPath ...some file imports OK
##
#Copy the layout of the csv imported for further processing..
##
$extraOut = $output.clone()
$extraOut | ForEach-Object {
$_.innerlinks1 = ""
$_.innerlinksurl1 = ""
}
When I try to print out the value of $output, by using $_ I get the empty strings that I previously assigned (which I do not want). Why does this happen?
#Trying to print out $output should have data and not empty strings.
$output | ForEach-Object { Write-Host $_ }
Any help, or code that allows to copy the structure of an Import-csv result (with or without PSObject cloning) would also be great.
NOTE: After finding a helpful answer, I also think that the problem needs more detailed scripting since there is a lot of empty strings in my file that might cause additional issues in the future.
I use the answer in this link whenever I need to deep copy an array. To quote the original answer:
# Get original data
$data = Import-Csv ...
# Serialize and Deserialize data using BinaryFormatter
$ms = New-Object System.IO.MemoryStream
$bf = New-Object System.Runtime.Serialization.Formatters.Binary.BinaryFormatter
$bf.Serialize($ms, $data)
$ms.Position = 0
$data2 = $bf.Deserialize($ms)
$ms.Close()
# Use deep copied data
$data2
Related
I'm trying to find an efficient way to read the value of a string variable in a PowerShell .ps1 file and then update the same variable/value in another .ps1 file. In my specific case, I would update a variable for the version # on script one and then I would want to run a script to update it on multiple other .ps1 files. For example:
1_script.ps1 - Script I want to read variable from
$global:scriptVersion = "v1.1"
2_script.ps1 - script I would want to update variable on (Should update to v1.1)
$global:scriptVersion = "v1.0"
I would want to update 2_script.ps1 to set the variable to "v1.1" as read from 1_script.ps1. My current method is using get-content with a regex to find a line starting with my variable, then doing a bunch of replaces to get the portion of the string I want. This does work, but it seems like there is probably a better way I am missing or didn't get working correctly in my tests.
My Modified Regex Solution Based on Answer by #mklement0 :
I slightly modified #mklement0 's solution because dot-sourcing the first script was causing it to run
$file1 = ".\1_script.ps1"
$file2 = ".\2_script.ps1"
$fileversion = (Get-Content $file1 | Where-Object {$_ -match '(?m)(?<=^\s*\$global:scriptVersion\s*=\s*")[^"]+'}).Split("=")[1].Trim().Replace('"','')
(Get-Content -Raw $file2) -replace '(?m)(?<=^\s*\$global:scriptVersion\s*=\s*")[^"]+',$fileversion | Set-Content $file2 -NoNewLine
Generally, the most robust way to parse PowerShell code is to use the language parser. However, reconstructing source code, with modifications after parsing, may situationally be hampered by the parser not reporting the details of intra-line whitespace - see this answer for an example and a discussion.[1]
Pragmatically speaking, using a regex-based -replace solution is probably good enough in your simple case (note that the value to update is assumed to be enclosed in "..." - but matching could be made more flexible to support '...' quoting too):
# Dot-source the first script in order to obtain the new value.
# Note: This invariably executes *all* top-level code in the script.
. .\1_script.ps1
# Outputs to the display.
# Append
# | Set-Content -Encoding utf8 2_script.ps1
# to save back to the input file.
(Get-Content -Raw 2_script.ps1) -replace '(?m)(?<=^\s*\$global:scriptVersion\s*=\s*")[^"]+', $global:scriptVersion
For an explanation of the regex and the ability to experiment with it, see this regex101.com page.
[1] Syntactic elements are reported in terms of line and column position, and columns are character-based, meaning that spaces and tabs are treated the same, so that a difference of, say, 3 character positions can represent 3 spaces, 3 tabs, or any mix of it - the parser won't tell you. However, if your approach allows keeping the source code as a whole while only removing and splicing in certain elements, that won't be a problem, as shown in iRon's helpful answer.
To compliment the helpful answer from #mklement0. In case your do go for the PowerShell abstract syntax tree (AST) class, you might use the Extent.StartOffset/Extent.EndOffset properties to reconstruct your script:
Using NameSpace System.Management.Automation.Language
$global:scriptVersion = 'v1.1' # . .\Script1.ps1
$Script2 = { # = Get-Content -Raw .\Script2.ps1
[CmdletBinding()]param()
begin {
$global:scriptVersion = "v1.0"
}
process {
$_
}
end {}
}.ToString()
$Ast = [Parser]::ParseInput($Script2, [ref]$null, [ref]$null)
$Extent = $Ast.Find(
{
$args[0] -is [AssignmentStatementAst] -and
$args[0].Left.VariablePath.UserPath -eq 'global:scriptVersion' -and
$args[0].Operator -eq 'Equals'
}, $true
).Right.Extent
-Join (
$Script2.SubString(0, $Extent.StartOffset),
$global:scriptVersion,
$Script2.SubString($Extent.EndOffset)
) # |Set-Content .\Script2.ps1
first time poster here, but you helped me a lot before. I donĀ“t know how to ask google this question.
I have a powershell script, where with foreach command i check every computer in .txt that contains computer names. (Short explanation is that in check bitlocker status, connection avalability etc.) Everything works fine, but since I fall in love with powershell recently and try to automate more and more thing, that i should upgrade this script little more.
I have foreach ($DestinationComputer in $DestinationComputers) and after i check everything i wanted, i want to delete that row in .txt file.
Can someone help? I am still learning this and got stuck.
Continuing from my comment, I suggest creating a list of computernames that did not process correctly, while discarding the ones that did not fail.
By doing so, you will effectively remove the items from the text file.
Something like this:
$DestinationComputers = Get-Content -Path 'X:\somewhere\computers.txt'
# create a list variable to store computernames in
$list = [System.Collections.Generic.List[string]]::new()
# loop over the computer names
foreach ($DestinationComputer in $DestinationComputers) {
# first check: is the machine available?
$succes = Test-Connection -ComputerName $DestinationComputer -Count 1 -Quiet
if ($succes) {
# do whatever you need to do with that $DestinationComputer
# if anything there fails, set variable $success to $false
<YOUR CODE HERE>
}
# test if we processed the computer successfully and if not,
# add the computername to the list. If all went OK, we do not
# add it to the list, thus removing it from the input text file
if (-not $success) {
$list.Add($DestinationComputer)
}
}
# now, write out the computernames we collected in $list
# conputernames that were processed OK will not be in there anymore.
# I'm using a new filename so we don't overwrite the original, but if that is
# what you want, you can set the same filename as the original here.
$list | Set-Content -Path 'X:\somewhere\computers_2.txt'
So my challenge today.
I have a config file (really just a txt document) that stores variables to store information passed between scripts or to be used after restarts.
I am looking for a more efficient way to read and update the file. Currently I read the file with:
Get-Content $current\Install.cfg | ForEach-Object {
Set-Variable -Name line -Value $_
$a, $b = $line.Split('=')
Set-Variable -name $a -Value $b
}
But to overwrite the contents, I recreate the file with:
ECHO OSV=$OSV >>"$ConfigLoc\tool.cfg"
ECHO OSb=$OSb >>"$ConfigLoc\tool.cfg"
ECHO cNum=$cNum >>"$ConfigLoc\tool.cfg"
ECHO cCode=$cCode >>"$ConfigLoc\tool.cfg"
ECHO Comp=$Comp >>"$ConfigLoc\tool.cfg"
Each time I have added a new saved variable, I have just hardcoded the new variable into both the original config file and the config updater.
As my next updates require an additional 30 variables to my current 15. I would like something like:
Get-Content $current\Install.cfg | ForEach-Object {
Set-Variable -Name line -Value $_
$a, $b = $line.Split('=')
ECHO $a=$$a
}
Where $$a uses the variable $a in the loop as the variable name to load the value.
Best example i can show to clarify is:
ECHO $a=$$a (in current loop)
Echo OSV=$OSV (actually appears in code as)
Not sure how to clarify this anymore, or how to achieve it with the variable title also being a variable.
If you want to create a file that has name=value parameters, here's an alternate suggestion. This is a snippet of a real script I use every day. You might modify it so it reads your .csv input and uses it instead of the hard coded values.
$Sites = ("RawSiteName|RoleName|DevUrl|SiteID|HttpPort|HttpsPort", `
"SiteName|Name of role|foo.com|1|80|443" `
) | ConvertFrom-CSV -Delimiter "|"
$site = $sites[0]
Write-Host "RawSiteName =$($site.RawSiteName)"
You might be able to use something similar to $text = Get-Content MyParameters.csv and pipe that to the ConvertFrom-CSV cmdlet. I realize it's not a direct answer to what you are doing but it will let you programmatically create a file to pass across scripts.
Thanks for the help everyone. This is the solution I am going with. Importing and exporting couldn't be simpler. If I have to manually update the XML install default I can with ease which is also amazing. I also love the fact that even if you import as $Test you can still use $original to access variables. I will be creating multiple hashtables to organize the different data I will be using going forward and just import/export it in a $config variable as the master.
$original = #{
OSV='0'
OSb='0'
cNum='00000'
cCode='0000'
Client='Unknown'
Comp='Unknown'
}
$original | Export-Clixml $Home\Desktop\sample.cfg
$Test = Import-Clixml $Home\Desktop\sample.cfg
Write $Test
Write $original.Client
In essence, you're looking for variable indirection: accessing a variable indirectly, via its name stored in another variable.
In PowerShell, Get-Variable allows you to do that, as demonstrated in the following:
# Sample variables.
$foo='fooVal'
$bar='barVal'
# List of variables to append to the config file -
# the *names* of the variables above.
$varsToAdd =
'foo',
'bar'
# Loop over the variable names and use string expansion to create <name>=<value> lines.
# Note how Get-Variable is used to retrieve each variable's value via its *name*.
$(foreach ($varName in $varsToAdd) {
"$varName=$(Get-Variable $varName -ValueOnly)"
}) >> "$ConfigLoc/tool.cfg"
With the above, the following lines are appended to the output *.cfg file:
foo=fooVal
bar=barVal
Note that you can read such a file more easily with the ConvertFrom-StringData, which outputs a hashtable with the name-value pairs from the file:
$htSettings = Get-Content -Raw "$ConfigLoc/tool.cfg" | ConvertFrom-StringData
Accessing $htSettings.foo would then return fooVal, for instance.
With a hashtable as the settings container, updating the config file becomes easier, as you can simply recreate the file with all settings and their current values:
$htSettings.GetEnumerator() |
ForEach-Object { "$($_.Key)=$($_.Value)" } > "$ConfigLoc/tool.cfg"
Note: PowerShell by default doesn't enumerate the entries of a hashtable in the pipeline, which is why .GetEnumerator() is needed.
Generally, though, this kind of manual serialization is fraught, as others have pointed out, and there are more robust - though typically less friendly - alternatives.
With your string- and line-based serialization approach, there are two things to watch out for:
All values are saved as a strings, so you have to manually reconvert to the desired data type, if necessary - and even possible, given that not all objects provide meaningful string representations.
Generally, the most robust serialization format is Export-CliXml, but note that it is not a friendly format - be careful with manual edits.
ConvertFrom-StringData will fail with duplicate names in the config file, which means you have to manually ensure that you create no duplicate entries when you append to the file - if you use the above approach of recreating the file from a hashtable every time, however, you're safe.
I want to make variables from a particular column in a CSV.
CSV will have the following headers:
FolderName,FolderManager,RoleGroup,ManagerEmail
Under FolderName will be a list of rows with respective folder names such as: Accounts,HR,Projects, etc... (each of these names is a separate row in the FolderName column)
So I would like to create a list of variables to call on in a later stage. They would be something like the following:
$Accounts,
$HR,
$Projects,
I have done a few different scripts based on searching here and google, but unable to produce the desired results. I am hoping someone can lead me in the right direction here to create this script.
Versions of this question ("dynamic variables" or "variable variables" or "create variables at runtime") come up a lot, and in almost all cases they are not the right answer.
This is often asked by people who don't know a better way to approach their problem, but there is a better way: collections. Arrays, lists, hashtables, etc.
Here's the problem: You want to read a username and print it out. You can't write Hello Alice because you don't know what their name is to put in your code. That's why variables exist:
$name = Read-Host "Enter your name"
Write-Host "Hello $name"
Great, you can write $name in your source code, something which never changes. And it references their name, which does change. But that's OK.
But you're stuck - how can you have two people's names, if all you have is $name? How can you make many variables like $name2, $name3? How can you make $Alice, $Bob?
And you can...
New-Variable -Name (Read-Host "Enter your name") -Value (Read-Host "Enter your name again")
Write-Host "Hello
wait
What do you put there to write their name? You're straight back to the original problem that variables were meant to solve. You had a fixed thing to put in your source code, which allowed you to work with a changing value.
and now you have a varying thing that you can't use in your source code because you don't know what it is again.
It's worthless.
And the fix is that one variable with a fixed name can reference multiple values in a collection.
Arrays (Get-Help about_Arrays):
$names = #()
do {
$name = Read-Host "Enter your name"
if ($name -ne '')
{
$names += $name
}
} while ($name -ne '')
# $names is now a list, as many items long as it needs to be. And you still
# work with it by one name.
foreach ($name in $names)
{
Write-Host "Hello $name"
}
# or
$names.Count
or
$names | foreach { $_ }
And more collections, like
Hashtables (Get-Help about_Hash_Tables): key -> value pairs. Let's pair each file in a folder with its size:
$FileSizes = #{} # empty hashtable. (aka Dictionary)
Get-ChildItem *.txt | ForEach {
$FileSizes[$_.BaseName] = $_.Length
}
# It doesn't matter how many files there are, the code is just one block
# $FileSizes now looks like
#{
'readme' = 1024;
'test' = 20;
'WarAndPeace' = 1048576;
}
# You can list them with
$FileSizes.Keys
and
foreach ($file in $FileSizes.Keys)
{
$size = $FileSizes[$file]
Write-Host "$file has size $size"
}
No need for a dynamic variable for each file, or each filename. One fixed name, a variable which works for any number of values. All you need to do is "add however many there are" and "process however many there are" without explicitly caring how many there are.
And you never need to ask "now I've created variable names for all my things ... how do I find them?" because you find these values in the collection you put them in. By listing all of them, by searching from the start until you find one, by filtering them, by using -match and -in and -contains.
And yes, New-Variable and Get-Variable have their uses, and if you know about collections and want to use them, maybe you do have a use for them.
But I submit that a lot of people on StackOverflow ask this question solely because they don't yet know about collections.
Dynamic variables in Powershell
Incrementing a Dynamic Variable in Powershell
Dynamic variable and value assignment in powershell
Dynamically use variable in PowerShell
How to create and populate an array in Powershell based on a dynamic variable?
And many more, in Python too:
https://stackoverflow.com/a/5036775/478656
How can you dynamically create variables via a while loop?
Basically you want to create folders based on the values you are getting from CSV File.
(FileName has headers such as FolderName,
FolderManager,
RoleGroup,
ManagerEmail)
$File=Import-csv "FileName"
$Path="C:\Sample"
foreach ($item in $File){
$FolderName=$item.FolderName
$NewPath=$Path+"\$FolderName"
if(!(Test-Path $NewPath))
{
New-Item $NewPath -ItemType Directory
}
}
Hope this HElps.
In PowerShell, you can import a CSV file and get back custom objects. Below code snippet shows how to import a CSV to generate objects from it and then dot reference the properties on each object in a pipeline to create the new variables (your specific use case here).
PS>cat .\dummy.csv
"foldername","FolderManager","RoleGroup"
"Accounts","UserA","ManagerA"
"HR","UserB","ManagerB"
PS>$objectsFromCSV = Import-CSV -Path .\dummy.csv
PS>$objectsFromCSV | Foreach-Object -Process {New-Variable -Name $PSItem.FolderName }
PS>Get-Variable -name Accounts
Name Value
---- -----
Accounts
PS>Get-Variable -name HR
Name Value
---- -----
HR
`
My data file(.txt) has records of 31 fields/columns each and the fields are pipe delimited. Somehow, few records are corrupted(the record is split into multiple lines).
Can anyone guide in writing a script that reads this input data file and shapes it into a file containing exactly 31 fields in each record?
PS: I am new to powershell.
Sample data:
Good data - Whole record shows up in a single line.
Bad data - Record is broken into multiple lines.
Below is the structure of the record.
11/16/2007||0007327| 3904|1000|M1||CCM|12/31/2009|000|East 89th Street|01CM1| 11073|DONALD INC|001|Project 077|14481623.8100|0.0000|1.00000|1|EA|September 2007 Invoice|Project 027||000000000000|1330|11/16/2007|X||11/29/2007|2144.57
Here is what i have tried and script hangs
#Setup paths
$Input = "Path\Input.txt"
$Output = "Path\Output.txt"
#Create empty variables to set types
$Record=""
$Collection = #()
#Loop through text file
gc Path\Input.txt | %{
$Record = "$Record$_"
If($Record -Match "(\d{1,2}/\d{1,2}/\d{4}(?:\|.*?){31})(\d{1,2}/\d{1,2}/\d{4}\|.*?\|.*)"){
$Collection+=$Matches[1]
$Record=$Matches[2]
}
}
#Add last record to the collection
$Collection+=$Record $Collection | Out-File $Output
I see some issues that need to be clarified or addressed. First I noticed the line $Record=$Matches[2] does not appear to serve a purpose. Second your regex string appears to have some flaws which you were looking for. When i test your regex against your test data here: http://regex101.com/r/yA9tZ1/1
At least on that site the forward slashes needed to be escaped. Once I escaped the tester threw the error at me
Your expression took too long to evaluate.
I know the root of that issue comes from this portion of your regex which is trying to match your passive group with a non greedy quantifier 31 times. (?:\|.*?){31}
So taking a guess as to your true intention I have the following regex string
(\d{1,2}\/\d{1,2}\/\d{4}.{31}).*?(\d{1,2}\/\d{1,2}\/\d{4}\|.*?\|.*)
You can see the results here: http://regex101.com/r/qY1jZ7/2
While i doubt it is exactly what you wanted I hope this leads you in the right direction.
I just tried this, and while that solution worked for an extremely similar issue where the user only had 11 fields per record, apparently it's just no good for your 31 field records. I'd like to suggest an alternative using -Split alongside a couple of regex matches. This should work faster for you I think.
#Create regex objects to match against
[RegEx]$Regex = "(.*?)(\d{2}/\d{2}/\d{4})$"
[RegEx]$Regex2 = "(\d{2}/\d{2}/\d{4}.*)"
#Setup paths
$Input = "Path\Input.txt"
$Output = "Path\Output.txt"
#Create empty variables to set types
$Record=""
$Collection = #()
#Loop through text file
gc $Input | %{
If($_ -match "^\d{1,2}/\d{1,2}/\d{4}" -and $record.split("|").count -eq 31){$collection+=$record;$record=$_}
else{
$record="$record$_"
if($record.split("|").count -gt 31){
$collection+=$regex.matches(($record.split("|")[0..30]) -join "|").groups[1].value
$record=$regex2.matches(($record.split("|")[30..($record.split("|").count)]) -join "|").groups[1].value
}
}
}
#Add last record to the collection
$collection+=$record
#Output everything to a file
$collection|out-file $Output