Save state of a PowerShell script to continue it later - powershell

I'm trying to fetch some registry parameters for about 4'000 machines with PowerShell. There's by no mean any chance that all machines are going to be up at the same time so I'd like to be able to save the list of machines that were already backed up for the script to only get the parameters from not already backed up machines.
I made a CSV file in the form of Machine,Status where the Machine column stores machines names and Status is supposedly either equal to 0 if the script hasn't run yet on this machine and 1 if the machine has already been backed up.
I'm already successfully parsing the machines names from the CSV where Status = 0 and running my backup script for all machines that are up at script run time but I can't figure out how to set the variable Status to 1 back to the CSV (or from what I understood after some reading, to a temporary CSV that will later replace the original one).
My code is like:
$csv = Import-Csv -Path ./list.csv
foreach ($line in $csv) {
$status = $line.Status
$machine = $line.Machine
if ($status -eq "0") {
[REG BACKUP SCRIPT]
$status = 1 #Set the machine as done, but how to put it back to the CSV ?
Export-Csv -Path ./list-new.csv
I don't know where to put Export-Csv and after some attempts
Thanks

Related

Powershell script to copy file from Primary PC to users disaster recovery PC

I am looking to create a PowerShell script to copy a file from a users primary PC to the users Disaster Recovery PC. This is going to be done for over 100 people so I was going to create txt file with the users Production PC names and a list with their recovery PC names. Not sure how to match the PCs up for two different list. The file name always begins with the users domain account (username.Primary) and I am not sure how to specify the username part since it changes for each user.
$ServerListProd = Get-Content "C:\path\computers.txt"
$ServerlistRecovery = Get-Content "C:\path\computers2.txt"
$SourceFileLocation = "c:\path\username.primary"
$Destination = "c:\path\"
foreach ($_ in $ServerList)
{Copy-Item $SourceFileLocation -Destination \\$serverlistd\$Destination -Recurse -PassThru}
Why not put everything in one CSV and than Import it to Powershell.
CSV could look like this:
Primary;Recovery1;Recovery2;Recovery3
user1.Primary;user1.Recovery1;user1.Recovery2;user1.Recovery3;
user2.Primary;user2.Recovery1;user2.Recovery2;user2.Recovery3;
user3.Primary;user3.Recovery1;user3.Recovery2;user3.Recovery3;
user4.Primary;user4.Recovery1;user4.Recovery2;user4.Recovery3;
user5.Primary;user5.Recovery1;user5.Recovery2;user5.Recovery3;
Now you can Import it to powershell and have the data in an array:
$pc = Import-Csv "path to CSV" -Delimiter ";"

How to maintain a Session info in Powershell script

I have script like below:
foreach ($var1 in (gc 1.txt)){
// my code logic
}
Here 1.txt file contains list of values like abc, xyz, pqr etc..,
If due to any of the script issues/ctrl+c stopped the script, need to restart the script from last stopped session.
To be clear if script has stopped at processing file at 'xyz' and when i restart the script it should process the logic from 'xyz' only but shouldn't restart from 'abc' again.
Please guide me to achieve this logic.
Thanks in advance,
Pavan kumar D
You need to add some counters and keep track of last iteration using and index file.
Then make sure that you start reading the file where you was interrupted using select -Skip
<#
.NOTE
The file index.txt is created at the first run and is used to store
the last line accessed of the file to process.
The index read from index.txt is used to skip by already processed
lines at the start of the foreach.
If the index doesn't need to be stored between session, then use
a $global.index instead to speed up the script, instead of an index
file (or a RAM-drive, not very common any more).
At each succefull iteration, the index is incremented and then stored in
index.txt.
When starting the stored index.txt is compared with the numbers of
lines in the file and will through an error if it's passed End Of File.
Make sure to **clear** the index.txt before starting the file process
fresh.
#>
#initialize counters
[int]$StartLine = Get-Content .\index.txt -ErrorAction SilentlyContinue
if (-not $StartLine){[int]$StartLine = 0} # First run will have no index file
$Index = $StartLine
[int]$LastLineOfFile = (get-content 1.txt).count - 1 # Arrays starts at 0
if ($Index -gt $LastLineOfFile){# Don't start if index passed EOF at last run
Write-Error "Index passed end of file"
return
}
foreach ($var in (Get-Content 1.txt | Select -Skip $StartLine)){
"New loop: $Index" | Out-Host # will start empty
"Processing value: $var" | Out-Host
####
# Processing here
####
"Done processing value: $var" | Out-Host
$Index++
$Index > index.txt
"Index incremented" | Out-Host
}
For a good article of using RAM drive, see, i.e. How to Create RAM Disk in Windows 10 for Super-Fast Read and Write Speeds
Windows Server do support RAM disks natively (kind of) by using the iSCSI Target Server.
See, How to Create a RAM Disk on Windows Server?

Fastest way to copy files (but not the entire directory) from one location to another

Summary
I am currently tasked with migrating around 6TB of data to a cloud server, and am trying to optimise how fast this can be done.
I would use standard Robocopy to do this usually, but there is a requirement that I am to only transfer files that are present in a filetable in SQL, and not the entire directories (due to a lot of junk being inside these folders that we do not want to migrate).
What I have tried
Feeding in individual files from an array into Robocopy is unfeasibly slow, as Robocopy instances were being started sequentially for each file, so I tried to speed up this process in 2 ways.
It was pointless to have /MT set above 1 if only one file was being transferred, so I attempted to simulate the multithreading feature. I did this by utilising the new ForEach-Object –Parallel feature in PowerShell 7.0, and setting the throttle limit to 4. With this, I was able to pass the array in and run 4 Robocopy jobs in parallel (still starting and stopping for each file), which increased speed a bit.
Secondly, I split the array into 4 equal arrays, and ran the above function across each array as a job, which again increased the speed by quite a bit. For clarity, I had equal 4 arrays fed to 4 ForEach-Object -Parallel code blocks that were running 4 Robocopy instances, so a total of 16 Robocopy instances at once.
Issues
I encountered a few problems.
My simulation of the multithreading feature did not behave in the way that the /MT flag works in Robocopy. When examining the processes running, my code executes 16 instances of Robocopy at once, whereas the normal /MT:16 flag of Robocopy would only kick off one Robocopy instance (but still be multithreading).
Secondly, the code causes a memory leak. The memory usage starts to increase when the jobs and accumulates over time, until a large portion of memory is being utilised. When the jobs complete, the memory usage is still high until I close PowerShell and the memory is released. Normal Robocopy did not do this.
Finally, I decided to compare the time taken for my method, and then a standard Robocopy of the entire testing directory, and the normal Robocopy was still over 10x faster, and had a better success rate (a lot of the files weren’t copied over with my code, and a lot of the time I was receiving error messages that the files were currently in use and couldn’t be Robocopied, presumably because they were in the process of being Robocopied).
Are there any faster alternatives, or is there a way to manually create a multithreading instance of robocopy that would perform like the /MT flag of the standard robocopy? I appreciate any insight/alternative ways of looking at this. Thanks!
#Item(0) is the Source excluding the filename, Item(2) is the Destination, Item(1) is the filename
$robocopy0 = $tables.Tables[0].Rows
$robocopy1 = $tables.Tables[1].Rows
$robocopy0 | ForEach-Object -Parallel {robocopy $_.Item(0) $_.Item(2) $_.Item(1) /e /w:1 /r:1 /tee /NP /xo /mt:1 /njh /njs /ns
} -ThrottleLimit 4 -AsJob
$robocopy1 | ForEach-Object -Parallel {robocopy $_.Item(0) $_.Item(2) $_.Item(1) /e /w:1 /r:1 /tee /NP /xo /mt:1 /njh /njs /ns
} -ThrottleLimit 4 -AsJob
#*8 for 8 arrays
RunspaceFactory multithreading might be optimally suited for this type of work--with one HUGE caveat. There are quite a few articles out on the net about it. Essentially you create a scriptblock that takes parameters for the source file to copy and the destination to write to and uses those parameters to execute robocopy against it. You create individual PowerShell instances to execute each variant of the scriptblock and append it to the RunspaceFactory. The RunspaceFactory will queue up the jobs and work against the probably millions of jobs X number at a time, where X is equal to the number of threads you allocate for the factory.
CAVEAT: First and foremost, to queue up millions of jobs relative to the probable millions of files you have across 6TB, you'll likely need monumental amounts of memory. Assuming an average path length for source and destination of 40 characters (probably very generous) * a WAG of 50 million files is nearly 4GB in memory by itself, which doesn't include object structural overhead, the PowerShell instances, etc. You can overcome this either breaking up the job into smaller chunks or use a server with 128GB RAM or better. Additionally, if you don't terminate the jobs once they've been processed, you'll also experience what appears to be a memory leak but is just your jobs producing information that you're not closing when completed.
Here's a sample from a recent project I did migrating files from an old domain NAS to a new domain NAS -- I'm using Quest SecureCopy instead of RoboCopy but you should be able to easily replace those bits:
## MaxThreads is an arbitrary number I use relative to the hardware I have available to run jobs I'm working on.
$FileRSpace_MaxThreads = 15
$FileRSpace = [runspacefactory]::CreateRunspacePool(1, $FileRSpace_MaxThreads, ([System.Management.Automation.Runspaces.InitialSessionState]::CreateDefault()), $Host)
$FileRSpace.ApartmentState = 'MTA'
$FileRSpace.Open()
## The scriptblock that does the actual work.
$sb = {
param(
$sp,
$dp
)
## This is my output object I'll emit through STDOUT so I can consume the status of the job in the main thread after each instance is completed.
$n = [pscustomobject]#{
'source' = $sp
'dest' = $dp
'status' = $null
'sdtm' = [datetime]::Now
'edtm' = $null
'elapsed' = $null
}
## Remove the Import-Module and SecureCopy cmdlet and replace it with the RoboCopy version
try {
Import-Module "C:\Program Files\Quest\Secure Copy 7\SCYPowerShellCore.dll" -ErrorAction Stop
Start-SecureCopyJob -Database "C:\Program Files\Quest\Secure Copy 7\SecureCopy.ssd" -JobName "Default" -Source $sp -Target $dp -CopySubFolders $true -Quiet $true -ErrorAction Stop | Out-Null
$n.status = $true
} catch {
$n.status = $_
}
$n.edtm = [datetime]::Now
$n.elapsed = ("{0:N2} minutes" -f (($n.edtm - $n.sdtm).TotalMinutes))
$n
}
## The array to hold the individual runspaces and ulitimately iterate over to watch for completion.
$FileWorkers = #()
$js = [datetime]::now
log "Job starting at $js"
## $peers is a [pscustomobject] I precreate that just contains every source (property 's') and the destination (property 'd') -- modify to suit your needs as necessary
foreach ($c in $peers) {
try {
log "Configuring migration job for '$($c.s)' and '$($c.d)'"
$runspace = [powershell]::Create()
[void]$runspace.AddScript($sb)
[void]$runspace.AddArgument($c.s)
[void]$runspace.AddArgument($c.d)
$runspace.RunspacePool = $FileRSpace
$FileWorkers += [pscustomobject]#{
'Pipe' = $runspace
'Async' = $runspace.BeginInvoke()
}
log "Successfully created a multi-threading job for '$($c.s)' and '$($c.d)'"
} catch {
log "An error occurred creating a multi-threading job for '$($c.s)' and '$($c.d)'"
}
}
while ($FileWorkers.Async.IsCompleted -contains $false) {
$Completed = $FileWorkers | ? { $_.Async.IsCompleted -eq $true }
[pscustomobject]#{
'Numbers' = ("{0}/{1}" -f $Completed.Count, $FileWorkers.Count)
'PercComplete' = ("{0:P2}" -f ($Completed.Count / $FileWorkers.Count))
'ElapsedMins' = ("{0:N2}" -f ([datetime]::Now - $js).TotalMinutes)
}
$Completed | % { $_.Pipe.EndInvoke($_.Async) } | Export-Csv -NoTypeInformation ".\$($DtmStamp)_SecureCopy_Results.csv"
Start-Sleep -Seconds 15
}
## This is to handle a race-condition where the final job(s) aren't completed before the sleep but do when the while is re-eval'd
$FileWorkers | % { $_.Pipe.EndInvoke($_.Async) } | Export-Csv -NoTypeInformation ".\$($DtmStamp)_SecureCopy_Results.csv"
Suggested strategies if you don't have a beefy server to queue up all the jobs simultaneously is to either batch out the files in statically sized blocks (e.g. 100,000 or whatever your hw can take) or you could group files together to send to each script block (e.g. 100 files per scriptblock) which would minimize the number of jobs to queue up in the runspace factory (but would require some code change).
HTH
Edit 1: To Address constructing the input object I'm using
$destRoot = '\\destinationserver.com\share'
$peers = #()
$children = #()
$children += (get-childitem '\\sourceserver\share' -Force) | Select -ExpandProperty FullName
foreach ($c in $children) {
$peers += [pscustomobject]#{
's' = $c
'd' = "$($destRoot)\$($c.Split('\')[3])\$($c | Split-Path -Leaf)"
}
}
In my case, I was taking stuff from \server1\share1\subfolder1 and moving it to something like \server2\share1\subfolder1\subfolder2. So in essence, all the '$peers' array is doing is constructing an object that took in the fullname of the source target and constructing the corresponding destination path (since the source/dest server names are different and possibly share name too).
You don't have to do this, you can dynamically construct the destination and just loop through the source folders. I perform this extra step because now I have a two property array that I can verify is pre-constructed accurately as well as perform tests to ensure things exist and are accessible.
There is a lot of extra bloat in my script due to custom objects meant to give me output from each thread put into the multi-threader so I can see the status of each copy attempt--to track things like folders that were successful or not, how long it took to perform that individual copy, etc. If you're using robocopy and dumping the results to a text file, you may not need this. If you want me to pair down script to it's barebone components just to get things multi-threading, I can do that if you like.

Powershell script to detect log file failures

I run perfmon on a server and the log files go into a c:\perfmon folder. The perfmon task restarts each week and the log files just collect in there over time. There could be a range of csv file dates in that folder with different dates.
I would like to write a powershell script that will check that folder to make sure that there is a file there with todays modified date on it. If there isn't one for today I would like a FAILED email so that I know to look at perfmon for issues.
Has anyone written anything like this. I have tried several of the scripts in here and none do it exactly how I would like it to work. This is what I have so far based on other scripts.
It sort of works but it is checking all files and responding for all files as well. If I had 3 files over the last three days and none for today I would get 3 emails saying FAIL. If I have one for today and two older ones I get 3 OK emails. If I just have just one file for today I get one OK email. How do I restrict this to just one email for a fail or success. There could be 50-100 files in that folder after two years and I just want a FAIL if none of them are modified today.
Hope that all makes sense. I'm afraid my Powershell skills are very weak.
$EmailTech = #{
To = 'a#a.com'
SmtpServer = 'relayserver'
From = 'a#a.com'
}
$CompareDate = (Get-Date).AddDays(-1)
Get-ChildItem -Path c:\perflogs\logs | ForEach-Object {
$Files = (Get-ChildItem -Path c:\perflogs\logs\*.csv | Where-Object {$_.LastWritetime -gt $CompareDate} | Measure-Object).Count
if ($Files -eq 0)
{
$EmailTech.Subject = 'Perfmon File ServerA - FAIL'
$EmailTech.Body = 'A performance monitor log file was not found on ServerA for today'
Send-MailMessage #EmailTech
}
Else
{
# If we found files it's ok and we don't report it
$EmailTech.Subject = 'Perfmon File ServerA - OK'
$EmailTech.Body = 'A performance monitor log file was found on ServerA for today'
Send-MailMessage #EmailTech
}
}

Get virtual SCSI hardware on servers using Powershell

I'm trying to use Powershell to get SCSI hardware from several virtual servers and get the operating system of each specific server. I've managed to get the specific SCSI hardware that I want to find with my code, however I'm unable to figure out how to properly get the operating system of each of the servers. Also, I'm trying to send all the data that I find into a csv log file, however I'm unsure of how you can make a powershell script create multiple columns.
Here is my code (almost works but something's wrong):
$log = "C:\Users\me\Documents\Scripts\ScsiLog.csv"
Get-VM | Foreach-Object {
$vm = $_
Get-ScsiController -VM $vm | Where-Object { $_.Type -eq "VirtualBusLogic" } | Foreach-Object {
get-VMGuest -VM $vm } | Foreach-Object{
Write-output $vm.Guest.VmName >> $log
}
}
I don't receive any errors when I run this code however whenever I run it I'm only getting the name of the servers and not the OS. Also I'm not sure what I need to do to make the OS appear in a different column from the name of the server in the csv log that I'm creating.
What do I need to change in my code to get the OS version of each virtual machine and output it in a different column in my csv log file?
get-vmguest returns a VMGuest object. Documented here: http://www.vmware.com/support/developer/PowerCLI/PowerCLI51/html/VMGuest.html. The documentation is sparse, but I would guess the OSFullName field would give you the OS version. So you could change the write-output line to
add-content $log "$($vm.guest.vmname) , $($vmguest.guest.OSFullName)"
and you'd be on the right track. The comma in the output is what makes the output "comma separated values". A CSV file can optionally have a header. (See Import-CSV / Export-CSV help for details). So you might want to add the following as the second line of your script:
add-content $log "VM Name, OS Name" # add CSV header line