Powershell copy files from different directories - powershell

I have a script that copies the files from one server to another.
I have basically 3 different server locations, that I want to copy from , and create on another server, a folder for each of the location from the source and the contents inside it.
I made that, but I declared a variable for each source and each folder/destination.
I want to create one variable, and from that it should automatically get each location of the source, and copy everything in the correct location.
Would defining $server= "\path1 , \path2, \path3 " do it and it would go into a foreach loop? and go through each part of the path and copy and paste?
If so, how can I define the destination if I have 1 folder with 3 subfolders each corresponding to one source.
for example \path1 should alwasy put items in the path1destination , \path2 should always put items in the path2destination and so on. basically I want somehow to correlate that for each source path to have a specific path destination and everything should use as less variables as possible.
Anyone can provide and ideas on how to tackle this ? My code works but I had to define $path1 , $path2 , $path3 and so on, and then go for a loop on each, which is great but I need to make it clean and less lines of code .
$server1 = "C:\Users\nicolae.calimanu\Documents\B\"
$server2 = "C:\Users\nicolae.calimanu\Documents\A\" # UNC Path.
$datetime = Get-Date -Format "MMddyyyy-HHmmss"
$server3 = "C:\Users\nicolae.calimanu\Documents\C\" # UNC Path.
foreach ($server1 in gci $server1 -recurse)
{
Copy-Item -Path $server1.FullName -Destination $server2
}
ForEach ( $server2 in $server2 ) {
$curDateTime = Get-Date -Format yyyyMMdd-HHmmss
Get-ChildItem $server2 -Recurse |
Rename-Item -NewName {$_.Basename + '_' + $curDateTime + $_.Extension }
}
foreach ($server2 in gci $server2 -Recurse)
{
Move-Item -path $server2 -destination "C:\Users\nicolae.calimanu\Documents\C"
}

Use a hashtable to create a key-value store for each source and destination. Like so,
# Create entries for each source and destination
$ht = #{}
$o = new-object PSObject -property #{
from = "\\serverA\source"
to = "\\serverB\destination" }
$ht.Add($o.from, $o)
$o = new-object PSObject -property #{
from = "\\serverC\source"
to = "\\serverB\destination2" }
$ht.Add($o.from, $o)
$o = new-object PSObject -property #{
from = "\\servera\source2"
to = "\\serverC\destination" }
$ht.Add($o.from, $o)
# Iterate the collection. For demo, print the copy commands
foreach($server in $ht.keys) { $cmd = $("copy-item {0} {1}" -f $ht.Item($server).from, $ht.Item($server).to); $cmd }
# Sample output
copy-item \\serverA\source \\serverB\destination
copy-item \\servera\source2 \\serverC\destination
copy-item \\serverC\source \\serverB\destination2

Related

zip multiple directories in Powershell using .NET classes, instead of compress-archive

I am trying to use .NET classes instead of native compress-archive to zip multiple directories (each containing sub-directories and files), as compress-archive is giving me occasional OutOfMemory Exception.
Some articles tell me .NET classes, makes for a more optimal approach.
My tools directory $toolsDir = 'C:\Users\Public\LocalTools' has more than one directory that need to be zipped (please note everything is a directory, not file) - whichever directory matches the regex pattern as in the code.
Below is my code:
$cmpname = $env:computername
$now = $(Get-Date -Format yyyyMMddmmhhss)
$pattern = '^(19|[2-9][0-9])\d{2}\-(0?[1-9]|1[012])\-(0[1-9]|[12]\d|3[01])T((?:[01]\d|2[0-3])\;[0-5]\d\;[0-5]\d)\.(\d{3}Z)\-' + [ regex ]::Escape($cmpname)
$toolsDir = 'C:\Users\Public\LocalTools'
$destPathZip = "C:\Users\Public\ToolsOutput.zip"
Add-Type -AssemblyName System.IO.Compression
Add-Type -AssemblyName System.IO.Compression.FileSystem
$CompressionLevel = [ System.IO.Compression.CompressionLevel ]::Optimal
$IncludeBaseDirectory = $false
$stream = New-Object System.IO.FileStream($destPathZip , [ System.IO.FileMode ]::OpenOrCreate)
$zip = New-Object System.IO.Compression.ZipArchive($stream , 'update')
$res = Get-ChildItem "${toolsDir}" | Where-Object {$_ .Name -match "${pattern}"}
if ($res -ne $null) {
foreach ($dir in $res) {
$source = "${toolsDir}\${dir}"
[ System.IO.Compression.ZipFileExtensions ]::CreateEntryFromFile($destPathZip , $source , (Split-Path $source -Leaf), $CompressionLevel)
}
}
else {
Write-Host "Nothing to Archive!"
}
Above code gives me this error:
When I researched about [ System.IO.Compression.ZipFileExtensions ]::CreateEntryFromFile , it is used to add files to a zip file already created. Is this the reason I am getting the error that I get ?
I also tried [ System.IO.Compression.ZipFile ]::CreateFromDirectory($source , $destPathZip , $CompressionLevel, $IncludeBaseDirectory) instead of [ System.IO.Compression.ZipFileExtensions ]::CreateEntryFromFile($destPathZip , $source , (Split-Path $source -Leaf), $CompressionLevel)
That gives me "The file 'C:\Users\Public\ToolsOutput.zip' already exists error.
How to change the code, in order to add multiple directories in the zip file.
There are 3 problems with your code currently:
First argument passed to CreateEntryFromFile() must be a ZipArchive object in which to add the new entry - in your case you'll want to pass the $zip which you've already created for this purpose.
CreateEntryFromFile only creates 1 entry for 1 file per call - to recreate a whole directory substructure you need to calculate the correct entry path for each file, eg. subdirectory/subsubdirectory/file.exe
You need to properly dispose of both the ZipArchive and the underlying file stream instances in order for the data to be persisted on disk. For this, you'll need a try/finally statement.
Additionally, there's no need to create the file if there are no files to archive :)
$cmpname = $env:computername
$pattern = '^(19|[2-9][0-9])\d{2}\-(0?[1-9]|1[012])\-(0[1-9]|[12]\d|3[01])T((?:[01]\d|2[0-3])\;[0-5]\d\;[0-5]\d)\.(\d{3}Z)\-' + [regex]::Escape($cmpname)
$toolsDir = 'C:\Users\Public\LocalTools'
$destPathZip = "C:\Users\Public\ToolsOutput.zip"
Add-Type -AssemblyName System.IO.Compression
Add-Type -AssemblyName System.IO.Compression.FileSystem
$CompressionLevel = [System.IO.Compression.CompressionLevel]::Optimal
$res = Get-ChildItem -LiteralPath $toolsDir | Where-Object { $_.Name -match $pattern }
if ($res) {
try {
# Create file + zip archive instances
$stream = New-Object System.IO.FileStream($destPathZip, [System.IO.FileMode]::OpenOrCreate)
$zip = New-Object System.IO.Compression.ZipArchive($stream, [System.IO.Compression.ZipArchiveMode]::Update)
# Discover all files to archive
foreach ($file in $res |Get-ChildItem -File -Recurse) {
$source = $dir.FullName
# calculate correct relative path to the archive entry
$relativeFilePath = [System.IO.Path]::GetRelativePath($toolsDir, $source)
$entryName = $relativeFilePath.Replace('\', '/')
# Make sure the first argument to CreateEntryFromFile is the ZipArchive object
[System.IO.Compression.ZipFileExtensions]::CreateEntryFromFile($zip, $source, $entryName, $CompressionLevel)
}
}
finally {
# Clean up in reverse order
$zip, $stream | Where-Object { $_ -is [System.IDisposable] } | ForEach-Object Dispose
}
}
else {
Write-Host "Nothing to Archive!"
}
Calling Dispose() on $zip will cause it to flush any un-written modifications to the underlying file stream and free any additional file handles it might have acquired, whereas calling Dispose() on the underlying file stream flushes those changes to the disk and closes the file handle.

Compare directories exactly - including moved files

My aim is to compare two directories exactly - including the structure of the directories and sub-directories.
I need this, because I want to monitor if something in the folder E:\path2 was changed. For this case a copy of the full folder is in C:\path1. If someone changes something it has to be done in two directories.
It is important for us, because if something is changed in the directory (accidentally or not) it could break down other functions in our infrastructure.
This is the script I've already written:
# Compare files for "copy default folder"
# This Script compares the files and folders which are synced to every client.
# Source: https://mcpmag.com/articles/2016/04/14/contents-of-two-folders-with-powershell.aspx
# 1. Compare content and Name of every file recursively
$SourceDocsHash = Get-ChildItem -recurse –Path C:\path1 | foreach {Get-FileHash –Path $_.FullName}
$DestDocsHash = Get-ChildItem -recurse –Path E:\path2 | foreach {Get-FileHash –Path $_.FullName}
$ResultDocsHash = (Compare-Object -ReferenceObject $SourceDocsHash -DifferenceObject $DestDocsHash -Property hash -PassThru).Path
# 2. Compare name of every folder recursively
$SourceFolders = Get-ChildItem -recurse –Path C:\path1 #| where {!$_.PSIsContainer}
$DestFolders = Get-ChildItem -recurse –Path E:\path2 #| where {!$_.PSIsContainer}
$CompareFolders = Compare-Object -ReferenceObject $SourceFolders -DifferenceObject $DestFolders -PassThru -Property Name
$ResultFolders = $CompareFolders | Select-Object FullName
# 3. Check if UNC-Path is reachable
# Source: https://stackoverflow.com/questions/8095638/how-do-i-negate-a-condition-in-powershell
# Printout, if UNC-Path is not available.
if(-Not (Test-Path \\bb-srv-025.ftscu.be\DIP$\Settings\ftsCube\default-folder-on-client\00_ftsCube)){
$UNCpathReachable = "UNC-Path not reachable and maybe"
}
# 4. Count files for statistics
# Source: https://stackoverflow.com/questions/14714284/count-items-in-a-folder-with-powershell
$count = (Get-ChildItem -recurse –Path E:\path2 | Measure-Object ).Count;
# FINAL: Print out result for check_mk
if($ResultDocsHash -Or $ResultFolders -Or $UNCpathReachable){
echo "2 copy-default-folders-C-00_ftsCube files-and-folders-count=$count CRITIAL - $UNCpathReachable the following files or folders has been changed: $ResultDocs $ResultFolders (none if empty after ':')"
}
else{
echo "0 copy-default-folders-C-00_ftsCube files-and-folders-count=$count OK - no files has changed"
}
I know the output is not perfect formatted, but it's OK. :-)
This script spots the following changes successfully:
create new folder or new file
rename folder or file -> it is shown as error, but the output is empty. I can live with that. But maybe someone sees the reason. :-)
delete folder or file
change file content
This script does NOT spot the following changes:
move folder or file to other sub-folder. The script still says "everything OK"
I've been trying a lot of things, but could not solve this.
Does anyone can help me how the script can be extended to spot a moved folder or file?
I think your best bet is to use the .NET FileSystemWatcher class. It's not trivial to implement an advanced function that uses it, but I think it will simplify things for you.
I used the article Tracking Changes to a Folder Using PowerShell when I was learning this class. The author's code is below. I cleaned it up as little as I could stand. (That publishing platform's code formatting hurts my eyes.)
I think you want to run it like this.
New-FileSystemWatcher -Path E:\path2 -Recurse
I could be wrong.
Function New-FileSystemWatcher {
[cmdletbinding()]
Param (
[parameter()]
[string]$Path,
[parameter()]
[ValidateSet('Changed', 'Created', 'Deleted', 'Renamed')]
[string[]]$EventName,
[parameter()]
[string]$Filter,
[parameter()]
[System.IO.NotifyFilters]$NotifyFilter,
[parameter()]
[switch]$Recurse,
[parameter()]
[scriptblock]$Action
)
$FileSystemWatcher = New-Object System.IO.FileSystemWatcher
If (-NOT $PSBoundParameters.ContainsKey('Path')){
$Path = $PWD
}
$FileSystemWatcher.Path = $Path
If ($PSBoundParameters.ContainsKey('Filter')) {
$FileSystemWatcher.Filter = $Filter
}
If ($PSBoundParameters.ContainsKey('NotifyFilter')) {
$FileSystemWatcher.NotifyFilter = $NotifyFilter
}
If ($PSBoundParameters.ContainsKey('Recurse')) {
$FileSystemWatcher.IncludeSubdirectories = $True
}
If (-NOT $PSBoundParameters.ContainsKey('EventName')){
$EventName = 'Changed','Created','Deleted','Renamed'
}
If (-NOT $PSBoundParameters.ContainsKey('Action')){
$Action = {
Switch ($Event.SourceEventArgs.ChangeType) {
'Renamed' {
$Object = "{0} was {1} to {2} at {3}" -f $Event.SourceArgs[-1].OldFullPath,
$Event.SourceEventArgs.ChangeType,
$Event.SourceArgs[-1].FullPath,
$Event.TimeGenerated
}
Default {
$Object = "{0} was {1} at {2}" -f $Event.SourceEventArgs.FullPath,
$Event.SourceEventArgs.ChangeType,
$Event.TimeGenerated
}
}
$WriteHostParams = #{
ForegroundColor = 'Green'
BackgroundColor = 'Black'
Object = $Object
}
Write-Host #WriteHostParams
}
}
$ObjectEventParams = #{
InputObject = $FileSystemWatcher
Action = $Action
}
ForEach ($Item in $EventName) {
$ObjectEventParams.EventName = $Item
$ObjectEventParams.SourceIdentifier = "File.$($Item)"
Write-Verbose "Starting watcher for Event: $($Item)"
$Null = Register-ObjectEvent #ObjectEventParams
}
}
I don't think any example I've found online tells you how to stop watching the filesystem. The simplest way is to just close your PowerShell window. But I always seem to have 15 tabs open in each of five PowerShell windows, and closing one of them is a nuisance.
Instead, you can use Get-Job to get the Id of registered events. Then use Unregister-Event -SubscriptionId n to, well, unregister the event, where 'n' represents the number(s) you find in the Id property of Get-Job..
So basically you want to synchronize the two folders and note all the changes made on that:
I would suggest you to use
Sync-Folder Script
Or
FreeFile Sync.

File SDDL not equal although it should be

We're trying to compare NTFS permissions for files or folders using the SDDL attribute. The only thing we're interested in is if the ACL is equal or not, by using the SDDL and not other methods like AccessToString or just comparing two plain ACL objects. This is because we had issues in the past with the standard way of doing this.
So, we now run against an issue where File1 and File2 have exactly the same permissions when checking the Advanced Permissions tab in Windows. However, the SDDL says it's not equal, although we take away the Owner O: part from the SDDL string as indicated here, as the owner doesn't interest us.
The code:
Function Test-ACLequal {
Param (
$Source,
$Target
)
$CompParams = #{
ReferenceObject = Get-Acl -LiteralPath $Source
PassThru = $True
}
$CompParams.DifferenceObject = Get-Acl -LiteralPath $Target
$AccessParams = #{
ReferenceObject = ($CompParams.ReferenceObject.sddl -split 'G:', 2 | Select -Last 1)
DifferenceObject = ($CompParams.DifferenceObject.sddl -split 'G:', 2 | Select -Last 1)
PassThru = $True
}
if (Compare-Object #AccessParams) {
Write-Verbose 'Test-ACLequalHC: Not equal'
$false
}
else {
Write-Verbose 'Test-ACLequalHC: Equal'
$True
}
}
Test-ACLequal -Source $File1-Target $File2
You can clearly see there is a difference between both files:
$AccessParams.ReferenceObject
DUD:(A;ID;FA;;;BA)(A;ID;0x1200a9;;;S-1-5-21-1078081533-261478967-839522115-243052)(A;ID;0x1301ff;;;S-1
-5-21-1078081533-261478967-839522115-280880)(A;ID;0x1301ff;;;S-1-5-21-1078081533-261478967-839522115-6
96733)(A;ID;0x1301ff;;;S-1-5-21-1078081533-261478967-839522115-696745)
$AccessParams.DifferenceObject
DUD:AI(A;ID;FA;;;BA)(A;ID;0x1200a9;;;S-1-5-21-1078081533-261478967-839522115-243052)(A;ID;0x1301ff;;;S
-1-5-21-1078081533-261478967-839522115-280880)(A;ID;0x1301ff;;;S-1-5-21-1078081533-261478967-839522115
-696733)(A;ID;0x1301ff;;;S-1-5-21-1078081533-261478967-839522115-696745)
Is there a way to compare files by using the SDDL without running into this issue?
Does using .Equals work for you here?
$sourceAcl = Get-Acl $source
$targetAcl = Get-Acl $target
if ($sourceAcl.sddl.Equals($targetAcl.sddl)) {
# Do something
....
}
This includes the owner however. In your example where you're removing it, you're also converting the object to a string, so using Compare-Object isn't really necessary. I'm also not sure how safe the split you're using is. You could also do:
$sourceAcl = Get-Acl $source
$targetAcl = Get-Acl $target
$s = $sourceAcl.sddl -replace "^O:[^:]+:",""
$t = $targetAcl.sddl -replace "^O:[^:]+:",""
if ($s -eq $t) {
# Do something
....
}

Extract year from string row in file text and move torrent to relative folder reordered by year

Question is tricky because is an evolution of my previously question.
To move torrent in folders I use this powershell script
$ToFolder = "$env:USERPROFILE\Desktop\to"
$FromFolder = "$env:USERPROFILE\Desktop\From"
#Create the sample folder on your desktop
#This line can be commented out if your ToFolder exists
New-Item $ToFolder -ItemType directory -Force
GCI -Path $FromFolder *.torrent | % {
if ($_.Name -match "(19|20)\d{2}") {
#Check to see if year folder already exists at the destination
#If not then create a folder based on this year
if (!(Test-Path "$ToFolder\$($Matches[0])")) {
New-Item -Path "$ToFolder\$($Matches[0])" -ItemType directory
}
#Transfer the matching file to its new folder
#Can be changed to Move-Item if happy with the results
Move-Item -Path $_.FullName -Destination "$ToFolder\$($Matches[0])" -Force
}
}
but in my NEW situation I must extract year from file text .txt
Example list of file .torrent inside a folder
Caccia Spietata.torrent
Caccia Zero terrore del Pacifico.torrent
Caccia.A.Ottobre.Rosso.torrent
Cacciatore Bianco Cuore Nero.torrent
Cacciatore di Ex.torrent
Cacciatori Di Zombie.torrent
Example of string list in file text
Caccia grossa a casa di Topolino (2006)
Caccia selvaggia [HD] (1981)
Caccia spietata (2006)
Cacciatori Di Zombie (2005)
What script must do ?
A. extract year from string in file text (every string is on a single row because file text is a list)
N.B script should compare between torrent files names and strings in file text list.
Caccia spietata (2006)
Extract year is possibile only for equal text or very very similar text like
Caccia Spietata.torrent
Caccia spietata (2006)
If I have
caccia.spietata.torrent
caccia SPiETata (2006)
this is for me very similar strings.
B. Make folder
2006
C. Move torrent
Caccia Spietata.torrent
into folder 2006
I want this solution because I have many .torrent file name without year so I must reorder them correctly by year.
Thanks for any help.
The first hurdle is parsing dates and names out of the string file. You then add them to a hash of movie name strings.
$movies = #()
(get-content C:\Path\Test4.txt) | foreach($_){
$properties = #{
date = $_.substring($_.IndexOf("(")+1,4)
name = $_.substring(0,$_.IndexOf("("))
}
$movies += New-Object PSObject -Property $properties
}
$movies
Once you have the movie names and dates separate, you loop through each movie and create a folder if it does not exist.
foreach($movie in $movies){
$movie.date
$datePath = "C:\Path\$($movie.date)"
if(-not(test-path $datePath)) {
new-item $datePath -ItemType "directory"
}
After that, you can split the name into key words based on whitespace.
$words = $movie.name -split '\s'
$words
Below is as far as I've gotten during a break of mine. The next step is a bit complicated seeming, as you have to then match the torrent files to the object in the hash based on keywords. It will be hard to construct such a filter without access to the raw data. My first thought would be to match based on fileName.torrent -like "*word*", but it looks like there are a ton of duplicate words. The next option is to match on multiple words, or maybe only use words that are not common (exclude "caccia", articles, etc). Either way, that should move you a bit closer to your goal. Maybe someone else can help finish, or I can revisit it during another break.
$movies = #()
(get-content C:\Path\Test4.txt) | foreach($_){
$properties = #{
date = $_.substring($_.IndexOf("(")+1,4)
name = $_.substring(0,$_.IndexOf("("))
}
$movies += New-Object PSObject -Property $properties
}
$movies
foreach($movie in $movies){
$movie.date
$datePath = "C:\Path\$($movie.date)"
if(-not(test-path $datePath)) {
new-item $datePath -ItemType "directory"
}
$words = $movie.name -split '\s'
$words
#this is as far as I got
}
UPDATE
I've added a bit that we talked about in comments. Most of the changes are at the bottom of the script.
$movies = #()
(get-content $Path\Test4.txt) | foreach($_){
$properties = #{
date = $_.substring($_.IndexOf("(")+1,4)
name = $_.substring(0,$_.IndexOf("("))
}
write-host $date
write-host $name
$movies += New-Object PSObject -Property $properties
}
#no significant changes were made above this point
$torrentFiles = dir $torrentPath
foreach($movie in $movies){
$datePath = "$Path\$($movie.date)"
if(-not(test-path $datePath)) {
new-item $datePath -ItemType "directory"
}
$words = ($movie.name -split '\s') | ?{ $_.Length -gt 1}
#this is as far as I got last time; most of the changes are below, though I did change
#just a bit above
#this sets a number of words which needs to match. Currently, it has to match
#on all words. If you wanted, you set it to a static number (2)
# or do something like $words.count -1. There is a commented-out example of
#such a solution.
$significant = $words.Count
#if($words.Count -eq 1){$significant = 1}
#else{$significant = ($words.Count - 1)
# here you loop through the torrentfiles, finding files whose base names have a
#significant number of matching words with the string
foreach($torrentFile in $torrentFiles){
$matchingWords = 0
foreach($word in $words){
if($torrentFile.BaseName -match $word){
$matchingWords += 1
}
}
if($matchingWords -ge $significant){
$_ | Move-Item -Destination $datePath
}
}
}

Retrieve Custom Object From Hashtable

I've written a PowerShell function to create a custom object and stores it into a hashtable. The issue I'm facing is retrieving that object. I need to retrieve that object because it contains an array, I need to loop through that array and write it to a text file.
function removeItem {
<#Madatory Parameters for function, it takes the path to the files/folders
to clean up and path to the hashtable.#>
Param([Parameter(Mandatory=$True)]
[string]$path,
[string]$writetoText,
[hashtable] $hashWrite=#{}
)
<#Begin if statement: Test if Path Exists#>
if (Test-Path ($path)) {
<#Begin if statement: Check if file is Directory#>
if ((Get-Item $path) -is [System.IO.DirectoryInfo]) {
$pathObj = [pscustomobject]#{
pathName = $path
Wipe = (Get-ChildItem -Path $path -Recurse)
Count = (Get-ChildItem -Path $path -Recurse | Measure-Object).Count
}
# Write-Output $pathObj.Wipe
#Add Data to Hashtable
$hashWrite.Add($pathObj.pathName,$pathObj)
foreach ($h in $hashWrite.GetEnumerator()) {
Write-Host "$($h.Name): $($h.Value)"
}
<#
[string[]]$view = $pathObj.Wipe
for ($i=0; $i -le $view.Count; $i++){
Write-Output $view[$i]
}
#>
$pathObj.pathName = $pathObj.pathName + "*"
}<#End if statement:Check if file is Directory #>
}
}
My function takes 3 arguments, a path, the text file path, and a hashtable. Now, I create a custom object and store the path, files/folders contained in that path, and the count. Now, my issue is, I want to retrieve that custom object from my hashtable so that I can loop though the Wipe variable, because it's an array, and write it to the text file. If I print the hashtable to the screen it see the Wipe variable as System.Object[].
How do I retrieve my custom object from the hash table so I can loop through the Wipe Variable?
Possible Solution:
$pathObj = [pscustomobject]#{
pathName = $path
Wipe = (Get-ChildItem -Path $path -Recurse)
Count = (Get-ChildItem -Path $path -Recurse | Measure-Object).Count
}
#Add Data to Hashtable
$hashWrite.Add($pathObj.pathName,$pathObj)
foreach ($h in $hashWrite.GetEnumerator()) {
$read= $h.Value
[string[]]$view = $read.Wipe
for ($i=0; $i -le $view.Count; $i++) {
Write-Output $view[$i]
}
}
Is this the ideal way of doing it?
There are uses for GetEnumerator(), but in your case you're better off just looping over the keys of the hashtable:
$hashWrite.Keys | % {
$hashWrite[$_].Wipe
} | select -Expand FullName