PowerShell Recursive copying and pause between each file is copied - powershell

I have the following script to recursive copy data and create a log file of the destination, could any assist please, I would like to pause for 10 seconds after each file is copied so each file allocated a different created time stamp.
$Logfile ='File_detaisl.csv'
$SourcePath = Read-Host 'Enter the full path containing the files to copy'
""
""
$TargetPath = Read-Host 'Enter the destination full path to copy the files'
""
#$str1FileName = "File_Details.csv"
Copy-Item -Path $SourcePath -Destination $TargetPath -recurse -Force
Get-ChildItem -Path $TargetPath -Recurse -Force -File | Select-Object Name,DirectoryName,Length,CreationTime,LastWriteTime,#{N='MD5 Hash';E={(Get-FileHash -Algorithm MD5 $_.FullName).Hash}},#{N='SHA-1 Hash';E={(Get-FileHash -Algorithm SHA1 $_.FullName).Hash}} | Sort-Object -Property Name,DirectoryName | Export-Csv -Path $TargetPath$Logfile

Copy-Item has a -PassThru parameter that outputs each item that is currently processed. By piping to ForEach-Object you can add a delay after each file.
Copy-Item -Path $SourcePath -Destination $TargetPath -recurse -Force -PassThru |
Where-Object PSIsContainer -eq $false |
ForEach-Object { Start-Sleep 10 }
The Where-Object is there to exclude folders from the ForEach-Object processing. For folder items the PSIsContainer property is $true and for files it is $false.

You would lose the integrity of the folder structure, but one way to do this is using Get-ChildItem then piping to Foreach-Object, or using a loop to iterate through the items one at time.
Get-ChildItem -Path $SourcePath -Recurse -Force |
ForEach-Object -Process {
Copy-Item -LiteralPath $_.FullName -Destination $TargetPath -Force
Start-Sleep -Seconds 10
}
The purpose is to get the files processed one after another using a loop in order to place our Start-Sleep after the file has been copied.

Related

Powershell copying specific files from source directory, excluding several folders, but then recursive with wildcard for files

Here is my current script and it works fine. Not efficient running same code twice but I don't know how to combine the wildcards... anyway on to the bigger issue.
The below code searches through my $sourceDir, excludes the files listed in $ExclusionFiles, copies all folders and structure as well as any .jpg or any .csv files, then puts them into the $targetDir.
$sourceDir = 'c:\sectionOne\Graphics\Data'
$targetDir = 'C:\Test\'
$ExclusionFiles = #("InProgress.jpg", "input.csv", "PCMCSV2.csv")
# Get .jpg files
Get-ChildItem $sourceDir -filter "*.jpg" -recurse -Exclude $ExclusionFiles | `
foreach{
$targetFile = $targetDir + $_.FullName.SubString($sourceDir.Length);
New-Item -ItemType File -Path $targetFile -Force;
Copy-Item $_.FullName -destination $targetFile
}
# Get .csv files
Get-ChildItem $sourceDir -filter "*.csv" -recurse -Exclude $ExclusionFiles | `
foreach{
$targetFile = $targetDir + $_.FullName.SubString($sourceDir.Length);
New-Item -ItemType File -Path $targetFile -Force;
Copy-Item $_.FullName -destination $targetFile
}
My list of files in the main $sourceDir that I need to exclude is getting longer and there are folders I want to exclude as well. Can someone tell me how to,
Copy only a list of specific files in the $sourceDir
Exclude certain folders in $sourceDir from copying
Combine the wildcard search for .jpg and .csv into one statement
I'm still learning so any help would be greatly appreciated!
This is a case where a little bit of Regex will go a long way:
You can filter multiple extensions by using a pretty basic match:
$extensions = 'jpg', 'csv'
$endsWithExtension = "\.(?>$($extensions -join '|'))$"
Get-ChildItem -Recurse |
Where-Object Name -Match $endsWithExtension
You can exclude a list of specific files with one more Where-Object and the -In parameter:
$extensions = 'jpg', 'csv'
$endsWithExtension = "\.(?>$($extensions -join '|'))$"
$ExcludeFileNames = #("InProgress.jpg", "input.csv", "PCMCSV2.csv")
Get-ChildItem -Recurse |
Where-Object Name -Match $endsWithExtension |
Where-Object Name -NotIn $ExcludeFileNames
From there on in, your Foreach-Object is basically correct (nice touch making sure the file exists by using New-Item, though I'd personally assign it's output to null and -PassThru the Copy-Item).
Get-ChildItem $sourceDir -Recurse |
Where-Object Name -Match $endsWithExtension |
Where-Object Name -NotIn $ExcludeFileNames |
Foreach-Object {
$targetFile = $targetDir + $_.FullName.SubString($sourceDir.Length);
New-Item -ItemType File -Path $targetFile -Force;
Copy-Item $_.FullName -destination $targetFile
}

Copy all latest files from folders/sub to the same name folders/sub in powershell

I am trying to copy the latest file from every folder/sub-folder into a same file structure on a different drive.
Latest file from source copied to the same name corresponding destination.
The destination folder hierarchy already exists & cannot be copied over or recreated. This & other versions are not behaving. Can anyone help?
$sourceDir = 'test F Drive\Shares\SSRSFileExtract\'
$destDir = 'test X Drive\SSRSFileExtract\'
$date = Get-Date
$list = Get-ChildItem -Path $sourceDir | Sort-Object -Property LastWriteTime -Descending | Select-Object -First 1
foreach ($item in $list)
{
Copy-Item -Verbose -LiteralPath $item.FullName -Destination $destDir -Force |
Get-Acl -Path $item.FullName | Set-Acl -Path $destDir\$(Split-Path -Path $item.FullName -Leaf)
}
Get-ChildItem –Path $destDir -Recurse | Where-Object {($_.LastWriteTime -lt (Get-Date).AddDays(-5))} | Remove-Item -Verbose -Recurse -Force
I found a solution for this which copies/moves all the files from all sub folders in to all corresponding sub folders:
Powershell: Move all files from folders and subfolders into single folder
The way your code retrieves the list of files will only return one single object because of Select-Object -First 1. Also, because you don't specify the -File switch, Get-ChildItem will also return DirectoryInfo objects, not just FileInfo objects..
What you could do is get an array of FileInfo objects recursively from the source folder and group them by the DirectoryName property
Then loop over these groups of files and from each of these groups, select the most recent file and copy that over to the destination folder.
Try:
$sourceDir = 'F:\Shares\SSRSFileExtract'
$destDir = 'X:\SSRSFileExtract'
Get-ChildItem -Path $sourceDir -File -Recurse | Group-Object DirectoryName | ForEach-Object {
# the $_ automatic variable represents one group at a time inside the loop
$newestFile = $_.Group | Sort-Object -Property LastWriteTime -Descending | Select-Object -First 1
# construct the target sub directory
# you could also use $_.Name (the name of this group) instead of $newestFile.DirectoryName here, because
# we grouped on that DirectoryName file property.
$targetDir = Join-Path -Path $destDir -ChildPath $newestFile.DirectoryName.Substring($sourceDir.Length)
# if you're not sure the targetpath exists, uncomment the next line to have it created first
# if (!(Test-Path -Path $targetDir -PathType Container)) { $null = New-Item -Path $target -ItemType Directory }
# copy the file
$newestFile | Copy-Item -Destination $targetDir -Force -Verbose
# copy the file's ACL
$targetFile = Join-Path -Path $targetDir -ChildPath $newestFile.Name
$newestFile | Get-Acl | Set-Acl -Path $targetFile
}
Apparently you would also like to clean up older files in the destination folder
Get-ChildItem –Path $destDir -File -Recurse |
Where-Object {$_.LastWriteTime -lt (Get-Date).AddDays(-5).Date} |
Remove-Item -Verbose -Recurse -Force
Be aware that the final code to remove older files could potentially remove all files from a subfolder if all happen to be older than 5 days..

Powershell move files and folders based on older than x days

I am new to powershell and trying to learn a basic file move from one directory to another. My goal is to move files and folders that are over 18months old to cold storage folder run as a scheduled Task. I need to be able to easily modify it's directories to fit our needs. It needs to preserve the folder structure and only move files that fit the above parameters. I also need it to log everything it did so if something is off i know where.
If I run this it just copies everything. If I comment out the %{Copy-Item... then it runs and lists only based on my parameters and logs it. Where am I going wrong or am I way off base?
Yes it would be easy to use robocopy to do this but I want to use powershell and learn from it.
#Remove-Variable * -ErrorAction SilentlyContinue; Remove-Module *; $error.Clear();
#Clear-Host
#Days older than
$Days = "-485"
#Path Variables
$Sourcepath = "C:\Temp1"
$DestinationPath = "C:\Temp2"
#Logging
$Logfile = "c:\temp3\file_$((Get-Date).ToString('MM-dd-yyyy_hh-mm-ss')).log"
#transcript logs all outputs to txt file
Start-Transcript -Path $Logfile -Append
Get-ChildItem $Sourcepath -Force -Recurse |
Where-Object {$_.LastwriteTime -le (Get-Date).AddDays($Days)} |
% {Copy-Item -Path $Sourcepath -Destination $DestinationPath -Recurse -Force}
Stop-Transcript
Problem
Copy-Item -Path $Sourcepath -Destination $DestinationPath -Recurse -Force
You always specify the same path for source and destination. With parameter -recurse you will copy the whole directory $SourcePath for each matching file.
Solution
You need to feed the output of the previous pipeline steps to Copy-Item by using the $_ (aka $PSItem) variable, basically using Copy-Item in single-item mode.
Try this (requires .NET >= 5.0 for GetRelativePath method):
Get-ChildItem $Sourcepath -File -Force -Recurse |
Where-Object {$_.LastwriteTime -le (Get-Date).AddDays($Days)} |
ForEach-Object {
$relativeSourceFilePath = [IO.Path]::GetRelativePath( $sourcePath, $_.Fullname )
$destinationFilePath = Join-Path $destinationPath $relativeSourceFilePath
$destinationSubDirPath = Split-Path $destinationFilePath -Parent
# Need to create sub directory when using Copy-Item in single-item mode
$null = New-Item $destinationSubDirPath -ItemType Directory -Force
# Copy one file
Copy-Item -Path $_ -Destination $destinationFilePath -Force
}
Alternative implementation without GetRelativePath (for .NET < 5.0):
Push-Location $Sourcepath # Base path to use for Get-ChildItem and Resolve-Path
try {
Get-ChildItem . -File -Force -Recurse |
Where-Object {$_.LastwriteTime -le (Get-Date).AddDays($Days)} |
ForEach-Object {
$relativeSourceFilePath = Resolve-Path $_.Fullname -Relative
$destinationFilePath = Join-Path $destinationPath $relativeSourceFilePath
$destinationSubDirPath = Split-Path $destinationFilePath -Parent
# Need to create sub directory when using Copy-Item in single-item mode
$null = New-Item $destinationSubDirPath -ItemType Directory -Force
# Copy one file
Copy-Item -Path $_ -Destination $destinationFilePath -Force
}
}
finally {
Pop-Location # restore previous location
}
On a side note, $Days = "-485" should be replaced by $Days = -485.
You currently create a string instead of a number and rely on Powershell's ability to automagically convert string to number when "necessary". This doesn't always work though, so better create a variable with the appropriate datatype in the first place.

How to search inside three paths and copy the name on a file.list

I am wondering if there is better way to make a script on PowerShell these instructions:
Search on 3 paths. Ex.
$LOGDIRS="C:\NETiKA\GED\Production\RI\log";"C:\NETiKA\GED\Test\RI\log";"C:\NETiKA\Tomcat-8.0.28\logs"
Find all files that are older than 7 days and copy on a file that I will call file.list . EX. > C:\Test\file.list
When I copied on my file.list, I need to search all the name of the files and delete them.
Apparently when you have more than thousands of file, this is the
fastest way to delete.
$LOGDIRS=C:/NETiKA/GED/Production/RI/log;C:/NETiKA/GED/Test/RI/log;C:/NETiKA/Tomcat-8.0.28/logs
$KEEP=-7
Get-ChildItem -Path $LOGDIRS -Recurse -Directory -Force -ErrorAction SilentlyContinue |
Select-Object FullName > files.list |
Foreach-Object {
if ($_.LastAccessTime -le (get-date).adddays($KEEP)) {
remove-item -recurse -force $_
}
};
Something like this should help you get started.
$path1 = "E:\Code\powershell\myPS\2018\Jun"
$path2 = "E:\Code\powershell\myPS\2018\Jun\compareTextFiles"
$path3 = "E:\Code\powershell\myPS\2018\May"
$allFiles = dir $path1, $path2, $path3 -File
$fileList = New-Item -type file file.list -Force
$keep = -7
$allFiles | foreach {
if ($_.LastAccessTime -le (Get-Date).AddDays($keep)) {
"$($_.FullName) is older than 7 days"
$_.FullName.ToString() | Out-File $fileList -Append
}
else {
"$($_.FullName) is new"
}
}
You can add deletion in the code in IF Block if you wish or check the file and do it later on. Your code has many issues which are very basic to PowerShell, e.g: once you use Select-Object the next pipeline will only receive the property you selected. You have tried using LastAccessTime in later pipe when you only selected to go ahead with FullName property.
Also, redirecting to a file and again using pipeline looks very messy.
Remove-Item accepts piped input and a
Where will filter the age
to first check what would be deleted I appended a -WhatIf to the Remove-Item
$LOGDIRS="C:\NETiKA\GED\Production\RI\log","C:\NETiKA\GED\Test\RI\log","C:\NETiKA\Tomcat-8.0.28\logs"
$KEEP=-7
Get-ChildItem -Path $LOGDIRS -Recurse -Directory -Force -ErrorAction SilentlyContinue |
Where-Object LastAccessTime -le ((get-date).AddDays($KEEP))
Remove-Item -Recurse -Force $_ -Whatif

Scan C disk and copy files

I would appreciate some help here.
The Powershell script should close Outlook process which works.
Aswell as scan C disk for .pst files which works.
Copy these files to "\fileserver01\temp\test\"
Export to csv/excel list where these files where located and last write time.
Possible hide error messages for the user when running the script since it complains about not full access on a few folders when running the scan.
Code:
Get-Process outlook | Foreach-Object { $_.CloseMainWindow() }
Get-ChildItem -path c:\ -recurse -include *.pst | `
Copy-Item -destination "\\fileserver01\temp\test\" | `
Select-object fullname,lastwritetime|export-csv "\\fileserver01\temp\test\"
How should I fix the last couple of things on my list?
Thanks
First you have to use double backslash for UNC paths.
Second, the copy-item does not output anything to the pipeline, you have to use the -Passthru parameter.
Get-ChildItem -path z:\ -recurse -include *.pst -PipelineVariable source |
Copy-Item -Destination "\\path\temp" -Verbose -PassThru |
Select-Object #{n="Source";e={$source.versioninfo.filename}},fullname,lastwritetime | export-csv "\\path\temp\copy_result.csv" -Append -Verbose
I believe the issue is that after the files are copied, the object is gone from the pipeline.
This works:
Get-ChildItem -Path C:\ -Include *.pst -ErrorAction SilentlyContinue | Select-Object FullName, LastWriteTime | Export-Csv -Path "\fileserver01\temp\test\MyCSV.csv"
This doesn't directly answer the question you've asked, as #Adamar's answer appears to do just that.
however, your issue could also be resolved by querying ost/pst files from registry using a snippet like this:
(Get-ChildItem HKCU:\Software\Microsoft\Office\16.0\Outlook\Search).Property | ? {$_ -match "(o|p)st$"}
which will return all of the ost/pst files the logged in user has open in outlook.
a snippet like this will then copy them all to a network share and print the logs to a file.
$FilesToCopy = (Get-ChildItem HKCU:\Software\Microsoft\Office\16.0\Outlook\Search).Property | ? {$_ -match "(o|p)st$"}
$FilesToCopy | ForEach { Copy-Item -Path $_ -Destination "\\network\share" ; $_ | Out-File "\\network\share\log.txt" -Append }
This saves a LOT of time over indexing through all of the C: drive - there's also an issue where very long directory names (greater than 260 char length) are not indexed properly by Get-ChildItem - making this method a bit more reliable and appealing for a production script.
This is the final code.
Thanks everyone for your support.
#Kill Oulook process
Get-Process outlook -ErrorAction SilentlyContinue | Foreach-Object { $_.CloseMainWindow() }
#Scan for .pst files on the C disk
Get-ChildItem -path c:\ -recurse -include *.pst -ErrorAction SilentlyContinue |
#Copy located .pst files to the destination
Copy-Item -Destination "\\networkpath\home\$env:username\ComputerChange\" -Verbose -PassThru -ErrorAction SilentlyContinue |
#Log where files were located and when they were last written to.
Select-Object fullname,lastwritetime | export-csv \\networkpath\home\$env:username\ComputerChange\PSTlog.csv -Verbose
Write-Host "PST Files have successfully been copied, press any key to close" -ErrorAction SilentlyContinue
$x = $host.UI.RawUI.ReadKey("NoEcho,IncludeKeyDown")
end
So I have created a much faster script as I have excluded some systemfolders and programfiles folders where .PST files doesn't save.
I bet some of you expert can find out why this code doesn't work?
#Exclude systemfolders etc
$folders=get-childitem c:\ | where-object{$_.mode -like "d-*" -AND $_.name -notlike "windows" -AND $_.name -notlike "drivers" -AND $_.name -notlike "program files*"}
#Search thru each root folder for PST-files
$allfiles=$folders | ForEach-Object{get-childitem -Path $_.fullname -include "*.pst" -recurse -ErrorAction silentlycontinue};
$env:username
$foldertocreate="\\destination\$env:username\"
#Check if folder with username exists in the \\destination folder otherwise create folder with username.
if((Test-Path -Path $foldertocreate -PathType Container)) {write-host "Folder already created"}
else {write-host "Creating Folder", New-Item -ItemType Directory -Force -Path $foldertocreate }
#Copy .PST files which is in $allfiles to the folder created in fileshare> $foldertocreate.
#Copy .PST files in $allfiles to the destination folder created.
robocopy $allfiles $foldertocreate
Write-Host "Press any key to close" -ErrorAction SilentlyContinue $x = $host.UI.RawUI.ReadKey("NoEcho,IncludeKeyDown")
end