Powershell Script to Move Files to new server by last access date - powershell

I'm new to Powershell. I have 80 servers that I need to connect to and run a Pshell script on remotely to find files recursively in one share by last access date and move them to another \server\share for archiving purposes. I also need the file creation, last accessed etc. timestamps to be preserved.
I would welcome any help please
thank you

You need to test this thoroughly before actually using it on all 80 servers!
What you could do if you want to use PowerShell on this is to use Invoke-Command on the servers adding admin credentials so the script can both access the files to move as well as the destination Archive folder.
I would suggest using ROBOCOPY to do the heavy lifting:
$servers = 'Server1', 'Server2', 'Server3' # etcetera
$cred = Get-Credential -Message "Please supply admin credentials for archiving"
$scriptBlock = {
$SourcePath = 'D:\StuffToArchive' # this is the LOCAL path on the server
$TargetPath = '\\NewServer\ArchiveShare' # this is the REMOTE path to where the files should be moved
$LogFile = 'D:\ArchivedFiles.txt' # write a textfile with all fie fullnames that are archived
$DaysAgo = 130
# from a cmd box, type 'robocopy /?' to see all possible switches you might want to use
# /MINAGE:days specifies the LastWriteTime
# /MINLAD:days specifies the LastAccessDate
robocopy $SourcePath $TargetPath /MOVE /MINLAD:$DaysAgo /COPYALL /E /FP /NP /XJ /XA:H /R:5 /W:5 /LOG+:$logFile
}
Invoke-Command -ComputerName $servers -ScriptBlock $scriptBlock -Credential $cred
If you want to do all using just PowerShell, try something like this:
$servers = 'Server1', 'Server2', 'Server3' # etcetera
$cred = Get-Credential -Message "Please supply admin credentials for archiving"
$scriptBlock = {
$SourcePath = 'D:\StuffToArchive' # this is the LOCAL path on the server
$TargetPath = '\\NewServer\ArchiveShare' # this is the REMOTE path to where the files should be moved
$LogFile = 'D:\ArchivedFiles.txt' # write a textfile with all fie fullnames that are archived
$refDate = (Get-Date).AddDays(-130).Date # the reference date set to midnight
# set the ErrorActionPreference to Stop, so exceptions are caught in the catch block
$OldErrorAction = $ErrorActionPreference
$ErrorActionPreference = 'Stop'
# loop through the servers LOCAL path to find old files and move them to the remote archive
Get-ChildItem -Path $SourcePath -File -Recurse |
Where-Object { $_.LastAccessTime -le $refDate } |
ForEach-Object {
try {
$target = Join-Path -Path $TargetPath -ChildPath $_.DirectoryName.Substring($SourcePath.Length)
# create the folder in the archive if not already exists
$null = New-Item -Path $target -ItemType Directory -Force
$_ | Move-Item -Destination $target -Force
Add-Content -Path $LogFile -Value "File '$($_.FullName)' moved to '$target'"
}
catch {
Add-Content -Path $LogFile -Value $_.Exception.Message
}
}
$ErrorActionPreference = $OldErrorAction
}
Invoke-Command -ComputerName $servers -ScriptBlock $scriptBlock -Credential $cred

Related

Check if a file/folder exists in remote system using powershell

I already searched for this and found lot of answers. But, none of them seems to work.
I am working a script, which will be used to copy some files from local machine to remote servers. Before copying the files, I need to check if the file/folder already exists. If the folder doesn't exists, then create a new folder and then copy the files. If the file already exists at the specified location, just overwrite the file.
I got the logic on how to do this. But, for some reason Test-Path doesn't seems to work.
$server = #list of servers
$username = #username
$password = #password
$files = #list of files path to be copied
foreach($server in $servers) {
$pw = ConvertTo-SecureString $password -AsPlainText -Force
$cred = New-Object Management.Automation.PSCredential ($username, $pw)
$s = New-PSSession -computerName $server -credential $cred
foreach($item in $files){
$regex = $item | Select-String -Pattern '(^.*)\\(.*)$'
$destinationPath = $regex.matches.groups[1]
$filename = $regex.matches.groups[2]
#check if the file exists on local system
if(Test-Path $item){
#check if the path/file already exists on remote machine
#First convert the path to UNC format before checking it
$regex = $item | Select-String -Pattern '(^.*)\\(.*)$'
$filename = $regex.matches.groups[2]
$fullPath = $regex.matches.groups[1]
$fullPath = $fullPath -replace '(.):', '$1$'
$unc = '\\' + $server + '\' + $fullPath
Write-Host $unc
Test-Path $unc #This always returns false even if file/path exists
if(#path exists){
Write-Host "Copying $filename to server $server"
Copy-Item -ToSession $s -Path $item -Destination $destinationPath
}
else{
#create the directory and then copy the files
}
}
else{
Write-Host "$filename does not exists at the local machine. Skipping this file"
}
}
Remove-PSSession -Session $s
}
The condition to check if the file/path exists on remote machine always fails. Not sure why.
I tried the following commands manually on powershell and the command returns true on the remote machine and false on the local machine.
On local machine:
Test-Path '\\10.207.xxx.XXX\C$\TEST'
False
On Remote machine:
Test-Path '\\10.207.xxx.xxx\C$\TEST'
True
Test-Path '\\localhost\C$\TEST'
True
So, it is clear that the command fails even if I try manually or through script. But the command passes when I try to do it from remote system or server.
But I need to check if the file exists on remote machine from local system.
Am I missing something ? Can someone help me to understand what's going on here ?
Thanks!
First of all, you're not using your PSSessions for anything. They look redundant.
If your local path is the same as the destination and you're using WMF/Powershell 4 or newer; I would suggest that you stop with your regex and UNC paths and do the following, which would simplify and drop most of your code:
$existsOnRemote = Invoke-Command -Session $s {param($fullpath) Test-Path $fullPath } -argumentList $item.Fullname;
if(-not $existsOnRemote){
Copy-Item -Path $item.FullName -ToSession $s -Destination $item.Fullname;
}

PowerShell SQL Job Step Move-Item not working on 1 server

This identical code has been used in 3 servers, and only one of them does it silently fail to move the items (it still REMOVES them, but they do not appear in the share).
Azure-MapShare.ps1
param (
[string]$DriveLetter,
[string]$StorageLocation,
[string]$StorageKey,
[string]$StorageUser
)
if (!(Test-Path "${DriveLetter}:"))
{
cmd.exe /c "net use ${DriveLetter}: ${StorageLocation} /u:${StorageUser} ""${StorageKey}"""
}
Get-Exclusion-Days.ps1
param (
[datetime]$startDate,
[int]$daysBack
)
$date = $startDate
$endDate = (Get-Date).AddDays(-$daysBack)
$allDays =
do {
"*"+$date.ToString("yyyyMMdd")+"*"
$date = $date.AddDays(-1)
} until ($date -lt $endDate)
return $allDays
Migrate-Files.ps1
param(
[string]$Source,
[string]$Filter,
[string]$Destination,
[switch]$Remove=$False
)
#Test if source path exist
if((Test-Path -Path $Source.trim()) -ne $True) {
throw 'Source did not exist'
}
#Test if destination path exist
if ((Test-Path -Path $Destination.trim()) -ne $True) {
throw 'Destination did not exist'
}
#Test if no files in source
if((Get-ChildItem -Path $Source).Length -eq 0) {
throw 'No files at source'
}
if($Remove)
{
#Move-Item removes the source files
Move-Item -Path $Source -Filter $Filter -Destination $Destination -Force
} else {
#Copy-Item keeps a local copy
Copy-Item -Path $Source -Filter $Filter -Destination $Destination -Force
}
return $True
The job step is type "PowerShell" on all 3 servers and contains this identical code:
#Create mapping if missing
D:\Scripts\Azure-MapShare.ps1 -DriveLetter 'M' -StorageKey "[AzureStorageKey]" -StorageLocation "[AzureStorageAccountLocation]\backup" -StorageUser "[AzureStorageUser]"
#Copy files to Archive
D:\Scripts\Migrate-Files.ps1 -Source "D:\Databases\Backup\*.bak" -Destination "D:\Databases\BackupArchive"
#Get date range to exclude
$exclusion = D:\Scripts\Get-Exclusion-Days.ps1 -startDate Get-Date -DaysBack 7
#Remove items that are not included in exclusion range
Remove-Item -Path "D:\Databases\BackupArchive\*.bak" -exclude $exclusion
#Move files to storage account. They will be destroyed
D:\Scripts\Migrate-Files.ps1 -Source "D:\Databases\Backup\*.bak" -Destination "M:\" -Remove
#Remove remote backups that are not from todays backup
Remove-Item -Path "M:\*.bak" -exclude $exclusion
If I run the job step using SQL then the files get removed but do not appear in the storage account. If I run this code block manually, they get moved.
When I start up PowerShell on the server, I get an error message: "Attempting to perform the InitializeDefaultDrives operation on the 'FileSystem' provider failed." However, this does not really impact the rest of the operations (copying the backup files to BackupArchive folder, for instance).
I should mention that copy-item also fails to copy across to the share, but succeeds in copying to the /BackupArchive folder
Note sure if this will help you but you could try to use the New-PSDrive cmdlet instead of net use to map your shares:
param (
[string]$DriveLetter,
[string]$StorageLocation,
[string]$StorageKey,
[string]$StorageUser
)
if (!(Test-Path $DriveLetter))
{
$securedKey = $StorageKey | ConvertTo-SecureString -AsPlainText -Force
$credentials = New-Object System.Management.Automation.PSCredential ($StorageUser, $securedKey)
New-PSDrive -Name $DriveLetter -PSProvider FileSystem -Root $StorageLocation -Credential $credentials -Persist
}
Apparently I tricked myself on this one. During testing I must have run the net use command in an elevated command prompt. This apparently hid the mapped drive from non-elevated OS features such as the Windows Explorer and attempts to view its existence via non-elevated command prompt sessions. I suppose it also was automatically reconnecting during reboots because that did not fix it.
The solution was as easy as running the net use m: /delete command from an elevated command prompt.

PowerShell run script simultaneously

I created a PowerShell script to remove all files and folders older than X days. This works perfectly fine and the logging is also ok. Because PowerShell is a bit slow, it can take some time to delete these files and folders when big quantities are to be treated.
My questions: How can I have this script ran on multiple directories ($Target) at the same time?
Ideally, we would like to have this in a scheduled task on Win 2008 R2 server and have an input file (txt, csv) to paste some new target locations in.
Thank you for your help/advise.
The script
#================= VARIABLES ==================================================
$Target = \\share\dir1"
$OlderThanDays = "10"
$Logfile = "$Target\Auto_Clean.log"
#================= BODY =======================================================
# Set start time
$StartTime = (Get-Date).ToShortDateString()+", "+(Get-Date).ToLongTimeString()
Write-Output "`nDeleting folders that are older than $OlderThanDays days:`n" | Tee-Object $LogFile -Append
Get-ChildItem -Directory -Path $Target |
Where-Object { $_.LastWriteTime -lt (Get-Date).AddDays(-$OlderThanDays) } | ForEach {
$Folder = $_.FullName
Remove-Item $Folder -Recurse -Force -ErrorAction SilentlyContinue
$Timestamp = (Get-Date).ToShortDateString()+" | "+(Get-Date).ToLongTimeString()
# If folder can't be removed
if (Test-Path $Folder)
{ "$Timestamp | FAILLED: $Folder (IN USE)" }
else
{ "$Timestamp | REMOVED: $Folder" }
} | Tee-Object $LogFile -Append # Output folder names to console & logfile at the same time
# Set end time & calculate runtime
$EndTime = (Get-Date).ToShortDateString()+", "+(Get-Date).ToLongTimeString()
$TimeTaken = New-TimeSpan -Start $StartTime -End $EndTime
# Write footer to log
Write-Output ($Footer = #"
Start Time : $StartTime
End Time : $EndTime
Total runtime : $TimeTaken
$("-"*79)
"#)
# Create logfile
Out-File -FilePath $LogFile -Append -InputObject $Footer
# Clean up variables at end of script
$Target=$StartTime=$EndTime=$OlderThanDays = $null
One way to achieve this would be to write an "outer" script that passes the directory-to-be-cleaned, into the "inner" script, as a parameter.
For your "outer" script, have something like this:
$DirectoryList = Get-Content -Path $PSScriptRoot\DirList;
foreach ($Directory in $DirectoryList) {
Start-Process -FilePath powershell.exe -ArgumentList ('"{0}\InnerScript.ps1" -Path "{1}"' -f $PSScriptRoot, $Directory);
}
Note: Using Start-Process kicks off a new process that is, by default, asynchronous. If you use the -Wait parameter, then the process will run synchronously. Since you want things to run more quickly and asynchronously, omitting the -Wait parameter should achieve the desired results.
Invoke-Command
Alternatively, you could use Invoke-Command to kick off a PowerShell script, using the parameters: -File, -ArgumentList, -ThrottleLimit, and -AsJob. The Invoke-Command command relies on PowerShell Remoting, so that must enabled, at least on the local machine.
Add a parameter block to the top of your "inner" script (the one you posted above), like so:
param (
[Parameter(Mandatory = $true)]
[string] $Path
)
That way, your "outer" script can pass in the directory path, using the -Path parameter for the "inner" script.

get script to work on remote servers

I want the below code to work on multiple computers - any idea how to do this? I have the below but it fails as I do not currently call in the servers in question I think.
Thanks,
CODE:
Write-Host "Script to check Storage Write, Read and Delete Times"
Write-Host "`n"
$computer = Get-Content -path d:\temp\servers.txt
$path = "f:\temp\test.txt"
Foreach ($storage in $computer)
{
$date = Get-Date
Write-Host "Script being run on $date"
$write = Measure-Command { new-item -Path $path -ItemType File -Force } | select TotalMilliseconds
Write-Host "Writing file on $storage took $write"
$read = Measure-Command { Get-Content -Path $path } | select TotalMilliseconds
Write-Host "Reading file on $storage took $read"
$delete = Measure-Command {Remove-Item -Path $path -Force } | select TotalMilliseconds
Write-Host "Deleting file on $storage took $delete"
Write-Host "`n"
}
You need to step back for a second an rethink the approach. You are issuing the filesystem commands every time to f:\temp, which is on your local system.
There are two ways to make remote computers perform filesystem tasks. The easiest way is to use UNC paths. That is, \\server\share format. Assuming you have local admin access:
Foreach ($storage in $computer) {
$uncpath = $("\\{0}\f`$\temp\text.txt" -f $storage)
$write = Measure-Command { new-item -Path $uncpath -ItemType #...
# rest of code uses $uncpath for access
}
Mind you, using UNC path puts some stess on LAN, so this type of testing might or might not be accurate enough.
The second way would be using Powershell remoting to connect on remote systems and issuing the commands there. Take a look at New-PSSession, Enter-PSSession and Exit-PSSession cmdlets.

Copying files from one source to multiple destinations in parallel

I'm attempting to write a microsoft powershell script which copies files from a single source to multiple destinations in parallel based on a config file. The config file is a CSV file which looks like this:
Server,Type
server1,Production
server2,Staging
My script is called with one argument (.\myscript.ps1 buildnumber) but it doesn't seem to actually do any deleting or copying of files.
I'm sure my copy-item and remove-item code works as I have tested them independently but I think its either an issue with how I am using script blocks or perhaps how I am using start-job.
Could anyone help me understand why this isn't working?
Thanks
Brad
<#
File Deployment Script
#>
#REQUIRES -Version 2
param($build)
$sourcepath = "\\server\software\$build\*"
$Config = import-csv -path C:\config\serverlist.txt
$scriptblock1 = {
$server = $args[0]
$destpath1 = "\\$server\share\Software Wizard\"
$destpath2 = "\\$server\share\Software Wizard V4.9XQA\"
remove-item "$destpath1\*" -recurse -force
remove-item "$destpath2\*" -recurse -force
copy-item $sourcepath -destination $destpath1 -recurse -force
copy-item $sourcepath -destination $destpath2 -recurse -force
}
$scriptblock2 = {
$server = $args[0]
$destpath = "\\$server\share\Software Wizard\"
#remove-item "$destpath\*" -recurse -force
copy-item $sourcepath -destination $destpath -recurse -force
}
foreach ($line in $Config) {
$server = $line.Server
$type = $line.Type
if ($type -match "Staging") {
Write-Host "Kicking job for $server off"
start-job -scriptblock $scriptblock2 -ArgumentList $server
}
if ($type -match "Production") {
Write-Host "Kicking job for $server off"
start-job -scriptblock $scriptblock2 -ArgumentList $server
}
}
Your script block doesn't have access to variables declared outside of it when it's called from start-job. So $scriptblock1 and $scriptblock2 can't see $sourcepath.
To elaborate on Jamey's answer, you can see that the $sourcepath variable declared in the caller scope is not available within the job by comparing the output of the two calls below:
$sourcepath = 'source path'
$scriptblock = { Write-Host "sourcepath = $sourcepath; args = $args" }
& $scriptblock 'server name'
Start-Job $scriptblock -ArgumentList 'server name' | Wait-Job | Receive-Job
To fix this, simply pass the outer variable as part of the argument list:
$scriptblock2 = {
param($sourcepath, $server)
$destpath = ...
Copy-Item $sourcepath -Destination $destpath -Recurse -Force
}
...
Start-Job -Scriptblock $scriptblock2 -ArgumentList $sourcepath,$server