I am currently strugling on a simple powershell script to archive files.
I have thousand of old file in a folder and i want to archive them depending on the month/year of their creation date in archives named "YYYYMM".
I use the code below
Get-ChildItem -Path $sourcePath -filter $filter |
Where-Object {(($_.CreationTime) -le $dateCriteria) -and ($_.psIsContainer -eq $false)}|
ForEach {
$archive = "{0:yyyy}{0:MM}.7z" -f $_.CreationTime
$archivePath= Join-Path -Path $destinationFolder -ChildPath $archive
& "C:\Program Files\7-Zip\7z.exe" a -mx9 -t7z -m0=lzma2 -sdel $archivePath$_.FullName |Out-Null
}
The logic seems fine as it creates files like
201809.7z
201810.7z
...
In my destination folder.
The problem is i see errors in the console:
System ERROR:
The file exists
or
System ERROR:
Access denied
or
System ERROR:
The file exists
ERROR: ********\202011.7z
Can not open the file as archive
As a result, in my destination folder, in addition to the expected archive files i have file like "201810.7z.tmp1"
I changes the working directory to isolate those files by adding -w"{WORK_PATH}"
to the command line.
I also added Start-Sleep -Milliseconds 1
as it looked like concurrent access even if my script is mono threaded (maybe 7zip doesn't end properly) but it didn't work.
With Start-Sleep -Milliseconds 500it seems to work but for obvious reasons i dont want to use that. What would the proper way to do that be ?
EDIT 1
Following MisterSmith's answer i changed my code for
Get-ChildItem -Path $emplacementSource -filter $filtreNomFichiers |
Where-Object {(($_.CreationTime) -le $dernierJour) -and ($_.psIsContainer -eq $false)}|
ForEach {
$archive = "{0:yyyy}{0:MM}.7z" -f $_.CreationTime
$cheminArchive= Join-Path -Path $dossierCible -ChildPath $archive
[Array]$arguments = "a" ,"-w$workDir", "-mx9" ,"-t7z" ,"-m0=lzma2" ,"-sdel" ,$cheminArchive, $_.FullName
$pinfo = New-Object System.Diagnostics.ProcessStartInfo
$pinfo.FileName = $sevenZip
$pinfo.RedirectStandardError = $true
$pinfo.CreateNoWindow= $true
$pinfo.UseShellExecute = $false
$pinfo.Arguments = "$arguments"
$process = New-Object System.Diagnostics.Process
$process.StartInfo = $pinfo
$process.Start()
$output = $process.StandardError.ReadToEnd()
$process.WaitForExit()
if(0 -ne $process.ExitCode){
Write-Output "$(Get-TimeStamp) Erreur: $output" | Out-file $fichierLogs -append
}
}
I still have File exists errors and .tmpX archives in my temp folder.
Instead of using & use Start-Process and pass the -wait switch or use -PassThru switch and the returned System.Diagnostics.Process to check if the process has finished yourself. Either will get the same result as your Start-Sleep -Milliseconds 500 test, but it will only wait for the time taken by 7z.exe to complete its execution.
Side note - you can append multiple files at once. That would probably work out quicker overall than adding each file separately.
Related
I have some code that checks a target file, waits for a change, and I want it to only move the most recent files based on their LastWriteTime Value. However, every time I change a file within the target directory nothing is copying over and I am having the copy-item directory change to "C:\Users\run". I
it recognizes that there are files to copy and even states their filename when throwing the error. What can I do in this situation to make sure my copy-item command is copying from my target directory?
Code for Reference:
$File = "C:\Users\run\Desktop\Target"
$destinationFolder = "c:\users\run\desktop\dest"
$maxDays = "-1"
$maxMins = "20"
$date = Get-Date
Write-Host "Waiting For File To Change in Job Cloud..."
$Action = '
dateChecker
Write-Host "Moving Files From Job Cloud To Server Shares... Please Do Not Disrupt This Service"
write-host "files copied to job cloud..."
exit
'
$global:FileChanged = $false
function dateChecker {
Foreach($File in (Get-ChildItem -Path $File)){
if($File.LastWriteTime -lt ($date).AddMinutes($maxMins)){
Write-Host "Moving Files From Job Cloud To Server Shares... Please Do Not Disrupt This Service"
Copy-Item -Path $File -Destination $destinationFolder -recurs #-ErrorAction #silentlyContinue
}
}
}
while($true) {
function Wait-FileChange {
param(
[string]$File,
[string]$Action
)
$FilePath = Split-Path $File -Parent
$FileName = Split-Path $File -Leaf
$ScriptBlock = [scriptblock]::Create($Action)
$Watcher = New-Object IO.FileSystemWatcher $FilePath, $FileName -Property #{
IncludeSubdirectories = $false
EnableRaisingEvents = $true
}
$onChange = Register-ObjectEvent $Watcher Changed -Action {$global:FileChanged = $true}
while ($global:FileChanged -eq $false){
Start-Sleep -Milliseconds 100
}
& $ScriptBlock
Unregister-Event -SubscriptionId $onChange.Id
}
Wait-FileChange -File $File -Action $Action
}
PowerShell is not switching directories - although I can certainly see why you'd think that based on the behavior. The explanation is closer than you might think though:
The -Path parameter takes a [string] argument.
$File is not a string - it's a [FileInfo] object - and PowerShell therefore converts it to a string before passing it to Copy-Item -Path. Unfortunately, this results in the name of the file (not the full path) being passed as the argument, and Copy-Item therefore has to resolve the full path, and does so relative to the current working directory.
You can fix this by passing the full path explicitly to Copy-Item -LiteralPath:
Copy-Item -LiteralPath $File.FullName ... |...
or you can let the pipeline parameter binder do it for you by piping the $File object to Copy-Item:
$File |Copy-Item ... |...
Why -LiteralPath instead of -Path? -Path accepts wildcard patterns like filenameprefix[0-9] and tries to resolve it to a file on disk, meaning if you have to operate on files with [ or ] in the name, it'll result in some unexpected behavior :)
I need help on a specific issue with Powershell.
What I am trying to do is that starting multiple services one bye one and after successfully start process, I need to copy some files from one location to another. These files are created only after the service/app is up. I need to check it from a string in a text file (like "Service successfully started").
I tried to make a for-each loop but because of copying and text check locations are different, I couldn't manage to do it. And honestly, I don't have much information about nested loops. Maybe you can give me some ideas to make this work.
For examples, for 1 service;
Source folder file locations;
C:\sourcepath\location1\folder\abc.dat
C:\sourcepath\location1\folder\cde.dat
txt file which needs to be checked if there is a string line called "Service successfully started" (to understand the service-app successfully started)
C:\sourcepath\folder1\logs\logfile.txt
Destination folder file locations
D:\destinationpath\location1\ (abc.dat and cde.dat files should be in same folder)
--- The flow should be like that;
Start a service
Make sure it's up as checking the txt file string
After controls, make copying process from source folder to destination for the specified files (as creating destination folder based on source folder)
Stop the service
After checking it's status as stopped, again start another service and do the same processes until the last service but for different locations
For example, location1 should be location2 and then location3 but the file names are the same. Also destination folder should be created according to source folder.
Even any directions will be helpful.
Edit1:
So far, I could write code.
[array]$serviceNames = "lfsvc", "iphlpsvc"
[array]$app = "app1", "app2"
$sourceStart = "C:\Source\"
$destinationStart = "C:\Target\"
$logs = "\logs"
$sourceFull = $sourceStart+$app.Get(0)+"\data"
$destinationFull = $destinationStart+$app.Get(0)
ForEach ($serviceNames in $serviceNames)
{
Start-Service $serviceNames -ErrorAction SilentlyContinue;
$text = Select-String -Path $sourceStart+$app.Get(0)+$logs\log.txt -Pattern "Service successfully started"
if ($text -ne $null)
{
md $destination;
Copy-Item -Path $sourceFull\123.txt -Destination $destinationFull\123.txt
Copy-Item -Path $sourceFull\456.txt -Destination $destinationFull\456.txt
}
}
I need to point other $app values in a row as pointing other $serviceNames values accordingly.
I need to take control the if values if wait till it shows the service successfully started line
Thanks
Edit2:
If I want to write it in long way, that should be something like that. (Ofc, if I can check the string from a specified text file, it would be gr8)
I need to shorten the codes
[array]$serviceNames = "aService", "bService"
Start-Service $serviceNames[0] -ErrorAction SilentlyContinue;
Start-Sleep -Seconds 75;
md "C:\Dest\aService\fld";
Copy-Item -Path "C:\Source\aService\fld\123.txt" -Destination "C:\Dest\aService\fld\123.txt";
Copy-Item -Path "C:\Source\aService\fld\456.txt" -Destination "C:\Dest\aService\fld\456.txt";
Copy-Item -Path "C:\Source\aService\fld\789.txt" -Destination "C:\Dest\aService\fld\789.txt";
Stop-Service $serviceNames[0] -ErrorAction SilentlyContinue;
Start-Sleep -Seconds 15;
Start-Service $serviceNames[1] -ErrorAction SilentlyContinue;
Start-Sleep -Seconds 75;
md "C:\Dest\bService\fld";
Copy-Item -Path "C:\Source\bService\fld\123.txt" -Destination "C:\Dest\bService\fld\123.txt";
Copy-Item -Path "C:\Source\bService\fld\456.txt" -Destination "C:\Dest\bService\fld\456.txt";
Copy-Item -Path "C:\Source\bService\fld\789.txt" -Destination "C:\Dest\bService\fld\789.txt";
Stop-Service $serviceNames[1] -ErrorAction SilentlyContinue;
Start-Sleep -Seconds 15;
I think I have an idea of what you want.
[array]$serviceNames = "lfsvc", "iphlpsvc"
[array]$apps = "app1", "app2"
$sourceStart = "C:\Source\"
$destinationStart = "C:\Target\"
$logs = "\logs"
# main loop, which loops over the apps
foreach($app in $apps)
{
$sourceFull = $sourceStart + $app + "\data"
$destinationFull = $destinationStart + $app
# each app will iterate over all of the services
ForEach ($name in $serviceNames)
{
# uses -passthru to get the service object, and pulls its status from that. Will cause any errors to terminate the script
$status = (Start-Service $name -ErrorAction Stop -PassThru).Status
# this while loop will cause it to pause until the service is in "running" state
while($status -ne "Running") {Start-Sleep -Seconds 5; Get-Service $name}
$text = Select-String -Path $($sourceStart + $app + $logs + "\log.txt") -Pattern "Service successfully started"
# check to see if the $text variable is null or empty, if not, do the thing
if (![string]::IsNullOrEmpty($text))
{
if(!(Test-Path -Path $destinationFull)){New-Item -ItemType Directory -Path $destinationFull}
Get-Content -Path "$sourceFull\123.txt" | Add-Content -Path "$destinationFull\123.txt"
Get-Content -Path "$sourceFull\456.txt" | Add-Content -Path "$destinationFull\456.txt"
}
# stops the service
$status = Stop-Service $name -PassThru
# pauses until the service is stopped
while($status -ne "Stopped") {Start-Sleep -Seconds 5; Get-Service $name}
}
}
something like this?
I am currently working on a project that requires that I move a file, and then rename it. I am using this code to move it and that is working. However, the rename portion is not taking place as it should. I cannot figure out why this isn't working. What have I goofed up? I have been beating my head against my desk for at least 20 minutes trying to figure this out.
# Variables for Watcher
$folder = "C:\Program Files\Whatever\Connector\Export\JobStatus"
$filter = '*.txt'
$date=(get-date -Format d) -replace("/")
$time=(get-date -Format t) -replace(":")
# Watcher + Settings
$fsw = New-Object IO.FileSystemWatcher $folder, $filter
$fsw.IncludeSubdirectories = $false
$fsw.NotifyFilter = [IO.NotifyFilters]'FileName', 'DirectoryName'
# Register Event (when file is created)
$onCreated = Register-ObjectEvent $fsw Created -SourceIdentifier FileCreated
-Action {
# Foreach file loop
ForEach ($f in $fsw)
{
if (($File = Get-Item $Event.SourceEventArgs.FullPath | select -Expand
Extension) -eq ".txt")
{
#Used for file testing - Opens the text file for 10 secs, then kills
it.
#Start-Process -FilePath $Event.SourceEventArgs.FullPath | %{ sleep
10; $_ } | kill
# Variables for move
$folderpath = ($Event.SourceEventArgs.FullPath | Split-Path)
$folderfile = ($Event.SourceEventArgs.FullPath | Split-Path -Leaf)
$destination = "C:\Program Files\Whatever\Connector\Staging\"
$newname = "job.import.$date"+"_"+"$time.txt"
}
# Variables for logging
$logpath = 'C:\Program
Files\Whatever\Connector\Export\JobStatus\outlog.txt'
# Grab current file and move to "Staging" folder
try
{
Get-ChildItem -Path $folderpath -Filter $folderfile | Move-Item -
Destination $destination | sleep 5 | Write-Host Rename-Item
$destination$folderfile -NewName $newname | Out-File -FilePath $logpath -
Append
Write-Host $destination$newname
#sleep 5
#Rename-Item "$destination $folderfile" -NewName $newname
#Write-Host $destination $folderfile
#"File $folderfile renamed to $newname" | Out-File -FilePath
$logpath -Append
# Log the move in logfile
"File $folderfile moved to $destination" | Out-File -FilePath
$logpath -Append
}
# Log if errors + clear
catch
{
$error | Out-File -FilePath $logpath -Append
$Error.Clear()
}
}
}
The pipeline is broken when there's no object output. move-item doesn't output an object unless the -passthru parameter is used. Also, set-sleep doesn't output anything. So, rename-item is never reached.
Replace the pipes after move-item and sleep with semicolons, and it should work.
I actually fixed this removing the piped rename and replacing it with a 5 second sleep. I do the rename after the sleep and it works fine now. Still not sure why the rename wasnt working in the piped command though.
This identical code has been used in 3 servers, and only one of them does it silently fail to move the items (it still REMOVES them, but they do not appear in the share).
Azure-MapShare.ps1
param (
[string]$DriveLetter,
[string]$StorageLocation,
[string]$StorageKey,
[string]$StorageUser
)
if (!(Test-Path "${DriveLetter}:"))
{
cmd.exe /c "net use ${DriveLetter}: ${StorageLocation} /u:${StorageUser} ""${StorageKey}"""
}
Get-Exclusion-Days.ps1
param (
[datetime]$startDate,
[int]$daysBack
)
$date = $startDate
$endDate = (Get-Date).AddDays(-$daysBack)
$allDays =
do {
"*"+$date.ToString("yyyyMMdd")+"*"
$date = $date.AddDays(-1)
} until ($date -lt $endDate)
return $allDays
Migrate-Files.ps1
param(
[string]$Source,
[string]$Filter,
[string]$Destination,
[switch]$Remove=$False
)
#Test if source path exist
if((Test-Path -Path $Source.trim()) -ne $True) {
throw 'Source did not exist'
}
#Test if destination path exist
if ((Test-Path -Path $Destination.trim()) -ne $True) {
throw 'Destination did not exist'
}
#Test if no files in source
if((Get-ChildItem -Path $Source).Length -eq 0) {
throw 'No files at source'
}
if($Remove)
{
#Move-Item removes the source files
Move-Item -Path $Source -Filter $Filter -Destination $Destination -Force
} else {
#Copy-Item keeps a local copy
Copy-Item -Path $Source -Filter $Filter -Destination $Destination -Force
}
return $True
The job step is type "PowerShell" on all 3 servers and contains this identical code:
#Create mapping if missing
D:\Scripts\Azure-MapShare.ps1 -DriveLetter 'M' -StorageKey "[AzureStorageKey]" -StorageLocation "[AzureStorageAccountLocation]\backup" -StorageUser "[AzureStorageUser]"
#Copy files to Archive
D:\Scripts\Migrate-Files.ps1 -Source "D:\Databases\Backup\*.bak" -Destination "D:\Databases\BackupArchive"
#Get date range to exclude
$exclusion = D:\Scripts\Get-Exclusion-Days.ps1 -startDate Get-Date -DaysBack 7
#Remove items that are not included in exclusion range
Remove-Item -Path "D:\Databases\BackupArchive\*.bak" -exclude $exclusion
#Move files to storage account. They will be destroyed
D:\Scripts\Migrate-Files.ps1 -Source "D:\Databases\Backup\*.bak" -Destination "M:\" -Remove
#Remove remote backups that are not from todays backup
Remove-Item -Path "M:\*.bak" -exclude $exclusion
If I run the job step using SQL then the files get removed but do not appear in the storage account. If I run this code block manually, they get moved.
When I start up PowerShell on the server, I get an error message: "Attempting to perform the InitializeDefaultDrives operation on the 'FileSystem' provider failed." However, this does not really impact the rest of the operations (copying the backup files to BackupArchive folder, for instance).
I should mention that copy-item also fails to copy across to the share, but succeeds in copying to the /BackupArchive folder
Note sure if this will help you but you could try to use the New-PSDrive cmdlet instead of net use to map your shares:
param (
[string]$DriveLetter,
[string]$StorageLocation,
[string]$StorageKey,
[string]$StorageUser
)
if (!(Test-Path $DriveLetter))
{
$securedKey = $StorageKey | ConvertTo-SecureString -AsPlainText -Force
$credentials = New-Object System.Management.Automation.PSCredential ($StorageUser, $securedKey)
New-PSDrive -Name $DriveLetter -PSProvider FileSystem -Root $StorageLocation -Credential $credentials -Persist
}
Apparently I tricked myself on this one. During testing I must have run the net use command in an elevated command prompt. This apparently hid the mapped drive from non-elevated OS features such as the Windows Explorer and attempts to view its existence via non-elevated command prompt sessions. I suppose it also was automatically reconnecting during reboots because that did not fix it.
The solution was as easy as running the net use m: /delete command from an elevated command prompt.
I created a PowerShell script to remove all files and folders older than X days. This works perfectly fine and the logging is also ok. Because PowerShell is a bit slow, it can take some time to delete these files and folders when big quantities are to be treated.
My questions: How can I have this script ran on multiple directories ($Target) at the same time?
Ideally, we would like to have this in a scheduled task on Win 2008 R2 server and have an input file (txt, csv) to paste some new target locations in.
Thank you for your help/advise.
The script
#================= VARIABLES ==================================================
$Target = \\share\dir1"
$OlderThanDays = "10"
$Logfile = "$Target\Auto_Clean.log"
#================= BODY =======================================================
# Set start time
$StartTime = (Get-Date).ToShortDateString()+", "+(Get-Date).ToLongTimeString()
Write-Output "`nDeleting folders that are older than $OlderThanDays days:`n" | Tee-Object $LogFile -Append
Get-ChildItem -Directory -Path $Target |
Where-Object { $_.LastWriteTime -lt (Get-Date).AddDays(-$OlderThanDays) } | ForEach {
$Folder = $_.FullName
Remove-Item $Folder -Recurse -Force -ErrorAction SilentlyContinue
$Timestamp = (Get-Date).ToShortDateString()+" | "+(Get-Date).ToLongTimeString()
# If folder can't be removed
if (Test-Path $Folder)
{ "$Timestamp | FAILLED: $Folder (IN USE)" }
else
{ "$Timestamp | REMOVED: $Folder" }
} | Tee-Object $LogFile -Append # Output folder names to console & logfile at the same time
# Set end time & calculate runtime
$EndTime = (Get-Date).ToShortDateString()+", "+(Get-Date).ToLongTimeString()
$TimeTaken = New-TimeSpan -Start $StartTime -End $EndTime
# Write footer to log
Write-Output ($Footer = #"
Start Time : $StartTime
End Time : $EndTime
Total runtime : $TimeTaken
$("-"*79)
"#)
# Create logfile
Out-File -FilePath $LogFile -Append -InputObject $Footer
# Clean up variables at end of script
$Target=$StartTime=$EndTime=$OlderThanDays = $null
One way to achieve this would be to write an "outer" script that passes the directory-to-be-cleaned, into the "inner" script, as a parameter.
For your "outer" script, have something like this:
$DirectoryList = Get-Content -Path $PSScriptRoot\DirList;
foreach ($Directory in $DirectoryList) {
Start-Process -FilePath powershell.exe -ArgumentList ('"{0}\InnerScript.ps1" -Path "{1}"' -f $PSScriptRoot, $Directory);
}
Note: Using Start-Process kicks off a new process that is, by default, asynchronous. If you use the -Wait parameter, then the process will run synchronously. Since you want things to run more quickly and asynchronously, omitting the -Wait parameter should achieve the desired results.
Invoke-Command
Alternatively, you could use Invoke-Command to kick off a PowerShell script, using the parameters: -File, -ArgumentList, -ThrottleLimit, and -AsJob. The Invoke-Command command relies on PowerShell Remoting, so that must enabled, at least on the local machine.
Add a parameter block to the top of your "inner" script (the one you posted above), like so:
param (
[Parameter(Mandatory = $true)]
[string] $Path
)
That way, your "outer" script can pass in the directory path, using the -Path parameter for the "inner" script.