Some useful info beforehand. What I'm attempting to do is read in output from an external command, specifically steamcmd, using powershell start-process and System.diagnostics.ProcessStartInfo.
What I'm running into is RedirectStandardOutput buffer limit of 4096 bytes. The output I'm getting from the steamcmd is more than that buffer, so I'm only getting a portion of what I need. I have no other method for getting this data, other than calling steamcmd.
You can see the output as well if you have steamcmd (it's free) and running this.
steamcmd +login anonymous +app_info_update 1 +app_info_print 443030 +quit
This will download all the manifest info about that appid.
I've tried to redirect to a file and also to a variable, both work as expected, it's just that it's cut short by the buffer. There also doesn't appear to be a powershell method in System.Diagnostics.Process to wait for the OutputDataReceived event.
Code used (stolen from another STackOverflow question)
$psi = New-object System.Diagnostics.ProcessStartInfo
$psi.CreateNoWindow = $true
$psi.UseShellExecute = $false
$psi.RedirectStandardOutput = $true
$psi.RedirectStandardError = $true
$psi.FileName = "C:\Path\to\SteamCMD.exe"
$psi.Arguments = "+login anonymous +app_info_update 1 +app_info_print 443030 +quit"
$process = New-Object System.Diagnostics.Process
$process.StartInfo = $psi
[void]$process.Start()
$output = $process.StandardOutput.ReadToEnd()
$process.WaitForExit()
$output
I think the actual issue is that steamCMD just outputs in one big write instead of line by line. I guess a better question would be, how can I increase the standardoutput buffer size of Start-Process or System.Diagnostics.Process.
Note: running the steamcmd > somefile.txt results in same buffer limit.
steamcmd.exe appears to work properly only if run from an empty subdirectory, at least for this command line.
I can't explain it, but I was able to repro your issue when I rand the command twice.
Here is one way to work around the issue. Run steamcmd.exe from an empty directory. Depending on your needs, you could use a static temp dir and clean it before each run, or generate a temp dir and use that and decide how to clean it up later.
CODE
$steamcmd = "L:\test\steam\steamcmd.exe"
# steam works best if run in an empty directory
# why, i have no clue...
$tempName = [System.IO.Path]::GetRandomFileName()
$parentDir = Split-Path $steamcmd -Parent
$tempPath = Join-Path $parentDir $tempName
$null = New-Item $tempPath -ItemType Directory
$null = Copy-Item $steamcmd $tempPath
Write-Output "temp directory is '$tempPath'"
$steamcmdTemp = Join-Path $tempPath 'steamcmd.exe'
Write-Output "Running '$steamcmdTemp'..."
$output = & $steamcmdTemp +login anonymous +app_info_update 1 +app_info_print 443030 +quit
$now = [DateTime]::Now.ToString("yyyyMMdd HHmmss")
$outFile = "output {0}.txt" -f $now
$outputFile = Join-Path $parentDir $outFile
Write-Output "Saving output to '$outputFile'"
$output | Out-File $outputFile -Force
# todo: deal with removing the temp dir...
# or keep cleaning and reusing a static name...
Write-Output "Remember to cleanup '$tempPath'"
Related
SCRIPT PURPOSE
The idea behind the script is to recursively extract the text from a large amount of documents and update a field in an Azure SQL database with the extracted text. Basically we are moving away from Windows Search of document contents to an SQL full text search to improve the speed.
ISSUE
When the script encounters an issue opening the file such as it being password protected, it fails for every single document that follows. Here is the section of the script that processes the files:
foreach ($list in (Get-ChildItem ( join-path $PSScriptRoot "\FileLists\*" ) -include *.txt )) {
## Word object
$word = New-Object -ComObject word.application
$word.Visible = $false
$saveFormat = [Enum]::Parse([Microsoft.Office.Interop.Word.WdSaveFormat], "wdFormatText")
$word.DisplayAlerts = 0
Write-Output ""
Write-Output "################# Parsing $list"
Write-Output ""
$query = "INSERT INTO tmp_CachedText (tCachedText, tOID)
VALUES "
foreach ($file in (Get-Content $list)) {
if ($file -like "*-*" -and $file -notlike "*~*") {
Write-Output "Processing: $($file)"
Try {
$doc = $word.Documents.OpenNoRepairDialog($file, $false, $false, $false, "ttt")
if ($doc) {
$fileName = [io.path]::GetFileNameWithoutExtension($file)
$fileName = $filename + ".txt"
$doc.SaveAs("$env:TEMP\$fileName", [ref]$saveFormat)
$doc.Close()
$4ID = $fileName.split('-')[-1].replace(' ', '').replace(".txt", "")
$text = Get-Content -raw "$env:TEMP\$fileName"
$text = $text.replace("'", "''")
$query += "
('$text', $4ID),"
Remove-Item -Force "$env:TEMP\$fileName"
<# Upload to azure #>
$query = $query.Substring(0,$query.Length-1)
$query += ";"
Invoke-Sqlcmd #params -Query $Query -ErrorAction "SilentlyContinue"
$query = "INSERT INTO tmp_CachedText (tCachedText, tOID)
VALUES "
}
}
Catch {
Write-Host "$($file) failed to process" -ForegroundColor RED;
continue
}
}
}
Remove-Item -Force $list.FullName
Write-Output ""
Write-Output "Uploading to azure"
Write-Output ""
<# Upload to azure #>
Invoke-Sqlcmd #params -Query $setQuery -ErrorAction "SilentlyContinue"
$word.Quit()
TASKKILL /f /PID WINWORD.EXE
}
Basically it parses through a folder of .txt files that contain x amount of document paths, creates a T-SQL update statement and runs against an Azure SQL database after each file is fully parsed. The files are generated with the following:
if (!($continue)) {
if ($pdf){
$files = (Get-ChildItem -force -recurse $documentFolder -include *.pdf).fullname
}
else {
$files = (Get-ChildItem -force -recurse $documentFolder -include *.doc, *.docx).fullname
}
$files | Out-File (Join-Path $PSScriptRoot "\documents.txt")
$i=0; Get-Content $documentFile -ReadCount $interval | %{$i++; $_ | Out-File (Join-Path $PSScriptRoot "\FileLists\documents_$i.txt")}
}
The $interval variable defines how many files are set to be extracted for each given upload to azure. Initially i had the word object being created outside the loop and never closed until the end. Unfortunately this doesn't seem to work as every time the script hits a file it cannot open, every file that follows will fail, until it reaches the end of the inner foreach loop foreach ($file in (Get-Content $list)) {.
This means that to get the expected outcome i have to run this with an interval of 1 which takes far too long.
This is a shot in the dark
But to me it sounds like the reason its failing is because the Word Com object is now prompting you for some action due since it cannot open the file so all following items in the loop also fail. This might explain why it works if you set the $Interval to 1 because when its 1 it is closing and opening the Com object every time and that takes forever (I did this with excel).
What you can do is in your catch statement, close and open a new Word Com object which should lets you continue on with the loop (but it will be a bit slower if it needs to open the Com object a lot).
If you want to debug the problem even more, set the Com object to be visible, and slowly loop through your program without interacting with Word. This will show you what is happening with Word and if there are any prompts that are causing the application to hang.
Of course, if you want to run it at full speed, you will need to detect which documents you can't open before hand or you could multithread it by opening several Word Com objects which will allow you to load several documents at a time.
As for...
ISSUE
When the script encounters an issue opening the file such as it being password protected, it fails for every single document that follows.
... then test for this as noted here...
How to check if a word file has a password?
$filename = "C:\path\to\your.doc"
$wd = New-Object -COM "Word.Application"
try {
$doc = $wd.Documents.Open($filename, $null, $null, $null, "")
} catch {
Write-Host "$filename is password-protected!"
}
... and skip the file to avoid the failure of the remaining files.
I am trying to join multiple log files into a single archive file, then move that archive file to another location, in an effort to both clean up old log files, and save hard drive space. We have a bunch of tools that all log to the same root, with a per-tool folder for their logs. (E.g.,
C:\ServerLogs
C:\ServerLogs\App1
C:\ServerLogs\2ndApp
each of which will have log files inside, like
C:\ServerLogs\App1\June1.log
C:\ServerLogs\App1\June2.log
C:\ServerLogs\2ndApp\June1.log
C:\ServerLogs\2ndApp\June2.log
I want to go into each of these subfolders, archive up all the files older than 5 days, then move the archive to another (long-term storage) drive and delete the now-zipped files. The tools I'm using are PowerShell and 7zip. The below code is using test locations.
I have cobbled together two scripts from various sources online, over the course of two full shifts, but neither one works right. Here's the first:
# Alias for 7-zip
if (-not (test-path "$env:ProgramFiles\7-Zip\7z.exe")) {throw "$env:ProgramFiles\7-Zip\7z.exe needed"}
set-alias 7zip "$env:ProgramFiles\7-Zip\7z.exe"
$Days = 5 #minimum age of files to archive; in other words, newer than this many days ago are ignored
$SourcePath = C:\WorkingFolder\FolderSource\
$DestinationPath = C:\Temp\
$LogsToArchive = Get-ChildItem -Recurse -Path $SourcePath | Where-Object {$_.lastwritetime -le (get-date).addDays(-$Days)}
$archive = $DestinationPath + $now + ".7z"
#endregion
foreach ($log in $LogsToArchive) {
#define Args
$Args = a -mx9 $archive $log
$Command = 7zip
#write-verbose $command
#invoke the command
invoke-expression -command $Command $Args
The problem with this one is that I get errors trying to invoke the expression. I've tried restructuring it, but then I get errors because my $Args have an "a"
So I abandoned this method (despite it being my preferred), and tried this set.
#region Params
param(
[Parameter(Position=0, Mandatory=$true)]
[ValidateScript({Test-Path -Path $_ -PathType 'container'})]
[System.String]
$SourceDirectory,
[Parameter(Position=1, Mandatory=$true)]
[ValidateNotNullOrEmpty()]
[System.String]
$DestinationDirectory
)
#endregion
function Compress-File{
#region Params
param(
[Parameter(Position=0, Mandatory=$true)]
[ValidateScript({Test-Path -Path $_ -PathType 'leaf'})]
[System.String]
$InputFile,
[Parameter(Position=1, Mandatory=$true)]
[ValidateNotNullOrEmpty()]
[System.String]
$OutputFile
)
#endregion
try{
#Creating buffer with size 50MB
$bytesGZipFileBuffer = New-Object -TypeName byte[](52428800)
$streamGZipFileInput = New-Object -TypeName System.IO.FileStream($InputFile,[System.IO.FileMode]::Open,[System.IO.FileAccess]::Read)
$streamGZipFileOutput = New-Object -TypeName System.IO.FileStream($OutputFile,[System.IO.FileMode]::Create,[System.IO.FileAccess]::Write)
$streamGZipFileArchive = New-Object -TypeName System.IO.Compression.GZipStream($streamGZipFileOutput,[System.IO.Compression.CompressionMode]::Compress)
for($iBytes = $streamGZipFileInput.Read($bytesGZipFileBuffer, 0,$bytesGZipFileBuffer.Count);
$iBytes -gt 0;
$iBytes = $streamGZipFileInput.Read($bytesGZipFileBuffer, 0,$bytesGZipFileBuffer.Count)){
$streamGZipFileArchive.Write($bytesGZipFileBuffer,0,$iBytes)
}
$streamGZipFileArchive.Dispose()
$streamGZipFileInput.Close()
$streamGZipFileOutput.Close()
Get-Item $OutputFile
}
catch { throw $_ }
}
Get-ChildItem -Path $SourceDirectory -Recurse -Exclude "*.7z"|ForEach-Object{
if($($_.Attributes -band [System.IO.FileAttributes]::Directory) -ne [System.IO.FileAttributes]::Directory){
#Current file
$curFile = $_
#Check the file wasn't modified recently
if($curFile.LastWriteTime.Date -le (get-date).adddays(-5)){
$containedDir=$curFile.Directory.FullName.Replace($SourceDirectory,$DestinationDirectory)
#if target directory doesn't exist - create
if($(Test-Path -Path "$containedDir") -eq $false){
New-Item -Path "$containedDir" -ItemType directory
}
Write-Host $("Archiving " + $curFile.FullName)
Compress-File -InputFile $curFile.FullName -OutputFile $("$containedDir\" + $curFile.Name + ".7z")
Remove-Item -Path $curFile.FullName
}
}
}
This actually seems to work, insofar as it creates individual archives for each eligible log, but I need to "bundle" up the logs into one mega-archive, and I can't seem to figure out how to recurse (to get sub-level items) and do a foreach (to confirm age) without having that foreach produce individual archives.
I haven't even gotten into the Move and Delete phase, because I can't seem to get the archiving stage to work properly, but I certainly don't mind grinding away at that once this gets figured out (I've already spent two full days trying to figure this one!).
I greatly appreciate any and all suggestions! If I've not explained something, or been a bit unclear, please let me know!
EDIT1: Part of the requirement, which I completely forgot to mention, is that I need to keep the structure in the new location. So the new location will have
C:\ServerLogs --> C:\Archive\
C:\ServerLogs\App1 --> C:\Archive\App1
C:\ServerLogs\2ndApp --> C:\Archive\2ndApp
C:\Archive
C:\Archive\App1\archivedlogs.zip
C:\Archive\2ndApp\archivedlogs.zip
And I have absolutely no idea how to specify that the logs from App1 need to go to App1.
EDIT2: For this latter part, I used Robocopy - It maintains the folder structure, and if you feed it ".zip" as an argument, it'll only do the .zip files.
this line $Args = a -mx9 $archive $log likely needs to have the right side value wrapped in double quotes OR each non-variable wrapped in quotes with a comma between each so that you get an array of args.
another method would be to declare an array of args explicitly. something like this ...
$ArgList = #(
'a'
'-mx9'
$archive
$log
)
i also recommend you NOT use an automatic $Var name. take a look at Get-Help about_Automatic_Variables and you will see that $Args is one of those. you are strongly recommended NOT to use any of them for anything other than reading. writing to them is iffy. [grin]
What i am looking for is to take powershell and read the file content out to the speech synthesis module.
File name for this example will be read.txt.
Start of the Speech module:
Add-Type -AssemblyName System.speech
$Narrator1 = New-Object System.Speech.Synthesis.SpeechSynthesizer
$Narrator1.SelectVoice('Microsoft Zira Desktop')
$Narrator1.Rate = 2
$Location = "$env:userprofile\Desktop\read.txt"
$Contents = Get-Content $Location
Get-Content $Location -wait -Tail 2 | where {$Narrator1.Speak($Contents)}
This works once. I like to use the Clear-Content to wipe the read.txt after each initial read and have powershell wait until new line is added to the read.txt file then process it again to speak the content. I believe I can also make it run in the background with -windowstyle hidden
Thank you in advanced for any assistance.
Scott
I don't think a loop is the answer, I would use the FileSystemWatcher to detect when the file has changed. Try this:
$fsw = New-Object System.IO.FileSystemWatcher
$fsw.Path = "$env:userprofile\Desktop"
$fsw.Filter = 'read.txt'
Register-ObjectEvent -InputObject $fsw -EventName Changed -Action {
Add-Type -AssemblyName System.speech
$Narrator1 = New-Object System.Speech.Synthesis.SpeechSynthesizer
$Narrator1.SelectVoice('Microsoft Zira Desktop')
$Narrator1.Rate = 2
$file = $Event.SourceEventArgs.FullPath
$Contents = Get-Content $file
$Narrator1.Speak($Contents)
}
Your only problem was that you accidentally used the previously assigned $Contents variable in the where (Where-Object) script block rather than $_, the automatic variable representing the current pipeline object:
Get-Content $Location -Wait -Tail 2 | Where-Object { $Narrator1.Speak($_) }
Get-Content $Location -Wait will poll the input file ($Location here) every second to check for new content and pass it through the pipeline (the -Tail argument only applies to the initial reading of the file; as new lines are added, they are all passed through).
The pipeline will stay alive indefinitely - until you delete the $Location file or abort processing.
Since the command is blocking, you obviously need another session / process to add content to file $Location, such as another PowerShell window or a text editor that has the file open and modifies its content.
You can keep appending to the file with >>, but that will keep growing it.
To discard the file's previous content, you must indeed use Clear-Content, as you say, which truncates the existing file without recreating it, and therefore keeps the pipeline alive; e.g.:
Clear-Content $Location
'another line to speak' > $Location
Caveat: Special chars. such as ! and ? seem to cause silent failure to speak. If anyone knows why, do tell us. The docs offer no immediate clues.
As for background operation:
With a background job, curiously, the Clear-Content / > combination appears not to work; if anybody knows why, please tell us.
However, using >> - which grows the file - does work.
The following snippet demonstrates the use of a background job to keep speaking input as it is being added to a specified file (with some delay), until a special end-of-input string is sent:
# Determine the input file (on the user's desktop)
$file = Join-Path ([environment]::GetFolderPath('Desktop')) 'read.txt'
# Initialize the input file.
$null > $file
# Define a special string that acts as the end-of-input marker.
$eofMarker = '[quit]'
# Start the background job (PSv3+ syntax)
$job = Start-Job {
Add-Type -AssemblyName System.speech
$Narrator1 = New-Object System.Speech.Synthesis.SpeechSynthesizer
$Narrator1.SelectVoice('Microsoft Zira Desktop')
$Narrator1.Rate = 2
while ($true) { # A dummy loop we can break out of on receiving the end-of-input marker
Get-Content $using:file -Wait | Where-Object {
if ($_ -eq $using:eofMarker) { break } # End-of-input marker received -> exit the pipeline.
$Narrator1.Speak($_)
}
}
# Remove the input file.
Remove-Item -ErrorAction Ignore -LiteralPath $using:file
}
# Speak 1, 2, ..., 10
1..10 | ForEach-Object {
Write-Verbose -Verbose $_
# !! Inexplicably, using Clear-Content followed by > to keep
# !! replacing the file content does *not* work with a background task.
# !! >> - which *appends* to the file - does work, however.
$_ >> $file
}
# Send the end-of-input marker to make the background job stop reading.
$eofMarker >> $file
# Wait for background processing to finish.
# Note: We'll get here long before the background job has finished speaking.
Write-Verbose -Verbose 'Waiting for processing to finish to cleanup...'
$null = Receive-Job $job -wait -AutoRemoveJob
I have a powershell script that's moving files from a source directory over to a target directory every 15 minutes. Files of around 1 meg are moving into the source directory by an SFTP server... so the files can be written at anytime by the SFTP clients.
The Move-Item command is moving files, however it seems that it's moving them without making sure the file isn't still being written (in-use?).
I need some help coming up with a way to write the files from the source to the target and make sure the entire file gets to the target. Anyone run across this issue before with Powershell?
I searched and was able to find a few functions that said they solved the problem but when I tried them out I wasn't seeing the same results.
Existing PowerShell script is below:
Move-Item "E:\SFTP_Server\UserFolder\*.*" "H:\TargetFolder\" -Verbose -Force *>&1 | Out-File -FilePath E:\Powershell_Scripts\LOGS\MoveFilesToTarget-$(get-date -f yyyy-MM-dd-HH-mm-ss).txt
I ended up cobbling together a few things and got this working as I wanted it. Basically I'm looping through the files and checking the length of the file once... then waiting a second and checking the length of the file again to see if it's changed. This seems to be working well. Here's a copy of the script incase it helps anyone in the future!
$logfile ="H:\WriteTest\LogFile_$(get-date -format `"yyyyMMdd_hhmmsstt`").txt"
function log($string, $color)
{
if ($Color -eq $null) {$color = "white"}
write-host $string -foregroundcolor $color
$string | out-file -Filepath $logfile -append
}
$SourcePath = "E:\SFTP_Server\UserFolder\"
$TargetPath = "H:\TargetFolder\"
$Stuff = Get-ChildItem "$SourcePath\*.*" | select name, fullname
ForEach($I in $Stuff){
log "Starting to process $I.name" green
$newfile = $TargetPath + $I.name
$LastLength = 1
$NewLength = (Get-Item $I.fullname).length
while ($NewLength -ne $LastLength) {
$LastLength = $NewLength
Start-Sleep -Seconds 1
log "Waiting 1 Second" green
$NewLength = (Get-Item $I.fullname).length
log "Current File Length = $NewLength" green
}
log "File Not In Use - Ready To Move!" green
Move-Item $I.fullname $TargetPath
}
After getting help in Print PDF to XPS using Powershell I made two scripts: one converts a PDF to XPS and the other XPS to PDF using a workflow. When combining them into one script, the first workflow converts the file to XPS, however, the second workflow doesn't seem to work at the appropriate time or at all.
There are some cleanup items at the end of the second workflow that do seem to be working properly. Can I delay the start of the second workflow somehow to start at the appropriate time?
# Define the directories used for converting files
$secure_pdf_dir = Select-FolderDialog # the variable contains user folder selection
$xps_dir = "aaa"
$unsecure_pdf_dir = "bbb"
$secure_pdf_archive = "ccc"
$xps_archive = "ddd"
$unsecure_pdf_archive = "eee"
$date = $(get-date -f yyyy-MMM-dd)
# Archives old files except secure pdfs
# Archives old xps files
New-Item $xps_archive\$date -type Directory
copy-Item $xps_dir\* -Recurse $xps_archive\$date -force
Remove-Item $xps_dir\*
# Archives old unsecure pdf files
New-Item $unsecure_pdf_archive\$date -type Directory
copy-Item $unsecure_pdf_dir\* -Recurse $unsecure_pdf_archive\$date -force
Remove-Item $unsecure_pdf_dir\*
# Converts PDF to XPS
function print_files($secure_pdf_dir){
#The purpose of this counter is to number your .xps files
Get-ChildItem $secure_pdf_dir -Filter *.pdf -Recurse | Foreach-Object {
#For each .pdf file in that directory, continue
same_time $_.FullName
}
}
# The following function keeps checking for a new window called "Save Print Output As". When the window shows up, it enters the name of the file and press ENTER.
function enter_my_names($fullname){
$wshell = New-Object -ComObject wscript.shell;
while($wshell.AppActivate('Save Print Output As') -ne $true){
$wshell.AppActivate('Save Print Output As')
}
$basename = [io.path]::GetFileNameWithoutExtension($fullname)
#This is where the name is actually entered
$wshell.SendKeys("$basename")
$wshell.SendKeys("{ENTER}")
}
# The following function launches simultaneously a print job on the input file and a function waiting for the print job to show up to name the file.
workflow same_time{
Param(
$fullname
)
parallel{
Start-Process -FilePath $fullname –Verb Print -PassThru
enter_my_names($fullname)
}
}
# MAIN PROGRAM
# Here the script saves your current printer as default
$defprinter = Get-WmiObject -Query "Select * from Win32_Printer Where Default=$true"
# Queries for a XPS printer
$printer = Get-WmiObject -Query "Select * from Win32_Printer Where Name='Microsoft XPS Document Writer'"
#Sets the XPS printer as Default
$printer.SetDefaultPrinter()
# Starts the main job
print_files($secure_pdf_dir)
# Sets the old default printer back as default again
$defprinter.SetDefaultPrinter()
# This is a small delay to be sure everything is completed before closing Adobe Reader. You can probably shorten it a bit
sleep 5
# Finally, close Adobe Reader
Get-Process "acrord32" | Stop-Process
Move-Item $secure_pdf_dir\*.oxps $xps_dir
# Converts XPS to PDF
function print_xps_files($xps_dir){
#The purpose of this counter is to number your .xps files
Get-ChildItem $xps_dir -Filter *.oxps -Recurse | Foreach-Object {
#For each .pdf file in that directory, continue
same_time $_.FullName
}
}
# The following function keeps checking for a new window called "Save Print Output As". When the window shows up, it enters the name of the file and press ENTER.
function enter_my_xps_names($fullname){
$wshell = New-Object -ComObject wscript.shell;
while($wshell.AppActivate('Save Print Output As') -ne $true){
$wshell.AppActivate('Save Print Output As')
}
$basename = [io.path]::GetFileNameWithoutExtension($fullname)
#This is where the name is actually entered
$wshell.SendKeys("$basename")
$wshell.SendKeys("{ENTER}")
}
function press_enter($enter){
$wshell = New-Object -ComObject wscript.shell;
while($wshell.AppActivate('Print') -ne $true){
$wshell.AppActivate('Print')
}
$wshell.SendKeys("{ENTER}")
}
# The following function launches simultaneously a print job on the input file and a function waiting for the print job to show up to name the file.
workflow same_time2{
Param(
$fullname
)
sequence{
Start-Process -FilePath $fullname –Verb Print -PassThru
press_enter($enter)
enter_my_xps_names($fullname)
}
}
# MAIN PROGRAM
# Here the script saves your current printer as default
$defprinter = Get-WmiObject -Query "Select * from Win32_Printer Where Default=$true"
# Queries for a XPS printer
$printer = Get-WmiObject -Query "Select * from Win32_Printer Where Name='Microsoft XPS Document Writer'"
# Sets the XPS printer as Default
$printer.SetDefaultPrinter()
# Starts the main job
print_files($xps_dir)
# Sets the old default printer back as default again
$defprinter.SetDefaultPrinter()
# This is a small delay to be sure everything is completed before closing Adobe Reader. You can probably shorten it a bit
sleep 5
# Archives old secure pdfs
# Archives old secure pdf files
New-Item $secure_pdf_archive\$date -type Directory
copy-Item $secure_pdf_dir\* -Recurse $secure_pdf_archive\$date -force
Remove-Item $secure_pdf_dir\*