I'm writing a script that moves all of my read emails older than 2 weeks to a separate PST for archiving. Once it is acceptable, I'll execute it via a rule.
However, my current code takes a very long time to complete (about 8 minutes), while simply doing a drag and drop in Outlook is phenomenally quicker.
Does anyone know of a better way to move large amounts of emails? Maybe via accessing Outlook's index?
Add-Type -AssemblyName "Microsoft.Office.Interop.Outlook"
$Outlook=New-Object -ComObject Outlook.Application
$Namespace = $Outlook.GetNameSpace("MAPI")
$Items=1
while ($Items -gt 0)
{
$Items=0
$SourceFolder = $Namespace.Folders.Item($SourcePSTName).Folders.Item($Folder)
$TargetFolder = $Namespace.Folders.Item($TargetPSTName).Folders.Item($Folder)
$AllOfDem=($SourceFolder.Items | where {$_.SentOn -lt $SentMaxDate -and $_.Unread -eq $False})
foreach ($Mail in $AllOfDem)
{
$Mail.Move($TargetFolder) | Out-Null
$Items++
}
}
I suspect your problem is not so much moving the messages (which can be optimized using Extended MAPI or Redemption (I am its author) to move all messages in a single call), but rather looping through all items in a folder - that is a huge problem.
Instead of looping, use Items.Find/FindNext or Items.Restrict to provide a query that only returns the matching items.
Related
I had an excel script to search for files in a command.
I found this example on the forum, the statement says that to search for a file by name, you need to write down the name and send (*) but when requested, it does not find anything
Get-ChildItem -Path "C:\\Folder\\test\*"
What can I do to simplify the code and make it much faster. Wait 10 minutes to find a file out of 10000. this is very long
I have a folder with 10,000 files, and excel searches through VBA for a script in almost 2-3 seconds.
When to script in PowerShell via
$find = Get-ChildItem -Path "C:\\Folder"
for ($f=0; $f -lt $find.Count; $f++){
$path_name = $find\[$f\].Name
if($path_name-eq 'test'){
Write Host 'success'
}
}
ut it turns out sooooooo long, the script hangs for 10 minutes and does not respond, and maybe he will be lucky to answer.
How can I find a file by filter using
Get-ChildItem
To make your search faster you can use Get-ChildItem filter.
$fileName = "test.txt"
$filter = "*.txt"
$status = Get-ChildItem -Path "C:\PS\" -Recurse -Filter $filter | Where-Object {$_.Name -match $fileName}
if ($status) {
Write-Host "$($status.Name) is found"
} else {
Write-Host "No such file is available"
}
You could also compare the speed of searching by using Measure-Command
If the disk the data is on is slow then it'll be slow no matter what you do.
If the folder is full of files then it'll also be slow depending on the amount of RAM in the system.
Less files per folder equals more performance so try to split them up into several folders if possible.
Doing that may also mean you can run several Get-ChildItems at once (disk permitting) using PSJobs.
Using several loops to take take care of a related problem usually makes the whole thing run "number of loops" times as long. That's what Where-Object is for (in addition to the -Filter, -Include and -Exclude flags to Get-ChildItem`).
Console I/O takes A LOT of time. Do NOT output ANYTHING unless you have to, especially not inside loops (or cmdlets that act like loops).
For example, including basic statistics:
$startTime = Get-Date
$FileList = Get-ChildItem -Path "C:\Folder" -File -Filter 'test'
$EndTime = Get-Date
$FileList
$AfterOutputTime = Get-Date
'Seconds taken for listing:'
(EndTime - $startTime).TotalSeconds
'Seconds taken including output:'
($AfterOutputTime - $StartTime).TotalSeconds
Good evening,
I'm hardly experienced in programming, but every now and then I try to build the things I need myself.
What do I want to achieve with the script?
This script should read a text file with words.
There is one new word per line. When reading the script should look if the word has between 3 and 16 letters. If it has less than 3 or more than 16, then the word is skipped. But if it is between the 3 and 16, then the word will be saved in a new Text_File. Again, I would love a new word every line.
Here is what I created.
Please don't judge my script too hard.
$regex = '[A-Z][a-z]{3,16}'
foreach ($line in Get-Content -Path C:\\Users\\Administrator\\Desktop\\namecheck\\words.txt)
{
if($line -match $regex)
{
Out-File -FilePath C:\\Users\\Administrator\\Desktop\\namecheck\\sorted.txt -Force
}
}
As mentioned above, the words are not written to a file. However, the file is created and the script also takes a little while to finish. So from my point of view something seems to happen.
'[A-Z][a-z]{3,16}' is only accounting for words that are of length 3+. That would be your first issue, and your second one is your export. Out-File isn't being told what you want to export. So, either provide it the a value via pipeline input, or using -InputObject:
$path = 'C:\Users\Administrator\Desktop\namecheck\'
$stream = [System.IO.StreamReader]::new("$path\words.txt")
while ($line = $stream.ReadLine())
{
if ($line.Length -gt 3 -and $line.Length -lt 16)
{
Out-File -FilePath "$path\sorted.txt" -InputObject $line -Append -Force
}
}
$stream.Close()
$stream.Dispose()
Although, a foreach loop is the fastest of the loops, when using Get-Content it still has to wait for the completion of the content gathering before it can move through the list. Only mentioning this since you said that the script takes quite a bit and without knowing the size of your words.txt file, im left to assume that's the cause.
With that said, using [System.IO.StreamReader] should speed up that reading of the file as you'll get to read and iterate through the file at the same time; with a while loop that is.
I'm new to Powershell scripting, and I'm struggling with how to identify when multiple files have been created in order to initiate a process.
Ultimately, I need to wait until a handful of files have been created (this usually occurs within a finite period of time the same time each day). These files are created at separate times. Once all files have been created and are at their final location, I need to perform a separate action with these files. The trouble I'm having is:
Identifying when all files are available
Identifying how to initiate a separate process once these files are available
If necessary,unregistering the events (I plan to run this script each morning...I don't know how long these events are registered)
I've toyed with using the IO.FileSystemWatcher with some success in order to monitor when any individual directory has this file. While this logs appropriately, I don't know how to consolidate the collection of these files. Possibly a flag? Not sure how to implement. I've also considered using Test-Path as a way of checking to see if these files exist -- but a drawback with this is that I'd need to periodically run the script (or Sleep) for a pre-defined period of time.
Does anyone have experience with doing something like this? Or possibly provide guidance?
What I've tried (with respect to IO.FileSystemWatcher) using test data:
$i=0
$paths = "C:\","Z:\"
$fileName='Test'+$Datestr.Trim()+'.txt'
foreach ($path in $paths)
{
$fsw = New-Object IO.FileSystemWatcher $path, $fileName -Property #{IncludeSubdirectories = $tru;NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite'}
Register-ObjectEvent $fsw Created -SourceIdentifier "$i+fileCreated" -Action {
$name = $Event.SourceEventArgs.Name
$changeType = $Event.SourceEventArgs.ChangeType
$fpath = $Event.SourceEventArgs.FullPath
$timeStamp = $Event.TimeGenerated
Write-Host "The folder "$Event.SourceEventArgs.FullPath" was $changeType at $timeStamp" -fore green
Out-File -FilePath Z:\log.txt -Append -InputObject "The folder $fpath was $changeType at $timeStamp"
}
I am guessing you do not have control over the process(es) that create the files, or else you would use the completion of that job to trigger the "post processing" you need to do. I have used IO.FileSystemWatcher on a loop/timer like you described, and then I Group-Object on the file names to get a distinct list of the files that have changed. I was using this to monitor for small files (about 100 files at ~100KB each) that did not change frequently, and it still generated a large number of events every time the files changed. If you want to take action/start a process every time a change is made, then IO.FileSystemWatcher is your friend.
If the files are larger/take longer to generate, and because you only care once they are all done (not when they are created/modified) you may be better off skipping the filewatcher and just check if all of the files are there. IE: the process(s) to generate the files usually finishes by 6am. So at 6am you run a script that checks to see if all the files exist. (and I would also check the file size/last modified dttm to help you ensure that the process which generates the file is done writing to it.) You still may want to build a loop into this, especially if you want the process to start as soon as the files are done.
$filesIWatch = #()
$filesIWatch += New-Object -TypeName psobject -Property #{"FullName"="C:\TestFile1.txt"
"AvgSize"=100}
$filesIWatch += New-Object -TypeName psobject -Property #{"FullName"="C:\TestFile2.txt"
"AvgSize"=100}
[bool]$filesDone = $true
foreach($file in $filesIWatch){
#Check if the file is there. Also, maybe check the file size/modified dttm to help be sure that some other process is still writing to the file.
if(-not (Test-Path ($file.FullName))){
$filesDone = $false
}
}
if($filesDone){
#do your processing
}
else{
#throw error/handle them not being done
}
Long time lurker first time poster. I'm looking(of my own initiative) to see if there is a method by which I can check for missing files, that we would expect to receive on a daily basis, and be notified via e-mail.
Our company has what I'd call a relatively unhinged systems infrstructure, that since I arrived I've been chipping away here and there putting in some practices and process' to be more proactive with our monitoring.
Specifically in this case, we receive files via FTP from a vendor, that outlines our Sales and other data. These files go through some validation and the data is then imported into our ERP platform. However I am interested to put in a check, that raises and alert when a file has not been received, when expected.
The last part of that requirement can potentially change, I'm not sure how specific I can get when trying to raise an alert from an expected file.
I'll outline this by stating I'm a relative novice in this area, but there is really no one in my department any the wiser. So I've been looking into powershell.
I've created the following two bits of codes so far, that when executed appear to return files that have been created/last writ, within the last day. This would even be enough, to have this output sent via e-mail. I would be able to spot quickly if an expected file is not in the list.
GET-ChildItem -Path "Path I am checking" |
Where-Object {$_.LastWritetime -gt (get-Date).AddDays(-1)}
The above returns one .csv file. I guess if I get a returned file, then I know its been provided, and if the return is blank/zero, then I know I didn't get a file.
I've used the above for four seperate checks, checking other subfolders in the structure.
To outline the folder structure
\"App server"\"Region"\"Vendor"
There are then the following subfolders
Purchases
Sales
Tenders
VAT
Each of the above four folders then has
Incoming
Processed
I am running my checks on the processed folder for each of the four folder outlined above.
Maybe something like this will help you out:
Function Test-NewerFiles {
# We use parameters as it makes things easy when we need to change things
# CmdLetBinding makes sure that we can see our 'Write-Verbose' messages if we want to
[CmdLetBinding()]
Param (
[String]$Path = 'C:\Users\me\Downloads\Input_Test',
[String]$ExportFile = 'C:\Users\me\Downloads\Log_Test\Attachment.txt'
)
# We first save the date, then we don't need to do this every time again
$CompareDate = (Get-Date).AddDays(-1)
# Then we collect only the folders and check each folder for files and count them
Get-ChildItem -Path $Path -Directory -Recurse | ForEach-Object {
$Files = (Get-ChildItem -Path $_.FullName -File | Where-Object {$_.LastWritetime -gt $CompareDate} | Measure-Object).Count
# If we didn't find files the count is 0 and we report this
if ($Files -eq 0) {
Write-Verbose "No files found in folder $($_.FullName)"
Write-Output $_.FullName
}
# If we found files it's ok and we don't report it
else {
Write-Verbose "Files found in folder $($_.FullName)"
}
}
}
# If you don't want to see output you can remove the '-Verbose' switch
Test-NewerFiles -Verbose
$MyNewFiles = Test-NewerFiles
$MyNewFiles | Out-File -FilePath $ExportFile -Encoding utf8
if ($MyNewFiles) {
$MailParams = #{
To = 'Chuck.Norris#world.com'
From = 'MyScriptServer#world.com'
SmtpServer = 'SMTPServer'
}
Send-MailMessage #MailParams -Priority High -Attachments $ExportFile -Body 'We found problems: check attachment for details'
}
else {
Send-MailMessage #MailParams -Priority Low -Body 'All is ok'
}
The Verbose switch is only used to report progress. So we can see what it does when it's running. But when we use this code in production, we don't need these messages and just use Test-NewerFiles instead of Test-NewerFiles -Verbose.
I have a folder which contains thousands of PDF files. I need to filter through these files based on file name (which will group these into 2 or more PDF's) and then merge these 2 more more PDF's into 1 PDF.
I'm OK with group the files but not sure the best way of then merging these into 1 PDF. I have researched iTextSharp but have been unable to get this to work in PowerShell.
Is iTextSharp the best way of doing this? Any help with the code for this would be much appreciated.
Many thanks
Paul
Have seen a few of these PowerShell-tagged questions that are also tagged with itextsharp, and always wondered why answers are given in .NET, which can be very confusing unless the person asking the question is proficient in PowerShell to begin with. Anyway, here's a simple working PowerShell script to get you started:
$workingDirectory = Split-Path -Parent $MyInvocation.MyCommand.Path;
$pdfs = ls $workingDirectory -recurse | where {-not $_.PSIsContainer -and $_.Extension -imatch "^\.pdf$"};
[void] [System.Reflection.Assembly]::LoadFrom(
[System.IO.Path]::Combine($workingDirectory, 'itextsharp.dll')
);
$output = [System.IO.Path]::Combine($workingDirectory, 'output.pdf');
$fileStream = New-Object System.IO.FileStream($output, [System.IO.FileMode]::OpenOrCreate);
$document = New-Object iTextSharp.text.Document;
$pdfCopy = New-Object iTextSharp.text.pdf.PdfCopy($document, $fileStream);
$document.Open();
foreach ($pdf in $pdfs) {
$reader = New-Object iTextSharp.text.pdf.PdfReader($pdf.FullName);
$pdfCopy.AddDocument($reader);
$reader.Dispose();
}
$pdfCopy.Dispose();
$document.Dispose();
$fileStream.Dispose();
To test:
Create an empty directory.
Copy code above into a Powershell script file in the directory.
Copy the itextsharp.dll to the directory.
Put some PDF files in the directory.
Not sure how you intend to group filter the PDFs based on file name, or if that's your intention (couldn't tell if you meant just pick out PDFs by extension), but that shouldn't be too hard to add.
Good luck. :)