I'm working on something that is extracting information from my desktop Outlook application. It works for most of the folders I've tried it on, but for some that have nearly a decade of e-mails, I get a "Exception getting 'ReceivedTime': 'Insufficient memory to continue the execution of the program." This is what I'm trying:
# New Outlook object
$ol = new-object -comobject "Outlook.Application";
# MAPI namespace
$mapi = $ol.getnamespace("mapi");
# Folder/Inbox
$folder = $mapi.Folders.Item('name#email.com').Folders.Item('Inbox')
# Sort by the Received Time
$contents = $folder.Items | sort ReceivedTime
# Get the first element in the array, convert to JSON, and then output to file
echo $contents[0] | convertTo-Json | Out-File C:\Users\ME\outlook_1.json -Encoding UTF8
Is there a better way of approaching this? I'm on Powershell 5.1.
EDIT: I've also tried this, which is looping through the array and then breaking on the first instance, but received the same error:
# New Outlook object
$ol = new-object -comobject "Outlook.Application";
# MAPI namespace
$mapi = $ol.getnamespace("mapi");
# Folder/Inbox
$folder = $mapi.Folders.Item('name#email.com').Folders.Item('Inbox')
# Sort by the Received Time
$contents = $folder.Items | sort ReceivedTime
$i = 1
foreach($item in $contents){
if (-not ([string]::IsNullOrEmpty($item))){
echo $item | convertTo-Json | Out-File Out-File C:\Users\ME\outlook_1.json -Encoding UTF8-Encoding UTF8
Break
}
}
Sort the items collection using Items.Sort("ReceivedTime", false), then read the first item using Items(1).
Make sure you store Items collection in a variable instead of accessing MAPIFolder.Items multiple times, otherwise you will get a brand new Items object every time you do that.
EDIT: I'm the OP of the question and am putting the correct code here for those who might be as dense as I am and not initially realize what is being said!
# New Outlook object
$ol = new-object -comobject "Outlook.Application";
# MAPI namespace
$mapi = $ol.getnamespace("mapi");
$folder = $mapi.Folders.Item('name#gmail.com').Folders.Item('Inbox')
# Get the items in the folder
$contents = $folder.Items
# Sort the items in the folder by the metadata, in this case ReceivedTime
$contents.Sort("ReceivedTime")
# Get the first item in the sorting; in this case, you will get the oldest item in your inbox.
$item = $contents.GetFirst()
echo $item
# If instead, you wanted to get the newest item, you could do the same thing but do $item = $contents.GetLast()
Related
I'm trying to filter about 2000 automated alerts in an outlook sub-folder.
I need to do the following series of steps:
Parse sub-folder Account Alert Lockouts
Search for a specific phrase that has a variable username
Dump out that whole phrase with the variable username into csv
Example Phrase
Account Name: jdoe
I have all of the required emails in a sub-folder, I just need to analyze them.
I've gotten my code to work in the Inbox, but it doesn't cover the sub-folder.
Add-Type -Assembly "Microsoft.Office.Interop.Outlook"
$Outlook = New-Object -ComObject Outlook.Application
$namespace = $Outlook.GetNameSpace("MAPI")
$inbox = $namespace.GetDefaultFolder([Microsoft.Office.Interop.Outlook.OlDefaultFolders]::olFolderInbox)
$RE = [RegEx]'(?sm)Account Name\s*:\s*(?<AccName>.*?)$.*'
$DebugPreference = 'Continue'
$Data = foreach ($item in $inbox.items) {
if ($item.body -match $RE) {
Write-Host "ding "
[PSCustomObject]#{ AccName = $Matches.AccName }
}
}
$Data
$Data | Export-CSv '.\data.csv' -NoTypeInformation
Per the documentation for NameSpace.GetDefaultFolder:
To return a specific non-default folder, use the Folders collection.
And the documentation for the Folders collection referenced above:
Use Folders (index), where index is the name or index number, to return a single Folder object. Folder names are case-sensitive.
You should be able to add this:
$subfolder = $inbox.Folders('Account Alert Lockouts')
and change your foreach to iterate over $subfolder.
I need help with PowerShell.
I will have to start renaming files in a weekly basis which I will be renaming more than 100 a week or more each with a dynamic name.
The files I want to rename are in a folder name Scans located in the "C: Documents\Scans". And they would be in order, to say time scanned.
I have an excel file located in "C: Documents\Mapping\ New File Name.xlsx.
The workbook has only one sheet and the new names would be in column A with x rows. Like mention above each cell will have different variables.
P Lease make comments on your suggestions so that I may understand what is going on since I'm a new to coding.
Thank you all for your time and help.
Although I agree with Ad Kasenally that it would be easier to use CSV files, here's something that may work for you.
$excelFile = 'C:\Documents\Mapping\New File Name.xlsx'
$scansFolder = 'C:\Documents\Scans'
########################################################
# step 1: get the new filenames from the first column in
# the Excel spreadsheet into an array '$newNames'
########################################################
$excel = New-Object -ComObject Excel.Application
$excel.Visible = $false
$workbook = $excel.Workbooks.Open($excelFile)
$worksheet = $workbook.Worksheets.Item(1)
$newNames = #()
$i = 1
while ($worksheet.Cells.Item($i, 1).Value() -ne $null) {
$newNames += $worksheet.Cells.Item($i, 1).Value()
$i++
}
$excel.Quit
# IMPORTANT: clean-up used Com objects
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($worksheet) | Out-Null
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($workbook) | Out-Null
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($excel) | Out-Null
[System.GC]::Collect()
[System.GC]::WaitForPendingFinalizers()
########################################################
# step 2: rename the 'scan' files
########################################################
$maxItems = $newNames.Count
if ($maxItems) {
$i = 0
Get-ChildItem -Path $scansFolder -File -Filter 'scan*' | # get a list of FileInfo objects in the folder
Sort-Object { [int]($_.BaseName -replace '\D+', '') } | # sort by the numeric part of the filename
Select-Object -First ($maxItems) | # select no more that there are items in the $newNames array
ForEach-Object {
try {
Rename-Item -Path $_.FullName -NewName $newNames[$i] -ErrorAction Stop
Write-Host "File '$($_.Name)' renamed to '$($newNames[$i])'"
$i++
}
catch {
throw
}
}
}
else {
Write-Warning "Could not get any new filenames from the $excelFile file.."
}
You may want to have 2 columns in the excel file:
original file name
target file name
From there you can save the file as a csv.
Use Import-Csv to pull the data into Powershell and a ForEach loop to cycle through each row with a command like move $item.original $item.target.
There are abundant threads describing using import-csv with forEach.
Good luck.
I am trying to convert a folder full of MSG files to HTML files. I have a scrip that gets most of the way there, but instead of displaying the text in powershell I need it to save each one as an individual html file. For some reason I can't get the save to work. I've tried various options like out-file and $body.SaveAs([ref][system.object]$name, [ref]$saveFormat)
$saveFormat = [Microsoft.Office.Interop.Outlook.olSaveAsType]::olFormatHTML
Get-ChildItem "C:\MSG\" -Filter *.msg |
ForEach-Object {
$body = ""
$outlook = New-Object -comobject outlook.application
$msg = $outlook.Session.OpenSharedItem($_.FullName)
$body = $msg | Select body | ft -AutoSize
}
Any advice on how to save this as individual files would be great.
To start with, you should not capture the output of a Format-* cmdlet in a variable. Those are designed to output to something (screen, file, etc).
Ok, that aside, you are already opening the msg files, so you just need to determine a name and then output the HTMLBody property for each file. Easiest way would be to just tack .htm to the end of the existing name.
Get-ChildItem "C:\MSG\*" -Filter *.msg |
ForEach-Object {
$body = ""
$outlook = New-Object -comobject outlook.application
$msg = $outlook.Session.OpenSharedItem($_.FullName)
$OutputFile = $_.FullName+'.htm'
$msg.HTMLBody | Set-Content $OutputFile
}
I recently finished my script with the help of someone on this site (Matt) Thanks again!
I now need to somehow get the logfile into a tabled format and I'm not sure how to implement that with the current setup of the script, any ideas?
Write-Host Report generated at (Get-date)
write-host("Lower Environments Status Check");
# Preprocessing Items
$msg = ""
$array = get-content C:\LowerEnvChecklist\appurls.txt
$log = "C:\LowerEnvChecklist\lowerenvironmentslog.txt"
$errorTexts = "error has occurred","Oops","Unable to display widget data","unexpected error occurred","temporarily unavailable","there was a problem"
$regex = ($errorTexts | ForEach-Object{[regex]::Escape($_)}) -join "|"
write-host("Checking appurls.txt...One moment please.");
("`n---------------------------------------------------------------------------") | out-file $log -Append
Get-Date | Out-File $log -Append
("`n***Checking Links***") | out-file $log -Append
("`n") | out-file $log -Append
# Loop through each element of the array.
ForEach($target in $array){
# Erase results for the next pass in case of error.
$result, $response, $stream, $page = $null
# Navigate to site urls
$result = [System.Net.WebRequest]::Create($target)
$response = $result.GetResponse()
$stream = [System.IO.StreamReader]$response.GetResponseStream()
$page = $stream.ReadToEnd()
# To ensure login/authentication pages that give a 403 response pages still show as online
If($response.StatusCode -eq 403){
$msg = " $target -----> is ONLINE!"}
# Determine if the status code 200 pages are truly up based on the information above.
If($response.StatusCode -eq 200){
# While the page might have rendered need to determine there are no errors present.
If($page -notmatch $regex){
$msg = " $target -----> is ONLINE!"
} else {
$msg = " $target -----> may be DOWN, please check!"
}
} else {
$msg = " $target -----> may be DOWN, please check!"
}
# Log Results.
$msg | Out-File $log -Append -width 120
write-host $msg
# Close the response.
$response.Close()
}
# Write completion to logfile.
("`n") | out-file $log -Append
("`n***Lower Environments Checklist Completed***") | out-file $log -Append
# Write completion to host.
write-host("Lower Environments Checklist Completed");
# Open logfile once script is complete.
Invoke-Item C:\LowerEnvChecklist\lowerenvironmentslog.txt
If you just want to view it in-script you could do Out-GridView on your log file. This will open a new window with a view of the data in the log file that looks like a table. Depending on your formatting you may have to add extra items like headers that are human readable.
To wet your whistle with structured output I opted to show you a CSV based solution. Either way all avenues require objects. What we do here is create a custom object that we populate as the script progresses. Each pass sends the details down the pipe. Using the pipeline we can use Export-CSV to collect all of the data in a nice file. Even filtering is possible now.
write-host("Lower Environments Status Check");
# Preprocessing Items
$array = Get-Content C:\LowerEnvChecklist\appurls.txt
$log = "C:\LowerEnvChecklist\lowerenvironmentslog.csv"
$errorTexts = "error has occurred","Oops","Unable to display widget data","unexpected error occurred","temporarily unavailable","there was a problem"
$regex = ($errorTexts | ForEach-Object{[regex]::Escape($_)}) -join "|"
# Loop through each element of the array. Use the pipeline to make output easier
$array | ForEach-Object{
# Keep the variable $target so it is not lost in scopes. Build the object to be completed as we go.
$target = [pscustomobject][ordered]#{
URL = $_
Status = ""
Detail = "N/A"
Timestamp = Get-Date
}
# Erase results for the next pass in case of error.
$result, $response, $stream, $page = $null
# Navigate to site urls. If we cant access the site set a flag to mark the site as down.
$result = [System.Net.WebRequest]::Create($target.URL)
$response = try{$result.GetResponse()}catch{$null}
switch([int]$response.StatusCode){
403{
$target.Status = "OK"
$target.Detail = "403"
}
200{
# Get page content to confirm up status
$stream = [System.IO.StreamReader]$response.GetResponseStream()
$page = $stream.ReadToEnd()
# While the page might have rendered need to determine there are no errors present.
If($page -notmatch $regex){
$target.Status = "OK"
} else {
$target.Status = "DOWN"
$target.Detail = "Pattern"
}
}
default{
$target.Status = "DOWN"
}
}
# Send the object down the pipeline
$target
# Close the response. The object might not exist so check before we call the methods.
if($response){$response.Close()}
if($stream){$stream.Close()}
} | Export-CSV -Path $log -NoTypeInformation
# Write completion to host.
write-host("Lower Environments Checklist Completed");
# Open logfile once script is complete.
Invoke-Item $log
I took the liberty off adding another column to your request called Detail it could add context. Not sure what you wanted from the date but if you have plenty of URLS and processing time then I suppose it could be of use. Also to reduce the if logic I added a switch statement. This would be more useful if you react to other status codes down the road. Still, good thing to know.
Sample Output
URL Status Detail Timestamp
--- ------ ------ ---------
https://7fweb DOWN N/A 1/11/2016 12:18:16 PM
http://www.google.ca OK N/A 1/11/2016 12:18:16 PM
http://www.microsoft.com DOWN Pattern 1/11/2016 12:18:16 PM
I added "windows" to $errorTexts to trigger a pattern match for microsoft.com
I've been putting "tags" into the names of files, but that's a terrible way of organizing a large number of files.
ex: "ABC - file name.docx"
So, I want to set the category attribute to "ABC" instead of having it in the name using PowerShell. The script would have to find all of the files with "ABC" in its name in the subdirectories of a certain folder and set the category attribute to "ABC".
So I have the first part where I am finding the files but I don't know where to go from here.
Get-ChildItem -Filter "ABC*" -Recurse
Any ideas?
Thanks.
So this borrow heavily from the Scripting Guys. What we need to do is for every file we find use the a Word COM object to access those special properties of the file. Using the current file name we extract the "category" by splitting on the first hyphen and saving both parts. First becomes the category and second is the new name we give the file assuming the category update was successful.
There is still margin for error with this but this
$path = "C:\temp"
# Create the Word com object and hide it sight
$wordApplication = New-Object -ComObject word.application
$wordApplication.Visible = $false
# Typing options for located Word Documents. Mainly to prevent changes to timestamps
$binding = "System.Reflection.BindingFlags" -as [type]
# Locate Documents.
$docs = Get-childitem -path $Path -Recurse -Filter "*-*.docx"
$docs | ForEach-Object{
$currentDocumentPath = $_.fullname
$document = $wordApplication.documents.open($currentDocumentPath)
$BuiltinProperties = $document.BuiltInDocumentProperties
$builtinPropertiesType = $builtinProperties.GetType()
$categoryUpdated = $false # Assume false as a reset of the state.
# Get the category from the file name for this particular file.
$filenamesplit = $_.Name.split("-",2)
$category = $filenamesplit[0].Trim()
# Attempt to change the property.
Try{
$BuiltInProperty = $builtinPropertiesType.invokemember("item",$binding::GetProperty,$null,$BuiltinProperties,"Category")
$BuiltInPropertyType = $BuiltInProperty.GetType()
$BuiltInPropertyType.invokemember("value",$binding::SetProperty,$null,$BuiltInProperty,[array]$category)
$categoryUpdated = $true
}Catch [system.exception]{
# Error getting property so most likely is not populated.
Write-Host -ForegroundColor Red "Unable to set the 'Category' for '$currentDocumentPath'"
}
# Close the document. It should save by default.
$document.close()
# Release COM objects to ensure process is terminated and document closed.
[System.Runtime.InteropServices.Marshal]::ReleaseComObject($BuiltinProperties) | Out-Null
[System.Runtime.InteropServices.Marshal]::ReleaseComObject($document) | Out-Null
Remove-Variable -Name document, BuiltinProperties
# Assuming the category was successfully changed lets remove the information from the current filename as it is redundant.
If($categoryUpdated){Rename-Item $currentDocumentPath -NewName $filenamesplit[1].Trim()}
}
$wordApplication.quit()
[System.Runtime.InteropServices.Marshal]::ReleaseComObject($wordApplication) | Out-Null
Remove-Variable -Name wordApplication
[gc]::collect()
[gc]::WaitForPendingFinalizers()
You should see some explanation in comments that I tried to add for clarification. Also read the link above to get more of an explanation as to what is happening with the COM object.