So I have a basic program (incredibly buggy but we quite like it) that uses a shared folder that a couple of people at school have access to (Paths have been changed for ease of use). It is designed to work as a messaging application, with each user writing into the same Notepad file to send a message to a Poweshell script using the Get-Content and -Wait parameter. I have added a couple of commands using "/", but I want one (i.e. /online) that a user can type and see all of the other people currently using the program.
I have tried to set up a different text file that is updated every x seconds by each individual user with their own user name, while wiping the previous record:
while (1){
Clear-Content -Path C:\users\Freddie\Desktop\ConvoOnline.txt
Start-Sleep -Milliseconds 5000
Add-Content -Path C:\users\Freddie\Desktop\ConvoOnline.txt $env:UserName
}
So this can be called upon later:
elseif($_ -match "/online"){Get-Content -Path C:\users\Freddie\Desktop\ConvoOnline.txt}
But this doesn't work, it won't sync up between users, so one user will wipe the current users and only that will apear as active, until the other users' cycle wipes THEIR name.
To avoid the XY Problem, I want a fairly simple way (still only using two files maximum) to determine which users are actively using (therefore updating) the Powershell script they are running.
Whole code:
Add-Type -AssemblyName System.speech
$speak = New-Object System.Speech.Synthesis.SpeechSynthesizer
$speak.Volume = 100
Write-Host "Type /helpp, save it, then hit backspace and save it again for a guide and list of commands!"
Get-Content -Path C:\users\Freddie\Desktop\Convo.txt -Wait |
%{$_ -replace "^", "$env:UserName "} |
%{if($_ -match "/cls"){cls} `
elseif($_ -match "/online"){Get-Content -Path C:\users\Freddie\Desktop \ConvoOnline.txt} `
elseif(($_ -match "/afk") -and ($env:UserName -eq "Freddie")){Write-Host "$env:UserName has gone afk"} `
elseif(($_ -match "/say") -and ($env:UserName -eq "Freddie")) {$speak.Speak($_.Substring(($_.length)-10))} `
elseif($_ -match "/whisper"){
$array = #($_ -split "\s+")
if($array[2] -eq "$env:UserName"){
Write-Host $array[2]
} `
} `
elseif($_ -match "/help"){
Write-Host "Help: `
1. Press Ctrl+S in Notepad to send your message `
2. Make sure you delete it after it's been sent `
3. If your message doesn't send properly, just hit backspace and all but the last letter will be sent `
`
COMMANDS: `
`
/online - Lists all users currently in the chat `
/cls - Clears you screen of all current and previous messages `
/whisper [USERNAME] [MESSAGE] - This allows you to send a message privately to a user"
}
else{Write-Host "$_"}}
#
#
#
#
#Add a command: elseif($_ -match "/[COMMAND]"){[FUNCTION]}
#
#Make it user-specific: elseif($_ -match "/[COMMAND]" -and $envUserName -eq "[USERNAME]"){[FUNCTION]}
You can add time stamp with add-content and another ps1 file for clearing data written before 5 seconds (you can do this in the same ps1 file but another ps1 file is better)
Modified user online updation part :
while ($true){
Add-Content -Path d:\ConvoOnline.txt "$($env:UserName);$(get-date)"
Start-Sleep -Milliseconds 5000
}
Another script which watches and clears content before 5 seconds ,so the online file is always updated
while($true){
Start-Sleep -Milliseconds 5000
$data = get-content -Path D:\ConvoOnline.txt
clear-content -Path D:\ConvoOnline.txt
if($data){
$data | %{if(!([datetime]($_.split(";")[1]) -lt (get-date).addmilliseconds(-4500))){Add-Content -Path d:\ConvoOnline.txt $_}}
}
}
Related
Continuing from my previous question:
I have Powershell script that exports Mailboxes Size Results to CSV file.
The Results contain "Total Size" column that display results, and follow by Name.
However, i want the exported CSV file to filter and display only "greater then" 25GB Results, from high to low.
Like that:
Now, there is the traditional way to use excel to filter to Numbers in the CSV results- after the powershell export.
But, i want to have it in the CSV file, so i do not have to do it over and over again.
Here's the script:
Param
(
[Parameter(Mandatory = $false)]
[switch]$MFA,
[switch]$SharedMBOnly,
[switch]$UserMBOnly,
[string]$MBNamesFile,
[string]$UserName,
[string]$Password
)
Function Get_MailboxSize
{
$Stats=Get-MailboxStatistics -Identity $UPN
$IsArchieved=$Stats.IsArchiveMailbox
$ItemCount=$Stats.ItemCount
$TotalItemSize=$Stats.TotalItemSize
$TotalItemSizeinBytes= $TotalItemSize –replace “(.*\()|,| [a-z]*\)”, “”
$TotalSize=$stats.TotalItemSize.value -replace "\(.*",""
$DeletedItemCount=$Stats.DeletedItemCount
$TotalDeletedItemSize=$Stats.TotalDeletedItemSize
#Export result to csv
$Result=#{'Display Name'=$DisplayName;'User Principal Name'=$upn;'Mailbox Type'=$MailboxType;'Primary SMTP Address'=$PrimarySMTPAddress;'IsArchieved'=$IsArchieved;'Item Count'=$ItemCount;'Total Size'=$TotalSize;'Total Size (Bytes)'=$TotalItemSizeinBytes;'Deleted Item Count'=$DeletedItemCount;'Deleted Item Size'=$TotalDeletedItemSize;'Issue Warning Quota'=$IssueWarningQuota;'Prohibit Send Quota'=$ProhibitSendQuota;'Prohibit send Receive Quota'=$ProhibitSendReceiveQuota}
$Results= New-Object PSObject -Property $Result
$Results | Select-Object 'Display Name','User Principal Name','Mailbox Type','Primary SMTP Address','Item Count',#{Name = 'Total Size'; Expression = {($_."Total Size").Split(" ")[0]}},#{Name = 'Unit'; Expression = {($_."Total Size").Split(" ")[1]}},'Total Size (Bytes)','IsArchieved','Deleted Item Count','Deleted Item Size','Issue Warning Quota','Prohibit Send Quota','Prohibit Send Receive Quota' | Export-Csv -Path $ExportCSV -Notype -Append
}
Function main()
{
#Check for EXO v2 module inatallation
$Module = Get-Module ExchangeOnlineManagement -ListAvailable
if($Module.count -eq 0)
{
Write-Host Exchange Online PowerShell V2 module is not available -ForegroundColor yellow
$Confirm= Read-Host Are you sure you want to install module? [Y] Yes [N] No
if($Confirm -match "[yY]")
{
Write-host "Installing Exchange Online PowerShell module"
Install-Module ExchangeOnlineManagement -Repository PSGallery -AllowClobber -Force
}
else
{
Write-Host EXO V2 module is required to connect Exchange Online.Please install module using Install-Module ExchangeOnlineManagement cmdlet.
Exit
}
}
#Connect Exchange Online with MFA
if($MFA.IsPresent)
{
Connect-ExchangeOnline
}
#Authentication using non-MFA
else
{
#Storing credential in script for scheduling purpose/ Passing credential as parameter
if(($UserName -ne "") -and ($Password -ne ""))
{
$SecuredPassword = ConvertTo-SecureString -AsPlainText $Password -Force
$Credential = New-Object System.Management.Automation.PSCredential $UserName,$SecuredPassword
}
else
{
$Credential=Get-Credential -Credential $null
}
Connect-ExchangeOnline -Credential $Credential
}
#Output file declaration
$ExportCSV=".\MailboxSizeReport_$((Get-Date -format yyyy-MMM-dd-ddd` hh-mm` tt).ToString()).csv"
$Result=""
$Results=#()
$MBCount=0
$PrintedMBCount=0
Write-Host Generating mailbox size report...
#Check for input file
if([string]$MBNamesFile -ne "")
{
#We have an input file, read it into memory
$Mailboxes=#()
$Mailboxes=Import-Csv -Header "MBIdentity" $MBNamesFile
foreach($item in $Mailboxes)
{
$MBDetails=Get-Mailbox -Identity $item.MBIdentity
$UPN=$MBDetails.UserPrincipalName
$MailboxType=$MBDetails.RecipientTypeDetails
$DisplayName=$MBDetails.DisplayName
$PrimarySMTPAddress=$MBDetails.PrimarySMTPAddress
$IssueWarningQuota=$MBDetails.IssueWarningQuota -replace "\(.*",""
$ProhibitSendQuota=$MBDetails.ProhibitSendQuota -replace "\(.*",""
$ProhibitSendReceiveQuota=$MBDetails.ProhibitSendReceiveQuota -replace "\(.*",""
$MBCount++
Write-Progress -Activity "`n Processed mailbox count: $MBCount "`n" Currently Processing: $DisplayName"
Get_MailboxSize
$PrintedMBCount++
}
}
#Get all mailboxes from Office 365
else
{
Get-Mailbox -ResultSize Unlimited | foreach {
$UPN=$_.UserPrincipalName
$Mailboxtype=$_.RecipientTypeDetails
$DisplayName=$_.DisplayName
$PrimarySMTPAddress=$_.PrimarySMTPAddress
$IssueWarningQuota=$_.IssueWarningQuota -replace "\(.*",""
$ProhibitSendQuota=$_.ProhibitSendQuota -replace "\(.*",""
$ProhibitSendReceiveQuota=$_.ProhibitSendReceiveQuota -replace "\(.*",""
$MBCount++
Write-Progress -Activity "`n Processed mailbox count: $MBCount "`n" Currently Processing: $DisplayName"
if($SharedMBOnly.IsPresent -and ($Mailboxtype -ne "SharedMailbox"))
{
return
}
if($UserMBOnly.IsPresent -and ($MailboxType -ne "UserMailbox"))
{
return
}
Get_MailboxSize
$PrintedMBCount++
}
}
#Open output file after execution
If($PrintedMBCount -eq 0)
{
Write-Host No mailbox found
}
else
{
Write-Host `nThe output file contains $PrintedMBCount mailboxes.
if((Test-Path -Path $ExportCSV) -eq "True")
{
Write-Host `nThe Output file available in $ExportCSV -ForegroundColor Green
$Prompt = New-Object -ComObject wscript.shell
$UserInput = $Prompt.popup("Do you want to open output file?",`
0,"Open Output File",4)
If ($UserInput -eq 6)
{
Invoke-Item "$ExportCSV"
}
}
}
#Disconnect Exchange Online session
Disconnect-ExchangeOnline -Confirm:$false | Out-Null
}
. main
How can i achieve that?
It's a bit of an unfortunate scenario, but if you're outputting the results to the file on each iteration, you have 2 options:
At the end of the script, read the output file, filter the >25gb mailboxes, sort the objects, then output again
Instead of writing the user mailbox to the file each user, save to a
variable instead. At the end, filter, sort, then export to file
Without going too far into the code...
Option 1
Might be simplest, as you're working off two input sizes already. After you've retrieved all mailboxes and gotten all statistics, read the exported csv file and filter + sort the data. Then, export the info back to the file, overwriting. Something like
$tempImport = Import-CSV $exportCSV | Where-Object {($_.'Total Size' -ge 25) -and ($_.Unit -eq "GB")} | Sort-Object 'Total Size' -descending
$tempImport | Export-CSV $exportCSV -noTypeInformation
PowerShell may not like overwriting a file read-in on the same command, hence the saving as a temp variable.
Option 2
Create a live variable storing all mailbox data, and write the information at the end of the script instead of opening the file to append data each iteration. Then, at the end of the script, filter and sort before exporting.
$global:largeMailboxes = #() # definition to allow reading through all functions
Then, instead of exporting to CSV each time, add the result to the above variable
$tvar = $null
$tvar = $Results | Select-Object 'Display Name','User Principal Name','Mailbox Type','Primary SMTP Address','Item Count',#{Name = 'Total Size'; Expression = {($_."Total Size").Split(" ")[0]}},#{Name = 'Unit'; Expression = {($_."Total Size").Split(" ")[1]}},'Total Size (Bytes)','IsArchieved','Deleted Item Count','Deleted Item Size','Issue Warning Quota','Prohibit Send Quota','Prohibit Send Receive Quota'
$global:largeMailboxes += $tvar
#
# Alternatively, only add the mailbox if it's larger than 25GB, to avoid adding objects you don't care about
if ($TotalItemSizeinBytes -ge 26843545600) # This is 25 GB, better to make a variable called $minSize or such to store this in, in case you want to change it later.
{
# Above code to add to global variable
}
Once all mailboxes have been added, sort the object
$global:largeMailboxes = $global:largeMailboxes | Sort-Object 'Total Size' -descending
Then export as needed
$global:largeMailboxes | Export-CSV $exportCSV -NoTypeInformation
I have my current script display results but when it comes up it just puts everything in one long column but they are in order. Any ideas how I can get three columns called "computer name" "Patch#" "Installed?".
Below is my code
$strCategory = "computer"
$objDomain = New-Object System.DirectoryServices.DirectoryEntry
$objSearcher = New-Object System.DirectoryServices.DirectorySearcher
$ObjSearcher.SearchRoot = $objDomain
$objSearcher.filter = ("(objectCategory=$strCategory)")
$colProplist = "name"
foreach($i in $colProplist){
$objSearcher.PropertiesToLoad.Add($i)
}
#Finds all operating systems and computer names
$colResults = $objSearcher.FindAll()
foreach($objResult in $colResults){
$objComputer = $objResult.Properties;
$names = $objComputer.name
# Define Hotfix to check
$CheckKBS = #("KB2937610", "KB4019264")
# Query the computers for the HotFix
foreach($name in $names){
foreach ($CheckKB in $CheckKBS) {
$HotFixQuery = Get-HotFix -ComputerName $name | Where-Object {$_.HotFixId -eq $CheckKB} | Select-Object -First 1;
if($HotFixQuery -eq $null) {
Out-file C:\Users\xxx\Desktop\12.csv -Append -InputObject "computername: $name"
Out-file C:\Users\xxx\Desktop\12.csv -Append -InputObject "“KB Patch: $CheckKB"
Out-file C:\Users\xxx\Desktop\12.csv -Append -InputObject "Hotfix $CheckKB is not installed on $name"
} else {
Out-file C:\xxx\xxx\xxx\12.csv -Append -InputObject "computername: $name"
Out-file C:\xxx\xxx\xxx\12.csv -Append -InputObject "KB Patch; $CheckKB"
Out-file C:\xxx\xxx\xxx\12.csv -Append -InputObject "Hotfix $CheckKB was installed on $name by $($HotFixQuery.InstalledBy)"
}
}
}
}
You're outputting raw text into a CSV which is OK but you have to maintain the intended destination format.
out-file is going to output raw text, one per line. That is why your calls are creating a single column. You can change your code to collapse your output to 1 line that is separated by commas:
Out-file C:\Users\xx\Desktop\12.csv -Append -InputObject “computer name: $name,KB Patch: $CheckKB,Hotfix $CheckKB is not installed on $name”
This will write 1 raw line to your CSV file. Your fields will be respected as columns assuming the rest of the CSV file is intact.
I doubt you want to repeat your headers per line but wanted to show you what it would look like. Now you are going to want to write the initial line of the CSV file before you start your loop to output the headers:
Out-file C:\Users\xx\Desktop\12.csv -Append -InputObject “Computer Name,KB Patch,Hotfix”
Then just remove the header stuff from the lines written inside your loop. This does mean that the header will be written with each run so you are either going to want to wipe the file at the start or you are going to want to test if it exists before you write the header.
There are better ways to do this but this doesn't require you to make significant changes to your work. I would create a custom object for each record, create a collection of those objects, and then export them via export-csv. If that would better suite your need please just comment a request for it.
I have a script that I have put together, for the most part the script does what I want it to do, it hit's a list of servers, looking for log files that are 25+ hours old (indicating that another script isn't doing it's job), this worked perfectly in testing(1 to 5 servers), however once I turned it loose on the 150+ servers I want to check on in this environment, the file size increased, and the email process failed due to the fact the filesize is in excess of 10mb.
So now I need a way to compress the results, I would like to use 7zip, but for some reason I just cannot wrap my head around how to accomplish what I'm trying to do.
Any assistance would be greatly appreciated.
Here is the script I have thus far.
# Specify where the list of servers is located.
$SL = get-content C:\Scripts\lists\AgingLogsServers.txt
# Define logfile age to report on.
$limit = (Get-Date).AddHours(-25)
# Define the current date & time.
$filedate = get-date -f "MM.dd.yy_hh.mm.ss"
$emldate = get-date -f "MM.dd.yy"
# Variable to add current date & time to saved filename.
$filename = "AgingReport_$($filedate).log"
# Files or patterns to exclude from the scan.
$excluded = #(".exe")
# Specify SMTP server
$smtpserver = "mail.yourserver.com"
# Loop to process each server in the pool.
Foreach ($Server in $SL){
$c1++
Write-Progress -Activity 'Looking for Logfiles in excess of 25 hours old' -Status "Processing $($c1) of $($SL.count)" -CurrentOperation $Server -PercentComplete (($c1/$SL.count) * 100)
If (Test-Path "\\$Server\logs") {$SP1 = "\\$Server\Logs"}
Else {$SP1 = "\\$Server\D-Logs"}
Get-ChildItem -ErrorAction SilentlyContinue -Path $SP1 -Exclude $excluded -Include *.zip, *.7z, *.log -Recurse | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $limit } | Foreach-Object {write-output $_.CreationTime $server $_.fullname} | Out-file C:\Scripts\data\$filename -Append -Width 160
}
# Zip the $filename and remove the original
# And this is where I believe a 7zip process would go to compress the result file, then I can reference that file and path in the Send-MailMessage line.
# Email the results.
Send-MailMessage -From "Aging.Logs#yourhost.com" -To "user#yourhost.com" -Subject "Aging Logs report for $emldate" -Body "Attached is the listing of aging logs for the environment for $emldate" -SmtpServer $smtpserver -Attachments C:\Scripts\data\$filename
# Give the logfile time to release from the email process.
Start-Sleep -s 120
# Clean up the results file.
#Remove-Item C:\Scripts\data\AgingReport*
Running 7-Zip is pretty easy. The syntax is
7z.exe a <archive path> <file to zip path>
That's easy, we just need to know where 7z.exe is. So, we'll make PowerShell find that, then execute it using the call operator &, with those parameters (by the way, the 'a' means that we're adding a file to an archive). Then we clean up the source file, and email the archive.
# Zip the $filename and remove the original
# Find 7-Zip executable
$7Zip = gci c:\Program* -include '7z.exe' -recurse -ea 4|select -first 1 -expand fullname
# Define archive name and path
$ZipFile = "C:\Scripts\data\$filename" -replace "...$","zip"
# Perform Zip
& $7Zip a "$ZipFile" "C:\Scripts\data\$filename" | Out-Null
# Remove source file
Remove-Item -Path "C:\Scripts\data\$filename"
# Email the results.
Send-MailMessage -From "Aging.Logs#yourhost.com" -To "user#yourhost.com" -Subject "Aging Logs report for $emldate" -Body "Attached is the listing of aging logs for the environment for $emldate" -SmtpServer $smtpserver -Attachments $ZipFile
Your archive, by the way, will be the same as your log file, but with a .zip file extension instead of .log.
#TheMadTechnician, Thank you for your very helpful post, I attempted to integrate what you gave me, but no matter how I went about it I was unable to get the desired action, I took the direction that you provided and finally was able to get it to work, here is the code that does everything that I wanted it to do, in-case anyone else is looking to accomplish the same thing.
<#
Script: AgingLogQuery.ps1
Author: Xander J.
Date: 11/12/2015
Aging log query checks the logs and d-logs shares contained within a text file to see if there are any logfiles older than 25
hours old,if it finds a logfile that is older than 25 hours old it passes the server name, the full path and filename and the
files age to the AgingReport log file.
After checking all of the servers in the list, the script archives the logfile, removes the original logfile, emails the
archive as an attachment, then waits a specified amount of time to remove the archive file.
#>
# Specify where the list of servers is located.
$SL = get-content C:\Scripts\lists\AgingLogsServers.txt
# Define logfile age to report on.
$limit = (Get-Date).AddHours(-25)
# Define the current date & time.
$filedate = get-date -f "MM.dd.yy_hh.mm.ss"
$emldate = get-date -f "MM.dd.yy"
# Variable to add current date & time to saved filename.
$filename = "AgingReport_$($filedate).log"
# Files or patterns to exclude from the scan.
$excluded = #("*.exe*")
# Specify SMTP server
$smtpserver = "mail.email.com"
#Get script path
$ScriptPath = Split-Path -Path $($MyInvocation.MyCommand.Definition)
#Get the path for 7za.exe
$zipexe = $ScriptPath + "\7za.exe"
set-alias sz $zipexe
$archive = "AgingReport_$($filedate).zip"
# Loop to process each server on the list.
Foreach ($Server in $SL){
$c1++
Write-Progress -Activity 'Looking for Logfiles in excess of 25 hours old' -Status "Processing $($c1) of $($SL.count)" -CurrentOperation $Server -PercentComplete (($c1/$SL.count) * 100)
If (Test-Path "\\$Server\logs") {$SP1 = "\\$Server\Logs"}
Else {$SP1 = "\\$Server\D-Logs"}
Get-ChildItem -ErrorAction SilentlyContinue -Path $SP1 -Exclude $excluded -Include *.zip, *.7z, *.log -Recurse | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $limit } | Foreach-Object {write-output $_.CreationTime $server $_.fullname} | Out-file C:\Scripts\data\$filename -Append -Width 160
}
# Zip the $filename
& sz a -mmt -tzip c:\Scripts\Data\$archive C:\Scripts\data\AgingReport*.log -stl
# Clean up the results file.
Remove-Item -Force C:\Scripts\data\$filename
# Email the results.
Send-MailMessage -From "Aging.Logs#echopass.com" -To "user#email.com" -Subject "Aging Logs report for $emldate" -Body "Attached is the listing of aging logs for the environment for $emldate" -SmtpServer $smtpserver -Attachments C:\Scripts\data\$archive
# Give the logfile time to release from the email process.
Start-Sleep -s 15
# Clean up the results file.
Remove-Item -Force C:\Scripts\data\$archive
I spent days trying to implement a parallel jobs and queues system, but... I tried but I can't make it. Here is the code without implementing nothing, and CSV example from where looks.
I'm sure this post can help other users in their projects.
Each user have his pc, so the CSV file look like:
pc1,user1
pc2,user2
pc800,user800
CODE:
#Source File:
$inputCSV = '~\desktop\report.csv'
$csv = import-csv $inputCSV -Header PCName, User
echo $csv #debug
#Output File:
$report = "~\desktop\output.csv"
#---------------------------------------------------------------
#Define search:
$findSize = 40GB
Write-Host "Lonking for $findSize GB sized Outlook files"
#count issues:
$issues = 0
#---------------------------------------------------------------
foreach($item in $csv){
if (Test-Connection -Quiet -count 1 -computer $($item.PCname)){
$w7path = "\\$($item.PCname)\c$\users\$($item.User)\appdata\Local\microsoft\outlook"
$xpPath = "\\$($item.PCname)\c$\Documents and Settings\$($item.User)\Local Settings\Application Data\Microsoft\Outlook"
if(Test-Path $W7path){
if(Get-ChildItem $w7path -Recurse -force -Include *.ost -ErrorAction "SilentlyContinue" | Where-Object {$_.Length -gt $findSize}){
$newLine = "{0},{1},{2}" -f $($item.PCname),$($item.User),$w7path
$newLine | add-content $report
$issues ++
Write-Host "Issue detected" #debug
}
}
elseif(Test-Path $xpPath){
if(Get-ChildItem $w7path -Recurse -force -Include *.ost -ErrorAction "SilentlyContinue" | Where-Object {$_.Length -gt $findSize}){
$newLine = "{0},{1},{2}" -f $($item.PCname),$($item.User),$xpPath
$newLine | add-content $report
$issues ++
Write-Host "Issue detected" #debug
}
}
else{
write-host "Error! - bad path"
}
}
else{
write-host "Error! - no ping"
}
}
Write-Host "All done! detected $issues issues"
Parallel data processing in PowerShell is not quite simple, especially with
queueing. Try to use some existing tools which have this already done.
You may take look at the module
SplitPipeline. The cmdlet
Split-Pipeline is designed for parallel input data processing and supports
queueing of input (see the parameter Load). For example, for 4 parallel
pipelines with 10 input items each at a time the code will look like this:
$csv | Split-Pipeline -Count 4 -Load 10, 10 {process{
<operate on input item $_>
}} | Out-File $outputReport
All you have to do is to implement the code <operate on input item $_>.
Parallel processing and queueing is done by this command.
UPDATE for the updated question code. Here is the prototype code with some
remarks. They are important. Doing work in parallel is not the same as
directly, there are some rules to follow.
$csv | Split-Pipeline -Count 4 -Load 10, 10 -Variable findSize {process{
# Tips
# - Operate on input object $_, i.e $_.PCname and $_.User
# - Use imported variable $findSize
# - Do not use Write-Host, use (for now) Write-Warning
# - Do not count issues (for now). This is possible but make it working
# without this at first.
# - Do not write data to a file, from several parallel pipelines this
# is not so trivial, just output data, they will be piped further to
# the log file
...
}} | Set-Content $report
# output from all jobs is joined and written to the report file
UPDATE: How to write progress information
SplitPipeline handled pretty well a 800 targets csv, amazing. Is there anyway
to let the user know if the script is alive...? Scan a big csv can take about
20 mins. Something like "in progress 25%","50%","75%"...
There are several options. The simplest is just to invoke Split-Pipeline with
the switch -Verbose. So you will get verbose messages about the progress and
see that the script is alive.
Another simple option is to write and watch verbose messages from the jobs,
e.g. Write-Verbose ... -Verbose which will write messages even if
Split-Pipeline is invoked without Verbose.
And another option is to use proper progress messages with Write-Progress.
See the scripts:
Test-ProgressJobs.ps1
Test-ProgressTotal.ps1
Test-ProgressTotal.ps1 also shows how to use a collector updated from jobs
concurrently. You can use the similar technique for counting issues (the
original question code does this). When all is done show the total number of
issues to a user.
I'm writing a powershell script to automate some updates.
For this purpose I need to execute another script and save the output into a variable.
Afterwards I can cut the things that I need off and save them into other variables.
These things theoreticly work but the other script which I'm executing stops the process,
beause it need any key to continue at the end.
Does somebody know how I can pass this?
The scripts stops after:
$list = .\list.cmd
Kind regards :)
Thats a part of the script:
Write-Host "Importing..."
cd "$path"
$list = .\list.cmd
Write-Host "Searching for the certificate file"
$CertificateFile = $list | where {$_ -match "Certificate File:"}
$CertificateFile = $CertificateFile.Substring(18)
Write-Host "I'm trying to find the Password File:"
$PasswordFile = $PasswordFile = $list | where {$_ -match "Password File:"}
$PasswordFile.Substring(15)
Write-Host "Searching for the password file"
$Enddate = $list | where {$_ -match "Validity NotAfter:"}
$Enddate = $Enddate.Substring(19)
Here is how you send a keystroke. As for timing, it needs to occur when the paused app is in the foreground 'active' window.
add-type -AssemblyName System.Windows.Forms
[System.Windows.Forms.SendKeys]::SendWait("A")