The purpose of this code is to get a list of all used executables from a specific folder. After a month we will delete any exe's not on this list.
I currently get the correct results using this:
while ($true) {
foreach ($process in Get-Process | where {$_.Path -imatch 'ksv'} | select -Unique) {
$dir = $process | Get-ChildItem;
New-Object -TypeName PSObject -Property #{
'Path' = $process.Path;
} | Out-String | Add-Content -LiteralPath Z:\processList.txt
}
Get-Content Z:\processList.txt | sort | Get-Unique > Z:\uniqueprocesslist.txt
}
I'm going to get rid of the while loop as this will be eventually running as a service.
The problem with this is that it creates a huge list in processlist.txt that I would like to eliminate to save space.
I tried to come up with a better solution that scans the text file to see if the path is written already before adding the new process path. I am not sure what I am doing wrong but nothing is ever written to the text file
while ($true) {
foreach ($process in Get-Process | where {$_.Path -imatch 'ksv'} | select -Unique) {
$dir = $process | Get-ChildItem;
$progPath = New-Object -TypeName PSObject -Property #{
'Path' = $process.Path
}
$file = Get-Content "Z:\processList.txt"
$containsLine = $file | %{$_ -match $progPath}
if ($containsLine -contains $false) {
Add-Content -LiteralPath Z:\processList.txt
}
}
}
If I understand your question correctly you want to build a "recently used" list of executables in a specific directory in a file, and update that (unique) list with each run of your script.
Something like this should do that:
$listfile = 'Z:\processlist.txt'
# Build a dictionary from known paths, so that we can check for already known
# paths with an index lookup instead of a linear search over an array.
$list = #{}
if (Test-Path -LiteralPath $listfile) {
Get-Content $listfile | ForEach-Object {
$list[$_] = $true
}
}
# List processes, expand their path, then check if the path contains the
# string "ksv" and isn't already known. Append the results to the list file.
Get-Process |
Select-Object -Expand Path |
Sort-Object -Unique |
Where-Object {$_ -like '*ksv*' -and -not $list.ContainsKey($_)} |
Add-Content $listfile
Hashtable lookup and wildcard match are used for performance reasons, because they're significantly faster than linear searches in arrays and regular expression matches.
while ($true) {
$file = Get-Content "Z:\processList.txt"
$KSVPaths = Get-Process |
Where-Object {$_.Path -imatch 'ksv'} |
Select-Object -ExpandProperty Path |
Select-Object -Unique
ForEach ($KSVPath in $KSVPaths) {
if ($KSVPath -notin $file) {
Add-Content -Path $file -Value $KSVPath
}
}
}
Related
I'm new to PS scripting (really, I started today) and, for a project, I need to create a .txt file with all the extensions from all shared folders on the local machine (a Windows file server).
I think I'm on the right path with this :
get-childitem -Path C:\test -Recurse | select extension -unique > $PSScriptRoot\ExtensionList.txt
It's doing exactly what I want for a given path and all subfolders but now I need to apply this to all shared folders on the machine.
I was able to list all the shared folder's path with this command :
$Shares= #(Get-WmiObject Win32_Share |
Select Name,Path,Type |
Where-Object { $_.Type -match '0|2147483648' } |
Select -ExpandProperty Path |
Select -Unique)
Write-Host $Shares
Now I'm stuck, I suppose I need to use the foreach command but I can't find the way to make it work.
Can someone help me put this together ?
Thanks,
You can try Get-SMBShare cmdLet:
Get-SMBShare | Foreach {
Get-ChildItem "\\$($_.name)" | Select-Object Extension -Unique
}
You're probably looking for something similar to this:
$Shares = #( Get-CimInstance Win32_Share | Where-Object { $_.Type -match '0|2147483648' } | Select -Unique )
ForEach ( $Share In $Shares ) { Get-ChildItem -Path $Share.Path -File -Recurse -ErrorAction Ignore | Select -Unique -ExpandProperty Extension }
I'll leave you to split the lines to match your particular style and to output to a file, (I'd advise that you consider using Out-File instead of > for that).
Thank you guys for your help! I was able to figure it out.
The following script will gather all extensions on shared folders, sort them, eliminate duplicates and empty lines, add "*' before the extension and create a file list.txt with the result.
#get shares
$Shares = #( Get-CimInstance Win32_Share |
Where-Object { $_.Type -match '0|2147483648' } |
Select -Unique )
#list all extensions
ForEach ( $Share In $Shares ) { Get-ChildItem -Path $Share.Path -File -Recurse -ErrorAction Ignore | Select -Unique -ExpandProperty Extension | out-file C:\extensions\List1.txt -append }
#remove empty lines
#(gc C:\extensions\List1.txt) -match '\S' | out-file C:\extensions\List2.txt
#Add * before extention type
gc C:\extensions\List2.txt | %{"*$_"} | out-file C:\extensions\List3.txt
#Sort by name
gc C:\extensions\List3.txt | sort | get-unique > C:\extensions\List4.txt
#Remove duplicates
$hash = #{}
gc C:\extensions\List4.txt |
%{if($hash.$_ -eq $null) { $_ }; $hash.$_ = 1} > C:\extensions\List.txt
#Delete list1-4
Remove-Item C:\extensions\List1.txt, C:\extensions\List2.txt, C:\extensions\List3.txt, C:\extensions\List4.txt
So I'm trying to process CSV files, then giving the output new name. I can do it with one file by explicitly specifying the file name. But is there a way / wildcard I can use to make the script to process multiple files at the same time? Let's just say I want to process anything with .csv as an extension. Here's my script that's used to process a specific file
$objs =#();
$output = Import-csv -Path D:\TEP\FilesProcessing\Test\file1.csv | ForEach {
$Object = New-Object PSObject -Property #{
Time = $_.READ_DTTM
Value = $_.{VALUE(KWH)}
Tag = [String]::Concat($_.SUBSTATION,'_',$_.CIRCUITNAME,'_',$_.PHASE,'_',$_.METERID,'_KWH')
}
$objs += $Object;
}
$objs
$objs | Export-CSv -NoTypeInformation D:\TEP\FilesProcessing\Test\file1_out.csv
You can combine Get-ChildItem and Import-Csv.
Here's an example that specifies different input and output directories to avoid name collisions:
$inputPath = "D:\TEP\FilesProcessing\Test"
$outputPath = "D:\TEP\FilesProcessing\Output"
Get-ChildItem (Join-Path $inputPath "*.csv") | ForEach-Object {
$outputFilename = Join-Path $outputPath $_.Name
Import-Csv $_.FullName | ForEach-Object {
New-Object PSObject -Property #{
"Time" = $_.READ_DTTM
"Value" = $_.{VALUE(KWH)}
"Tag" = "{0}_{1}_{2}_{3}_KWH" -f $_.SUBSTATION,$_.CIRCUITNAME,$_.PHASE,$_.METERID
}
} | Export-Csv $outputFilename -NoTypeInformation
}
Note that there's no need for creating an array and repeatedly appending it. Just output the custom objects you want and export afterwards.
Use the Get-Childitem and cut out all the unnecessary intermediate variables so that you code it in a more Powershell type way. Something like this:
Get-CHhilditems 'D:\TEP\FilesProcessing\Test\*.csv' | % {
Import-csv $_.FullName | % {
New-Object PSObject -Property #{
Time = $_.READ_DTTM
Value = $_.{VALUE(KWH)}
Tag = '{0}_{1}_{2}_{3}_KWH' -f $_.SUBSTATION, $_.CIRCUITNAME, $_.PHASE, $_.METERID
}
} | Export-CSv ($_.FullName -replace '\.csv', '_out.csv') -NoTypeInformation
}
The Get-ChildItem is very useful for situations like this.
You can add wildcards directly into the path:
Get-ChildItem -Path D:\TEP\FilesProcessing\Test\*.csv
You can recurse a path and use the provider to filter files:
Get-ChildItem -Path D:\TEP\FilesProcessing\Test\ -recurse -include *.csv
This should get you what you need.
$Props = #{
Time = [datetime]::Parse($_.READ_DTTM)
Value = $_.{VALUE(KWH)}
Tag = $_.SUBSTATION,$_.CIRCUITNAME,$_.PHASE,$_.METERID,'KWH' -join "_"
}
$data = Get-ChildItem -Path D:\TEP\FilesProcessing\Test\*.csv | Foreach-Object {Import-CSV -Path $_.FullName}
$data | Select-Object -Property $Props | Export-CSv -NoTypeInformation D:\TEP\FilesProcessing\Test\file1_out.csv
Also when using Powershell avoid doing these things:
$objs =#();
$objs += $Object;
We have a file server that processes files that are received. When a file fails to process for whatever reason, it is moved into a failure folder. I've written a script to iterate through every possible one of these folders and spit out the FullName of the file into an e-mail which it sends to me.
Now when I run it manually, it works fine. However, when I set it as a scheduled task (running as Local System), the script still runs successfully, but the e-mail contains paths like \\blahblah\blah\blahblahblah\bl.....
I've tweaked the script a bunch of different ways and every time the output ends up the same. When I run it manually, it works as intended, when it runs as an automated script, it truncates the FullNames. I've found other people with this issue, but not as an automated task.
This is the relevant code of the script.
$emailFileList = ""
$filelist = #()
try {
GCI $topLevelPath -Recurse |
? { $_.PSIsContainer } |
ForEach-Object {
dir $_.FullName |
Where-Object {$_.FullName -like $unableToProcess} | ForEach-Object {
$filelist += dir $_.FullName
}
}
$emailFileList = Out-String -InputObject $($filelist | Select-Object FullName | Format-Table -AutoSize)
$emailBody = $emailBody + $emailFileList
}
EDIT:
I used the HTML method below but it added a bunch of junk markup. I added 4 lines to replace the markup with quotes. The inside of the try block now looks like this, and it works even as scheduled tasks.
GCI $topLevelPath -Recurse |
? { $_.PSIsContainer } |
ForEach-Object {
dir $_.FullName |
Where-Object {$_.FullName -like $unableToProcess} | ForEach-Object {
$filelist += dir $_.FullName
}
}
$emailFileList = $filelist | Select-Object FullName | ConvertTo-Html -fragment
$emailFileList = [regex]::Replace($emailFileList, "<table>.+</th></tr>", "")
$emailFileList = $emailFileList -replace '<tr><td>', '"'
$emailFileList = $emailFileList -replace '</td></tr>', """`r`n"
$emailFileList = $emailFileList -replace '</table>', ''
$emailBody = $emailBody + $emailFileList
I guess I also technically used regex on html what have I done noooooooo
Edit: Regardling answer "duplication" the problem above is SPECIFICALLY an interaction between powershell and the windows scheduled tasks.
This gives the kind of output you would probably expect
[command] | Format-Table -AutoSize | Out-String -Width 10000 #| clip.exe
Since you were using Format-Table -Autosize it was probably truncating due to the the amount of characters per line in the powershell instance. You can use the ConvertTo-Html function with the -Fragment command to create an HTML table.
Try something like this:
$emailFileList = ""
$filelist = #()
try {
Get-ChildItem $topLevelPath -Recurse `
| Where-Object -Property PSIsContainer -EQ -Value $True `
| ForEach-Object {
Get-ChildItem $_.FullName |
Where-Object -Property FullName -Like -Value $UnableToProcess `
| ForEach-Object {
$filelist += Get-ChildItem $_.FullName
}
}
$emailFileList = $filelist | Select-Object FullName | ConvertTo-Html -Fragment
$emailBody = $emailBody + $emailFileList
}
catch
{
}
Your problem is the formatter truncating because the console host can't render the full strings. Here's a solution where you'll get a txt with a list of names that can be used however you want
General rule of thumb: filter left, format right.
#Requires -Version 3
Try
{
Get-ChildItem -Path $TopLevelPath -Recurse -Folder |
## If you're on version 2, replace the -Folder switch with the following:
#Where-Object { $_.PSIsContainer }
ForEach-Object {
## If version 2, remove #().FullName and replace with
# | Select-Object -ExpandProperty FullName
$FileList = #(Get-ChildItem -Path $_.FullName -Filter "*$UnableToProcess*").FullName
}
$EmailBody += #($FileList)
}
Catch
{
}
I need to take a slew of csv files from a directory and get them into an array in Powershell (to eventually manipulate and write back to a CSV).
The problem is there are 5 file types. I need around 8 columns from each. The columns are essentially the same, but have different headings.
Is there an easy way to do this? I started creating a custom object with my 8 fields, looping through the files importing each one, looking at the filename (which tells me the column names I need) and then a bunch of ifs to add it to my custom object array.
I was wondering if there is a simpler way...like with a template saying which columns from each file.
wound up doing this. It may have not been the most efficient, but works. I wound up writing out each file separately and combining at the end as PS really got bogged down (over a million rows combined).
$Newcsv = #()
$path = "c:\scrap\BWFILES\"
$files = gci -path $path -recurse -filter *.csv | Where-Object { ! ($_.psiscontainer) }
$counter=1
foreach($file in $files)
{
$csv = Import-Csv $file.FullName
if ($file.Name -like '*SAV*')
{
$Newcsv = $csv | Select-Object #{Name="PRODUCT";Expression={"SV"}},DMBRCH,DMACCT,DMSHRT
}
if ($file.Name -like '*TIME*')
{
$Newcsv = $csv | Select-Object #{Name="PRODUCT";Expression={"TM"}},TMBRCH,TMACCT,TMSHRT
}
if ($file.Name -like '*TRAN*')
{
$Newcsv = $csv | Select-Object #{Name="PRODUCT";Expression={"TR"}},DMBRCH,DMACCT,DMSHRT
}
if ($file.Name -like '*LN*')
{
$Newcsv = $csv | Select-Object #{Name="PRODUCT";Expression={"LN"}},LNBRCH,LNNOTE,LNSHRT
}
$Newcsv | Export-Csv "C:\scrap\$file.name$counter.csv" -force -notypeinformation
$counter++
}
get-childItem "c:\scrap\*.csv" | foreach {
$filePath = $_
$lines = $lines = Get-Content $filePath
$linesToWrite = switch($getFirstLine) {
$true {$lines}
$false {$lines | Select -Skip 1}
}
$getFirstLine = $false
Add-Content "c:\scrap\combined.csv" $linesToWrite
}
With a hashtable for reference, a little RegEx matching, and using the automatic variable $Matches in a ForEach-Object loop (alias % used) that could all be shortened to:
$path = "c:\scrap\BWFILES\"
$Reference = #{
'SAV' = 'SV'
'TIME' = 'TM'
'TRAN' = 'TR'
'LN'='LN'
}
Set-Content -Value "PRODUCT,BRCH,ACCT,SHRT" -Path 'c:\scrap\combined.csv'
gci -path $path -recurse -filter *.csv | Where-Object { !($_.psiscontainer) -and $_.Name -match ".*(SAV|TIME|TRAN|LN).*"}|%{
$Product = $Reference[($Matches[1])]
Import-CSV $_.FullName | Select-Object #{Name="PRODUCT";Expression={$Product}},*BRCH,#{l='Acct';e={$_.LNNOTE, $_.DMACCT, $_.TMACCT|?{$_}}},*SHRT | ConvertTo-Csv -NoTypeInformation | Select -Skip 1 | Add-Content 'c:\scrap\combined.csv'
}
That should produce the exact same file. Only kind of tricky part was the LNNOTE/TMACCT/DMACCT field since obviously you can't just do the same as like *SHRT.
There must be a more efficient way to import multiple CSV files from a directory where the name of the file contains Services_Results*.csv into one variable with unique entries. I'm thinking of looping through all the files in the directory that match the file name with a wildcard then just importing the lines where Success is on the field.
$Success0 = Import-Csv -Path "\\FILE05\Users\USER001\+Projects\Chrome\Services_Results.csv" | Where-Object {$_.Status -eq "Success"}
$Success1 = Import-Csv -Path "\\FILE05\Users\USER001\+Projects\Chrome\Services_Results_150601.csv" | Where-Object {$_.Status -eq "Success"}
$Success2 = Import-Csv -Path "\\FILE05\Users\USER001\+Projects\Chrome\Services_Results_150602.csv" | Where-Object {$_.Status -eq "Success"}
$Success3 = Import-Csv -Path "\\FILE05\Users\USER001\+Projects\Chrome\Services_Results_150602_b.csv" | Where-Object {$_.Status -eq "Success"}
$Success4 = Import-Csv -Path "\\FILE05\Users\USER001\+Projects\Chrome\Services_Results_150602_c.csv" | Where-Object {$_.Status -eq "Success"}
$Success5 = Import-Csv -Path "\\FILE05\Users\USER001\+Projects\Chrome\Services_Results_150603_a.csv" | Where-Object {$_.Status -eq "Success"}
$PCList = $Success0 + $Success1 + $Success2 + $Success3 + $Success4 + $Success5
$PCList = $PCList.PC | sort -Unique
Write-host "PCList" $PCList.count
From Get-Help Import-CSV:
-Path <String[]>
Specifies the path to the CSV file to import. You can also pipe a path to Import-Csv.
Required? false
Position? 1
Default value None
Accept pipeline input? true (ByValue)
Accept wildcard characters? false
So Import-CSV will accept multiple values for -Path from the pipeline:
$PCList =
Get-ChildItem '\\FILE05\Users\USER001\+Projects\Chrome\Services_Results*.csv' |
Select -ExpandProperty FullName |
Import-CSV |
Where-Object {$_.Status -eq "Success"}
How about using PowerShell Background Jobs to speed up the process? I haven't tested this code, but try to read an understand the pattern. I think you'll appreciate how much it can speed up the process, if you're importing large CSV files.
### Create an empty array to hold Background Jobs.
$JobList = #();
### Get a list of CSV files.
$FileList = Get-ChildItem -Path \\FILE05\Users\USER001\+Projects\Chrome\*Services_Results*.csv;
foreach ($File in $FileList) {
### Start a new Background Job for each file
$JobList += Start-Job -ScriptBlock { Import-Csv -Path $args[0] | Where-Object -FilterScript { $PSItem.Status -eq 'Success'; } } -ArgumentList $File.FullName -Name $File.Name;
}
### Wait for all the jobs to complete.
Wait-Job -Job $JobList;
### Receive the results of the jobs.
$ResultList = Receive-Job -Job $JobList;
I think you can do what you want quite neatly with the way you're already doing it, but using a wildcard and a ForEach loop:
$files = Get-ChildItem "\\FILE05\Users\USER001\+Projects\Chrome\*.csv"
$PCList = $files | ForEach { (Import-Csv $_ | Where Status -eq "Success") }
$PCList = $PCList | sort -Unique -Property PC
Write-Output $PCList
Write-Output $PCList.Count