Perform function for all c:\users\*\AppData\Local - powershell

Got the following script that I thought would happily update the specified .ini file in each C:\users\*\AppData\Local\Greeentram folder individually.
function Set-OrAddIniValue {
Param(
[string]$FilePath,
[hashtable]$keyValueList
)
$content = Get-Content $FilePath
$keyValueList.GetEnumerator() | ForEach-Object {
if ($content -match "^$($_.Key)=") {
$content= $content -replace "^$($_.Key)=(.*)", "$($_.Key)=$($_.Value)"
} else {
$content += "$($_.Key)=$($_.Value)"
}
}
$content | Set-Content $FilePath
}
Set-OrAddIniValue -FilePath "C:\Users\*\AppData\Local\Greentram\SDA_Apps.ini" -keyValueList #{
UserName = "Dcebtcv7[[G"
UserEmail = "x}tpwpjmkxmvkYjmklzmx7zv7lr"
UserNo = "*++*(+"
UserKey = "^X(_0[*_/0L)\_0,U,-"
KEM = "H10"
}
What it seems to be doing is somehow combining all the .INI files together and creating a new .INI file for each user.
I have wrongly assumed that C:\Users\*\AppData\Local\Greentram\SDA_Apps.ini would work.
I only want to update or add these specific values to each .INI file.
Set-OrAddIniValue -FilePath "C:\Users\*\AppData\Local\Greentram\SDA_Apps.ini" -keyValueList #{
UserName = "Dcebtcv7[[G"
UserEmail = "x}tpwpjmkxmvkYjmklzmx7zv7lr"
UserNo = "*++*(+"
UserKey = "^X(_0[*_/0L)\_0,U,-"
KEM = "H10"
}

Your function Set-OrAddIniValue doesn't handle wildcards in paths.
$content = Get-Content $FilePath
...
$content | Set-Content $FilePath
The first statement reads the content of all matching files into a single array. The second statement then writes the entire modified content to all matching files. (How would it decide which content belongs to which file?)
You can either call your function for each file individually:
Get-ChildItem "C:\Users\*\AppData\Local\Greentram\SDA_Apps.ini" | ForEach-Object {
Set-OrAddIniValue -FilePath $_.FullName -keyValueList ...
}
or change your function so that it does the enumeration internally:
function Set-OrAddIniValue {
Param(
[string]$FilePath,
[hashtable]$keyValueList
)
Get-ChildItem $FilePath | Where-Object {
-not $_.PSIsContainer # process only files
} | ForEach-Object {
$file = $_.FullName
$content = Get-Content $file
...
$content | Set-Content $file
}
}
On PowerShell v3 and newer you can use Get-ChildItem -File instead of piping the object list through Where-Object {-not $_.PSIsContainer}.

Related

Concatenating Output from Folder

I have thousands of PDF documents that I am trying to comb through and pull out only certain data. I have successfully created a script that goes through each PDF, puts its content into a .txt, and then the final .txt is searched for the requested information. The only part I am stuck on is trying to combine all the data from each PDF into this .txt file. Currenly, each successive PDF simply overwrites the previous data and the search is only performed on the final PDF in the folder. How can I alter this set of code to allow each bit of information to be concatenated into the .txt instead of overwriting?
$all = Get-Childitem -Path $file1 -Recurse -Filter *.pdf
foreach ($f in $all){
$outfile = -join ', '
$text = convert-PDFtoText $outfile
}
Here is my entire script for reference:
Start-Process powershell.exe -Verb RunAs {
function convert-PDFtoText {
param(
[Parameter(Mandatory=$true)][string]$file
)
Add-Type -Path "C:\ps\itextsharp.dll"
$pdf = New-Object iTextSharp.text.pdf.pdfreader -ArgumentList $file
for ($page = 1; $page -le $pdf.NumberOfPages; $page++){
$text=[iTextSharp.text.pdf.parser.PdfTextExtractor]::GetTextFromPage($pdf,$page)
Write-Output $text
}
$pdf.Close()
}
$content = Read-Host "What are we looking for?: "
$file1 = Read-Host "Path to search: "
$all = Get-Childitem -Path $file1 -Recurse -Filter *.pdf
foreach ($f in $all){
$outfile = $f -join ', '
$text = convert-PDFtoText $outfile
}
$text | Out-File "C:\ps\bulk.txt"
Select-String -Path C:\ps\bulk.txt -Pattern $content | Out-File "C:\ps\select.txt"
Start-Sleep -Seconds 60
}
Any help would be greatly appreciated!
To capture all output across all convert-PDFtoText in a single output file, use a single pipeline with the ForEach-Object cmdlet:
Get-ChildItem -Path $file1 -Recurse -Filter *.pdf |
ForEach-Object { convert-PDFtoText $_.FullName } |
Out-File "C:\ps\bulk.txt"
A tweak to your convert-PDFtoText function would allow for a more concise and efficient solution:
Make convert-PDFtoText accept Get-ChildItem input directly from the pipeline:
function convert-PDFtoText {
param(
[Alias('FullName')
[Parameter(Mandatory, ValueFromPipelineByPropertyName)]
[string] $file
)
begin {
Add-Type -Path "C:\ps\itextsharp.dll"
}
process {
$pdf = New-Object iTextSharp.text.pdf.pdfreader -ArgumentList $file
for ($page = 1; $page -le $pdf.NumberOfPages; $page++) {
[iTextSharp.text.pdf.parser.PdfTextExtractor]::GetTextFromPage($pdf,$page)
}
$pdf.Close()
}
}
This then allows you to simplify the command at the top to:
Get-ChildItem -Path $file1 -Recurse -Filter *.pdf |
convert-PDFtoText |
Out-File "C:\ps\bulk.txt"

Appending file name and last write time as columns in a CSV

I have bunch of text files that I am converting to a CSV.
For example I have a few hundred txt files that look like this
Serial Number : 123456
Measurement : 5
Test Data : 125
And each file is being converted to a single row on the CSV. I can't figured out how to add an additional column for the file name and the last write time.
This is what I currently have that copies all of the data from txt to CSV
$files = "path"
function Get-Data {
param (
[Parameter (Mandatory, ValueFromPipeline, Position=0)] $filename
)
$data=#{}
$lines=Get-Content -LiteralPath $filename | Where-Object {$_ -notlike '*---*'}
foreach ($line in $lines) {
$splitLine=$line.split(":")
$data.Add($splitLine[0],$splitLine[1])
}
return [PSCustomObject]$data
}
$files | Foreach-Object -Process {Get-Data $_} | Export-Csv -Path C:\Scripts\data.csv -NoTypeInformation -Force
I've tried doing this but it doesn't add anything. I might be trying to add the data the wrong way.
$files = "path"
function Get-Data {
param (
[Parameter (Mandatory, ValueFromPipeline, Position=0)] $filename
)
$data=#{}
$name = Get-ChildItem -literalpath $filename | Select Name
$data.Add("Filename", $name)
$lines=Get-Content -LiteralPath $filename | Where-Object {$_ -notlike '*---*'}
foreach ($line in $lines) {
$splitLine=$line.split(":")
$data.Add($splitLine[0],$splitLine[1])
}
return [PSCustomObject]$data
}
$files | Foreach-Object -Process {Get-Data $_} | Export-Csv -Path E:\Scripts\Pico2.csv -NoTypeInformation -Force
Here's a streamlined version of your code that should work as intended:
function Get-Data {
param (
[Parameter (Mandatory, ValueFromPipeline)]
[System.IO.FileInfo] $file # Accept direct output from Get-ChildItem
)
process { # Process each pipeline input object
# Initialize an ordered hashtable with
# the input file's name and its last write time.
$data = [ordered] #{
FileName = $file.Name
LastWriteTime = $file.LastWriteTime
}
# Read the file and parse its lines
# into property name-value pairs to add to the hashtable.
$lines = (Get-Content -ReadCount 0 -LiteralPath $file.FullName) -notlike '*---*'
foreach ($line in $lines) {
$name, $value = ($line -split ':', 2).Trim()
$data.Add($name, $value)
}
# Convert the hashtable to a [pscustomobject] instance
# and output it.
[PSCustomObject] $data
}
}
# Determine the input files via Get-ChildItem and
# pipe them directly to Get-Data, which in turn pipes to Export-Csv.
Get-ChildItem "path" |
Get-Data |
Export-Csv -Path C:\Scripts\data.csv -NoTypeInformation -Force

PowerShell add string before pattern in a file

I'm trying to add a line in a .sln file before the $pattern. The only problem is that when I try to add the $firstOccurrence condition in the if statement it doesn't add anything at all. It still triggers the Write-Debug.
The first occurrence part is now commented out but I can't seem to find out why it doesn't write anything when I set the first occurrence.
Original source to my solution can be found here:
How to declare a variable and its type is Boolean in PowerShell?
$firstOccurrence = $true;
$pattern = "Global"
(Get-Content $fileName) | Foreach-Object {
#if ($firstOccurrence) {
if ($_ -match $pattern) {
Write-Debug "test"
$firstOccurrence = $false
#Add Lines after the selected pattern
"aaaaaaaaaaaaaaaaaaaaaaa"
}
#}
# send the current line to output
$_
} | Set-Content $fileName
You could also do this using (very fast) switch -Regex like:
$fileName = 'D:\Test\blah.txt'
$firstOccurrence = $true
$pattern = "Global"
$insert = "aaaaaaaaaaaaaaaaaaaaaaa"
$newContent = switch -Regex -File $fileName {
$pattern {
if ($firstOccurrence) {
$insert
$firstOccurrence = $false
}
$_
}
default { $_ }
}
$newContent | Set-Content $fileName -Force
Or did you perhaps mean this:
$fileName = 'D:\Test\blah.txt'
$pattern = "Global"
$insert = "aaaaaaaaaaaaaaaaaaaaaaa"
((Get-Content -Path $fileName -Raw) -split $pattern, 2) -join "$insert $pattern" | Set-Content $fileName -Force
?

Memory exception while filtering large CSV files

getting memory exception while running this code. Is there a way to filter one file at a time and write output and append after processing each file. Seems the below code loads everything to memory.
$inputFolder = "C:\Change\2019\October"
$outputFile = "C:\Change\2019\output.csv"
Get-ChildItem $inputFolder -File -Filter '*.csv' |
ForEach-Object { Import-Csv $_.FullName } |
Where-Object { $_.machine_type -eq 'workstations' } |
Export-Csv $outputFile -NoType
May be can you export and filter your files one by one and append result into your output file like this :
$inputFolder = "C:\Change\2019\October"
$outputFile = "C:\Change\2019\output.csv"
Remove-Item $outputFile -Force -ErrorAction SilentlyContinue
Get-ChildItem $inputFolder -Filter "*.csv" -file | %{import-csv $_.FullName | where machine_type -eq 'workstations' | export-csv $outputFile -Append -notype }
Note: The reason for not using Get-ChildItem ... | Import-Csv ... - i.e., for not directly piping Get-ChildItem to Import-Csv and instead having to call Import-Csv from the script block ({ ... } of an auxiliary ForEach-Object call, is a bug in Windows PowerShell that has since been fixed in PowerShell Core - see the bottom section for a more concise workaround.
However, even output from ForEach-Object script blocks should stream to the remaining pipeline commands, so you shouldn't run out of memory - after all, a salient feature of the PowerShell pipeline is object-by-object processing, which keeps memory use constant, irrespective of the size of the (streaming) input collection.
You've since confirmed that avoiding the aux. ForEach-Object call does not solve the problem, so we still don't know what causes your out-of-memory exception.
Update:
This GitHub issue contains clues as to the reason for excessive memory use, especially with many properties that contain small amounts of data.
This GitHub feature request proposes using strongly typed output objects to help the issue.
The following workaround, which uses the switch statement to process the files as text files, may help:
$header = ''
Get-ChildItem $inputFolder -Filter *.csv | ForEach-Object {
$i = 0
switch -Wildcard -File $_.FullName {
'*workstations*' {
# NOTE: If no other columns contain the word `workstations`, you can
# simplify and speed up the command by omitting the `ConvertFrom-Csv` call
# (you can make the wildcard matching more robust with something
# like '*,workstations,*')
if ((ConvertFrom-Csv "$header`n$_").machine_type -ne 'workstations') { continue }
$_ # row whose 'machine_type' column value equals 'workstations'
}
default {
if ($i++ -eq 0) {
if ($header) { continue } # header already written
else { $header = $_; $_ } # header row of 1st file
}
}
}
} | Set-Content $outputFile
Here's a workaround for the bug of not being able to pipe Get-ChildItem output directly to Import-Csv, by passing it as an argument instead:
Import-Csv -LiteralPath (Get-ChildItem $inputFolder -File -Filter *.csv) |
Where-Object { $_.machine_type -eq 'workstations' } |
Export-Csv $outputFile -NoType
Note that in PowerShell Core you could more naturally write:
Get-ChildItem $inputFolder -File -Filter *.csv | Import-Csv |
Where-Object { $_.machine_type -eq 'workstations' } |
Export-Csv $outputFile -NoType
Solution 2 :
$inputFolder = "C:\Change\2019\October"
$outputFile = "C:\Change\2019\output.csv"
$encoding = [System.Text.Encoding]::UTF8 # modify encoding if necessary
$Delimiter=','
#find header for your files => i take first row of first file with data
$Header = Get-ChildItem -Path $inputFolder -Filter *.csv | Where length -gt 0 | select -First 1 | Get-Content -TotalCount 1
#if not header founded then not file with sise >0 => we quit
if(! $Header) {return}
#create array for header
$HeaderArray=$Header -split $Delimiter -replace '"', ''
#open output file
$w = New-Object System.IO.StreamWriter($outputfile, $true, $encoding)
#write header founded
$w.WriteLine($Header)
#loop on file csv
Get-ChildItem $inputFolder -File -Filter "*.csv" | %{
#open file for read
$r = New-Object System.IO.StreamReader($_.fullname, $encoding)
$skiprow = $true
while ($line = $r.ReadLine())
{
#exclude header
if ($skiprow)
{
$skiprow = $false
continue
}
#Get objet for current row with header founded
$Object=$line | ConvertFrom-Csv -Header $HeaderArray -Delimiter $Delimiter
#write in output file for your condition asked
if ($Object.machine_type -eq 'workstations') { $w.WriteLine($line) }
}
$r.Close()
$r.Dispose()
}
$w.close()
$w.Dispose()
You have to read and write to the .csv files one row at a time, using StreamReader and StreamWriter:
$filepath = "C:\Change\2019\October"
$outputfile = "C:\Change\2019\output.csv"
$encoding = [System.Text.Encoding]::UTF8
$files = Get-ChildItem -Path $filePath -Filter *.csv |
Where-Object { $_.machine_type -eq 'workstations' }
$w = New-Object System.IO.StreamWriter($outputfile, $true, $encoding)
$skiprow = $false
foreach ($file in $files)
{
$r = New-Object System.IO.StreamReader($file.fullname, $encoding)
while (($line = $r.ReadLine()) -ne $null)
{
if (!$skiprow)
{
$w.WriteLine($line)
}
$skiprow = $false
}
$r.Close()
$r.Dispose()
$skiprow = $true
}
$w.close()
$w.Dispose()
get-content *.csv | add-content combined.csv
Make sure combined.csv doesn't exist when you run this, or it's going to go full Ouroboros.

Replace or add .INI file content

Need to update a .INI file across multiple computers and change the contents. I have the following script that works:
(Get-Content SDA_Apps.ini) | Foreach-Object {
$_ -replace "UserName=.+", "UserName=Test" `
-replace "UserEmail=.+", "UserEmail=test#test.com" `
-replace "UserNo=.+", "UserNo=1234" `
-replace "UserKey=.+", "UserKey=^%&$*$778-" `
-replace "KEM=.+", "KEM=H10"
} | Set-Content SDA_Apps.ini
Sometimes those lines of text do not exist and I need to add the text instead of replace it.
This is my attempt to do this - without success:
function setConfig( $file, $key1, $value1, $key2, $value2 ) {
$content = Get-Content $file
if ( $content -match "^$key\s*=" ) {
$content $_ -replace "^$key1\s*=.*", "$key1=$value1" -replace "^$key2\s*=.*", "$key2=$value2"|
Set-Content $file
} else {
Add-Content $file "$key1 = $value1"
Add-Content $file "$key2 = $value2"
}
}
setConfig "SDA_Apps.ini" "UserName" "Test" "UserEmail" "test#test.com"
I rewrote your function and renamed it to reflect what it actualy does Set-OrAddIniValue:
function Set-OrAddIniValue
{
Param(
[string]$FilePath,
[hashtable]$keyValueList
)
$content = Get-Content $FilePath
$keyValueList.GetEnumerator() | ForEach-Object {
if ($content -match "^$($_.Key)=")
{
$content= $content -replace "^$($_.Key)=(.*)", "$($_.Key)=$($_.Value)"
}
else
{
$content += "$($_.Key)=$($_.Value)"
}
}
$content | Set-Content $FilePath
}
The benefit of this function is that you can pass a key-value list as a hashtable to it. It reads the ini file only once, updates the content and saves it back. Here is an usage example:
Set-OrAddIniValue -FilePath "c:\yourinipath.ini" -keyValueList #{
UserName = "myName"
UserEmail = "myEmail"
UserNewField = "SeemsToWork"
}