I have a Move Item Script which I only want to execute if there is not just a single 0 in the file. I have thought about checking the file size for 0kb\empty but because of the value in there the file size is 1kb.
Code tried:
$file = Get-Content "transfer\A28AP.txt"
$containsWord = $file | %{$_ -match "0"}
if ($containsWord -contains $true) {
Move-Item "transfer\A28AP.txt" -Destination "transfer\A28History\"
}
You can simplify your code by just using -match inside your if. I also corrected the regex:
$file = Get-Content "transfer\A28AP.txt" -raw
if ($file -notmatch '^0$') {
Move-Item "transfer\A28AP.txt" -Destination "transfer\A28History\"
}
Related
getting memory exception while running this code. Is there a way to filter one file at a time and write output and append after processing each file. Seems the below code loads everything to memory.
$inputFolder = "C:\Change\2019\October"
$outputFile = "C:\Change\2019\output.csv"
Get-ChildItem $inputFolder -File -Filter '*.csv' |
ForEach-Object { Import-Csv $_.FullName } |
Where-Object { $_.machine_type -eq 'workstations' } |
Export-Csv $outputFile -NoType
May be can you export and filter your files one by one and append result into your output file like this :
$inputFolder = "C:\Change\2019\October"
$outputFile = "C:\Change\2019\output.csv"
Remove-Item $outputFile -Force -ErrorAction SilentlyContinue
Get-ChildItem $inputFolder -Filter "*.csv" -file | %{import-csv $_.FullName | where machine_type -eq 'workstations' | export-csv $outputFile -Append -notype }
Note: The reason for not using Get-ChildItem ... | Import-Csv ... - i.e., for not directly piping Get-ChildItem to Import-Csv and instead having to call Import-Csv from the script block ({ ... } of an auxiliary ForEach-Object call, is a bug in Windows PowerShell that has since been fixed in PowerShell Core - see the bottom section for a more concise workaround.
However, even output from ForEach-Object script blocks should stream to the remaining pipeline commands, so you shouldn't run out of memory - after all, a salient feature of the PowerShell pipeline is object-by-object processing, which keeps memory use constant, irrespective of the size of the (streaming) input collection.
You've since confirmed that avoiding the aux. ForEach-Object call does not solve the problem, so we still don't know what causes your out-of-memory exception.
Update:
This GitHub issue contains clues as to the reason for excessive memory use, especially with many properties that contain small amounts of data.
This GitHub feature request proposes using strongly typed output objects to help the issue.
The following workaround, which uses the switch statement to process the files as text files, may help:
$header = ''
Get-ChildItem $inputFolder -Filter *.csv | ForEach-Object {
$i = 0
switch -Wildcard -File $_.FullName {
'*workstations*' {
# NOTE: If no other columns contain the word `workstations`, you can
# simplify and speed up the command by omitting the `ConvertFrom-Csv` call
# (you can make the wildcard matching more robust with something
# like '*,workstations,*')
if ((ConvertFrom-Csv "$header`n$_").machine_type -ne 'workstations') { continue }
$_ # row whose 'machine_type' column value equals 'workstations'
}
default {
if ($i++ -eq 0) {
if ($header) { continue } # header already written
else { $header = $_; $_ } # header row of 1st file
}
}
}
} | Set-Content $outputFile
Here's a workaround for the bug of not being able to pipe Get-ChildItem output directly to Import-Csv, by passing it as an argument instead:
Import-Csv -LiteralPath (Get-ChildItem $inputFolder -File -Filter *.csv) |
Where-Object { $_.machine_type -eq 'workstations' } |
Export-Csv $outputFile -NoType
Note that in PowerShell Core you could more naturally write:
Get-ChildItem $inputFolder -File -Filter *.csv | Import-Csv |
Where-Object { $_.machine_type -eq 'workstations' } |
Export-Csv $outputFile -NoType
Solution 2 :
$inputFolder = "C:\Change\2019\October"
$outputFile = "C:\Change\2019\output.csv"
$encoding = [System.Text.Encoding]::UTF8 # modify encoding if necessary
$Delimiter=','
#find header for your files => i take first row of first file with data
$Header = Get-ChildItem -Path $inputFolder -Filter *.csv | Where length -gt 0 | select -First 1 | Get-Content -TotalCount 1
#if not header founded then not file with sise >0 => we quit
if(! $Header) {return}
#create array for header
$HeaderArray=$Header -split $Delimiter -replace '"', ''
#open output file
$w = New-Object System.IO.StreamWriter($outputfile, $true, $encoding)
#write header founded
$w.WriteLine($Header)
#loop on file csv
Get-ChildItem $inputFolder -File -Filter "*.csv" | %{
#open file for read
$r = New-Object System.IO.StreamReader($_.fullname, $encoding)
$skiprow = $true
while ($line = $r.ReadLine())
{
#exclude header
if ($skiprow)
{
$skiprow = $false
continue
}
#Get objet for current row with header founded
$Object=$line | ConvertFrom-Csv -Header $HeaderArray -Delimiter $Delimiter
#write in output file for your condition asked
if ($Object.machine_type -eq 'workstations') { $w.WriteLine($line) }
}
$r.Close()
$r.Dispose()
}
$w.close()
$w.Dispose()
You have to read and write to the .csv files one row at a time, using StreamReader and StreamWriter:
$filepath = "C:\Change\2019\October"
$outputfile = "C:\Change\2019\output.csv"
$encoding = [System.Text.Encoding]::UTF8
$files = Get-ChildItem -Path $filePath -Filter *.csv |
Where-Object { $_.machine_type -eq 'workstations' }
$w = New-Object System.IO.StreamWriter($outputfile, $true, $encoding)
$skiprow = $false
foreach ($file in $files)
{
$r = New-Object System.IO.StreamReader($file.fullname, $encoding)
while (($line = $r.ReadLine()) -ne $null)
{
if (!$skiprow)
{
$w.WriteLine($line)
}
$skiprow = $false
}
$r.Close()
$r.Dispose()
$skiprow = $true
}
$w.close()
$w.Dispose()
get-content *.csv | add-content combined.csv
Make sure combined.csv doesn't exist when you run this, or it's going to go full Ouroboros.
Here what I want to do is get the list of the folders that have files which has the value of ErrorCode > 0.
This is what I have done till now.
$fileNames = Get-ChildItem -Path $scriptPath -Recurse -Include *.data
$FoldersToRename = #() #initialize as array
foreach ($file in $fileNames) {
If (Get-Content $file | %{$_ -match '"ErrorCode": 0'})
{
echo "matched"
}
Now I have .data file which are being searched by this program. It contains an object with a value of "ErrorCode":value. I want to perform some operations only if that value is greater than zero.
How do I solve this?
One way to do it is like this:
Get-ChildItem -Path $scriptPath -Filter *.data |
ForEach-Object {
if((Get-Content -Path $_.FullName -Raw) -match '"ErrorCode": [1-9]\d*') {
"Matched"
}
}
I have written the below conditional script to go through the files in the directory and replace the one text in all files only if file contains the word as 'Health'
cd -Path "\\shlhfilprd08\Direct Credits\Temp2"
ForEach ($file in (Get-ChildItem -Path "\\shlhfilprd08\Direct Credits\Temp2"))
{
$filecontent = Get-Content -path $file -First 1
if($filecontent -like '*Health*'){$filecontent = $filecontent -replace 'TEACHERF','UniHlth '}
Set-Content $file.PSpath -Value $filecontent
}
I come across with two issues such as
If the ($filecontent -like 'Health'), it is replacing the word in first raw and deleting other rows along with replace.I do not want that to happen
I'm getting set-content to path is denied error message for file content does not contain the Health text
Can you try with this
cd -Path "\\shlhfilprd08\Direct Credits\Temp2"
$configFiles = Get-ChildItem . *.config -rec
foreach ($file in $configFiles)
{
(Get-Content $file.PSPath) |
Foreach-Object { $_ -replace "TEACHERF", "UniHlth " } |
Set-Content $file.PSPath
}
I would try this; it worked for me in a little file
(make a small copy of a few data into a new folder and test it there)
$path = "\\shlhfilprd08\Direct Credits\Temp2"
$replace ="TEACHERF" #word to be replaced
$by = "UniHlth " #by this word (change $replace by $by)
gci $path -file | %{
foreach($line in $(Get-content $_.Fullname)){
if($line -like $replace){
$newline = $line.Replace($($replace),$($by))
Set-Content $_.FullName $newline
}
}
}
I want some .cs model file to append annotation. If script finds specific property it will put above that property annotation.
Here is the script:
$annotation = "[DatabaseGenerated(DatabaseGeneratedOption.Computed)]"
Get-ChildItem -Filter *.cs | % {
(Get-Content $_.FullName) | ForEach-Object {
if ($_ -match "StartDateTime") {
$_ -replace $_ , "`n`t`t$annotation`n$_"
}
} | Set-Content $_.FullName
}
It works with replacing, but at the end I get a blank file with only two lines (annotation and custom property). I realize that the last pipeline Set-Content $_.FullName is messed up.
If I remove Set-Content, nothing happens with my file (it's not updated)?
This should work better for you:
$filePath = '<YOUR PATH HERE>'
$annotation = "[DatabaseGenerated(DatabaseGeneratedOption.Computed)]"
Get-ChildItem -Path $filePath -Filter *.cs | ForEach-Object {
$file = $_.FullName
(Get-Content $file) | ForEach-Object {
# test all strings in $file
if ($_ -match "StartDateTime") {
# emit the annotation followed by the string itself
"`r`n`t`t$annotation`r`n" + $_
}
else {
# just output the line as-is
$_
}
} | Set-Content -Path $file -Force
}
Within the Foreach-Object I'm capturing the $_.FullName for later use and also to not confuse it with the $_ you use later on as line in the file.
Then, if the line does match the if, output the replaced line, but if it does not (in the else) you should output the line unchanged.
Then, the Set-Content always outputs each line, replaced or not.
Since you actually are not replacing anything inside the string, but rather prefixing it with an annotation, this can be simplified a bit like so:
$annotation = "[DatabaseGenerated(DatabaseGeneratedOption.Computed)]"
Get-ChildItem -Path 'D:\' -Filter *.cs | ForEach-Object {
$file = $_.FullName
(Get-Content $file) | ForEach-Object {
# test all strings in $file
if ($_ -match "StartDateTime") {
# emit the annotation
"`r`n`t`t$annotation"
}
# output the line as-is
$_
} | Set-Content -Path $file -Force
}
I have a list of strings in a CSV file. The format is:
OldValue,NewValue
223134,875621
321321,876330
....
and the file contains a few hundred rows (each OldValue is unique). I need to process changes over a number of text files in a number of folders & subfolders. My best guess of the number of folders, files, and lines of text are - 15 folders, around 150 text files in each folder, with approximately 65,000 lines of text in each folder (between 400-500 lines per text file).
I will make 2 passes at the data, unless I can do it in one. First pass is to generate a text file I will use as a check list to review my changes. Second pass is to actually make the change in the file. Also, I only want to change the text files where the string occurs (not every file).
I'm using the following Powershell script to go through the files & produce a list of the changes needed. The script runs, but is beyond slow. I haven't worked on the replace logic yet, but I assume it will be similar to what I've got.
# replace a string in a file with powershell
[reflection.assembly]::loadwithpartialname("Microsoft.VisualBasic") | Out-Null
Function Search {
# Parameters $Path and $SearchString
param ([Parameter(Mandatory=$true, ValueFromPipeline = $true)][string]$Path,
[Parameter(Mandatory=$true)][string]$SearchString
)
try {
#.NET FindInFiles Method to Look for file
[Microsoft.VisualBasic.FileIO.FileSystem]::GetFiles(
$Path,
[Microsoft.VisualBasic.FileIO.SearchOption]::SearchAllSubDirectories,
$SearchString
)
} catch { $_ }
}
if (Test-Path "C:\Work\ListofAllFilenamesToSearch.txt") { # if file exists
Remove-Item "C:\Work\ListofAllFilenamesToSearch.txt"
}
if (Test-Path "C:\Work\FilesThatNeedToBeChanged.txt") { # if file exists
Remove-Item "C:\Work\FilesThatNeedToBeChanged.txt"
}
$filefolder1 = "C:\TestFolder\WorkFiles"
$ftype = "*.txt"
$filenames1 = Search $filefolder1 $ftype
$filenames1 | Out-File "C:\Work\ListofAllFilenamesToSearch.txt" -Width 2000
if (Test-Path "C:\Work\FilesThatNeedToBeChanged.txt") { # if file exists
Remove-Item "C:\Work\FilesThatNeedToBeChanged.txt"
}
(Get-Content "C:\Work\NumberXrefList.CSV" |where {$_.readcount -gt 1}) | foreach{
$OldFieldValue, $NewFieldValue = $_.Split("|")
$filenamelist = (Get-Content "C:\Work\ListofAllFilenamesToSearch.txt" -ReadCount 5) #|
foreach ($j in $filenamelist) {
#$testvar = (Get-Content $j )
#$testvar = (Get-Content $j -ReadCount 100)
$testvar = (Get-Content $j -Delimiter "\n")
Foreach ($i in $testvar)
{
if ($i -imatch $OldFieldValue) {
$j + "|" + $OldFieldValue + "|" + $NewFieldValue | Out-File "C:\Work\FilesThatNeedToBeChanged.txt" -Width 2000 -Append
}
}
}
}
$FileFolder = (Get-Content "C:\Work\FilesThatNeedToBeChanged.txt" -ReadCount 5)
Get-ChildItem $FileFolder -Recurse |
select -ExpandProperty fullname |
foreach {
if (Select-String -Path $_ -SimpleMatch $OldFieldValue -Debug -Quiet) {
(Get-Content $_) |
ForEach-Object {$_ -replace $OldFieldValue, $NewFieldValue }|
Set-Content $_ -WhatIf
}
}
In the code above, I've tried several things with Get-Content - default, with -ReadCount, and -Delimiter - in an attempt to avoid an out of memory error.
The only thing I have control over is the length of the old & new replacement strings file. Is there a way to do this in Powershell? Is there a better option/solution? I'm running Windows 7, Powershell version 3.0.
Your main problem is that you're reading the file over and over again to change each of the terms. You need to invert the looping of the replace terms and looping of the files. Also, pre-load the csv. Something like:
$filefolder1 = "C:\TestFolder\WorkFiles"
$ftype = "*.txt"
$filenames = gci -Path $filefolder1 -Filter $ftype -Recurse
$replaceValues = Import-Csv -Path "C:\Work\NumberXrefList.CSV"
foreach ($file in $filenames) {
$contents = Get-Content -Path $file
foreach ($replaceValue in $replaceValues) {
$contents = $contents -replace $replaceValue.OldValue, $replaceValue.NewValue
}
Copy-Item $file "$file.old"
Set-Content -Path $file -Value $contents
}