Count characters for each line - powershell

I am new to WinPowerShell. Please, would you be so kind to give me some code or information, how to write a program which will do for all *.txt files in a folder next:
1.Count characters for each line in the file
2. If length of line exceeds 1024 characters to create a subfolder within that folder and to move file there (that how I will know which file has over 1024 char per line)
I've tried though VB and VBA (this is more familiar to me), but I want to learn some new cool stuff!
Many thanks!
Edit: I found some part of a code that is beginning
$fileDirectory = "E:\files";
foreach($file in Get-ChildItem $fileDirectory)
{
# Processing code goes here
}
OR
$fileDirectory = "E:\files";
foreach($line in Get-ChildItem $fileDirectory)
{
if($line.length -gt 1023){# mkdir and mv to subfolder!}
}

If you are willing to learn, why not start here.
You can use the Get-Content command in PS to get some information of your files. http://blogs.technet.com/b/heyscriptingguy/archive/2013/07/06/powertip-counting-characters-with-powershell.aspx and Getting character count for each row in text doc

With your second edit I did see some effort so I would like to help you.
$path = "D:\temp"
$lengthToNotExceed = 1024
$longFiles = Get-ChildItem -path -File |
Where-Object {(Get-Content($_.Fullname) | Measure-Object -Maximum Length | Select-Object -ExpandProperty Maximum) -ge $lengthToNotExceed}
$longFiles | ForEach-Object{
$target = "$($_.Directory)\$lengthToNotExceed\"
If(!(Test-Path $target)){New-Item $target -ItemType Directory -Force | Out-Null}
Move-Item $_.FullName -Destination $target
}
You can make this a one-liner but it would be unnecessarily complicated. Use measure object on the array returned by Get-Content. The array being, more or less, a string array. In PowerShell strings have a length property which query.
That will return the maximum length line in the file. We use Where-Object to filter only those results with the length we desire.
Then for each file we attempt to move it to the sub directory that is in the same location as the file matched. If no sub folder exists we make it.
Caveats:
You need at least 3.0 for the -File switch. In place of that you can update the Where-Object to have another clause: $_.PSIsContainer
This would perform poorly on files with a large number of lines.

Here's my comment above indented and line broken in .ps1 script form.
$long = #()
foreach ($file in gci *.txt) {
$f=0
gc $file | %{
if ($_.length -ge 1024) {
if (-not($f)) {
$f=1
$long += $file
}
}
}
}
$long | %{
$dest = #($_.DirectoryName, '\test') -join ''
[void](ni -type dir $dest -force)
mv $_ -dest (#($dest, '\', $_.Name) -join '') -force
}
I was also mentioning labels and breaks there. Rather than $f=0 and if (-not($f)), you can break out of the inner loop with break like this:
$long = #()
foreach ($file in gci *.txt) {
:inner foreach ($line in gc $file) {
if ($line.length -ge 1024) {
$long += $file
break inner
}
}
}
$long | %{
$dest = #($_.DirectoryName, '\test') -join ''
[void](ni -type dir $dest -force)
mv $_ -dest (#($dest, '\', $_.Name) -join '') -force
}
Did you happen to notice the two different ways of calling foreach? There's the verbose foreach command, and then there's command | %{} where the iterative item is represented by $_.

Related

How to parse through folders and files using PowerShell?

I am trying to construct a script that moves through specific folders and the log files in it, and filters the error codes. After that it passes them into a new file.
I'm not really sure how to do that with for loops so I'll leave my code bellow.
If someone could tell me what I'm doing wrong, that would be greatly appreciated.
$file_name = Read-Host -Prompt 'Name of the new file: '
$path = 'C:\Users\user\Power\log_script\logs'
Add-Type -AssemblyName System.IO.Compression.FileSystem
function Unzip
{
param([string]$zipfile, [string]$outpath)
[System.IO.Compression.ZipFile]::ExtractToDirectory($zipfile, $outpath)
}
if ([System.IO.File]::Exists($path)) {
Remove-Item $path
Unzip 'C:\Users\user\Power\log_script\logs.zip' 'C:\Users\user\Power\log_script'
} else {
Unzip 'C:\Users\user\Power\log_script\logs.zip' 'C:\Users\user\Power\log_script'
}
$folder = Get-ChildItem -Path 'C:\Users\user\Power\log_script\logs\LogFiles'
$files = foreach($logfolder in $folder) {
$content = foreach($line in $files) {
if ($line -match '([ ][4-5][0-5][0-9][ ])') {
echo $line
}
}
}
$content | Out-File $file_name -Force -Encoding ascii
Inside the LogFiles folder are three more folders each containing log files.
Thanks
Expanding on a comment above about recursing the folder structure, and then actually retrieving the content of the files, you could try something line this:
$allFiles = Get-ChildItem -Path 'C:\Users\user\Power\log_script\logs\LogFiles' -Recurse
# iterate the files
$allFiles | ForEach-Object {
# iterate the content of each file, line by line
Get-Content $_ | ForEach-Object {
if ($_ -match '([ ][4-5][0-5][0-9][ ])') {
echo $_
}
}
}
It looks like your inner loop is of a collection ($files) that doesn't yet exist. You assign $files to the output of a ForEach(...) loop then try to nest another loop of $files inside it. Of course at this point $files isn't available to be looped.
Regardless, the issue is you are never reading the content of your log files. Even if you managed to loop through the output of Get-ChildItem, you need to look at each line to perform the match.
Obviously I cannot completely test this, but I see a few issues and have rewritten as below:
$file_name = Read-Host -Prompt 'Name of the new file'
$path = 'C:\Users\user\Power\log_script\logs'
$Pattern = '([ ][4-5][0-5][0-9][ ])'
if ( [System.IO.File]::Exists( $path ) ) { Remove-Item $path }
Expand-Archive 'C:\Users\user\Power\log_script\logs.zip' 'C:\Users\user\Power\log_script'
Select-String -Path 'C:\Users\user\Power\log_script\logs\LogFiles\*' -Pattern $Pattern |
Select-Object -ExpandProperty line |
Out-File $file_name -Force -Encoding ascii
Note: Select-String cannot recurse on its own.
I'm not sure you need to write your own UnZip function. PowerShell has the Expand-Archive cmdlet which can at least match the functionality thus far:
Expand-Archive -Path <SourceZipPath> -DestinationPath <DestinationFolder>
Note: The -Force parameter allows it to over write the destination files if they are already present. which may be a substitute for testing if the file exists and deleting if it does.
If you are going to test for the file that section of code can be simplified as:
if ( [System.IO.File]::Exists( $path ) ) { Remove-Item $path }
Unzip 'C:\Users\user\Power\log_script\logs.zip' 'C:\Users\user\Power\log_script'
This is because you were going to run the UnZip command regardless...
Note: You could also use Test-Path for this.
Also there are enumerable ways to get the matching lines, here are a couple of extra samples:
Get-ChildItem -Path 'C:\Users\user\Power\log_script\logs\LogFiles' |
ForEach-Object{
( Get-Content $_.FullName ) -match $Pattern
# Using match in this way will echo the lines that matched from each run of
# Get-Content. If nothing matched nothing will output on that iteration.
} |
Out-File $file_name -Force -Encoding ascii
This approach will read the entire file into an array before running the match on it. For large files it may pose a memory issue, however it enabled the clever use of -match.
OR:
Get-ChildItem -Path 'C:\Users\user\Power\log_script\logs\LogFiles' |
Get-Content |
ForEach-Object{ If( $_ -match $Pattern ) { $_ } } |
Out-File $file_name -Force -Encoding ascii
Note: You don't need the alias echo or its real cmdlet Write-Output
UPDATE: After fuzzing around a bit and trying different things I finally got it to work.
I'll include the code below just for demonstration purposes.
Thanks everyone
$start = Get-Date
"`n$start`n"
$file_name = Read-Host -Prompt 'Name of the new file: '
Out-File $file_name -Force -Encoding ascii
Expand-Archive -Path 'C:\Users\User\Power\log_script\logs.zip' -Force
$i = 1
$folders = Get-ChildItem -Path 'C:\Users\User\Power\log_script\logs\logs\LogFiles' -Name -Recurse -Include *.log
foreach($item in $folders) {
$files = 'C:\Users\User\Power\log_script\logs\logs\LogFiles\' + $item
foreach($file in $files){
$content = Get-Content $file
Write-Progress -Activity "Filtering..." -Status "File $i of $($folders.Count)" -PercentComplete (($i / $folders.Count) * 100)
$i++
$output = foreach($line in $content) {
if ($line -match '([ ][4-5][0-5][0-9][ ])') {
Add-Content -Path $file_name -Value $line
}
}
}
}
$end = Get-Date
$time = [int]($end - $start).TotalSeconds
Write-Output ("Runtime: " + $time + " Seconds" -join ' ')

Using Powershell to replace multiple strings in multiple files & folders

I have a list of strings in a CSV file. The format is:
OldValue,NewValue
223134,875621
321321,876330
....
and the file contains a few hundred rows (each OldValue is unique). I need to process changes over a number of text files in a number of folders & subfolders. My best guess of the number of folders, files, and lines of text are - 15 folders, around 150 text files in each folder, with approximately 65,000 lines of text in each folder (between 400-500 lines per text file).
I will make 2 passes at the data, unless I can do it in one. First pass is to generate a text file I will use as a check list to review my changes. Second pass is to actually make the change in the file. Also, I only want to change the text files where the string occurs (not every file).
I'm using the following Powershell script to go through the files & produce a list of the changes needed. The script runs, but is beyond slow. I haven't worked on the replace logic yet, but I assume it will be similar to what I've got.
# replace a string in a file with powershell
[reflection.assembly]::loadwithpartialname("Microsoft.VisualBasic") | Out-Null
Function Search {
# Parameters $Path and $SearchString
param ([Parameter(Mandatory=$true, ValueFromPipeline = $true)][string]$Path,
[Parameter(Mandatory=$true)][string]$SearchString
)
try {
#.NET FindInFiles Method to Look for file
[Microsoft.VisualBasic.FileIO.FileSystem]::GetFiles(
$Path,
[Microsoft.VisualBasic.FileIO.SearchOption]::SearchAllSubDirectories,
$SearchString
)
} catch { $_ }
}
if (Test-Path "C:\Work\ListofAllFilenamesToSearch.txt") { # if file exists
Remove-Item "C:\Work\ListofAllFilenamesToSearch.txt"
}
if (Test-Path "C:\Work\FilesThatNeedToBeChanged.txt") { # if file exists
Remove-Item "C:\Work\FilesThatNeedToBeChanged.txt"
}
$filefolder1 = "C:\TestFolder\WorkFiles"
$ftype = "*.txt"
$filenames1 = Search $filefolder1 $ftype
$filenames1 | Out-File "C:\Work\ListofAllFilenamesToSearch.txt" -Width 2000
if (Test-Path "C:\Work\FilesThatNeedToBeChanged.txt") { # if file exists
Remove-Item "C:\Work\FilesThatNeedToBeChanged.txt"
}
(Get-Content "C:\Work\NumberXrefList.CSV" |where {$_.readcount -gt 1}) | foreach{
$OldFieldValue, $NewFieldValue = $_.Split("|")
$filenamelist = (Get-Content "C:\Work\ListofAllFilenamesToSearch.txt" -ReadCount 5) #|
foreach ($j in $filenamelist) {
#$testvar = (Get-Content $j )
#$testvar = (Get-Content $j -ReadCount 100)
$testvar = (Get-Content $j -Delimiter "\n")
Foreach ($i in $testvar)
{
if ($i -imatch $OldFieldValue) {
$j + "|" + $OldFieldValue + "|" + $NewFieldValue | Out-File "C:\Work\FilesThatNeedToBeChanged.txt" -Width 2000 -Append
}
}
}
}
$FileFolder = (Get-Content "C:\Work\FilesThatNeedToBeChanged.txt" -ReadCount 5)
Get-ChildItem $FileFolder -Recurse |
select -ExpandProperty fullname |
foreach {
if (Select-String -Path $_ -SimpleMatch $OldFieldValue -Debug -Quiet) {
(Get-Content $_) |
ForEach-Object {$_ -replace $OldFieldValue, $NewFieldValue }|
Set-Content $_ -WhatIf
}
}
In the code above, I've tried several things with Get-Content - default, with -ReadCount, and -Delimiter - in an attempt to avoid an out of memory error.
The only thing I have control over is the length of the old & new replacement strings file. Is there a way to do this in Powershell? Is there a better option/solution? I'm running Windows 7, Powershell version 3.0.
Your main problem is that you're reading the file over and over again to change each of the terms. You need to invert the looping of the replace terms and looping of the files. Also, pre-load the csv. Something like:
$filefolder1 = "C:\TestFolder\WorkFiles"
$ftype = "*.txt"
$filenames = gci -Path $filefolder1 -Filter $ftype -Recurse
$replaceValues = Import-Csv -Path "C:\Work\NumberXrefList.CSV"
foreach ($file in $filenames) {
$contents = Get-Content -Path $file
foreach ($replaceValue in $replaceValues) {
$contents = $contents -replace $replaceValue.OldValue, $replaceValue.NewValue
}
Copy-Item $file "$file.old"
Set-Content -Path $file -Value $contents
}

Powershell Script to search for credit card numbers in a folder

I am using the below script to search for credit card numbers inside a folder that contains many subfolders:
Get-ChildItem -rec | ?{ findstr.exe /mprc:. $_.FullName }
| select-string "[456][0-9]{15}","[456][0-9]{3}[-| ][0-9]{4} [-| ][0-9]{4}[-| ][0-9]{4}"
However, this will return all instances found in every folder/subfolder.
How can I amend the script to skip the current folder on the first instance found? meaning that if it finds a credit card number it will stop processing the current folder and move to the next folder.
Appreciate you answers and help.
Thanks in advance,
You could use this recursive function:
function cards ($dir)
Get-ChildItem -Directory $dir | % { cards($_.FullName) }
Get-ChildItem -File $dir\* | % {
if ( Select-String $_.FullName "[456][0-9]{15}","[456][0-9]{3}[-| ][0-9]{4} [-| ][0-9]{4}[-| ][0-9]{4}" ) {
write-host "card found in $dir"
return
}
}
}
cards "C:\path\to\base\dir"
It'll keep going through subdirectories of the top level directory you specify. Whenever it gets to a directory with no subdirectories, or its been through all the subdirectories of the current directory, it'll start looking through the files for the matching regex, but will bail out of the function when the first match is found.
So really what you want is the first file in every folder that has a credit card number in the contents.
Break it into two parts. Get a list of all your folders, recursively. Then, for each folder, get the list of files, non-recursively. Search each file until you find one that matches.
I don't see any easy way to do this with pipes alone. That means more traditional programming techniques.
This requires PowerShell 3.0. I've eliminated ?{ findstr.exe /mprc:. $_.FullName } because all I can see that it does is eliminate folders (and zero length files) and this already handles that.
Get-ChildItem -Directory -Recurse | ForEach-Object {
$Found = $false;
$i = 0;
$Files = $_ | Get-ChildItem -File | Sort-Object -Property Name;
for ($i = 0; ($Files[$i] -ne $null) -and ($Found -eq $false); $i++) {
$SearchResult = $Files[$i] | Select-String "[456][0-9]{15}","[456][0-9]{3}[-| ][0-9]{4} [-| ][0-9]{4}[-| ][0-9]{4}";
if ($SearchResult) {
$Found = $true;
Write-Output $SearchResult;
}
}
}
Didn't have the time to test it fully, but I thought about something like this:
$Location = 'H:\'
$Dirs = Get-ChildItem $Location -Directory -Recurse
$Regex1 = "[456][0-9]{3}[-| ][0-9]{4} [-| ][0-9]{4}[-| ][0-9]{4}"
$Regex2 = "[456][0-9]{15}"
Foreach ($d in $Dirs) {
$Files = Get-ChildItem $d.FullName -File
foreach ($f in $Files) {
if (($f.Name -match $Regex1) -or ($f.Name -match $Regex2)) {
Write-Host 'Match found'
Return
}
}
}
Here is another one, why not, the more the merrier.
I'm assuming that your Regex is correct.
Using break in the second loop will skip looking for a credit card in the remaining files if one is found and continue to the next folder.
$path = '<your path here>'
$folders = Get-ChildItem $path -Directory -rec
foreach ($folder in $folders)
{
$items = Get-ChildItem $folder.fullname -File
foreach ($i in $items)
{
if (($found = $i.FullName| select-string "[456][0-9]{15}","[456][0-9]{3}[-| ][0-9]{4} [-| ][0-9]{4}[-| ][0-9]{4}") -ne $null)
{
break
}
}
}
I think the intention was to look inside each file for the PII data right?
If so, you need to open the load the file and search each line. The code you posted will only run a regex on the name of the file.

Recursively count files in subfolders

I am trying to count the files in all subfolders in a directory and display them in a list.
For instance the following dirtree:
TEST
/VOL01
file.txt
file.pic
/VOL02
/VOL0201
file.nu
/VOL020101
file.jpg
file.erp
file.gif
/VOL03
/VOL0301
file.org
Should give as output:
PS> DirX C:\TEST
Directory Count
----------------------------
VOL01 2
VOL02 0
VOL02/VOL0201 1
VOL02/VOL0201/VOL020101 3
VOL03 0
VOL03/VOL0301 1
I started with the following:
Function DirX($directory)
{
foreach ($file in Get-ChildItem $directory -Recurse)
{
Write-Host $file
}
}
Now I have a question: why is my Function not recursing?
Something like this should work:
dir -recurse | ?{ $_.PSIsContainer } | %{ Write-Host $_.FullName (dir $_.FullName | Measure-Object).Count }
dir -recurse lists all files under current directory and pipes (|) the result to
?{ $_.PSIsContainer } which filters directories only then pipes again the resulting list to
%{ Write-Host $_.FullName (dir $_.FullName | Measure-Object).Count } which is a foreach loop that, for each member of the list ($_) displays the full name and the result of the following expression
(dir $_.FullName | Measure-Object).Count which provides a list of files under the $_.FullName path and counts members through Measure-Object
?{ ... } is an alias for Where-Object
%{ ... } is an alias for foreach
Similar to David's solution this will work in Powershell v3.0 and does not uses aliases in case someone is not familiar with them
Get-ChildItem -Directory | ForEach-Object { Write-Host $_.FullName $(Get-ChildItem $_ | Measure-Object).Count}
Answer Supplement
Based on a comment about keeping with your function and loop structure i provide the following. Note: I do not condone this solution as it is ugly and the built in cmdlets handle this very well. However I like to help so here is an update of your script.
Function DirX($directory)
{
$output = #{}
foreach ($singleDirectory in (Get-ChildItem $directory -Recurse -Directory))
{
$count = 0
foreach($singleFile in Get-ChildItem $singleDirectory.FullName)
{
$count++
}
$output.Add($singleDirectory.FullName,$count)
}
$output | Out-String
}
For each $singleDirectory count all files using $count ( which gets reset before the next sub loop ) and output each finding to a hash table. At the end output the hashtable as a string. In your question you looked like you wanted an object output instead of straight text.
Well, the way you are doing it the entire Get-ChildItem cmdlet needs to complete before the foreach loop can begin iterating. Are you sure you're waiting long enough? If you run that against very large directories (like C:) it is going to take a pretty long time.
Edit: saw you asked earlier for a way to make your function do what you are asking, here you go.
Function DirX($directory)
{
foreach ($file in Get-ChildItem $directory -Recurse -Directory )
{
[pscustomobject] #{
'Directory' = $File.FullName
'Count' = (GCI $File.FullName -Recurse).Count
}
}
}
DirX D:\
The foreach loop only get's directories since that is all we care about, then inside of the loop a custom object is created for each iteration with the full path of the folder and the count of the items inside of the folder.
Also, please note that this will only work in PowerShell 3.0 or newer, since the -directory parameter did not exist in 2.0
Get-ChildItem $rootFolder `
-Recurse -Directory |
Select-Object `
FullName, `
#{Name="FileCount";Expression={(Get-ChildItem $_ -File |
Measure-Object).Count }}
My version - slightly cleaner and dumps content to a file
Original - Recursively count files in subfolders
Second Component - Count items in a folder with PowerShell
$FOLDER_ROOT = "F:\"
$OUTPUT_LOCATION = "F:DLS\OUT.txt"
Function DirX($directory)
{
Remove-Item $OUTPUT_LOCATION
foreach ($singleDirectory in (Get-ChildItem $directory -Recurse -Directory))
{
$count = Get-ChildItem $singleDirectory.FullName -File | Measure-Object | %{$_.Count}
$summary = $singleDirectory.FullName+" "+$count+" "+$singleDirectory.LastAccessTime
Add-Content $OUTPUT_LOCATION $summary
}
}
DirX($FOLDER_ROOT)
I modified David Brabant's solution just a bit so I could evaluate the result:
$FileCounter=gci "$BaseDir" -recurse | ?{ $_.PSIsContainer } | %{ (gci "$($_.FullName)" | Measure-Object).Count }
Write-Host "File Count=$FileCounter"
If($FileCounter -gt 0) {
... take some action...
}

Loop through files in a directory using PowerShell

How can I change the following code to look at all the .log files in the directory and not just the one file?
I need to loop through all the files and delete all lines that do not contain "step4" or "step9". Currently this will create a new file, but I'm not sure how to use the for each loop here (newbie).
The actual files are named like this: 2013 09 03 00_01_29.log. I'd like the output files to either overwrite them, or to have the SAME name, appended with "out".
$In = "C:\Users\gerhardl\Documents\My Received Files\Test_In.log"
$Out = "C:\Users\gerhardl\Documents\My Received Files\Test_Out.log"
$Files = "C:\Users\gerhardl\Documents\My Received Files\"
Get-Content $In | Where-Object {$_ -match 'step4' -or $_ -match 'step9'} | `
Set-Content $Out
Give this a try:
Get-ChildItem "C:\Users\gerhardl\Documents\My Received Files" -Filter *.log |
Foreach-Object {
$content = Get-Content $_.FullName
#filter and save content to the original file
$content | Where-Object {$_ -match 'step[49]'} | Set-Content $_.FullName
#filter and save content to a new file
$content | Where-Object {$_ -match 'step[49]'} | Set-Content ($_.BaseName + '_out.log')
}
To get the content of a directory you can use
$files = Get-ChildItem "C:\Users\gerhardl\Documents\My Received Files\"
Then you can loop over this variable as well:
for ($i=0; $i -lt $files.Count; $i++) {
$outfile = $files[$i].FullName + "out"
Get-Content $files[$i].FullName | Where-Object { ($_ -match 'step4' -or $_ -match 'step9') } | Set-Content $outfile
}
An even easier way to put this is the foreach loop (thanks to #Soapy and #MarkSchultheiss):
foreach ($f in $files){
$outfile = $f.FullName + "out"
Get-Content $f.FullName | Where-Object { ($_ -match 'step4' -or $_ -match 'step9') } | Set-Content $outfile
}
If you need to loop inside a directory recursively for a particular kind of file, use the below command, which filters all the files of doc file type
$fileNames = Get-ChildItem -Path $scriptPath -Recurse -Include *.doc
If you need to do the filteration on multiple types, use the below command.
$fileNames = Get-ChildItem -Path $scriptPath -Recurse -Include *.doc,*.pdf
Now $fileNames variable act as an array from which you can loop and apply your business logic.
Other answers are great, I just want to add... a different approach usable in PowerShell:
Install GNUWin32 utils and use grep to view the lines / redirect the output to file http://gnuwin32.sourceforge.net/
This overwrites the new file every time:
grep "step[49]" logIn.log > logOut.log
This appends the log output, in case you overwrite the logIn file and want to keep the data:
grep "step[49]" logIn.log >> logOut.log
Note: to be able to use GNUWin32 utils globally you have to add the bin folder to your system path.