I currently have a CSV file that has 2,440 lines of data. The data looks something like:
server1:NT:Y:N:N:00:N
server2:NT:Y:N:n:33:N
This is what I have so far:
$newCsvPath = Get-Content .\sever.csv |
Where-Object { $_ -notmatch '^#|^$|^"#' }
[int]$windows = 0
[int]$totalsever = 0
$Results = #()
$date = Get-Date -Format g
Clear-Content .\results.csv -Force
foreach ($thing in $newCsvPath) {
$totalsever++
$split = $thing -split ":"
if ($split[1] -contains "NT") {
$windows++
$thing | Out-File results.csv -Append -Force
} else {
continue
}
}
Clear-Content .\real.csv -Force
$servers = Get-Content results.csv
foreach ($server in $servers) {
$server.Split(':')[0] | Out-File real.csv -Append -Force
}
My issue is that when the script gets to the $server.Split(':')[0] | Out-File real.csv -Append -Force part, for some reason it only outputs 1,264 lines instead of all 2,440 to "real.csv". However, when I remove | Out-File real.csv -Append -Force, $server stores ALL 2,400 names of servers.
Does anyone have any idea of why this is happening?
Related
I'm tying to automate gci in order to work on each row in a config file, where for each row I have as first column the path, and following it a list of files. Something like this:
C:\Users\*\AppData\Roaming\* *.dll
C:\Test file.txt,file2.txt
This means that gci will search for:
*.dll in C:\Users*\AppData\Roaming*
file.txt in C:\Test
file2.txt in C:\Test
In order to do this I'm creating dynamically the where condition in the script below. Here the ps script I'm using
foreach($line in Get-Content .\List.txt) {
try {
$path,$files = $line.split(' ')
$files = $files.split(',')
}
catch {
$path = $line
$files = "*.*"
}
if([string]::IsNullOrEmpty($files)){
$files = "*.*"
}
$filter = $files -join(" -or `$_.Name` -like ")
$filter = "`$_.Name` -like " + $filter
echo "Searching Path: $path, Pattern: $filter" | out-file -append -encoding ASCII -filepath .\result.txt
if ($path.Contains("*"))
{
gci -Path $path -Recurse | Where {$filter} | Select -ExpandProperty FullName | Out-String -Width 2048 | out-file -append -encoding UTF8 -filepath .\result.txt
}
else
{
gci -Path $path | Where {$filter} | Select -ExpandProperty FullName | Out-String -Width 2048 | out-file -append -encoding UTF8 -filepath .\result.txt
}
}
The problem is that the where filter is not considered. All files are returned
First attempt, suggested by
foreach($line in Get-Content .\List.txt) {
try {
$path,$files = $line.split(' ')
$files = $files.split(',')
}
catch {
$path = $line
$files = "*.*"
}
if([string]::IsNullOrEmpty($files)){
$files = "*.*"
}
$filter = $files -join(" -or `$_.Name -like ")
$filter = "`$_.Name -like " + $filter
$gciParams = #{
Path = $Path
Recurse = $Path.Contains('*')
}
"Searching Path: $path, Pattern(s): [$($files -join ',')]" | Add-Content -Path .\result.txt -Encoding ASCII
Get-ChildItem #gciParams | Where $filter | Select -ExpandProperty FullName | Add-Content -Path .\result.txt -Encoding UTF8
}
If you want to create a piece of code and defer execution of it until later, you need a Script Block.
A Script Block literal in PowerShell is just {}, so for constructing script block to filter based on a single comparison, you'd want to define $filter like this:
$filter = {$_.Name -like $filter}
At which point you can pass it directly as an argument to Where-Object:
Get-ChildItem $path |Where-Object $filter
... but since you want to test against multiple wildcard patterns, we'll need to write a slightly different filtering routine:
$filter = {
# Store file name of file we're filtering
$FileName = $_.Name
# Test ALL the patterns in $files and see if at least 1 matches
$files.Where({$FileName -like $_}, 'First').Count -eq 1
}
Since the $filter block now references $files to get the patterns, we can simplify your loop as:
foreach($line in Get-Content .\List.txt) {
try {
$path,$files = $line.split(' ')
$files = $files.split(',')
}
catch {
$path = $line
$files = "*.*"
}
if([string]::IsNullOrEmpty($files)){
$files = "*.*"
}
$gciParams = #{
Path = $Path
Recurse = $Path.Contains('*')
}
"Searching Path: $path, Pattern(s): [$($files -join ',')]" | Add-Content -Path .\result.txt -Encoding ASCII
Get-ChildItem #gciParams | Where $filter | Select -ExpandProperty FullName | Add-Content -Path .\result.txt -Encoding UTF8
}
Note that we no longer need to re-define $filter everytime the loop runs - the condition is based on the value of $files at runtime, so you can define $filter once before entering the loop and then reuse $filter every time.
The "trick" with using #gciParams (which allows us to remove the big if/else block) is known as splatting, but you could achieve the same result with Get-ChildItem -Path:$Path -Recurse:$Path.Contains('*') :)
I want to add an append to my script. So if it should crash it gives me the data to this point and dont print like nothing.
Right now the use of the script is to filter a List by name and Date. After that it remove all names on the blacklist and only contains entries from the month i entered
[xml]$config = Get-Content -Path 'C:\Users\DZimmermann\Desktop\EVIM.Script\EVIM-Config.xml'
[xml]$blacklist = Get-Content -Path 'C:\Users\DZimmermann\Desktop\EVIM.Script\EVIM-Blacklist.xml'
#Names to filter
$BLN = $blacklist.Names
#Import Path
$info = Import-Csv $config.config.path.input -Delimiter ';'
$info | Format-Table
#from which month
#$dateCutoff = get-date "02.2020" -Format "MM.yyyy"
$dateCutoff = $config.config.date
$result = foreach($i in $info){
if(-Not($blacklist -contains $i.SCAN_USER)){
$entryDate = get-date $i.SCAN_DATE -Format "MM.yyyy"
if($entryDate -eq $dateCutoff){
$i
}
}
Write-Host $i.SCAN_DATE
}
#Export path
$result | Export-Csv $config.config.path.output -NoTypeInformation -Delimiter ';'
$dateCutoff
all my changeble vars are linkt with a config file so you dont have to edit the script every time.
Start-Transcript -Path "path to save the transcript" -Append
xml]$config = Get-Content -Path 'C:\Users\DZimmermann\Desktop\EVIM.Script\EVIM-Config.xml'
[xml]$blacklist = Get-Content -Path 'C:\Users\DZimmermann\Desktop\EVIM.Script\EVIM-Blacklist.xml'
#Names to filter
$BLN = $blacklist.Names
#Import Path
$info = Import-Csv $config.config.path.input -Delimiter ';' -Verbose
$info | Format-Table -Verbose
#from which month
#$dateCutoff = get-date "02.2020" -Format "MM.yyyy"
$dateCutoff = $config.config.date
$result = foreach($i in $info){
if(-Not($blacklist -contains $i.SCAN_USER)){
$entryDate = get-date $i.SCAN_DATE -Format "MM.yyyy"
if($entryDate -eq $dateCutoff){
$i
}
}
Write-Host $i.SCAN_DATE
}
#Export path
$result | Export-Csv $config.config.path.output -NoTypeInformation -Delimiter ';' -Append -Verbose
$dateCutoff
Stop-Transcript
thank you for your help but i think i got it :) My script looks now like this
[xml]$config = Get-Content -Path 'C:\Users\DZimmermann\Desktop\EVIM.Script\EVIM-Config.xml'
[xml]$blacklist = Get-Content -Path 'C:\Users\DZimmermann\Desktop\EVIM.Script\EVIM-Blacklist.xml'
#Names to filter
$BLN = $blacklist.Names
#Import Path
$info = Import-Csv $config.config.path.input -Delimiter ';'
$info | Format-Table
#from which month
#$dateCutoff = get-date "02.2020" -Format "MM.yyyy"
$dateCutoff = $config.config.date
$result = foreach($i in $info){
if(-Not($BLN -contains $i.SCAN_USER)){
$entryDate = Get-Date $i.SCAN_DATE -Format "MM.yyyy"
if($entryDate -eq $dateCutoff){
$i
}
}
$result | Out-File $config.config.path.output
Write-Host $i
$config.config.path.output + "\" + $info | Out-File -Append $config.config.path.output
}
getting memory exception while running this code. Is there a way to filter one file at a time and write output and append after processing each file. Seems the below code loads everything to memory.
$inputFolder = "C:\Change\2019\October"
$outputFile = "C:\Change\2019\output.csv"
Get-ChildItem $inputFolder -File -Filter '*.csv' |
ForEach-Object { Import-Csv $_.FullName } |
Where-Object { $_.machine_type -eq 'workstations' } |
Export-Csv $outputFile -NoType
May be can you export and filter your files one by one and append result into your output file like this :
$inputFolder = "C:\Change\2019\October"
$outputFile = "C:\Change\2019\output.csv"
Remove-Item $outputFile -Force -ErrorAction SilentlyContinue
Get-ChildItem $inputFolder -Filter "*.csv" -file | %{import-csv $_.FullName | where machine_type -eq 'workstations' | export-csv $outputFile -Append -notype }
Note: The reason for not using Get-ChildItem ... | Import-Csv ... - i.e., for not directly piping Get-ChildItem to Import-Csv and instead having to call Import-Csv from the script block ({ ... } of an auxiliary ForEach-Object call, is a bug in Windows PowerShell that has since been fixed in PowerShell Core - see the bottom section for a more concise workaround.
However, even output from ForEach-Object script blocks should stream to the remaining pipeline commands, so you shouldn't run out of memory - after all, a salient feature of the PowerShell pipeline is object-by-object processing, which keeps memory use constant, irrespective of the size of the (streaming) input collection.
You've since confirmed that avoiding the aux. ForEach-Object call does not solve the problem, so we still don't know what causes your out-of-memory exception.
Update:
This GitHub issue contains clues as to the reason for excessive memory use, especially with many properties that contain small amounts of data.
This GitHub feature request proposes using strongly typed output objects to help the issue.
The following workaround, which uses the switch statement to process the files as text files, may help:
$header = ''
Get-ChildItem $inputFolder -Filter *.csv | ForEach-Object {
$i = 0
switch -Wildcard -File $_.FullName {
'*workstations*' {
# NOTE: If no other columns contain the word `workstations`, you can
# simplify and speed up the command by omitting the `ConvertFrom-Csv` call
# (you can make the wildcard matching more robust with something
# like '*,workstations,*')
if ((ConvertFrom-Csv "$header`n$_").machine_type -ne 'workstations') { continue }
$_ # row whose 'machine_type' column value equals 'workstations'
}
default {
if ($i++ -eq 0) {
if ($header) { continue } # header already written
else { $header = $_; $_ } # header row of 1st file
}
}
}
} | Set-Content $outputFile
Here's a workaround for the bug of not being able to pipe Get-ChildItem output directly to Import-Csv, by passing it as an argument instead:
Import-Csv -LiteralPath (Get-ChildItem $inputFolder -File -Filter *.csv) |
Where-Object { $_.machine_type -eq 'workstations' } |
Export-Csv $outputFile -NoType
Note that in PowerShell Core you could more naturally write:
Get-ChildItem $inputFolder -File -Filter *.csv | Import-Csv |
Where-Object { $_.machine_type -eq 'workstations' } |
Export-Csv $outputFile -NoType
Solution 2 :
$inputFolder = "C:\Change\2019\October"
$outputFile = "C:\Change\2019\output.csv"
$encoding = [System.Text.Encoding]::UTF8 # modify encoding if necessary
$Delimiter=','
#find header for your files => i take first row of first file with data
$Header = Get-ChildItem -Path $inputFolder -Filter *.csv | Where length -gt 0 | select -First 1 | Get-Content -TotalCount 1
#if not header founded then not file with sise >0 => we quit
if(! $Header) {return}
#create array for header
$HeaderArray=$Header -split $Delimiter -replace '"', ''
#open output file
$w = New-Object System.IO.StreamWriter($outputfile, $true, $encoding)
#write header founded
$w.WriteLine($Header)
#loop on file csv
Get-ChildItem $inputFolder -File -Filter "*.csv" | %{
#open file for read
$r = New-Object System.IO.StreamReader($_.fullname, $encoding)
$skiprow = $true
while ($line = $r.ReadLine())
{
#exclude header
if ($skiprow)
{
$skiprow = $false
continue
}
#Get objet for current row with header founded
$Object=$line | ConvertFrom-Csv -Header $HeaderArray -Delimiter $Delimiter
#write in output file for your condition asked
if ($Object.machine_type -eq 'workstations') { $w.WriteLine($line) }
}
$r.Close()
$r.Dispose()
}
$w.close()
$w.Dispose()
You have to read and write to the .csv files one row at a time, using StreamReader and StreamWriter:
$filepath = "C:\Change\2019\October"
$outputfile = "C:\Change\2019\output.csv"
$encoding = [System.Text.Encoding]::UTF8
$files = Get-ChildItem -Path $filePath -Filter *.csv |
Where-Object { $_.machine_type -eq 'workstations' }
$w = New-Object System.IO.StreamWriter($outputfile, $true, $encoding)
$skiprow = $false
foreach ($file in $files)
{
$r = New-Object System.IO.StreamReader($file.fullname, $encoding)
while (($line = $r.ReadLine()) -ne $null)
{
if (!$skiprow)
{
$w.WriteLine($line)
}
$skiprow = $false
}
$r.Close()
$r.Dispose()
$skiprow = $true
}
$w.close()
$w.Dispose()
get-content *.csv | add-content combined.csv
Make sure combined.csv doesn't exist when you run this, or it's going to go full Ouroboros.
I need create this list to allow an other program to properly work. I use this code:
function analyse {
Param(
[parameter(Mandatory=$true)]
[String]$newPath
)
cd $newPath
dir | Foreach-Object {
$data = Get-Content -Path o:\******\public\ParcoursArborescence\Limitless\data.txt
if ($_.PsisContainer -eq $True) {
$testPath = $_.FullName + ";"
$name = $testPath
$testPath = $data -match [regex]::escape($testPath)
$testpath
if($testPath.Length -eq 0) {
$name | Out-File -Append "o:\******\public\ParcoursArborescence\Limitless\data.txt"
if ($_.FullName.Length -gt 248) {
"ecriture"
$result += $_.FullName + "`r"
} else {
"nouvelle analyse"
$_.Fullname
analyse $_.FullName
}
}
} else {
$testPath = $_.Directory.FullName + ";"
$name = $testPath
$testPath = $data -match [regex]::escape($testPath)
if($testPath.Length -eq 0) {
$name | Out-File -Append "o:\******\public\ParcoursArborescence\Limitless\data.txt"
$_.FullName.Length
if ($_.FullName.Length -gt 260) {
"ecriture2"
$result += $_.Directory.Name + "`r"
}
}
}
}
$result | Out-File -Append "o:\******\public\ParcoursArborescence\Limitless\bilanLimitless.txt"
}
But it takes hours and hours... I need to use this in thousands of folders. So, do you have any idea about how could it get faster ?
Maybe I'm oversimplifying things here, but why not list all the files at once, and test their FullName Length (PS 3.0 needed for the -File parameter of Get-ChildItem) ?
$maxLength = 248
Get-ChildItem $newPath -Recurse |
Where-Object { ($_.FullName.Length -gt $maxLength) } |
Select-Object -ExpandProperty DirectoryName -Unique |
Out-File "overlength_paths.txt"
For PS 2.0:
$maxLength = 248
Get-ChildItem $newPath -Recurse -File |
Where-Object { ($_.FullName.Length -gt $maxLength) -and (-not $_.PSisContainer) } |
Select-Object -ExpandProperty DirectoryName -Unique |
Out-File "overlength_paths.txt"
I'm new to Powershell, I'm creating a code to delete a file/s if more than "x" days.
I'm almost done. Need your help in representing my date (table) and should not produce a log file if no files will be delete.
Here's my code:
$max_days = "-30"
$curr_date = Get-Date
$del_date = $curr_date.AddDays($max_days)
$Path = "C:\Desktop\Code"
$DateTime = Get-Date -Format "D=yyyy-MM-dd_T=HH-mm-ss"
$itemsearch = Get-ChildItem C:\Test -Recurse | Where-Object { $_.LastWriteTime -lt $del_date}
Foreach ($item in $itemsearch)
{
Write "File:", $item.Name "Modified:", $item.LastWriteTime "Path:", $item.FullName "Date Deleted:" $del_date | Out-File "C:\Desktop\Code\Deleted\SFTP_DeleteFiles_WORKSPACE_$DateTime.txt" -append
$item | Remove-Item
}
Can anyone please help me? It's already working by the way.
Just need to present the data in table form and don't create a log file if there's nothing to delete.
Update:
Already solved the condition statement by doing:
if($itemsearch)
{
Foreach ($item in $itemsearch)
{
Write "File:", $item.Name "Modified:", $item.LastWriteTime "Path:", $item.FullName "Date Deleted:" $del_date | Out-File "C:\Desktop\Code\Deleted\SFTP_DeleteFiles_WORKSPACE_$DateTime.txt" -append
$item | Remove-Item
}
}
else
{
Write "No files will be deleted."
}
Thanks!
What I want to display it in Excel/Text file is like this one:
http://i59.tinypic.com/30wv33d.jpg
Anyone?
It returns me with this one:
IsReadOnly;"IsFixedSize";"IsSynchronized";"Keys";"Values";"SyncRoot";"Count"
False;"False";"False";"System.Collections.Hashtable+KeyCollection";"System.Collections.Hashtable+ValueCollection";"System.Object";"4"
False;"False";"False";"System.Collections.Hashtable+KeyCollection";"System.Collections.Hashtable+ValueCollection";"System.Object";"4"
False;"False";"False";"System.Collections.Hashtable+KeyCollection";"System.Collections.Hashtable+ValueCollection";"System.Object";"4"
False;"False";"False";"System.Collections.Hashtable+KeyCollection";"System.Collections.Hashtable+ValueCollection";"System.Object";"4"
In Excel. Do you have any idea? I have to search it though.
To introduce tabular logging I would use a CSV file as output by replacing your foreach block by this code:
$results = #()
foreach ($item in $itemsearch)
{
$success = $true
try
{
$item | Remove-Item
}
catch
{
$success = $false
}
if( $success -eq $true )
{
Write-Host $item.FullName 'successfully deleted.'
$results += [PSCustomObject]#{'File'=$item.Name;'Modified'=$item.LastWriteTime;'Path'=$item.FullName;'Date Deleted'=$del_date;'State'='SUCCESS'}
}
else
{
Write-Host 'Error deleting' $item.FullName
$results += [PSCustomObject]#{'File'=$item.Name;'Modified'=$item.LastWriteTime;'Path'=$item.FullName;'Date Deleted'=$del_date;'State'='ERROR'}
}
}
$results | Export-Csv -Path "C:\Desktop\Code\Deleted\SFTP_DeleteFiles_WORKSPACE_$DateTime.csv" -Encoding UTF8 -Delimiter ';' -NoTypeInformation
First an empty array is created ($results).
The try/catch block is here to detect if the deletion succeeded or not, then the appropriate line is added to $results.
At the end the $results array is exported to CSV with ';' separator so you can open it right away with Excel.