I have a powershell scripts that runs and gets security permissions in a server and output these in a csv. At the moment its outputting the information but with each file it reads it is including the header Account, Ace String and Object Path again: how do i remove so it only shows it at the start of the document only?
Account,Ace String,Object Path
NT AUTHORITY\SYSTEM,Allow FullControl, (Inherited),C:\Users\munjanga\Documents\Operations Orchestration\jetty
BUILTIN\Administrators,Allow FullControl, (Inherited),C:\Users\munjanga\Documents\Operations Orchestration\jetty
EMEA\munjanga,Allow FullControl, (Inherited),C:\Users\munjanga\Documents\Operations Orchestration\jetty
Account,Ace String,Object Path
NT AUTHORITY\SYSTEM,Allow FullControl, (Inherited),C:\Users\munjanga\Documents\Operations Orchestration\jre1.6
$OutFile = "C:\Users\munjanga\Documents\AoN Project\Execute\$([Environment]::MachineName).txt"
$Header = "Folder Path,IdentityReference,AccessControlType,IsInherited,InheritanceFlags,PropagationFlags"
Del $OutFile
Add-Content -Value $Header -Path $OutFile
$RootPath = "C:\Users\munjanga\Documents\Operations Orchestration"
$Folders = dir $RootPath -recurse | where {$_.psiscontainer -eq $true}
$isInherited = #{
$true = 'Inherited'
$false = 'Not Inherited'
}
$inheritance = #{
0 = 'files only'
1 = 'this folder and subfolders'
2 = 'this folder and files'
3 = 'subfolders and files'
}
$fldr = $Folder.FullName
$Folders | % {
$fldr = $_.FullName
Get-Acl $fldr | select -Expand Access |
select #{n='Account';e={$_.IdentityReference}},
#{n='Ace String';e={"{0} {1}, {2} ({3})" -f $_.AccessControlType,
$_.FileSystemRights, $inheritance[$_.InheritanceFlags],
$isInherited[$_.IsInherited]}},
#{n='Object Path';e={$fldr}} | ConvertTo-Csv -NoTypeInformation | % {$_ -replace '"', ""} | Out-File $OutFile -Force -Encoding ascii -Append}
Let's see if this works:
#{n='Object Path';e={$fldr}} | ConvertTo-Csv -NoTypeInformation | Select -Skip 1 | % {$_ -replace '"', ""} | Out-File $OutFile -Force -Encoding ascii -Append
The key bit is the Select -Skip 1 after the ConvertTo-Csv piece.
Edit:
After thinking this through a bit more there's a better answer here. The last line should look something like this:
#{n='Object Path';e={$fldr}}} | ConvertTo-Csv -NoTypeInformation | % {$_ -replace '"', ""} | Out-File $OutFile -Force -Encoding ascii -Append
That loop should be closed before converting to CSV.
Related
I have the following code to add new columns to a csv file. I would like to amend the code to run on multiple csv files within a folder and output it to a different folder.
$source = "C:\input_folder\input.csv"
$destination = "C:\output_folder\output.csv"
(Import-CSV $source |
Select-Object *,#{Name='column1';Expression={'data1'}} |
Select-Object *,#{Name='column2';Expression={'data2'}} |
ConvertTo-csv -NoTypeInformation |
Select-Object -Skip 0) -replace '"' | Set-Content $destination
You could do something like this. You might also want to add a check that the files in the folder are .csv type and not something else.
$source = "C:\input_folder"
$destinationFolder = "C:\output_folder"
$folderContents = Get-ChildItem $source
foreach ($item in $folderContents) {
if (Test-Path -Path $item -PathType Leaf == True) {
(Import-CSV $item |
Select-Object *,#{Name='column1';Expression={'data1'}} |
Select-Object *,#{Name='column2';Expression={'data2'}} |
ConvertTo-csv -NoTypeInformation |
Select-Object -Skip 0) -replace '"' | Set-Content ("$destinationFolder\$item" + "_formatted.csv")
}
}
I am trying to list first and second level folders of a path. the script works fine, but I am having this error "You cannot call a method on a null-valued expression." any idea why ?
$folderPath = '\\FILSERVER\DATA$'
$PathScript = "C:\Users\adm\Desktop\Script_V.2"
$sites = "Madrid"
foreach ($site in $Sites){
#Get_Level_1_Folders
$PathShare = "\\FILSERVER\DATA$\Data_$site"
Get-ChildItem -Path $PathShare -Directory -Force -ErrorAction SilentlyContinue | Select-Object FullName | out-file "${PathScript}\level_1_${site}.txt"
(get-content "${PathScript}\level_1_${site}.txt") -notmatch "--------" | out-file "${PathScript}\level_1_${site}.txt"
(get-content "${PathScript}\level_1_${site}.txt").replace("\\FILSERVER\DATA$\Data_$site\","" ) | out-file "${PathScript}\level_1_${site}.txt"
(get-content "${PathScript}\level_1_${site}.txt") -notmatch "FullName" | out-file "${PathScript}\level_1_${site}.txt"
(get-content "${PathScript}\level_1_${site}.txt") | Foreach {$_.TrimEnd()} | Set-Content "${PathScript}\level_1_${site}.txt"
(get-content "${PathScript}\level_1_${site}.txt") | ? {$_.trim() -ne "" } | set-content "${PathScript}\level_1_${site}.txt"
#Get_Level_2_Folders
$Level_Folders = get-content "${PathScript}\level_1_${site}.txt"
foreach($lv1 in $Leve1_Folders){
Get-ChildItem -Path $PathShare\$lv1 -Directory -Force -ErrorAction SilentlyContinue | Select-Object FullName | out-file "${PathScript}\level_2_${site}_${lv1}.txt"
(get-content "${PathScript}\level_2_${site}_${lv1}.txt") -notmatch "--------" | out-file "${PathScript}\level_2_${site}_${lv1}.txt"
(get-content "${PathScript}\level_2_${site}_${lv1}.txt").replace("\\FILSERVER\DATA$\Data_$site\","") | out-file "${PathScript}\level_2_${site}_${lv1}.txt"
(get-content "${PathScript}\level_2_${site}_${lv1}.txt") -notmatch "FullName" | out-file "${PathScript}\level_2_${site}_${lv1}.txt"
(get-content "${PathScript}\level_2_${site}_${lv1}.txt") | Foreach {$_.TrimEnd()} | Set-Content "${PathScript}\level_2_${site}_${lv1}.txt"
(get-content "${PathScript}\level_2_${site}_${lv1}.txt") | ? {$_.trim() -ne "" } | set-content "${PathScript}\level_2_${site}_${lv1}.txt"
}
As mentioned in comments, the cause is likely that this expandable string:
"${PathScript}\level_2_${site}_${lv1}.txt"
... resolved to the path of a file that's empty.
Get-Content will open the file - which is why you don't get any "file not found" errors - and then immediately return without outputting anything, since there's no meaningful "lines" to consume in an empty file.
The result of the (Get-Content ...) expression is therefore $null, and you received the error in question.
You can either use the -replace operator which will take any number of strings (including none) as input - just make sure you escape the arguments:
(Get-Content "${PathScript}\level_2_${site}_${lv1}.txt") -replace [regex]::Escape("\\FILSERVER\DATA$\Data_$site\") |Out-File ...
Or let the pipeline take care of enumerating the output instead of relying on implicit property enumeration:
Get-Content "${PathScript}\level_2_${site}_${lv1}.txt" |ForEach-Object {$_.Replace("\\FILSERVER\DATA$\Data_$site\","")} |Out-File ...
I am merging a lot of large CSV files, e.g. while skipping the leading junk and appending the filename to each line:
Get-ChildItem . | Where Name -match "Q[0-4]20[0-1][0-9].csv" |
Foreach-Object {
$file = $_.BaseName
Get-Content $_.FullName | select-object -skip 3 | % {
"$_,${file}" | Out-File -Append temp.csv -Encoding ASCII
}
}
In PowerShell this is incredibly slow even on an i7/16GB machine (~5 megabyte/minute). Can I make it more efficient or should I just switch to e.g. Python?
Get-Content / Set-Content are terrible with larger files. Streams are a good alternative when performance is key. So with that in mind lets use one to read in each file and another to write out the results.
$rootPath = "C:\temp"
$outputPath = "C:\test\somewherenotintemp.csv"
$streamWriter = [System.IO.StreamWriter]$outputPath
Get-ChildItem $rootPath -Filter "*.csv" -File | ForEach-Object{
$file = $_.BaseName
[System.IO.File]::ReadAllLines($_.FullName) |
Select-Object -Skip 3 | ForEach-Object{
$streamWriter.WriteLine(('{0},"{1}"' -f $_,$file))
}
}
$streamWriter.Close(); $streamWriter.Dispose()
Create a writing stream $streamWriter to output the edited lines in each file. We could read in the file and write the file in larger batches, which would be faster, but we need to ignore a few lines and make changes to each one so processing line by line is simpler. Avoid writing anything to console during this time as it will just slow everything down.
What '{0},"{1}"' -f $_,$file does is quote that last "column" that is added in case the basename contains spaces.
Measure-Command -Expression {
Get-ChildItem C:\temp | Where Name -like "*.csv" | ForEach-Object {
$file = $_.BaseName
Get-Content $_.FullName | select-object -Skip 3 | ForEach-Object {
"$_,$($file)" | Out-File -Append C:\temp\t\tempe1.csv -Encoding ASCII -Force
}
}
} # TotalSeconds : 12,0526802 for 11415 lines
If you first put everything into an array in memory, things go a lot faster:
Measure-Command -Expression {
$arr = #()
Get-ChildItem C:\temp | Where Name -like "*.csv" | ForEach-Object {
$file = $_.BaseName
$arr += Get-Content $_.FullName | select-object -Skip 3 | ForEach-Object {
"$_,$($file)"
}
}
$arr | Out-File -Append C:\temp\t\tempe2.csv -Encoding ASCII -Force
} # TotalSeconds : 0,8197193 for 11415 lines
EDIT: Fixed it so that your filename was added to each row.
To avoid -Append to ruin the performance of your script you could use a buffer array variable:
# Initialize buffer
$csvBuffer = #()
Get-ChildItem *.csv | Foreach-Object {
$file = $_.BaseName
$content = Get-Content $_.FullName | Select-Object -Skip 3 | %{
"$_,${file}"
}
# Populate buffer
$csvBuffer += $content
# Write buffer to disk if it contains 5000 lines or more
$csvBufferCount = $csvBuffer | Measure-Object | Select-Object -ExpandProperty Count
if( $csvBufferCount -ge 5000 )
{
$csvBuffer | Out-File -Path temp.csv -Encoding ASCII -Append
$csvBuffer = #()
}
}
# Important : empty the buffer remainder
if( $csvBufferCount -gt 0 )
{
$csvBuffer | Out-File -Path temp.csv -Encoding ASCII -Append
$csvBuffer = #()
}
How can I avoid getting a blank line at the end of an Out-File?
$DirSearcher = New-Object System.DirectoryServices.DirectorySearcher([adsi]'')
$DirSearcher.Filter = '(&(objectClass=Computer)(!(cn=*esx*)) (!(cn=*slng*)) (!(cn=*dcen*)) )'
$DirSearcher.FindAll().GetEnumerator() | sort-object { $_.Properties.name } `
| ForEach-Object { $_.Properties.name }`
| Out-File -FilePath C:\Computers.txt
I have tried several options and none of them seem to do anything, they all still have a blank line at the end.
(get-content C:\Computers.txt) | where {$_ -ne ""} | out-file C:\Computers.txt
$file = C:\Computers.txt
Get-Content $file | where {$_.Length -ne 0} | Out-File "$file`.tmp"
Move-Item "$file`.tmp" $file -Force
Use [IO.File]::WriteAllText:
[IO.File]::WriteAllText("$file`.tmp",
((Get-Content $file) -ne '' -join "`r`n"),
[Text.Encoding]::UTF8)
Often when you're looking to see if strings have no character data, you will want to use String.IsNullOrWhiteSpace():
Get-Content $file | Where-Object { ![String]::IsNullOrWhiteSpace($_) } | Out-File "$file`.tmp"
this the best solution for the avoiding the empty line at the end of txt file using powershell command
Add-Content C:\Users\e5584332\Desktop\CSS.txt "Footer | Processed unique data || $count " -NoNewline
I have a CSV file (file1) that looks like: (User dirs and the size)
Initials,Size
User1,10
User2,100
User3,131
User4,140
I have another CSV file (file2) that looks like: (VIP users)
User2
User4
Now what I'm trying to do, is to update file1, so it looks like:
User1,10
User3,131
User2 and User4 is removed because they are in file2
I can get them removed, but at the same time I remove the size for all users, so my output is only the Users:
User1
User3
My code:
$SourcePath = "\\server1\info\SYSINFO\UsrSize"
$DestinationFile = "\\server1\info\SYSINFO\UsrSize\OverLimit\UsersOverLimit1.log"
$VIP_Exclusion_List = "\\server1\info\SYSINFO\UsrSize\OverLimit\_VIP_EXCLUSION_LIST.txt"
$Database = "\\server1\info\SYSINFO\UsrSize\OverLimit\_UsersOverLimitDATABASE.log"
$INT_SizeToLookFor = 100
dir $SourcePath -Filter usr*.txt | import-csv -delimiter "`t" |
Where-Object {[INT] $_."Size excl. Backup/Pst" -ge $INT_SizeToLookFor} |
Select-Object Initials,"Size excl. Backup/Pst" | convertto-csv -NoTypeInformation | % { $_ -replace '"', ""} | out-file $DestinationFile ;
$Userlist = import-csv $DestinationFile | Select-Object Initials |
convertto-csv -NoTypeInformation | % { $_ -replace '"', ""};
compare-object ($Userlist) (get-content $VIP_Exclusion_List) |
select-object inputObject | convertto-csv -NoTypeInformation |
% { $_ -replace '"', ""} | out-file "\\server1\info\SYSINFO\UsrSize\OverLimit\UsersOverLimitThisTime.log";
If the files are small-ish and you don't care too much about performance, then the following would be a trivial way:
$data = Import-Csv file1
$vips = Import-Csv file2
$data = $data | ?{ $vips -notcontains $_.Initials }
$data | Export-Csv file1_new -NoTypeInformation
A faster way would be to add the names to remove to a set, but given the things you're talking about here I doubt you'll get into the range of a few thousand or million users.
I solved it using this code:
$ArrayVIP = get-content $VIP_Exclusion_List
select-string $DestinationFile -pattern $ArrayVIP -notmatch |
select -expand line |
out-file $DestinationFile
Taken from here: Removing lines from a CSV