I have two csv files, each that contain a PATH column. For example:
CSV1.csv
PATH,Data,NF
\\server1\folderA,1,1
\\server1\folderB,1,1
\\server2\folderA,1,1
\\server2\folderB,1,1
CSV2.csv
PATH,User,Access,Size
\\server1\folderA\file1,don,1
\\server1\folderA\file2,don,1
\\server1\folderA\file3,sue,1
\\server2\folderB\file1,don,1
What I'm attempting to do is create a script that will result in separate csv exports based on the paths in CSV1 such that the new files contain file values from CSV2 that match. For example, from the above, I'd end up with 2 results:
result1.csv
\\server1\folderA\file1,don,1
\\server1\folderA\file2,don,1
\\server1\folderA\file3,sue,1
result2.csv
\\server2\folderB\file1,don,1
Previously I've used a script lime this when the two values are exact:
$reportfile = import-csv $apireportoutputfile -delimiter ';' -encoding unicode
$masterlist = import-csv $pathlistfile
foreach ($record in $masterlist)
{
$path=$record.Path
$filename = $path -replace '\\','_'
$filename = '.\Working\sharefiles\' + $filename + '.csv'
$reportfile | where-object {$_.path -eq $path} | select FilePath,UserName,LastAccessDate,LogicalSize | export-csv -path $filename
write-host " Creating files list for $path" -foregroundcolor red -backgroundcolor white
}
however since the two path values are not the same, it returns nothing. I found a -like operator but am not sure how to use it in this code to get the results I want. where-object is a filter while -like ends up returning a true/false. Am I on the right track? Any ideas for a solution?
Something like this, maybe?
$ht = #{}
Import-Csv csv1.csv |
foreach { $ht[$_.path] = New-Object collections.arraylist }
Import-Csv csv2.csv |
foreach {
$path = $_.path | Split-Path -Parent
$ht[$path].Add($_) > $null
}
$i=1
$ht.Values |
foreach { if ($_.count)
{
$_ | Export-Csv "result$i.csv" -NoTypeInformation
$i++
}
}
My suggestion:
$1=ipcsv .\csv1.CSV
$2=ipcsv .\csv2.CSV
$equal = diff ($2|select #{n='PATH';e={Split-Path $_.PATH}}) $1 -Property PATH -IncludeEqual -ExcludeDifferent -PassThru
0..(-1 + $equal.Count) | %{%{$i = $_}{
$2 | ?{ (Split-Path $_.PATH) -eq $equal[$i].PATH } | epcsv ".\Result$i.CSV"
}}
Related
In this script I'm getting a collection of CSV files, performing a replace, storing in an empty array and attempting to export it to CSV.
$CSVFiles = Get-ChildItem "C:\GALIC\Test\Test2\WindowsLists\*.csv" -Exclude M*
$AllJobsList = $CSVFiles | ForEach { (Import-CSV $_ -Delimiter ',' | Select 'Agent', 'Name', 'Folder' | Where-Object {$_.Agent -like "*AGENTGROUP*"})}
$UpdatedGroupsList = #()
$AllJobsList | Export-Csv -Path "C:\GALIC\Test\Test2\WindowsLists\FullJobs-Test.csv" -NoTypeInformation -Force
**$CSVContent = Get-Content "C:\GALIC\Test\Test2\WindowsLists\FullJobs-Test.csv"
foreach($line in $CSVContent)
{
if($line.Contains('|') -and $line.Contains('HOSTG'))
{
#Write-Host $line
$null = $line.Replace('|', '').Replace('HOSTG', '')
#Write-Host $LineReplace
$UpdatedGroupsList += $line
}
}
$UpdatedGroupsList | Export-CSV -Path "C:\GALIC\Test\Test2\WindowsLists\UpdatedFullJobs.csv" -NoTypeInformation -Force**
($CSVContent on down is what's giving me issues.)
After opening the CSV file, the content looks nothing like what I'm expecting. Any ideas/suggestions?
enter image description here
I'm trying (badly) to work through combining CSV files into one file and prepending a column that contains the file name. I'm new to PowerShell, so hopefully someone can help here.
I tried initially to do the well documented approach of using Import-Csv / Export-Csv, but I don't see any options to add columns.
Get-ChildItem -Filter *.csv | Select-Object -ExpandProperty FullName | Import-Csv | Export-Csv CombinedFile.txt -UseQuotes Never -NoTypeInformation -Append
Next I'm trying to loop through the files and append the name, which kind of works, but for some reason this stops after the first row is generated. Since it's not a CSV process, I have to use the switch to skip the first title row of each file.
$getFirstLine = $true
Get-ChildItem -Filter *.csv | Where-Object {$_.Name -NotMatch "Combined.csv"} | foreach {
$filePath = $_
$collection = Get-Content $filePath
foreach($lines in $collection) {
$lines = ($_.Basename + ";" + $lines)
}
$linesToWrite = switch($getFirstLine) {
$true {$lines}
$false {$lines | Select -Skip 1}
}
$getFirstLine = $false
Add-Content "Combined.csv" $linesToWrite
}
This is where the -PipelineVariable parameter comes in real handy. You can set a variable to represent the current iteration in the pipeline, so you can do things like this:
Get-ChildItem -Filter *.csv -PipelineVariable File | Where-Object {$_.Name -NotMatch "Combined.csv"} | ForEach-Object { Import-Csv $File.FullName } | Select *,#{l='OriginalFile';e={$File.Name}} | Export-Csv Combined.csv -Notypeinfo
Merging your CSVs into one and adding a column for the file's name can be done as follows, using a calculated property on Select-Object:
Get-ChildItem -Filter *.csv | ForEach-Object {
$fileName = $_.Name
Import-Csv $_.FullName | Select-Object #{
Name = 'FileName'
Expression = { $fileName }
}, *
} | Export-Csv path/to/merged.csv -NoTypeInformation
I'm tying to automate gci in order to work on each row in a config file, where for each row I have as first column the path, and following it a list of files. Something like this:
C:\Users\*\AppData\Roaming\* *.dll
C:\Test file.txt,file2.txt
This means that gci will search for:
*.dll in C:\Users*\AppData\Roaming*
file.txt in C:\Test
file2.txt in C:\Test
In order to do this I'm creating dynamically the where condition in the script below. Here the ps script I'm using
foreach($line in Get-Content .\List.txt) {
try {
$path,$files = $line.split(' ')
$files = $files.split(',')
}
catch {
$path = $line
$files = "*.*"
}
if([string]::IsNullOrEmpty($files)){
$files = "*.*"
}
$filter = $files -join(" -or `$_.Name` -like ")
$filter = "`$_.Name` -like " + $filter
echo "Searching Path: $path, Pattern: $filter" | out-file -append -encoding ASCII -filepath .\result.txt
if ($path.Contains("*"))
{
gci -Path $path -Recurse | Where {$filter} | Select -ExpandProperty FullName | Out-String -Width 2048 | out-file -append -encoding UTF8 -filepath .\result.txt
}
else
{
gci -Path $path | Where {$filter} | Select -ExpandProperty FullName | Out-String -Width 2048 | out-file -append -encoding UTF8 -filepath .\result.txt
}
}
The problem is that the where filter is not considered. All files are returned
First attempt, suggested by
foreach($line in Get-Content .\List.txt) {
try {
$path,$files = $line.split(' ')
$files = $files.split(',')
}
catch {
$path = $line
$files = "*.*"
}
if([string]::IsNullOrEmpty($files)){
$files = "*.*"
}
$filter = $files -join(" -or `$_.Name -like ")
$filter = "`$_.Name -like " + $filter
$gciParams = #{
Path = $Path
Recurse = $Path.Contains('*')
}
"Searching Path: $path, Pattern(s): [$($files -join ',')]" | Add-Content -Path .\result.txt -Encoding ASCII
Get-ChildItem #gciParams | Where $filter | Select -ExpandProperty FullName | Add-Content -Path .\result.txt -Encoding UTF8
}
If you want to create a piece of code and defer execution of it until later, you need a Script Block.
A Script Block literal in PowerShell is just {}, so for constructing script block to filter based on a single comparison, you'd want to define $filter like this:
$filter = {$_.Name -like $filter}
At which point you can pass it directly as an argument to Where-Object:
Get-ChildItem $path |Where-Object $filter
... but since you want to test against multiple wildcard patterns, we'll need to write a slightly different filtering routine:
$filter = {
# Store file name of file we're filtering
$FileName = $_.Name
# Test ALL the patterns in $files and see if at least 1 matches
$files.Where({$FileName -like $_}, 'First').Count -eq 1
}
Since the $filter block now references $files to get the patterns, we can simplify your loop as:
foreach($line in Get-Content .\List.txt) {
try {
$path,$files = $line.split(' ')
$files = $files.split(',')
}
catch {
$path = $line
$files = "*.*"
}
if([string]::IsNullOrEmpty($files)){
$files = "*.*"
}
$gciParams = #{
Path = $Path
Recurse = $Path.Contains('*')
}
"Searching Path: $path, Pattern(s): [$($files -join ',')]" | Add-Content -Path .\result.txt -Encoding ASCII
Get-ChildItem #gciParams | Where $filter | Select -ExpandProperty FullName | Add-Content -Path .\result.txt -Encoding UTF8
}
Note that we no longer need to re-define $filter everytime the loop runs - the condition is based on the value of $files at runtime, so you can define $filter once before entering the loop and then reuse $filter every time.
The "trick" with using #gciParams (which allows us to remove the big if/else block) is known as splatting, but you could achieve the same result with Get-ChildItem -Path:$Path -Recurse:$Path.Contains('*') :)
I want to filter lines according to specific word from file in powershell.
For example: the files animal1.txt and animal2.txt. Every file contain lines
dog
cat
dog
dog
bird
Then I want to create two derived files:
animal1_bak.txt that stores lines which contains the word 'dog' from animal1.txt
animal2_bak.txt that stores lines which contains the word 'dog' from animal2.txt
What I found on web is:
Select-String -Path "*.*" -Pattern "dog"
But the instruction to create the derived word is missing.
What can I do?
You can first get-content and use set-content like below
Get-Content -Path E:\KTDocs\Scripts\animal1.txt | where {
$_ -like '*dog*'} |Set-Content e:\animalbak.txt
try Something like this
select-string -Path "c:\temp\animal*.txt" -Pattern "dog" | Group Path | %{
$FileName="{0}_bak.txt" -f $_.Name
$_.Group.Line | select -unique | Out-File $FileName -Append
}
$folderpath = "D:\AnimalFolder" # your folder path here
$Allfiles = Get-ChildItem -Path $folderpath -Recurse -File -Force -ErrorAction SilentlyContinue |where{$_.Name -match ".txt"} |Select-Object -expandproperty FullName
foreach($filepath in $allfiles)
{
$Data = get-content $filepath
foreach($line in $data)
{
if($line -match "dog")
{
$newpath = $filepath.split('.')[0]
$getfullpath = $newpath + "_bak.txt"
$line | out-file $getfullpath -append
}
}
}
I need to take a slew of csv files from a directory and get them into an array in Powershell (to eventually manipulate and write back to a CSV).
The problem is there are 5 file types. I need around 8 columns from each. The columns are essentially the same, but have different headings.
Is there an easy way to do this? I started creating a custom object with my 8 fields, looping through the files importing each one, looking at the filename (which tells me the column names I need) and then a bunch of ifs to add it to my custom object array.
I was wondering if there is a simpler way...like with a template saying which columns from each file.
wound up doing this. It may have not been the most efficient, but works. I wound up writing out each file separately and combining at the end as PS really got bogged down (over a million rows combined).
$Newcsv = #()
$path = "c:\scrap\BWFILES\"
$files = gci -path $path -recurse -filter *.csv | Where-Object { ! ($_.psiscontainer) }
$counter=1
foreach($file in $files)
{
$csv = Import-Csv $file.FullName
if ($file.Name -like '*SAV*')
{
$Newcsv = $csv | Select-Object #{Name="PRODUCT";Expression={"SV"}},DMBRCH,DMACCT,DMSHRT
}
if ($file.Name -like '*TIME*')
{
$Newcsv = $csv | Select-Object #{Name="PRODUCT";Expression={"TM"}},TMBRCH,TMACCT,TMSHRT
}
if ($file.Name -like '*TRAN*')
{
$Newcsv = $csv | Select-Object #{Name="PRODUCT";Expression={"TR"}},DMBRCH,DMACCT,DMSHRT
}
if ($file.Name -like '*LN*')
{
$Newcsv = $csv | Select-Object #{Name="PRODUCT";Expression={"LN"}},LNBRCH,LNNOTE,LNSHRT
}
$Newcsv | Export-Csv "C:\scrap\$file.name$counter.csv" -force -notypeinformation
$counter++
}
get-childItem "c:\scrap\*.csv" | foreach {
$filePath = $_
$lines = $lines = Get-Content $filePath
$linesToWrite = switch($getFirstLine) {
$true {$lines}
$false {$lines | Select -Skip 1}
}
$getFirstLine = $false
Add-Content "c:\scrap\combined.csv" $linesToWrite
}
With a hashtable for reference, a little RegEx matching, and using the automatic variable $Matches in a ForEach-Object loop (alias % used) that could all be shortened to:
$path = "c:\scrap\BWFILES\"
$Reference = #{
'SAV' = 'SV'
'TIME' = 'TM'
'TRAN' = 'TR'
'LN'='LN'
}
Set-Content -Value "PRODUCT,BRCH,ACCT,SHRT" -Path 'c:\scrap\combined.csv'
gci -path $path -recurse -filter *.csv | Where-Object { !($_.psiscontainer) -and $_.Name -match ".*(SAV|TIME|TRAN|LN).*"}|%{
$Product = $Reference[($Matches[1])]
Import-CSV $_.FullName | Select-Object #{Name="PRODUCT";Expression={$Product}},*BRCH,#{l='Acct';e={$_.LNNOTE, $_.DMACCT, $_.TMACCT|?{$_}}},*SHRT | ConvertTo-Csv -NoTypeInformation | Select -Skip 1 | Add-Content 'c:\scrap\combined.csv'
}
That should produce the exact same file. Only kind of tricky part was the LNNOTE/TMACCT/DMACCT field since obviously you can't just do the same as like *SHRT.