Export content of multiple files to a CSV - powershell

Im trying to export the content of multiple .log files into a single CSV.
I think I'm close, but I just can't figure out what I'm doing wrong. I'm guessing it's somewhere in the foreach:
$dir = "\\server\c$\folder1\folder2"
$filter = "*.log"
$files = Get-ChildItem -path $dir -filter $filter | Sort-Object LastAccessTime -Descending | Select-Object -First 10
foreach ($file in $files){
Get-Content $file | convertTo-csv | export-csv -path "\\server\share\test.csv"
}
I've tried to write the get-content line in so many ways, but none seem to work.
When I do $files.name, it lists up the files perfectly.
The error I get from the code is "Cannot find path 'C:\Users\Myname\filename1.log' because it does not exist.. I don't understand why, because I never spesified the c:\users path..

You can simply use the Import-CSV cmdlet to load the CSV instead of the Get-Content and convertTo-csv cmdlet:
$files = Get-ChildItem -path $dir -filter $filter |
Sort-Object LastAccessTime -Descending |
Select-Object -First 10 |
Select-Object -ExpandProperty FullName
Import-Csv -Path $files | Export-Csv "\\server\share\test.csv"

try:
get-content -path $file.Fullname

Related

Need to Import and Convert-HTML for a searched file in powershell

I have folders with similar filenames with different time stamps, I can find the latest file from the folder but I am getting an error when importing that particular file to html.
Please someone help me on this.
$dir = "C:\ABC\Reports"
$filter="*.csv"
$latest = Get-ChildItem -Path $dir -Filter $filter | Sort-Object CreationTime -Descending | Select-Object -First 1
$latest.name
$Dailybackups = Import-Csv -Path $latest.name | ConvertTo-Html -Head $Header

Find the oldest file in each subdirectory with Powershell

My company recently moved to outlook365. We are entirely VDI based so our user profiles are stored on a single server. As a result our users all now have 2+ .ost files taking up storage space on the server. I'd like to write a script to find and delete the extraneous .ost files. In addition I'd like to schedule the script to run on a monthly basis to clean up any orphaned .ost's that occur for any other reason.
I've tried a few different solutions but can't seem to find the right syntax to identify just the oldest/original .ost in each subdirectory, all attempts have identified the oldest file from the whole directory or all .ost files in the directory.
$Path = "<path>"
$SubFolders = dir $Path -Recurse | Where-Object {$_.PSIsContainer} | ForEach-Object -Process {$_.FullName}
ForEach ($Folder in $SubFolders)
{
$FullFileName = dir $Folder | Where-Object {!$_.PSIsContainer} | Sort-Object {$_.LastWriteTime} -Descending | Select-Object -First 1
}
Inside of your loop, you could use the following to list the .ost file that has the oldest LastWriteTime value. Just add the -Descending flag to Sort-Object to list the newest file.
$FullFileName = foreach ($folder in $Subfolders) {
$Get-ChildItem -Path $folder -Recurse -File -Filter "*.ost" |
Sort-Object -Property LastWriteTime |
Select-Object -Property FullName -First 1
}
$FullFileName
If there is only one .ost file found in the $folder path, it will still find that file. So you will need logic to not delete when there is only one file. This does not guarantee it is the oldest file. You probably want a combination of the oldest CreationTime and newest LastWriteTime. The following will list the oldest .ost file based on CreationTime.
$FullFileName = foreach ($folder in $Subfolders) {
Get-ChildItem -Path $folder -Recurse -File -Filter "*.ost" |
Sort-Object -Property CreationTime |
Select-Object -Property FullName -First 1
}
$FullFileName
Another issue is setting the $FullFileName variable inside of the foreach loop. This means it will be overwritten through each loop iteration. Therefore, if you retrieve the value after the loop completes, it will only have the last value found. Setting the variable to be the result of the foreach loop output will create an array with multiple values.
To only output an OST file path when there are multiple OST files, you can do something like the following:
$FullFileName = foreach ($folder in $Subfolders) {
$files = Get-ChildItem -Path $folder -Recurse -File -Filter "*.ost" |
Sort-Object -Property LastWriteTime -Descending
if ($files.count -ge 2) {
$files | Select-Object -Property FullName -First 1
}
$FullFileName
This one liner should do the job, keeping the ost file with the newest LastWriteTime
gci -Path $Path -directory | where {(gci -Path $_\*.ost).count -gt 1}|%{gci -Path $_\*.cmd|Sort-Object LastWriteTime -Descending|Select-Object -Skip 1|Remove-Item -WhatIf}
Longer variant follows.
$Path = '<path>'
$Ext = '*.ost'
Get-ChildItem -Path $Path -directory -Recurse |
Where-Object {(Get-ChildItem -Path "$_\$Ext" -File -EA 0).Count -gt 1} |
ForEach-Object {
Get-ChildItem -Path "$_\$Ext" -File -EA 0| Sort-Object LastWriteTime -Descending |
Select-Object -Skip 1 | Remove-Item -WhatIf
}
The first two lines evaluate folders with more than one .ost file
The next lines iterates those folders and sort them descending by LastWriteTime, skips the first (newest) and pipes the other to Remove-Item with the -WhatIf parameter to only show what would be deleted while testing.
You can of course also move them to a backup location instead.

Pointing script to latest file

Looking to pull in latest file that does an update loop nightly. The script is pointing to a folder that has several files with the same naming convention, but different times.
Example:
File_Test04212019.csv
File_Test04222019.csv
File_Test04232019.csv
File_Test04242019.csv
File_Test04252019.csv
etc.
When I first ran this script it worked out fine, but after i edited a few files to update them to see if it'll pull another updated file...it is still trying to pull the previous file it originally pulled. This is the script I used below.
$dir = "C:\temp\File_Test*"
$filter = "*.csv"
$latest = Get-ChildItem -Path $dir -Filter $filter |
Sort-Object LastAccessTime -Descending |
Select-Object -First 1
$latest.Name
Import-Csv -Path $dir | ForEach-Object {
This is the error message I get:
Import-Csv : Cannot perform operation because the path resolved to more than
one file. This command cannot operate on multiple files.
At line:7 char:1
+ Import-Csv -Path $dir | ForEach-Object {
Any idea on how this can be resolved?
Thanks to Lee Dailey, below is the answer.
$dir = "C:\temp\File_Test*"
$filter = "*.csv"
$latest = Get-ChildItem -Path $dir -Filter $filter | Sort-Object LastWriteTime -
Descending | Select-Object -First 1
$latest.Name $outFile = 'C:\temp\empid_log.csv'
Import-Csv -Path $latest | ForEach-Object {

List file count by subfolder

I am trying to use powershell to produce a list of folder names and how many files are in each folder.
I have this script
$dir = "C:\Users\folder"
Get-ChildItem $dir -Recurse -Directory | ForEach-Object{
[pscustomobject]#{
Folder = $_.FullName
Count = #(Get-ChildItem -Path $_.Fullname -File).Count
}
} | Select-Object Folder,Count
Which lists the file count, but it puts the full path (i.e. C:\Users\name\Desktop\1\2\-movi...). Is there any way to just display the last folder ("movies") as well as save the result to a .txt file?
Thank you
Instead of $_.FullName, use $_.Name to only get the directory name.
Your Select-Object call is redundant - it is effectively a no-op.
While it's easy to send the results to a .txt file with >, for instance, it's better to use a more structured format for later programmatic processing.
In the simplest form, that means outputting to a CSV file via Export-Csv; generally, however, the most faithful way of serializing objects to a file is to use Export-CliXml.
Using Export-Csv for serialization:
$dir = 'C:\Users\folder'
Get-ChildItem -LiteralPath $dir -Recurse -Directory | ForEach-Object {
[pscustomobject] #{
Folder = $_.Name
Count = #(Get-ChildItem -LiteralPath $_.Fullname -File).Count
}
} | Export-Csv -NoTypeInformation results.csv
Note that you could streamline your command by replacing the ForEach-Object call with a Select-Object call that uses a calculated property:
$dir = 'C:\Users\folder'
Get-ChildItem -LiteralPath $dir -Recurse -Directory |
Select-Object Name,
#{ n='Count'; e={#(Get-ChildItem -LiteralPath $_.Fullname -File).Count} } |
Export-Csv -NoTypeInformation results.csv
You mean something like this...
Clear-Host
Get-ChildItem -Path 'd:\temp' -Recurse -Directory |
Select-Object Name,FullName,
#{Name='FileCount';Expression = {(Get-ChildItem -Path $_.FullName -File -Recurse| Measure-Object).Count}} `
| Format-Table -AutoSize
# Results
Name FullName FileCount
---- -------- ---------
abcpath0 D:\temp\abcpath0 5
abcpath1 D:\temp\abcpath1 5
abcpath2 D:\temp\abcpath2 5
Duplicates D:\temp\Duplicates 12677
EmptyFolder D:\temp\EmptyFolder 0
NewFiles D:\temp\NewFiles 4
PngFiles D:\temp\PngFiles 4
results D:\temp\results 905
...

PowerShell - searching for existing files generates empty output

I wish to search for specific files listed in searchFiles and pipe their locations to TestFileLocation.CSV. However, my current script only generates an empty CSV. What am I missing?
My TestFindFile.csv is of the form:
Name
123.pdf
321.pdf
aaa.pdf
SNIPPET
$searchFiles = Import-CSV 'C:\Data\SCRIPTS\PS1\TestFindFile.csv' -Header ("Name")
$source = 'C:\Data'
ForEach($File in $searchFiles)
{
Get-ChildItem $source -Filter $File -rec | where {!$_.PSIsContainer} | select-object FullName | export-csv -notypeinformation -delimiter '|' -path c:\data\scripts\ps1\TestFileLocation.csv
}
You were overwriting the CSV for each iteration of the loop.
$searchFiles = Import-CSV 'C:\Data\SCRIPTS\PS1\TestFindFile.csv' -Header ("Name")
$source = 'C:\Data'
$outputPath = 'c:\data\scripts\ps1\TestFileLocation.csv'
$searchFiles | ForEach-Object {
# Silently continue to try to ignore error like
# not being able to read path's which are too long
Get-ChildItem $source -Filter $_ -rec -ErrorAction SilentlyContinue | where {!$_.PSIsContainer} | select-object FullName
} | export-csv -notypeinformation -delimiter '|' -path $outputPath
Example using AlphaFS
A comment asked for an example using AlphaFS because it claims to overcome the long path issue. I'm not going into all the details, but here is how I got it to work.
# download and unzip to c:\alpahfs
# dir C:\AlphaFS\* -Recurse -File | Unblock-File
[System.Reflection.Assembly]::LoadFrom('C:\AlphaFS\lib\net451\AlphaFS.dll')
$searchFiles = Import-CSV 'C:\Data\SCRIPTS\PS1\TestFindFile.csv' -Header ("Name")
$source = 'C:\Data'
$outputPath = 'c:\data\scripts\ps1\TestFileLocation.csv'
$searchFiles | ForEach-Object {
$files = [Alphaleonis.Win32.Filesystem.Directory]::EnumerateFiles($source,'*',[System.IO.SearchOption]::AllDirectories)
$files | ForEach-Object { [PSCustomObject] #{FileName = $_} }
} | export-csv -notypeinformation -delimiter '|' -path $outputPath
# type $outputPath
If your .csv file contains the header "Name", there is no need to again declare it when running Import-Csv.
The reason the output is empty is that you are searching for an Object which contains the property Name (imported from the TestFindFile.csv). Search for $File.Name. Also pull commands outside the loop that don't need to be there:
$searchFiles | Select -ExpandProperty Name | % {
Get-ChildItem $source -Filter $_ -Recurse | where {!$_.PSIsContainer}
} | select-object FullName | export-csv -notypeinformation -delimiter '|' -path c:\data\scripts\ps1\TestFileLocation.csv