Check if files exist using Powershell - powershell

I have a Powershell script which I've cobbled together. It uses an external file as a lookup then checks the LastWriteTime of those files.
This was created as a checking procedure. To ensure a set of files had been updated each day.
However, I've noticed that if the files don't exist at run time, they don't show in the list at all. So there's potential for these to be missed.
As well as checking the LastWriteDate, is there a way this can be altered to highlight in some way if any of the files don't exist?
Either a new column saying Exists Y/N?
Or even a total row count VS expected row count?
This is what I've got so far...
#Filelist - This is a simple txt file with various filepaths entered
$filelist = Get-Content "H:\PowerShell\Files_Location_List.txt"
$results = foreach ($file in $filelist) {
Get-Item $file | select -Property fullname, LastWriteTime #| Measure-Object
}
$results | Export-Csv 'H:\PowerShell\File_Modified_Dates.csv' -NoTypeInformation #| Measure-Object
The contents of Files_Location_List.txt is very simple...
\server\folder\file1.csv
\server\folder\file2.csv
\server\folder\file3.csv
etc

you can try using Test-Path
if(Test-Path -Path <file_path>) {
# do stuff
}

You can also use -ErrorAction SilentlyContinue on Get-Item to either get a FileInfo object if the file exists or $null if not:
# Filelist - This is a simple txt file with various filepaths entered
$result = Get-Content "H:\PowerShell\Files_Location_List.txt" | ForEach-Object {
# try and get the FileInfo object for this path. Discard errors
$file = Get-Item -Path $_ -ErrorAction SilentlyContinue
if ($file) {
$file | Select-Object -Property FullName, LastWriteTime, #{Name = 'Exists'; Expression = {$true}}
}
else {
[PsCustomObject]#{
FullName = $_
LastWriteTime = $null
Exists = $false
}
}
}
$result | Export-Csv 'H:\PowerShell\File_Modified_Dates.csv' -NoTypeInformation

Related

Grabbing text from txt file into clipboard when ever the file is changed

I have a server I connect to that does not allow copy paste. There is a shared drive i can access from both. I wrote a script on the local side to dump the clipboard into a text file any time i copy something. I'm having trouble reversing it on the remote side. I'm trying to monitor the file and if it sees the LastWriteTime change, grab the txt and dump it into the remote clip board. However, it seems that no matter what i try, it's WRITNG the LastWriteTime insteading of READING. Here's my code....
$copypath = "sharedrive\copy.txt"
$lastModifiedDate = Get-Item $copypath | select -Property LastWriteTime
for()
{
#$dateA = $lastModifiedDate
$dateB = Get-ChildItem -Path $copypath $_.LastWriteTime
if ($dateA -ne $dateB) {
get-content $copypath|set-clipboard
$lastModifiedDate = (Get-Item $copypath).LastWriteTime
}
}
tried Get-Item $copypath | select -Property LastWriteTime and Get-ChildItem -Path $copypath $_.LastWriteTime as well as LastAccessTime
In your code, your are trying to access the LastWriteTime in three different ways, but only the third one is correct. Use that instead:
$copypath = "sharedrive\copy.txt"
$lastDate = (Get-Item $copypath).LastWriteTime
for() {
$currentDate = (Get-Item $copypath).LastWriteTime
if ($currentDate -ne $lastDate) {
Get-Content $copypath | Set-Clipboard
$lastDate = $currentDate
}
}
An alternative with Select-Object would be:
Get-Item $copypath | Select-Object -ExpandProperty LastWriteTime
Additional note: Your infinite loop is very expensive. Add a sleep time or use filesystem watchers instead.
You can use the .Refresh() method from FileInfo if you keep the original instance instead of using Select-Object:
# store the FileInfo
$file = Get-Item $copypath
# store its LastWriteTime
$date = $file.LastWriteTime
# infinite loop
for() {
# refresh the instance
$file.Refresh()
# if the dates are not equal
if($date -ne $file.LastWriteTime) {
# read the file and set clipboard
$file | Get-Content -Raw | Set-Clipboard
# update the new LastWriteTime
$date = $file.LastWriteTime
}
# should use sleep here
Start-Sleep -Milliseconds 300
}

How to format PowerShell results from ForEach

I am trying to get complete a request of amount of files in a directory and then the total size of that directory. I've come up with this:
Get-Content -Path C:\Users\$USERNAME%\Documents\list.txt |
Foreach-Object {
cd $_
Write-Host $_
(Get-ChildItem -Recurse -File | Measure-Object).Count
(ls -r|measure -sum Length).Sum
}
The txt file has contents such as:
\\directory\on\network\1
\\directory\on\network\also
Ultimately I need this in a spreadsheet, but I am failing with formatting. As is, it outputs straight to powershell, so with thousands of directories this isn't ideal. I've tried exporting to CSV but it overwrites the CSV with each result, and when I tried setting the function equal to a variable array and then exporting that, it simply output a blank file.
Any assistance with this is appreciated.
In order to export to CSV you will need an object with properties. Your code generates a few values without any structure. Surely the % in your sample code is a typo, it definitely doesn't belong there. It is generally considered bad practice to use aliases in scripts, however you should, at a minimum, keep it consistent. One line you use Get-ChildItem/Measure-Object and the next use ls/measure. Regardless you don't show your export, so it's hard to help with what we can't see. You also don't need to CD into the directory, it seems it would only slow the script down if anything.
The easiest way I know to create an object is to use the [PSCustomObject] type accelerator.
$infile = "C:\Users\$USERNAME\Documents\list.txt"
$outfile = "C:\some\path\to.csv"
Get-Content -Path $infile |
Foreach-Object {
Write-Host Processing $_
[PSCustomObject]#{
Path = $_
Total = (Get-ChildItem $_ -Recurse -File | Measure-Object).Count
Size = (Get-ChildItem $_ -Recurse -File | Measure-Object -sum Length).Sum
}
} | Export-Csv $outfile -NoTypeInformation
Edit
We should've ran the Get-Childitem call once and then pulled the info out. The first option is in "pipeline" mode can save on memory usage but might be slower. The second puts it all in memory first so it can be much quicker if it's not too large.
Get-Content -Path $infile |
Foreach-Object {
Write-Host Processing $_
$files = Get-ChildItem $_ -Recurse -File | Measure-Object -sum Length
[PSCustomObject]#{
Path = $_
Total = $files.Count
Size = $files.Sum
}
} | Export-Csv $outfile -NoTypeInformation
or
$results = foreach($folder in Get-Content -Path $infile)
{
Write-Host Processing $folder
$files = Get-ChildItem $folder -Recurse -File | Measure-Object -sum Length
[PSCustomObject]#{
Path = $folder
Total = $files.Count
Size = $files.Sum
}
}
$results | Export-Csv $outfile -NoTypeInformation
The -append flag in Export-Csv allows you to add to an existing file rather than overwriting.

Write all running processes to a text file in PowerShell

The purpose of this code is to get a list of all used executables from a specific folder. After a month we will delete any exe's not on this list.
I currently get the correct results using this:
while ($true) {
foreach ($process in Get-Process | where {$_.Path -imatch 'ksv'} | select -Unique) {
$dir = $process | Get-ChildItem;
New-Object -TypeName PSObject -Property #{
'Path' = $process.Path;
} | Out-String | Add-Content -LiteralPath Z:\processList.txt
}
Get-Content Z:\processList.txt | sort | Get-Unique > Z:\uniqueprocesslist.txt
}
I'm going to get rid of the while loop as this will be eventually running as a service.
The problem with this is that it creates a huge list in processlist.txt that I would like to eliminate to save space.
I tried to come up with a better solution that scans the text file to see if the path is written already before adding the new process path. I am not sure what I am doing wrong but nothing is ever written to the text file
while ($true) {
foreach ($process in Get-Process | where {$_.Path -imatch 'ksv'} | select -Unique) {
$dir = $process | Get-ChildItem;
$progPath = New-Object -TypeName PSObject -Property #{
'Path' = $process.Path
}
$file = Get-Content "Z:\processList.txt"
$containsLine = $file | %{$_ -match $progPath}
if ($containsLine -contains $false) {
Add-Content -LiteralPath Z:\processList.txt
}
}
}
If I understand your question correctly you want to build a "recently used" list of executables in a specific directory in a file, and update that (unique) list with each run of your script.
Something like this should do that:
$listfile = 'Z:\processlist.txt'
# Build a dictionary from known paths, so that we can check for already known
# paths with an index lookup instead of a linear search over an array.
$list = #{}
if (Test-Path -LiteralPath $listfile) {
Get-Content $listfile | ForEach-Object {
$list[$_] = $true
}
}
# List processes, expand their path, then check if the path contains the
# string "ksv" and isn't already known. Append the results to the list file.
Get-Process |
Select-Object -Expand Path |
Sort-Object -Unique |
Where-Object {$_ -like '*ksv*' -and -not $list.ContainsKey($_)} |
Add-Content $listfile
Hashtable lookup and wildcard match are used for performance reasons, because they're significantly faster than linear searches in arrays and regular expression matches.
while ($true) {
$file = Get-Content "Z:\processList.txt"
$KSVPaths = Get-Process |
Where-Object {$_.Path -imatch 'ksv'} |
Select-Object -ExpandProperty Path |
Select-Object -Unique
ForEach ($KSVPath in $KSVPaths) {
if ($KSVPath -notin $file) {
Add-Content -Path $file -Value $KSVPath
}
}
}

PowerShell Test If a Filename From a List Exists Somewhere In a Directory and Export Missing

I've researched this and haven't been able to come up with a solid solution. Basically, I have a separate hard drive containing thousands of music files. I have a CSV list with the names of all the files that should be in the hard drive. Example:
My List
I want to be able to test if each of the files on my list exist in the hard drive, and if not, export it to a separate "missing files" list. The thing is each of the files in the hard drive exist under multiple folders.
As my script is now, I am trying to test if the path exists by using join-path. Here is my code right now - it's returning all of the files in the directory instead of just the missing files:
$documents = 'C:\Users\Me\Documents\ScriptTest'
$CSVexport = 'C:\Users\Me\Documents\ScriptTest\TestResults.csv'
$obj = #()
Write-host "`n_____Results____`n" #Write the results and export to a CSV file
$NewCsv = Import-CSV -Path 'C:\Users\Me\Documents\ScriptTest\Test.csv' |
Select-Object ID,'File Path' |
ForEach-Object {
if (!(Test-Path (Join-Path $documents $_.'File Path'))){
write-host "`n$($_.'File Path') is missing from the folder.`n"
$ObjectProperties = #{
ID = $_.ID
'File Path' = $_.'File Path'
}
$obj += New-Object PSObject -Property $ObjectProperties
}
}
$obj | export-csv $CSVexport -NoTypeInformation
How do I account for the sub-directories that vary with each file?
Edit - Resolved
$myFolder = 'C:\Users\Me\Documents\ScriptTest'
$CSVexport = 'C:\Users\Me\Documents\ScriptTest\Results.csv'
$csvPath = 'C:\Users\Me\Documents\ScriptTest\Test.csv'
$FileList = Get-ChildItem $myFolder -Recurse *.wav | Select-Object -ExpandProperty Name -Unique
Import-CSV -Path $csvPath |
Where-Object {$FileList -notcontains $_.'File Path'} |
export-csv $CSVexport -NoTypeInformation
You could generate a list of filenames from the recursed folders, then check if the file is in that list.
$documents = 'C:\Users\Me\Documents\ScriptTest'
$CSVexport = 'C:\Users\Me\Documents\ScriptTest\TestResults.csv'
$FileList = Get-ChildItem $documents -Recurse |
Where-Object { -not $_.PSIsContainer } |
Select-Object -ExpandProperty Name -Unique
Import-CSV -Path 'C:\Users\Me\Documents\ScriptTest\Test.csv' |
Where-Object {$FileList -notcontains $_.File} |
Export-CSV $CSVexport -NoTypeInformation
Edit: Answer updated to work with PowerShell 2.0 with suggestions from Bruce Payette and mklement0

Scan the properties of files on an excel sheet by Powershell

I have a task that requires to scan the property of all the files indicated by certain directories where the files are stored. I need my code to read the following line of information separated by the delimiter "," stored in a .txt file as follows (the directory is made up by myself on my own device and I went ahead making up some blank .xlsx files to test my code:
Jakarta,C:\\temp\Hfolder,C:\temp\Lfolder
I currently have code that looks like this:
$LocContent = Import-Csv "C:\temp\Location.txt" # -Header $fileHeaders
ForEach($line in $LocContent){C:\temp\test1.csv -NoTypeInformation
#split fields into values
$line = $LocContent -split (",")
$country = $line[0]
$hDrivePath = $line[1]
$lDrivePath = $line[2]
Get-ChildItem $hDrivePath -force -include *.xlsx, *.accdb, *.accde, *.accdt, *.accdr -Recurse
Get-ChildItem $lDrivePath -force -include *.xlsx, *.accdb, *.accde, *.accdt, *.accdr -Recurse
? {
$_.LastWriteTime -gt (Get-Date).AddDays(-5)
}
Select-Object -Property Name, Directory, #{Name="Owner";Expression={(Get-ACL $_.Fullname).Owner}}, CreationTime, LastAccessTime, #{N="Location";E={$country}}, #{N='size in MB';E={$_.Length/1024kb}} | Export-Csv
}
However there is no output on the .csv file I assigned to output the information. What is wrong in my code?
Thanks!
There are several flaws within your code:
The Select has neither an -InputObject nor is anything piped to it so there can't be an output
You should decide whether you treat C:\temp\Location.txt as
a text file with Get-Contentand a split
or as a csv with headers
or without headers and supply them to the import.
The Get-ChildItem output isn't piped anywhere nor stored in a variable so it goes to the screen.
Export-Csv needs a file name to export to.
Try this untested script:
## Q:\Test\2018\06\26\SO_51038180.ps1
$fileHeaders = #('country','hDrivePath','lDrivePath')
$extensions = #('*.xlsx','*.accdb','*.accde','*.accdt','*.accdr')
$LocContent = Import-Csv "C:\temp\Location.txt" -Header $fileHeaders
$NewData = ForEach($Row in $LocContent){
Get-ChildItem $Row.hDrivePath,$Row.lDrivePath -Force -Include $extensions -Recurse |
Where-Object LastWriteTime -gt (Get-Date).AddDays(-5) |
Select-Object -Property Name,
Directory,
#{Name="Owner";Expression={(Get-ACL $_.Fullname).Owner}},
CreationTime,
LastAccessTime,
#{N="Location";E={$Row.country}},
#{N='size in MB';E={$_.Length/1024kb}}
}
# you choose what to do with the result uncomment the desired
$NewData | Format-Table -Auto
# $NewData | Out-Gridview
# $NewData | Export-Csv '.\NewData.csv' -NoTypeInformation