Error while listing directories recursively on C:\ drive - powershell

I’m trying to list every file, including dll, exe, driver list, etc on my windows system using PowerShell.
The following gives me list of only the particular folder called "Users"
Get-ChildItem C:\Users -Recurse | Select-Object DirectoryName,Name |Where { $_.DirectoryName -ne $NULL } | Export-CSV C:\Filelist.csv
It gives the list of files in only Users folder and I get error when I try to list all files under C Drive using the following command as per my aim:
Get-ChildItem C:\ -Recurse | Select-Object DirectoryName,Name | Where { $_.DirectoryName -ne $NULL } | Export-CSV C:\Filelist.csv
I receive this error:
ERROR : Get-ChildItem : Access to the path ‘C:\Windows\CSC\v2.0.6’ is denied.
At line:1 char:14
+ Get-Childitem <<<< C:\ -Recurse | Select-Object DirectoryName,Name | Where { $_.DirectoryName -ne $NULL } | Export-CSV C:\Filelist.csv
+ CategoryInfo : PermissionDenied: (C:\Windows\CSC\v2.0.6:String) [Get-ChildItem], UnauthorizedAccessException
+ FullyQualifiedErrorId : DirUnauthorizedAccessError,Microsoft.PowerShell.Commands.GetChildItemCommand
How do I list the entire C drive or all files including dll, scripts, exe s on my system and export to a CSV file?

You have to execute your script as admin. Also you can omit the error message for files you don't have access to using the -ErrorAction common parameter:
Get-ChildItem C:\ -Recurse -ErrorAction SilentlyContinue |
Select-Object DirectoryName,Name |
Where { $_.DirectoryName -ne $NULL } |
Export-CSV C:\Filelist.csv

Related

Catching paths of inaccessible folders from Get-Childitem

I am working on small script to capture file hashes on a running system. I only have Powershell available.
This is the active part of the code:
get-childitem -path $path -filter $filename -Recurse -Force | Select FullName | foreach-object { get-filehash $_.fullname | select * }
this is the command I am testing with:
./Get-FileHashesRecursive.ps1 -path c:\ -filename *.txt
When running the script I get a series of errors because certain folders are inaccessible. I'd like to record the paths of those folders so the user has a record on completion of what failed.
the error looks like this in a console window:
get-childitem : Access to the path 'C:\$Recycle.Bin\S-1-5-21-4167544967-4010527683-3770225279-9182' is denied.
At E:\git\Get-RemoteFileHashesRecursive\Get-FileHashesRecursive.ps1:14 char:9
+ get-childitem -path $path -filter $filename -Recurse -Force | ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : PermissionDenied: (C:\$Recycle.Bin...3770225279-9182:String) [Get-ChildItem], UnauthorizedAccessException
+ FullyQualifiedErrorId : DirUnauthorizedAccessError,Microsoft.PowerShell.Commands.GetChildItemCommand
Is there a way I can grab the path or the entire first line of the error WITHOUT stopping the rest of the script from running?
As requested, here's my earlier comments as an answer:
Get-ChildItem -Path $Path -Filter $Filename -File -Recurse -Force -ErrorVariable FailedItems -ErrorAction SilentlyContinue | ForEach-Object { Get-FileHash -Path $_.FullName | Select-Object * }
$FailedItems | Foreach-Object {$_.CategoryInfo.TargetName} | Out-File "C:\Users\sailingbikeruk\Desktop\noaccess.log"
I have added the -File parameter to Get-ChildItem, because you are specifically dealing with only files.
I also added the -ErrorVariable and -ErrorAction parameters to the Get-ChildItem command. -ErrorVariable FailedItems defines a custom name for a variable which stores errors from the command during processing. -ErrorAction SilentlyContinue, tells the script to continue without notifying you of the errors.
Once your command has finished processing, you can parse the content of the $FailedItems variable. In the example above, I've output the TargetName to a file so that you can read it at your leisure, (please remember to adjust its file path and name as needed, should you also wish to output it to a file).

Catching errors and handling blocked files in PowerShell

The history of this question lies in an earlier question I asked here
I am running this command to get the file hashes of all files in a given location, but I need to capture any that are missed or inaccessible.
Get-ChildItem -Path $Path -Filter $Filename -File -Recurse -Force -ErrorVariable FailedItems -ErrorAction SilentlyContinue | ForEach-Object { Get-FileHash -Path $_.FullName | Select-Object * }
$FailedItems | Foreach-Object {$_.CategoryInfo.TargetName} | Out-File "C:\Users\sailingbikeruk\Desktop\noaccess.log"
In the earlier question I thought that I just needed to catch folders, and the answer given and accepted did capture any folder access denied messages but the command doesn't capture individual files that are inaccessible. The suggested answer (using -errorvariable) doesn't appear to record the path of these.
I am not clear as to why the -ErrorVariable is catching the paths from this error:
get-childitem : Access to the path 'C:\$Recycle.Bin\S-1-5-21-4167544967-4010527683-3770225279-9182' is denied.
At E:\git\Get-RemoteFileHashesRecursive\Get-FileHashesRecursive.ps1:14 char:9
+ get-childitem -path $path -filter $filename -Recurse -Force | ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : PermissionDenied: (C:\$Recycle.Bin...3770225279-9182:String) [Get-ChildItem], UnauthorizedAccessException
+ FullyQualifiedErrorId : DirUnauthorizedAccessError,Microsoft.PowerShell.Commands.GetChildItemCommand
but not this one
Get-FileHash : The file 'E:\devices.csv' cannot be read: The process cannot access the file
'E:\devices.csv' because it is being used by another process.
At E:\Scripts\Ian\git\Get-RemoteFileHashesRecursive\Get-FileHashesRecursive.ps1:25 char:132
+ ... FailedItems | ForEach-Object { Get-FileHash -Path $_.FullName | Selec ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ReadError: (E:\devices.csv:PSObject) [Write-Error], WriteErrorException
+ FullyQualifiedErrorId : FileReadError,Get-FileHash
In this example I am writing $_.CategoryInfo.TargetName to the error log, but I have also tried writing $_.TargetObject and get the same results
The common parameters -ErrorVariable and -ErrorAction apply to a single command only. So you have to add them to Get-FileHash too:
Get-ChildItem -Path $Path -Filter $Filename -File -Recurse -Force -ErrorVariable FailedItems -ErrorAction SilentlyContinue |
ForEach-Object {
Get-FileHash -Path $_.FullName -ErrorVariable +FailedItems -ErrorAction SilentlyContinue | Select-Object *
}
$FailedItems | Foreach-Object {$_.CategoryInfo.TargetName} | Out-File "C:\Users\sailingbikeruk\Desktop\noaccess.log"
Note that I have inserted + in front of the error variable name for Get-FileHash to prevent it from clearing any errors produced by Get-ChildItem. See about_CommonParameters.
Unrelated improvements:
You can remove ForEach-Object and just pipe Get-ChildItem directly into Get-FileHash. Also Select-Object * is superfluous.
Get-ChildItem -Path $Path -Filter $Filename -File -Recurse -Force -ErrorVariable FailedItems -ErrorAction SilentlyContinue |
Get-FileHash -ErrorVariable +FailedItems -ErrorAction SilentlyContinue

How to avoid an UnathorizedAccessException when using Get-ChildItem?

I am using PowerShell 5.0 and working on a script to find and list all the versions of log4net.dll under the current directory recursively.
Get-ChildItem log4net.dll -Recurse | % versioninfo | Export-Csv "C:\MyJunk\log4net.csv"
The above statement begins returning version information as expected but execution stops at the first folder I lack permission to access:
Get-ChildItem : The specified network name is no longer available.
At line:1 char:1
+ Get-ChildItem log4net.dll -Recurse | % versioninfo | Export-Csv "C:\M ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ReadError: (J:\ArcPlan_OracleWallet\Production:String) [Get-ChildItem], IOException
+ FullyQualifiedErrorId : DirIOError,Microsoft.PowerShell.Commands.GetChildItemCommand
Get-ChildItem : Access is denied
At line:1 char:1
+ Get-ChildItem log4net.dll -Recurse | % versioninfo | Export-Csv "C:\M ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Get-ChildItem], UnauthorizedAccessException
+ FullyQualifiedErrorId : System.UnauthorizedAccessException,Microsoft.PowerShell.Commands.GetChildItemCommand
I am running Windows PowerShell ISE as Adminstrator. ExecutionPolicy is RemoteSigned and $ErrorActionPreference is Continue.
Ideally I would like the script to interrogate each folder's ACL and bypass all folders (and their contents) I lack permission to access. However another solution would one in which hard-coded folders are bypassed. Being a novice in PowerShell I focused on the later.
I have tried bypassing the first problem folder (by name) to see if I could get that working, but encounter the same exception and processing stops.
Get-ChildItem log4net.dll -Recurse | Where-Object { $_.FullName -notmatch '\\ArcPlan_OracleWallet\\?'} | export-csv 'C:\MyJunk\log4net.csv'
Thanks.
If you want to ignore the errors, use -ErrorAction SilentlyContinue.
There are other useful values to this parameter, as you can discover here and here.
Here is a nice question quite on-topic.
You can also fetch help about this with Get-Help about_CommonParameters.
(Hi and welcome, if you dig this answer, read this ^^).
I believe the issue was that the Get-ChildItem log4net.dll -Recurse would fail before the Where-Object could filter out the unwanted directories.
I want to avoid hard-coding directories, but here is my (klunky) solution so far.
## Version information will be retrieved for $fileName
$fileName = 'log4net.dll'
$ErrorActionPreference = 'Continue'
## Get directies - excluding those you lack permission to access
$directories = Get-ChildItem -Directory |
Where-Object {$_.FullName -inotmatch 'directory-1' -and
$_.FullName -inotmatch 'directory-2' -and
$_.FullName -inotmatch 'directory-3'
}
## Array to hold version information
$allFilesVersionInfo = #()
foreach ($directory in $directories) {
## Get all files recursively
$files = Get-ChildItem -Path $directory.FullName $fileName -Recurse
foreach ($file in $files) {
## Get version information and add to array
$fileVersionInfo = $file | % versioninfo
$allFilesVersionInfo += $fileVersionInfo
}
}
# Write version information in arra to file
$exportFullPath = "C:\MyJunk\$($fileName)-version.csv"
$allFilesVersionInfo | Export-Csv -Path $($exportFullPath)

Powershell ErrorAction not silent

I've got a PS script that looks for the Office15 folder on computers on our network. For the most part, the script works as intended. In fact, this is just me being picky. I set -ErrorAction SilentlyContinue, but the error messages when the Office15 folder is not found still appear on screen. I'm wondering if I'm doing something wrong or just don't really understand what my script is doing.
$filePath = "\\"+$computer+"\c$\Program Files (x86)\Microsoft Office\"
$listing = Get-ChildItem $filePath | where-object { $_.name -eq "Office15" } | Select-Object Name -ErrorAction SilentlyContinue
With this script as-is, I get errors like the following:
Get-ChildItem : Cannot find path '\\COMPNAME\c$\Program Files (x86)\Microsoft Office\' because it does not exist.
At C:\Users\someGuy\bootTime\checkOffice.ps1:16 char:20
+ $listing = Get-ChildItem $filePath | where-object { $_.name -eq "Office1 ...
+ ~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (\\COMPNAME\c$\Pr...crosoft Office\:String) [Get-ChildItem], ItemNotFoundException
+ FullyQualifiedErrorId : PathNotFound,Microsoft.PowerShell.Commands.GetChildItemCommand
I pump all of the valid results into a text file, so the other parts of the script work just fine, and I get the expected results otherwise. I'm only really interested in learning what I might be doing wrong here.
You need to pass the error action to gci:
$listing = Get-ChildItem $filePath -ErrorAction SilentlyContinue | where-object { $_.name -eq "Office15" } | Select-Object Name

Remove-Item : Cannot find path 'C:\Windows\system32\Microsoft.PowerShell.Commands.GroupInfo' because it does not exist

I'm trying to execute following commands to clear files in a temporary directory. If there are multiple files for a particular day I should keep only the latest file.
$groups = get-ChildItem -Path "D:\Temp\Archive" -Filter "*_bak.zip" | ?{-not $_.PsIsContainer} | Group {$_.LastWriteTime.ToString("yyyy-MM-dd")}
if($groups -ne $NULL){
ForEach ($files in $groups) {
"Count: $($files.Count)"
if ($files.Count -gt 1) {
$files | Sort LastWriteTime | Select-Object -First ($files.Count - 1) | Remove-Item -Force -WhatIf
}
}
}
But, I'm getting the following error. I'm executing these commands as an administrator. And, Execution Policy is set to Unrestricted.
Remove-Item : Cannot find path 'C:\Windows\system32\Microsoft.PowerShell.Commands.GroupInfo' because it does not exist.
At D:\User1\Tasks\Delete_backup_files.ps1:86 char:87
+ $files | Sort LastWriteTime | Select-Object -First ($files.Count - 1) | Remove-Item <<<< -Force -WhatIf
+ CategoryInfo : ObjectNotFound: (C:\Windows\syst...mands.GroupInfo:String) [Remove-Item], ItemNotFoundException
+ FullyQualifiedErrorId : PathNotFound,Microsoft.PowerShell.Commands.RemoveItemCommand
The Remove-Item command works fine if I just use it with a UNC path.
$files.FullName doesn't have a value. I suspect this could be the issue. But, I'm not sure how to fix it. I need the grouping logic to stay as is.
I hope someone could help me out here.
Thanks!
You were actually very close. You have to pass the files within the group to the sort command, not the group itself:
$files.Group | Sort LastWriteTime | Select-Object -First ($files.Count - 1) | Remove-Item -Force -WhatIf
The error you get:
Remove-Item : Cannot find path 'C:\WINDOWS\system32\Microsoft.PowerShell.Commands.GroupInfo' because it does not exist.
Happens because Remove-Item requires a string as input. But you are passing a GroupInfo object so PS calls the ToString() method. Remove-Item interprets the resulting string as a file name located in your current location C:\WINDOWS\system32\.