Get-ACL -Literalpath throws: illegal character in path - powershell

I have an arraylist of all folders and files where I want to get the ACL off.
Everything runs well in my Foreach loop.
But for one odd reason it throws:
Get-Acl : Illegal characters
+ CategoryInfo : NotSpecified: (:) [Get-Acl], ArgumentException
+ FullyQualifiedErrorId : System.ArgumentException,Microsoft.PowerShell.Commands.GetAclCommand
When I run the get-acl manual with the 'faulty' path, I get the ACL.
My code is:
$ACL = get-acl -LiteralPath "$Path" | select -ExpandProperty access | select IdentityReference
$Result = Compare-Object -ReferenceObject $ACLListAccess -DifferenceObject $ACL -Property Access -PassThru
if ($Result.count -ne 0)
{
$ExplicitTest = get-acl -LiteralPath "$Path" | select -ExpandProperty access
if ($ExplicitTest.IsInherited.count -ne $ACLListAccessFull.count -and $ExplicitTest.IsInherited -Like "*False*")
{
$WrongFolders.Add($Path) | Out-Null
}
}
The Path with the 'illegal character' is
\mycompany.com\folders\Algemeen\Reme§ysen

Sorry, cannot repro your problem and comment does not allow me to paste this
D:\temp\a> get-acl .\aaa§bbb
Directory: D:\temp\a
Path Owner Access
---- ----- ------
aaa§bbb adil BUILTIN\Administrators Allow FullControl...
if you copy paste that character, do you also get 167?
C:\> [int][char]'§'
167

Related

How to avoid an UnathorizedAccessException when using Get-ChildItem?

I am using PowerShell 5.0 and working on a script to find and list all the versions of log4net.dll under the current directory recursively.
Get-ChildItem log4net.dll -Recurse | % versioninfo | Export-Csv "C:\MyJunk\log4net.csv"
The above statement begins returning version information as expected but execution stops at the first folder I lack permission to access:
Get-ChildItem : The specified network name is no longer available.
At line:1 char:1
+ Get-ChildItem log4net.dll -Recurse | % versioninfo | Export-Csv "C:\M ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ReadError: (J:\ArcPlan_OracleWallet\Production:String) [Get-ChildItem], IOException
+ FullyQualifiedErrorId : DirIOError,Microsoft.PowerShell.Commands.GetChildItemCommand
Get-ChildItem : Access is denied
At line:1 char:1
+ Get-ChildItem log4net.dll -Recurse | % versioninfo | Export-Csv "C:\M ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Get-ChildItem], UnauthorizedAccessException
+ FullyQualifiedErrorId : System.UnauthorizedAccessException,Microsoft.PowerShell.Commands.GetChildItemCommand
I am running Windows PowerShell ISE as Adminstrator. ExecutionPolicy is RemoteSigned and $ErrorActionPreference is Continue.
Ideally I would like the script to interrogate each folder's ACL and bypass all folders (and their contents) I lack permission to access. However another solution would one in which hard-coded folders are bypassed. Being a novice in PowerShell I focused on the later.
I have tried bypassing the first problem folder (by name) to see if I could get that working, but encounter the same exception and processing stops.
Get-ChildItem log4net.dll -Recurse | Where-Object { $_.FullName -notmatch '\\ArcPlan_OracleWallet\\?'} | export-csv 'C:\MyJunk\log4net.csv'
Thanks.
If you want to ignore the errors, use -ErrorAction SilentlyContinue.
There are other useful values to this parameter, as you can discover here and here.
Here is a nice question quite on-topic.
You can also fetch help about this with Get-Help about_CommonParameters.
(Hi and welcome, if you dig this answer, read this ^^).
I believe the issue was that the Get-ChildItem log4net.dll -Recurse would fail before the Where-Object could filter out the unwanted directories.
I want to avoid hard-coding directories, but here is my (klunky) solution so far.
## Version information will be retrieved for $fileName
$fileName = 'log4net.dll'
$ErrorActionPreference = 'Continue'
## Get directies - excluding those you lack permission to access
$directories = Get-ChildItem -Directory |
Where-Object {$_.FullName -inotmatch 'directory-1' -and
$_.FullName -inotmatch 'directory-2' -and
$_.FullName -inotmatch 'directory-3'
}
## Array to hold version information
$allFilesVersionInfo = #()
foreach ($directory in $directories) {
## Get all files recursively
$files = Get-ChildItem -Path $directory.FullName $fileName -Recurse
foreach ($file in $files) {
## Get version information and add to array
$fileVersionInfo = $file | % versioninfo
$allFilesVersionInfo += $fileVersionInfo
}
}
# Write version information in arra to file
$exportFullPath = "C:\MyJunk\$($fileName)-version.csv"
$allFilesVersionInfo | Export-Csv -Path $($exportFullPath)

Error while listing directories recursively on C:\ drive

I’m trying to list every file, including dll, exe, driver list, etc on my windows system using PowerShell.
The following gives me list of only the particular folder called "Users"
Get-ChildItem C:\Users -Recurse | Select-Object DirectoryName,Name |Where { $_.DirectoryName -ne $NULL } | Export-CSV C:\Filelist.csv
It gives the list of files in only Users folder and I get error when I try to list all files under C Drive using the following command as per my aim:
Get-ChildItem C:\ -Recurse | Select-Object DirectoryName,Name | Where { $_.DirectoryName -ne $NULL } | Export-CSV C:\Filelist.csv
I receive this error:
ERROR : Get-ChildItem : Access to the path ‘C:\Windows\CSC\v2.0.6’ is denied.
At line:1 char:14
+ Get-Childitem <<<< C:\ -Recurse | Select-Object DirectoryName,Name | Where { $_.DirectoryName -ne $NULL } | Export-CSV C:\Filelist.csv
+ CategoryInfo : PermissionDenied: (C:\Windows\CSC\v2.0.6:String) [Get-ChildItem], UnauthorizedAccessException
+ FullyQualifiedErrorId : DirUnauthorizedAccessError,Microsoft.PowerShell.Commands.GetChildItemCommand
How do I list the entire C drive or all files including dll, scripts, exe s on my system and export to a CSV file?
You have to execute your script as admin. Also you can omit the error message for files you don't have access to using the -ErrorAction common parameter:
Get-ChildItem C:\ -Recurse -ErrorAction SilentlyContinue |
Select-Object DirectoryName,Name |
Where { $_.DirectoryName -ne $NULL } |
Export-CSV C:\Filelist.csv

Powershell ErrorAction not silent

I've got a PS script that looks for the Office15 folder on computers on our network. For the most part, the script works as intended. In fact, this is just me being picky. I set -ErrorAction SilentlyContinue, but the error messages when the Office15 folder is not found still appear on screen. I'm wondering if I'm doing something wrong or just don't really understand what my script is doing.
$filePath = "\\"+$computer+"\c$\Program Files (x86)\Microsoft Office\"
$listing = Get-ChildItem $filePath | where-object { $_.name -eq "Office15" } | Select-Object Name -ErrorAction SilentlyContinue
With this script as-is, I get errors like the following:
Get-ChildItem : Cannot find path '\\COMPNAME\c$\Program Files (x86)\Microsoft Office\' because it does not exist.
At C:\Users\someGuy\bootTime\checkOffice.ps1:16 char:20
+ $listing = Get-ChildItem $filePath | where-object { $_.name -eq "Office1 ...
+ ~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (\\COMPNAME\c$\Pr...crosoft Office\:String) [Get-ChildItem], ItemNotFoundException
+ FullyQualifiedErrorId : PathNotFound,Microsoft.PowerShell.Commands.GetChildItemCommand
I pump all of the valid results into a text file, so the other parts of the script work just fine, and I get the expected results otherwise. I'm only really interested in learning what I might be doing wrong here.
You need to pass the error action to gci:
$listing = Get-ChildItem $filePath -ErrorAction SilentlyContinue | where-object { $_.name -eq "Office15" } | Select-Object Name

Remove-Item : Cannot find path 'C:\Windows\system32\Microsoft.PowerShell.Commands.GroupInfo' because it does not exist

I'm trying to execute following commands to clear files in a temporary directory. If there are multiple files for a particular day I should keep only the latest file.
$groups = get-ChildItem -Path "D:\Temp\Archive" -Filter "*_bak.zip" | ?{-not $_.PsIsContainer} | Group {$_.LastWriteTime.ToString("yyyy-MM-dd")}
if($groups -ne $NULL){
ForEach ($files in $groups) {
"Count: $($files.Count)"
if ($files.Count -gt 1) {
$files | Sort LastWriteTime | Select-Object -First ($files.Count - 1) | Remove-Item -Force -WhatIf
}
}
}
But, I'm getting the following error. I'm executing these commands as an administrator. And, Execution Policy is set to Unrestricted.
Remove-Item : Cannot find path 'C:\Windows\system32\Microsoft.PowerShell.Commands.GroupInfo' because it does not exist.
At D:\User1\Tasks\Delete_backup_files.ps1:86 char:87
+ $files | Sort LastWriteTime | Select-Object -First ($files.Count - 1) | Remove-Item <<<< -Force -WhatIf
+ CategoryInfo : ObjectNotFound: (C:\Windows\syst...mands.GroupInfo:String) [Remove-Item], ItemNotFoundException
+ FullyQualifiedErrorId : PathNotFound,Microsoft.PowerShell.Commands.RemoveItemCommand
The Remove-Item command works fine if I just use it with a UNC path.
$files.FullName doesn't have a value. I suspect this could be the issue. But, I'm not sure how to fix it. I need the grouping logic to stay as is.
I hope someone could help me out here.
Thanks!
You were actually very close. You have to pass the files within the group to the sort command, not the group itself:
$files.Group | Sort LastWriteTime | Select-Object -First ($files.Count - 1) | Remove-Item -Force -WhatIf
The error you get:
Remove-Item : Cannot find path 'C:\WINDOWS\system32\Microsoft.PowerShell.Commands.GroupInfo' because it does not exist.
Happens because Remove-Item requires a string as input. But you are passing a GroupInfo object so PS calls the ToString() method. Remove-Item interprets the resulting string as a file name located in your current location C:\WINDOWS\system32\.

Powershell - Export Multiple CSV's into unique folders

I have been working on a PowerShell script for the better part of well a week or two. I've been able to get some parts of it working however I'm unable to fully get this automated.
I deal with a lot of CSV files on a daily basis, I have been tasked with uploading them into our software however sometimes they're too large to handle so I break them down based upon their "type" (it's a column in the CSV) and I export it to a single CSV per "type". I've been able to accomplish this with the following:
$file = gci -Filter "*.csv";
Import-Csv $file `
| Group-Object –Property “type” `
| Foreach-Object `
{
$path=$_.name+”.csv” ; $_.group `
| Export-Csv –Path $path –NoTypeInformation
}
So this works wonderfully, for each individual CSV. Unfortunately I don't have the time to do this for each individual CSV. Now I come to my other PowerShell script:
get-childitem -Filter "*.csv" `
| select-object basename `
| foreach-object{ $path=$_.basename+".csv" #iterate through files.
if(!(Test-Path -path $_.basename)) #If the folder of the file can't be found then it will attempt to create it.
{
New-Item $_.basename -type directory; $file=$_.basename+".csv";
Import-Csv $file `
| Group-Object -Property "Type" `
| Foreach-Object {
$path=$_.name+".csv"; $_.group `
| `
if(!(Test-Path -path $path2))
{
New-Item $path2 -type directory
Export-Csv -Path $path2 + "\" + $path -NoTypeInformation
}
else
{
"Failed on: " + $_.basename
#Export-Csv -Path $_.basename + "\" + $path -NoTypeInformation
}
}
}
else
{
Import-Csv $path `
| Group-Object -Property "Type" `
| Foreach-Object {$path=$_.basename+".csv" ; $_.group
if(Test-Path -path $._)
{
New-Item $path2 -type directory
Export-Csv -Path $path2 + "\" + $path -NoTypeInformation
}
#else
#{
Write-Host "Failed on: $_.basename"
#Export-Csv -Path $_.basename + "\" + $path -NoTypeInformation
#}
}
}
}
I just can't wrap my head around "why" this isn't working effectively. I have two conditionals. Is there a folder for the CSV? If no create one. I have to have another one because one of the "types" contains a \ which errors out if I don't have the folder, so I automatically try to create it. When I run the script I get the Path is null.
The Error is:
The term ' ' is not recognized as the name of a cmdlet, function,
script file, or operable program. Check the spelling of the name, or
if a path was included, verify that the path is correct and try again.
At C:\Users\c.burkinshaw\foldermake.ps1:11 char:26
+ | ` <<<<
+ CategoryInfo : ObjectNotFound: ( :String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException
Test-Path : Cannot bind argument to parameter 'Path' because it is null.
At C:\Users\c.burkinshaw\foldermake.ps1:12 char:45
+ if(!(Test-Path -path <<<< $path2))
+ CategoryInfo : InvalidData: (:) [Test-Path], ParameterBindingValidationException
+ FullyQualifiedErrorId : ParameterArgumentValidationErrorNullNotAllowed,Microsoft.PowerShell.Commands.TestPathCommand
Any help would be greatly appreciated, if you have questions please don't hesitate to ask.
You have not defined $path2 anywhere, so something like test-path -path $path2 will say path is null. And in one place you are using $._ which will again give errors.
Edit after question updated with error message:
Your error message also says the same
Test-Path : Cannot bind argument to parameter 'Path' because it is
null. At C:\Users\c.burkinshaw\foldermake.ps1:12 char:45
+ if(!(Test-Path -path <<<< $path2))
Also the other error is in:
$path=$_.name+".csv"; $_.group `
| `
what are you trying to do here with the $_.group?
It is not proper. You cannot do $_.group | and provide some if statement.
Other comments:
Why are using $_.basename and then appending .csv? You could have just used $_.name. Try to not use the select-object basename - I don't see the value.
Extract the common import-csv and export-csv part into a function.