Powershell script to scan a target directory - powershell

I am getting an error when trying to scan a target directory within Powershell.
The code I am using is;
$path = read-host 'Enter the target drive letter'
$objects = get-childitem $path -force -recurse
Essentially what I want to do is get this general search working (scanning a user-specified location) so that I can then refine it to search for specific items such as files by filetype, size etc. When this runs, however, I get an error when the path contains a space, such as the 'documents and settings' or 'program files' folders in the drive.
Is there any way I can do this without getting this error? I am quite new to powershell, and I couldn't see this answered anywhere else, but I apologise if this has already been covered elsewhere.
Update:
It's running on Powershell v2.0. Thinking about it, the reason I thought it was space-related is because these are the only items which seem to error, but the message itself says;
Get-ChildItem : Access to the path 'C:\Documents and Settings' is denied. At
C:\users\robert\desktop\v1.ps1:5 char:25 + $objects = get-childitem <<<< $path
- force -recurse + CategoryInfo : PermissionDenied: (C:\Documents and
Settings:String) [Get-ChildItem], UnauthorizedAccessException +
FullyQualifiedErrorId :
DirUnauthorizedAccessError,Microsoft.PowerShell.Commands.GetChildItemCommand
Sorry if this was misleading, I thought as I was running this as an administrator with unrestricted execution policy the permissions shouldn't be a problem, but if this is what's actually causing it is there a way to override this so it will scan everything on the target drive?

Use: (see additional double quotes)
$objects = get-childitem "$path" -force -recurse

Vista or Windows 7? Maybe no problem, if it's just stumbling on the "Documents and Settings" junction on the root. The junction links to c:\users and maybe gci doesn't follow the link, to prevent a circular or incomplete search. Here's another way to stumble on this:
at vanilla cmd:
cd 'C:\documents and settings'
explorer .
On Win7 I get a popup error message:
Location is not available
C:\documents and settings is not accessible
Access is denied
No error with cd 'C:\documents and settings' and the shell returns a prompt inside that folder (as opposed to the link target, c:\users); and running the single command explorer 'C:\documents and settings' launches the My Documents folder without error ...but i digress.
We wouldn't want our recursive searches running in endless loops or skipping folders between 'Documents and Settings' and Users. Maybe get-childItem should handle this in some better way, dunno.
Meanwhile, you could exclude junction points this way. I haven't tested large recursive searches this way, but they might run longer.
gci "c:\" -force | where {($_.attributes.toString() -like "*ReparsePoint*") -eq $false}
Results do not include 'Documents and Settings' unless we swap $true for $false.

Related

How do I reference the current logged in user when a script is running?

So, I'm using Desktop Central to run some scripts on a bunch of machines. The script is supposed to open a zip file in the c:\users%USERNAME%\ folder, and decompress it to a folder of my choosing. The idea is to use a single script for many machines, that can leverage the c:\users\LOGGEDONUSER\downloads folder (Default TEAMS download dir). The idea is that each user will download the archive from teams, and a script will decompress and install from each users DOWNLOADS folder.
The issue is that I don't seem to know how to write a script uses a variable representing the username of the logged in user for the -path in my argument.
For instance;
Extract file
Expand-archive -path $home\Downloads\SWANDPDM_SP5.1.zip -DestinationPath C:\temp\swpdminstaller\extracted\ -Force
#Define registry values to modify to allow for no UAC
$RegistryPath = 'HKLM:\SOFTWARE\Microsoft\Windows\CurrentVersion\Policies\System'
$Name = 'ConsentPromptBehaviorAdmin'
$Value = '0'
#Run reg change
New-ItemProperty -Path $RegistryPath -Name $Name -Value $Value -PropertyType DWORD -Force
#Run installer
Invoke-Item C:\temp\swpdminstaller\extracted\SOLIDWORKS_AND_PDM_2021_SP5.1\startswinstall.exe
#Define reg values to change back to default
$RegistryPath = 'HKLM:\SOFTWARE\Microsoft\Windows\CurrentVersion\Policies\System'
$Name = 'ConsentPromptBehaviorAdmin'
$Value = '5'
#Run reg change
New-ItemProperty -Path $RegistryPath -Name $Name -Value $Value -PropertyType DWORD -Force
This works great if I copy the script to the machine manually, and launch the script as a user. It looks at $home and figures out the correct directory based on whomever is logged in.
However, when it runs as Desktop Central, $home doesn't mean the same location. It comes back with this;
Expand-archive : The path 'C:\Windows\system32\config\systemprofile\Downloads\SWANDPDM_SP5.1.zip' either does not
exist or is not a valid file system path.
At C:\Program Files (x86)\DesktopCentral_Agent\Computer\startup\76507\SWandPDMdecomInstall.ps1:2 char:1
+ Expand-archive -path $home\Downloads\SWANDPDM_SP5.1.zip -DestinationP ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (C:\Windows\syst...NDPDM_SP5.1.zip:String) [Expand-Archive], InvalidOpe
rationException
+ FullyQualifiedErrorId : ArchiveCmdletPathNotFound,Expand-Archive
I tried using various env variables with no luck. It seems like because it's a "Desktop central" account that's running the script remotely, I can't get it to point to the correct folder in c:\users\NAMEOFLOGGEDINUSER\
So, it thinks $home = 'C:\Windows\system32\config\systemprofile\ instead of c:\users\NAMEOFLOGGEDINUSER\
Is there a way that I can get the username of the current logged on user, assign it to a variable, and then use that variable instead of $home? Keep in mind, it needs to find the logged in user while running the script as the Desktop Central service account. I've tried running the script as various domain admins\system accounts with no luck.
I thought about doing a whoami, writing to a text file, then omitting the domain portion of the output and assigning it to a variable, but there's got to be a better way.
Any help is greatly appreciated!
EDIT: Thought I was on to something, but it didn't work. I tried;
Expand-archive -path $env:HOMEPATH\Downloads\SWANDPDM_SP5.1.zip -DestinationPath C:\temp\swpdminstaller\extracted\ -Force
I see from the comments that you found a workaround. But to answer your original question, you can't get the logged in username from the usual Powershell techniques ($env:USERNAME, whoami, etc.) when you're running the script under a different security context.
But you can check who owns the Explorer.exe process:
$User = (Get-CimInstance Win32_Process -Filter "name = 'explorer.exe'" |
Invoke-CimMethod -MethodName GetOwner).User
The "Desktop central" user will probably not have Explorer running. However, if there are multiple users logged in via RDP sessions this will return an array.

Looking for a simple way to test if folder is empty vs inaccessible/access denied

Testing if a folder is empty is pretty simple and has been discussed many times. Mostly using test-path or get-childitem
Example: Powershell test if folder empty
I'm working on a script for auditing and correcting file NTFS permissions on SMB shares in my enterprise environment.
I've run into something that would at first appear simple but I'm not able to find a simple solution. I'm wondering if there is a cmdlet that would help.
That is, differentiating between if a given folder is empty or I don't have access to it.
The problem is all the solutions that I see assume that you have access to the folder.
I was thinking that I was going to have to write a greasy function to do multiple tests then catching and testing error codes or something.
But I thought I would ask the smart coders here at Stack Overflow before going down that rabbithole.
Is there a more elegant way to test this? A Test-MyAccess cmdlet or something?
Test-Path returns the same results for empty folders and folders with no access.
Get-childitem returns null/empty for empty folders and folders with no access. Except a no access folder has an error.
PS H:\PowerShell\NTFS> Test-Path $EmptyFolder
True
PS H:\PowerShell\NTFS> Test-Path $EmptyFolder\*
False
PS H:\PowerShell\NTFS> Test-Path $NoAccessFolder
True
PS H:\PowerShell\NTFS> Test-Path $NoAccessFolder\*
False
PS H:\PowerShell\NTFS> Get-ChildItem -Path $EmptyFolder
PS H:\PowerShell\NTFS> Get-ChildItem -Path $NoAccessFolder
Get-ChildItem : The specified network name is no longer available.
At line:1 char:1
+ Get-ChildItem -Path $NoAccessFolder
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ReadError: (\\server.domain....\Sufolder1\Sub3:String) [Get-ChildItem], IOException
......
Thanks to #Lee_Dailey I seem to have been going down the right track to begin with. It looks like the way is to check Get-ChildItem for errors.
Seems like there should be a cmdlet to do this but I guess I'll write a function.
Something like this I imagine.
Function Get-FolderProperties ($Path){
$ErrOut = $Null
$Folder = Get-ChildItem -Path $Path -ErrorAction SilentlyContinue -ErrorVariable ErrOut
If($ErrOut) {
# There was an error accessing $path. Do something with $ErrOut
}
}

Copying from remote servers to local server

I am trying to do an unattended backup our websites from 2 webservers to our backup server.
$FolderName = $(Get-Date -Format D)
New-Item -ItemType directory -Path D:\backups\webservers\$FolderName
New-Item -ItemType directory -Path D:\backups\webservers\$FolderName\ColoWebP1
New-Item -ItemType directory -Path D:\backups\webservers\$FolderName\ColoWebD1
Copy-Item \\colowebp1.wa.local\e$\websites D:\backups\webservers\$FolderName\ColoWebP1 -recurse
Copy-Item \\colowebp1.wa.local\e$\backup D:\backups\webservers\$FolderName\ColoWebP1 -recurse
Copy-Item \\colowebd1.wa.local\e$\websites D:\backups\webservers\$FolderName\ColoWebD1 -recurse
Copy-Item \\colowebd1.wa.local\e$\backup D:\backups\webservers\$FolderName\ColoWebD1 -recurse
Now I still have not got this to run unattended. It creates the folders but does not copy the files. And now a new wrinkle has occured. When I run it manually I recieve this error:
Copy-Item : Access to the path 'D:\backups\webservers\Tuesday, February 25, 2014\ColoWebD1\websites\Agent_eVantage_Beta
\Master_wSlider.master' is denied.
At C:\scripts\Webserverbackup.ps1:12 char:10
+ Copy-Item <<<< \\colowebd1.wa.local\e$\websites D:\backups\webservers\$FolderName\ColoWebD1 -recurse
+ CategoryInfo : PermissionDenied: (Master_wSlider.master:FileInfo) [Copy-Item], UnauthorizedAccessExcept
ion
+ FullyQualifiedErrorId : CopyFileInfoItemUnauthorizedAccessError,Microsoft.PowerShell.Commands.CopyItemCommand
But all the files appear to be there. (I haven't attempted a restore of this yet).
So my questions are:
Am I reading this error right? Is it having trouble authenticating to the server this is running from?
And how do I get this to run unattended?
The problem is with the dollar sign in your Copy-Item (i.e. \$e\)
PowerShell is interpreting the $ sign as a variable. I would instead use a shared folder instead of the drive letter.
Copy-Item '\\colowebp1.wa.local\Share\websites' "D:\backups\webservers\$FolderName\ColoWebP1" -recurse
You have to set proper permissions to access Admin share. What happens when you access the target path above with Explorer? If everything is set up correctly, you should be able to get into the share without authentication. (ex. with default network credentials). Your solution itself is fine however and it will work once the authentication is not required. There are workarounds to this with PS but you would have to provide some details on network and UAC setup. I will happily attempt to resolve this once you provide the details.
At work I use such paths to admin share and these are working perfectly, Powershell doesnt treat the share as a variable.
Thanks,
Alex

How to limiting files searched by Get-ChildItem (or limiting depth of recursion)?

Background
There is a directory that is automatically populated with MSI files throughout the day. I plan on leveraging Task Scheduler to run the script shown below every 15 minutes. The script will search the directory and copy any new MSIs that have been created in the last 15 minutes to a network share.
Within this folder C:\ProgramData\flx\Output\<APP-NAME>\_<TIME_STAMP>\<APP-NAME>\ there are two other folders: Repackaged and MSI Package. The Repackaged folder does not need to be searched as it does not contain any MSIs. Also I have found that it needs to be excluded in some way to prevent this error:
Get-ChildItem : The specified path, file name, or both are too long. The fully qualified file name must be less than 260 characters, and the directory name must be less than 248 characters.
At line:14 char:32
+$listofFiles=(Get-ChildItem <<<< -Recurse -Path $outputPath -Include "*.msi" -Exclude "*.Context.msi" | where {$_.LastAccessTime -gt $time.AddMinutes($minutes)})
+ CategoryInfo : ReadError: C:\ProgramData\...xcellence\Leg 1:String) [Get-ChildItem], PathTooLongException
+ FullyQualifiedErrorId : DirIOError,Microsoft.PowerShell.Commands.GetChildItemCommand
Limitations
I am stuck using Powershell v1.0
I have no control over the directory structure of the source location
Updated:
I don't know the app name or the what the time stamp will be. That is something else that is out of my control.
Current plans
I have read about using -Filter and I am aware of filters that are similar to functions but I wasn't able to come up with any ideas of how to use them. My only thought at the moment would be to do something like:
$searchList=Get-ChildItem "all instances of the MSI Package folder"
foreach($folder in $searchList){
$listofFiles=Get-ChildItem "search for *.msi"
foreach($file in $listofFiles){"Logic to copy MSI from source to destination"}
}
However...I thought that there might be a more efficient way of doing this.
Questions
How can I limit depth that Get-ChildItem searches?
How can I limit the Get-ChildItem search to C:\ProgramData\flx\Output\<APP-NAME>_<TIME_STAMP>\<APP-NAME>\MSI Package
How can I only search folders that have been accessed in the last 15 minutes? I don't want to waste time drilling down into folders when I know MSI has already been copied.
Any additional advice on how to make this script more efficient overall would also be greatly appreciated.
Script
My current script can be found here. I kept getting: "Your post appears to contain code that is not properly formatted as code" and gave up after the fourth time trying to reformat it.
You can try this
dir C:\ProgramData\flx\Output\*\*\*\*\* -filter *.msi
this search all .msi files at this level
C:\ProgramData\flx\Output\<APP-NAME>\_<TIME_STAMP>\<APP-NAME>\Repackaged or 'MSI Package' or whatever else present folder
without recursion, this avoid too deep folder that give you error.
Pipe the result to:
Where {$_.LastAccessTime -gt (Get-Date).AddMinutes(-15)} #be sure no action on file is taken before the dir command
or
Where {$_.LastWriteTime -gt (Get-Date).AddMinutes(-15)} #some file can be re-copied maybe
With help from C.B. this is my new search which eliminates the issues I was having.
Changed -Path to C:\ProgramData\flx\Output\*\*\*\* to help limit the depth that was searched.
Used -Filter instead of -Include and put the -Exclude logic into the where clause.
Get-ChildItem -Path C:\ProgramData\flx\Output\*\*\*\* -Filter "*.msi" | where {$_.Name -notlike "*.Context.msi" -and $_.LastAccessTime -gt (Get-Date).AddMinutes(-15)}
You can't limit the recursion depth of Get-ChildItem except to not use -Recurse i.e. Get-ChildItem is either depth = 0 or N.
Set up variables for app name and timestamp e.g.:
$appName = "foo"
$timestamp = Get-date -Format HHmmss
Get-ChildItem "C:\ProgramData\flx\Output\${appName}_$timestamp\$appName\MSI Package" -force -r
You can filter the results like so:
Get-ChildItem <path> -R | Where {$_.LastWriteTime -gt (Get-Date).AddMinutes(-15)}

I need help understanding PowerShell security and file access issues

I'm working with PowerShell, running a script (from my console) that includes this line:
$inpath = "C:\users\xxxxx\path\foo\bar"
and I keep getting this error:
Get-Content : Access to the path 'C:\users\xxxxx\path\foo\bar' is denied.
At C:\users\xxxxx\path\foo\testscript.ps1:53 char:12
+ Get-Content <<<< $txtfile | Get-WordCount -Exclude (Get-Content c:\temp\exclude.txt) | select -First 15
+ CategoryInfo : PermissionDenied: (C:\users\xxxxx\path\foo\bar:String) [Get-Content], UnauthorizedAcc
essException
+ FullyQualifiedErrorId : GetContentReaderUnauthorizedAccessError,Microsoft.PowerShell.Commands.GetContentCommand
The scripts and target files are all located on my local drive. I can access the files in Explorer, view/edit/save them using NotePad, and do not have any permissions restrictions set. When I'm on the command line, I can run the get-content cmdlet successfully on files in my path. I can change directories PS C:> cd C:\users\xxxxx\path\foo\bar and successfully list what's there. Even more interesting, I can duplicate the line that's erroring in the script, and NOT receive an error on the command line.
PS C:\users\xxxxx\path\foo> $inpath = "C:\users\xxxxx\path\foo\bar"
PS C:\users\xxxxx\path\foo>
This makes me suspect that the 'Permission Denied' error is actually something else, or something vague enough that I've got no clue how to proceed with troubleshooting. Is it possible for PS to have different permissions than the user under which it's running? Has anyone seen this behavior before, and how did you solve the problem? I'm sure there's a simple solution that I don't know.
Get-Content : Access to the path 'C:\users\xxxxx\path\foo\bar' is denied.
At C:\users\xxxxx\path\foo\testscript.ps1:53 char:12
That path doesn't look like it is a file but a folder.
Are you sure you are appending the file name to the folder and passing that to Get-Content?
Windows gives Access Denied when you try and open a directory as if it were a file without passing extra flags; and .NET does not pass those flags (there are a few specific circumstances for opening a folder, but they do not apply here).
Get-Content read contents of file not folder. Please add . after your your folder path like below.
Get-Content "D:\Logs\*.*" | ?{($_|Select-String "test")}
If you want to go through all folders way under it then add -recurse like below:
Get-Content "D:\Logs\*.*" -Recurse | ?{($_|Select-String "test")}
Instead of this: (as per your comment)
foreach ($doc in $inpath) { do-function }
try this:
foreach ($doc in (gci $inpath)) { do-function }
You are doing a foreach on a string object instead of your folder items.