Using Where-Object for accessing folders with administrator permission - powershell

I would like to scan windows folder to find files that are created in a specific data range and export it to a csv file. Although I have opened a powershell as administrator, I still see some "access denied" messages.
PS C:\Windows> Get-ChildItem . -recurse | Where-Object { $_.CreationTime -ge "07/01/2021" -and $_.CreationTime -le "07/31/2021" } | Export-Csv 'e:\scans.csv'
Get-ChildItem : Access to the path 'C:\Windows\CSC' is denied.
...
How can I fix that?

A known trick is to install the sysInternals suite to use psexec.exe and to run your script in a powershell as System:
psexec.exe -i -s powershell.exe
Then at least C:\windows\CSC is available.

Related

MSIExec via Powershell Install

Find the list of MSI Files from a directory and install on given PC remotely or locally . I want to be able to to run a script that will install 8 separate MSI files in a given directory 1 by 1. I found this script and think it work but i feel as if it is missing something right?
foreach($_msiFiles in
($_msiFiles = Get-ChildItem $_Source -Recurse | Where{$_.Extension -eq ".msi"} |
Where-Object {!($_.psiscontainter)} | Select-Object -ExpandProperty FullName))
{
msiexec /i $_msiFiles /passive
}
It would help you to understand what is going on here. I would write it something like this:
Declare source Directory:
$source = “\\path\to\source\folder”
Put each child .msi object into an array:
$msiFiles = Get-Childitem $source -File -recurse | Where-Object {$_.Extension -eq “.msi”}
Iterate the array to run each .msi:
Foreach ($msi in $msiFiles) {
Msiexec /I “$($msi.FullName)” /passive
}
This is of course just an explanation of what you are doing. It does not include any error handling, checking for return codes, or remote command syntax, etc. etc.

Ignore a directory in Powershell for Get-childItem

I am trying to recursively get the files from a folder with multiple sub-folders. I noticed that Windows XP created this 5b....cb\amd64 files in the folder and it keeps throwing me the access denied error.
Get-ChildItem: Access to path is denied.
I do not have admin rights in these machines and what I am trying to do is skip these folders.
Is there a way to do this? Here is what I have tried with no success. Is it a good idea to suppress these messages as it does not break the script? Any help will be appreciated.
Get-ChildItem $sourceDir -Recurse | ? { $_.FullName -notlike '*\5b...cb\*'}
| Where-Object {$_.CreationTime.Date -eq ((Get-Date).Date)} |
foreach {
Write-Host $_
}
The error you see is thrown by Get-ChildItem, so trying to filter the inaccessible files further down the pipeline won't help. You need to suppress the original error using the ErrorAction common parameter:
Get-ChildItem $sourceDir -Recurse -ErrorAction SilentlyContinue | ...

How to do a data search on remote pc's

I have a script:
get-childitem c:\users -include *.mov,*.avi,*.asf,*.flv,*.swf,*.mpg,*.mp3,*.mp4,*.wmv,*.wav,*.jpg,*.tif,*.png,*.gif,*.bmp
-recurse > collection.txt
This works great when collecting on a local computer. However, I need to run the same thing on several computers at once. So I tried this in a BAT file:
PSexec #list.txt -u UserID -p Password PowerShell get-childitem c:\users -include *.mov,*.avi,*.asf,*.flv,*.swf,*.mpg,*.mp3,*.mp4,*.wmv,*.wav,*.jpg,*.tif,*.png,*.gif,*.bmp
-recurse > collection.txt 2>&1 pause
This worked on some remote PC's, but I ran into a couple of problems:
1) The collection.txt file contains all the information with no identification of which piece goes with which computer.
2) When running on a single computer, sometimes, it looks like it is running, but never finishes and/or never reports that it has completed or writes to the file.
Is there another way to collect the same data for all users that have logged into the computer? Or, am I just not doing it right
The better approach would be to use PSRemoting rather than PSExec.
$list = "RemoteComputer1","RemoteComputer2"
Invoke-Command -ComputerName $list -ScriptBlock {get-childitem c:\users -include *.mov,*.avi,*.asf,*.flv,*.swf,*.mpg,*.mp3,*.mp4,*.wmv,*.wav,*.jpg,*.tif,*.png,*.gif,*.bmp -recurse} | Out-File .\collection.txt
If you need to use PSExec and a BAT file:
PSexec #list.txt -u UserID -p Password PowerShell -command $env:computername; get-childitem c:\users -include *.mov,*.avi,*.asf,*.flv,*.swf,*.mpg,*.mp3,*.mp4,*.wmv,*.wav,*.jpg,*.tif,*.png,*.gif,*.bmp -recurse 2>&1 > collection.txt

Powershell WinSCP won't move it

here is my script:
$Path = "G:\FTP\APX\APXDropoff\Pay"
$Archive = "G:\FTP\APX\APXDropoff\Pay\Archive"
#$BankOfTulsa = "H:\Documentation\Projects\PJ\BankOfTulsa"
#$compareDate = (Get-Date).AddDays(-1)
$LastFile = Get-ChildItem $Path -Recurse | Where{$_.Name -Match "^CPdb(\d{6})(\d{8}).txt"}; $LastFile
CP $LastFile $Archive
#Call WinSCP Navigate to Incoming\Temp folder for test.
# & 'C:\Program Files (x86)\WinSCP\WinSCP.com' /command "option batch abort" "option confirm off" "open sftp:BankOfTulsa/" "put $LastFile /incoming/arp/"
So here's my issue. I am using reg ex to find the file the CP moves it just fine but when I go to upload to winSCP it says the file doesn't exist.
And it calls it by name, so the variable is there...
Authenticating with pre-entered password.
Authenticated.
Starting the session...
Reading remote directory...
Session started.
Active session: [1] BankOfAmerica
File or folder 'CPdb08131408252014.TXT' does not exist.
System Error. Code: 2.
The system cannot find the file specified
(A)bort, (R)etry, (S)kip, Ski(p) all: Abort
Please help!!
I would think that your issue would lie in that $LastFile does not contain the full path of the file you are trying to upload. I would suggest you use the .FullName property of $LastFile since you have that from the Get-ChildItem cmdlet.
"put $($LastFile.FullName) /incoming/arp/"
Also please refrain from using aliases where you can as some people might not know that CP is an alias for Copy-Item
Afterthought
$lastFile has the potential to match more that one file. If that is the case it would make a mess of the rest of the script potentially.
From your comment you can do the following:
Get-ChildItem $Path -Recurse | Where{$_.Name -Match "^CPdb(\d{6})(\d{8}).txt"} |
Sort-Object LastWriteTime -Descending | Select-Object -First 1

Powershell network drive Get-ChildItem issues

Essentially I'm trying to use PowerShell to find files with certain file extensions on a network drive created on or after June 1st of this year. I thought I could do that with the following statement:
Get-ChildItem NETWORKPATH*. -recurse -include .xlsx | Where-Object { $_.CreationTime -ge "06/01/2014" }
I run into 2 problems:
The command only returns files from the root and one folder on the network drive, there's over 100 folders on this network drive
The command returns 3 out of the 5 files created after 6/1/14 and one created well before my creation time date.
I have access to all of the folders on the network drive. When I run Windows 7 search it finds all of the files. It doesn't matter if I run Powershell as administrator or not. It doesn't matter if I run it from my machine (Windows 7) or from one of our 2008 servers. The network drive I'm trying to search through is on a 2003 file server. What am I doing wrong?
Make sure you add a wildcard to your Include parameter. Also you should never use strings for date comparison. See the example of why not here. Try the following:
$testDate = new-object DateTime (2014,06,01)
Get-ChildItem NETWORKPATH*. -recurse -include *.xlsx | Where-Object { $_.CreationTime -ge $testDate }
Also note that files and folders marked as hidden will not show up unless you add a -force to the get-childitem. Not sure if that is part of the issue or not.
gci -path PATH -recurse | where {$_.extension -match "xlsx"} was the silver bullet to all of this.
This is what I use.
$Extensions = '*.xlsx','*.csv','*.xls'
$path = 'Network path'
Get-ChildItem "$path" -Include $Extensions -Recurse -Force | where {$_.CreationTime -gt
[datetime]"10/05/2018"} | Select * | Export-Csv -Path C:\TestExcelfiles.csv -
NoTypeInformation | fl * #format-wide