Powershell script executed on each file in a folder? - powershell

I currently have a powershell script, which print out some information regarding the files which passed in as argument..
The command for executing the script, it done as such:
.\myscript.ps1 -accessitem C:\folder
I want to apply the script on all files and folder on the drive C:, is it possible i for loop to list all files, and pass the path as argument for the script?
The script:
[CmdletBinding()]
Param (
[Parameter(Mandatory=$True,Position=0)]
[String]$AccessItem
)
$ErrorActionPreference = "SilentlyContinue"
If ($Error) {
$Error.Clear()
}
$RepPath = Split-Path -Parent $MyInvocation.MyCommand.Definition
$RepPath = $RepPath.Trim()
$str = $AccessItem -replace ':',''
$str = $AccessItem -replace '/','.'
$FinalReport = "$RepPath\"+$str+".csv"
$ReportFile1 = "$RepPath\NTFSPermission_Report.txt"
If (!(Test-Path $AccessItem)) {
Write-Host
Write-Host "`t Item $AccessItem Not Found." -ForegroundColor "Yellow"
Write-Host
}
Else {
If (Test-Path $FinalReport) {
Remove-Item $FinalReport
}
If (Test-Path $ReportFile1) {
Remove-Item $ReportFile1
}
Write-Host
Write-Host "`t Working. Please wait ... " -ForegroundColor "Yellow"
Write-Host
## -- Create The Report File
$ObjFSO = New-Object -ComObject Scripting.FileSystemObject
$ObjFile = $ObjFSO.CreateTextFile($ReportFile1, $True)
$ObjFile.Write("NTFS Permission Set On -- $AccessItem `r`n")
$ObjFile.Close()
$ObjFile = $ObjFSO.CreateTextFile($FinalReport, $True)
$ObjFile.Close()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($ObjFSO) | Out-Null
Remove-Variable ObjFile
Remove-Variable ObjFSO
If((Get-Item $AccessItem).PSIsContainer -EQ $True) {
$Result = "ItemType -- Folder"
}
Else {
$Result = "ItemType -- File"
}
$DT = Get-Date -Format F
Add-Content $ReportFile1 -Value ("Report Created As On $DT")
Add-Content $ReportFile1 "=================================================================="
$Owner = (Get-Item -LiteralPath $AccessItem).GetAccessControl() | Select Owner
$Owner = $($Owner.Owner)
$Result = "$Result `t Owner -- $Owner"
Add-Content $ReportFile1 "$Result `n"
(Get-Item -LiteralPath $AccessItem).GetAccessControl() | Select * -Expand Access | Select IdentityReference, FileSystemRights, AccessControlType, IsInherited, InheritanceFlags, PropagationFlags | Export-CSV -Path "$RepPath\NTFSPermission_Report2.csv" -NoTypeInformation
Add-Content $FinalReport -Value (Get-Content $ReportFile1)
Add-Content $FinalReport -Value (Get-Content "$RepPath\NTFSPermission_Report2.csv")
Remove-Item $ReportFile1
Remove-Item "$RepPath\NTFSPermission_Report2.csv"
Invoke-Item $FinalReport
}
If ($Error) {
$Error.Clear()
}
I would prefer a outside command doing this, as workings of the script should not be altered, it it used for single file testing..

There are two ways to do this:
Add -Recurse Flag to the script
Run the script on each directory
I'm going with option two since the script looks complicated enough that I don't want to touch it.
$path_to_script = "C:\path\to\myscript.ps1"
$start_directory = "C:\folder"
# Call Script on Parent Directory
& "$path_to_script" -AccessItem "$start_directory"
# Call Script on any Child Directories within the "$start_directory"
foreach($child in (ls "$start_directory" -Recurse -Directory))
{
$path = $child.FullName
& "$path_to_script" -AccessItem "$path"
}
Basically, I'm calling the script on the parent directory and any sub-directories within the parent directory.

Related

For loop with list not working as expected Powershell

I'm trying to create a powershell script which checks a log file for lines of text and if the line exists restarts a service and resets/archives the log. I got it working before with 1 "checkstring" if you will, but I've been struggling to get it to work with a list of strings. Could anyone help me figure out where I'm going wrong?
This is the code I'm currently using:
$serviceName = "MySQL80"
$file = "test.txt"
$pwd = "C:\tmp\"
$checkStrings = New-Object System.Collections.ArrayList
# Add amount of checkstrings
$checkStrings.add("Unhandled error. Error message: Error retrieving response.")
$checkStrings.add("Unhandled error. Error message: Error retrieving response. Second")
$logName = "ServiceCheck.log"
$backupFolder = "Archive"
$logString = (Get-Date).ToString("ddMMyyyyHHmmss"), " - The service has been reset and the log moved to backup" -Join ""
Set-Location -Path $pwd
if(Test-Path -Path $file) {
if(Test-Path -Path $backupFolder) {
} else {
New-Item -Path $pwd -Name $backupFolder -ItemType "director"
}
foreach ($element in $checkStrings) {
$containsWord = $fileContent | %{$_ -match $element}
if ($containsWord -contains $true) {
Restart-Service -Name $serviceName
$backupPath = $pwd, "\", $backupFolder, "\", $date, ".log" -join ""
$currentFile = $pwd, "\", $file -join ""
Copy-Item $currentFile -Destination $backupPath
Get-Content $currentFile | select-string -pattern $checkString -notmatch | Out-File $currentFile
if(Test-Path -Path $logName) {
Add-Content $logName $logString
} else {
$logString | Out-File -FilePath $logName
}
}
}
}

Powershell delete files older than 30 days and log

I'm using this code to delete files older than 30 days
Function Remove_FilesCreatedBeforeDate {
$Path = "\\servername\path"
$Date = (Get-Date).AddDays(-30)
$ValidPath = Test-Path $Path -IsValid
If ($ValidPath -eq $True) {
"Path is OK and Cleanup is now running"
Get-ChildItem -Path $path -Recurse | Where-Object { $_.LastWriteTime -lt $Date } | Remove-Item -Recurse -force -Verbose
}
Else {
"Path is not a ValidPath"
}
}
Remove_FilesCreatedBeforeDate
Now I want to log which files were deleted, and also whether there was an error or the path isn't valid. Can anyone help me here?
//EDIT
Im Now using this Code (Thanks to Efie for helping)
[Cmdletbinding()]
param(
[Parameter()]$LogPath = 'C:\Admin\scripts\Clean_Folder\Log\log.txt',
[Parameter(ValueFromPipeline)]$Message
)
process {
$timeStampedMessage = "[$(Get-Date -Format 's')] $Message"
$timeStampedMessage | Out-File -FilePath $LogPath -Append
}
}
Function Remove-FilesCreatedBeforeDate {
[Cmdletbinding()]
param(
[Parameter()]$Path = '\\servername\path\',
[Parameter()]$Date = $(Get-Date).AddDays(-30)
)
process {
if(-not (Test-Path $Path -IsValid)) {
"Path $Path was invalid" | Write-MyLog
return
}
"Path $Path is OK and Cleanup is now running" | Write-MyLog
try {
Get-ChildItem -Path $Path -Recurse |
Where-Object {
$_.LastWriteTime -lt $Date
} | Remove-Item -recurse -force -verbose | Write-MyLog
}
catch {
"Remove-Item failed with message $($_.Exception.Message)" | Write-MyLog
}
}
}
Write-MyLog
Remove-FilesCreatedBeforeDate
Two files getting deleted but i just see this in my Log
[2021-07-22T16:27:53] Path \\servername\path\ is OK and Cleanup is now running
I dont see which files getting deleted sadly
A simple implementation for your example would be something like this:
Function Remove-FilesCreatedBeforeDate {
[Cmdletbinding()]
param(
[Parameter(Mandatory)]$Path = '\some\default\path',
[Parameter()]$Date = $(Get-Date).AddDays(-30)
)
process {
if(-not (Test-Path $Path -IsValid)) {
"Path $Path was invalid" | Write-MyLog
return
}
"Path $Path is OK and Cleanup is now running" | Write-MyLog
try {
Get-ChildItem -Path $Path -Recurse |
Where-Object {
$_.LastWriteTime -lt $Date
} | Remove-Item -Recurse -Force -Verbose
}
catch {
"Remove-Item failed with message $($_.Exception.Message)" | Write-MyLog
}
}
}
function Write-MyLog {
[Cmdletbinding()]
param(
[Parameter()]$LogPath = 'default\log\path\log.txt',
[Parameter(ValueFromPipeline)]$Message
)
process {
$timeStampedMessage = "[$(Get-Date -Format 's')] $Message"
$timeStampedMessage | Out-File -FilePath $LogPath -Append
}
}
Some notes:
Advanced Functions
process { }, [Cmdletbinding()], and [Parameter()] are what turn your function into an 'advanced' function. You get to use loads of built in features normally reserved for compiled cmdlets this way.
For example, you could now suppress errors with $ErrorActionPreference = 'SilentlyContinue' like you're used to doing with native Powershell cmdlets.
You can pipe your messages to your logging function by adding ValueFromPipelin to your parameter.
Those really just brush the surface of the extra capabilities you get.
Here is some information. I would recommend getting in the habit of writing them like this if you plan to use them in the future.
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_functions_advanced?view=powershell-7.1
Error Handling
I'd recommend looking into this documentation by Microsoft on error handling:
https://learn.microsoft.com/en-us/powershell/scripting/learn/deep-dives/everything-about-exceptions?view=powershell-7.1
Naming Conventions
I would also recommend taking a look at this about PowerShell function naming conventions:
https://learn.microsoft.com/en-us/powershell/scripting/developer/cmdlet/approved-verbs-for-windows-powershell-commands?view=powershell-7
By PowerShell standards it would make more sense to name your function Remove-FilesCreatedBeforeDate with the dash separating verb-action instead of the underscore.
Logging
If you want a little more control and a few more features for logging your functions, here is some information on a tried and true solution for PowerShell using PSFramework:
https://adamtheautomator.com/powershell-logging/
Good luck! Hope some of that helps.
In Unix its Simple
find /var/log/hive -type f -mtime +30 -delete
Could Start-transcript with Try and catch be your solution here?
Start-Transcript logs everything that you do and the errors.
I tried this and this does what you want
Start-Transcript -Path "$PSScriptRoot\RemoveAccountLog.txt" -Force -Append
Get-Date -Format "yyyy-mm-dd HH:MM"
Try
{ # Start Try
$Path = "\\servername\path"
$Date = (Get-Date).AddDays(-30)
$TestPath = Test-Path -Path $Path -PathType Container
If ( $TestPath -Eq $Null )
{ # Start If
Write-Host "The $TestPath String is empty, Path is not a valid"
} # End If
Else
{ # Start Else
Write-host "Path is OK and Cleanup is now running... 0%"
$GetFiles = Get-ChildItem -Path $Path -Recurse -Force |
Where-Object { $_.LastWriteTime -lt $Date } |
Remove-Item -Recurse -force -Verbose |
Write-host "Path is OK and Cleanup is now running... 100%" -ForegroundColor Green
} # End Else
} # End Try
Catch
{ # Start Catch
Write-Warning -Message "## ERROR## "
Write-Warning -Message "## Script could not start ## "
Write-Warning $Error[0]
} # End Catch
Screenshot:

powershell set-acl for multiple computers provided by a csv

I want to change the permission of a folder on multiple pcs, provided in a csv file. the csv doesn't have a header, just the computernames.
problem is, that it does not import the pc names. i can't use a txt file
$scriptpath = $MyInvocation.MyCommand.Path
$dir = Split-Path $scriptpath
$file = import-csv -path "$dir\pc.csv"
foreach($pc in $file) {
try {
$acl = get-acl -path "\\$pc\c$\Program Files (x86)\testfolder"
$new = "users","full","ContainerInherit,ObjectInherit","None","Allow"
$accessRule = new-object System.Security.AccessControl.FileSystemAccessRule $new
$acl.SetAccessRule($accessRule)
$acl | Set-Acl "\\$pc\c$\Program Files (x86)\testfolder"
Write-Output $([string](get-date) + "`t $pc success") | out-file -append -filepath "$dir\acl_success.log"
}
catch {Write-Output $([string](get-date) + "`t $pc failed") | out-file -append -filepath "$dir\acl_failed.log"
}
}
Is it possible to use a invoke-command setting the folder acl using the provided csv file?
You say "i can't use a txt file", but what you describe IS a txt file. No header, just pc names each on its own line.
Also, your script would greatly benefit if you would indent the code.
Try
$scriptpath = $MyInvocation.MyCommand.Path
$dir = Split-Path $scriptpath
# set ErrorAction to 'Stop' in order to catch errors
$oldErrorAction = $ErrorActionPreference
$ErrorActionPreference = 'Stop'
Get-Content -Path "$dir\pc.csv" | ForEach-Object {
$pc = $_ # capture this for when we hot the catch block
$path = "\\$pc\c$\Program Files (x86)\testfolder"
try {
$acl = Get-Acl -LiteralPath $path
$accessrule = [System.Security.AccessControl.FileSystemAccessRule]::new('Users', 'FullControl', 'ContainerInherit,ObjectInherit', 'None', 'Allow')
$acl.SetAccessRule($accessRule)
$acl | Set-Acl -LiteralPath $path
# write to both the console and to file
("{0}`t{1} success" -f (Get-Date).ToString(), $pc) | Add-Content -LiteralPath "$dir\acl_success.log" -PassThru
}
catch {
("{0}`t{1} failed" -f (Get-Date).ToString(), $pc) | Add-Content -LiteralPath "$dir\acl_success.log" -PassThru
}
}
# restore previous ErrorAction
$ErrorActionPreference = $oldErrorAction
Using Invoke-Command:
$scriptpath = $MyInvocation.MyCommand.Path
$dir = Split-Path $scriptpath
$script = {
# set ErrorAction to 'Stop' in order to catch errors
$oldErrorAction = $ErrorActionPreference
$ErrorActionPreference = 'Stop'
# you're now running this on the remote pc, so use local path
$path = "C:\Program Files (x86)\testfolder"
try {
$acl = Get-Acl -LiteralPath $path
$accessrule = [System.Security.AccessControl.FileSystemAccessRule]::new('Users', 'FullControl', 'ContainerInherit,ObjectInherit', 'None', 'Allow')
$acl.SetAccessRule($accessRule)
$acl | Set-Acl -LiteralPath $path
# output the message
"{0}`t{1} success" -f (Get-Date).ToString(), $env:COMPUTERNAME
}
catch {
"{0}`t{1} failed" -f (Get-Date).ToString(), $env:COMPUTERNAME
}
# restore previous ErrorAction
$ErrorActionPreference = $oldErrorAction
}
$allPCs = #(Get-Content -Path "$dir\pc.csv") # force it to be an array, evenif just one pc name in the file
# run the scriptblock on the $allPCs array
$result = Invoke-Command -ComputerName $allPCs -ScriptBlock $script
$result | Where-Object { $_ -match 'success' } | Add-Content -Path "$dir\acl_success.log" -PassThru
$result | Where-Object { $_ -match 'failed' } | Add-Content -Path "$dir\acl_failed.log" -PassThru
Instead of this:
$file = import-csv -path "$dir\pc.csv"
Use:
$file = get-content "$dir\pc.csv"

check if a Deny permission already exists to a directory or not in PowerShell

I have been writing a script
Workflow:
Get list all of fixed disks ( except cdrom , floppy drive , usb drive)
to check if a path exists or not in PowerShell
to check if a Deny permission already exists to a directory or not in PowerShell
Set deny permission for write access for users
My question are :
1- After file exist control like below , also I want to check if a Deny permission already exists to a directory. ("$drive\usr\local\ssl")
If(!(test-path $path))
{
New-Item -ItemType Directory -Force -Path $path
}
2- there are about 1000 machines. How can I improve this script ?
Thanks in advance,
script :
$computers = import-csv -path "c:\scripts\machines.csv"
Foreach($computer in $computers){
$drives = Get-WmiObject Win32_Volume -ComputerName $computer.ComputerName | Where { $_.drivetype -eq '3'} |Select-Object -ExpandProperty driveletter | sort-object
foreach ($drive in $drives) {
$path = "$drive\usr\local\ssl"
$principal = "users"
$Right ="Write"
$rule=new-object System.Security.AccessControl.FileSystemAccessRule($Principal,$Right,"Deny")
If(!(test-path $path))
{
New-Item -ItemType Directory -Force -Path $path
}
try
{
$acl = get-acl $folder
$acl.SetAccessRule($rule)
set-acl $folder $acl
}
catch
{
write-host "ACL failed to be set on: " $folder
}
#### Add-NTFSAccess -Path <path> -Account <accountname> -AccessType Deny -AccessRights <rightstodeny>
}
}
The first thing I noticed is that in your code, you suddenly use an undefined variable $folder instead of $path.
Also, you get the drives from the remote computer, but set this $path (and try to add a Deny rule) on folders on your local machine:
$path = "$drive\usr\local\ssl"
where you should set that to the folder on the remote computer:
$path = '\\{0}\{1}$\usr\local\ssl' -f $computer, $drive.Substring(0,1)
Then, instead of Get-WmiObject, I would nowadays use Get-CimInstance which should give you some speed improvement aswell, and I would add some basic logging so you will know later what happened.
Try this on a small set of computers first:
Note This is assuming you have permissions to modify permissions on the folders of all these machines.
$computers = Import-Csv -Path "c:\scripts\machines.csv"
# assuming your CSV has a column named 'ComputerName'
$log = foreach ($computer in $computers.ComputerName) {
# first try and get the list of harddisks for this computer
try {
$drives = Get-CimInstance -ClassName Win32_Volume -ComputerName $computer -ErrorAction Stop |
Where-Object { $_.drivetype -eq '3'} | Select-Object -ExpandProperty driveletter | Sort-Object
}
catch {
$msg = "ERROR: Could not get Drives on '$computer'"
Write-Host $msg -ForegroundColor Red
# output a line for the log
$msg
continue # skip this one and proceed on to the next computer
}
foreach ($drive in $drives) {
$path = '\\{0}\{1}$\usr\local\ssl' -f $computer, $drive.Substring(0,1)
$principal = "users"
$Right = "Write"
if (!(Test-Path -Path $path -PathType Container)) {
$null = New-Item -Path $path -ItemType Directory -Force
}
# test if the path already has a Deny on write for the principal
$acl = Get-Acl -Path $path -ErrorAction SilentlyContinue
if (!$acl) {
$msg = "ERROR: Could not get ACL on '$path'"
Write-Host $msg -ForegroundColor Red
# output a line for the log
$msg
continue # skip this one and proceed to the next drive
}
if ($acl.Access | Where-Object { $_.AccessControlType -eq 'Deny' -and
$_.FileSystemRights -band $Right -and
$_.IdentityReference -like "*$principal"}) {
$msg = "INFORMATION: Deny rule already exists on '$path'"
Write-Host $msg -ForegroundColor Green
# output a line for the log
$msg
}
else {
$rule = [System.Security.AccessControl.FileSystemAccessRule]::new($Principal, $Right, "Deny")
# older PS versions use:
# $rule = New-Object System.Security.AccessControl.FileSystemAccessRule $Principal, $Right, "Deny"
try {
$acl.AddAccessRule($rule)
Set-Acl -Path $path -AclObject $acl -ErrorAction Stop
$msg = "INFORMATION: ACL set on '$path'"
Write-Host $msg -ForegroundColor Green
# output a line for the log
$msg
}
catch {
$msg = "ERROR: ACL failed to be set on: '$path'"
Write-Host $msg -ForegroundColor Red
# output a line for the log
$msg
}
}
}
}
# write the log
$log | Set-Content -Path "c:\scripts\SetAccessRuleResults.txt" -Force

Powershell loop is only running once file per filename, even if the filename exists with multiple extensions

I'll be the first to admit that PowerShell isn't my strong suit, but I've pieced together the following after an evening of digging around on the internet. The end goal is to organize a huge drive of images by the DateTaken, as well as sidecar XMP files if they exist. It's probably not the most elegant code, but it almost works.
The last issue, which I can't figure out, is that the foreach loop only executes once per filename, regardless of extension. For example, only DSC00001.arw, DSC00001.xmp, or DSC00001.jpg would be processed.
Any points in the right direction would be appreciated.
Thanks!
$Folders = (Get-ChildItem -Path F:\ -Recurse -Directory -Force).FullName
$objShell = New-Object -ComObject Shell.Application
$Folders | % {
$objFolder = $objShell.namespace($_)
foreach ($File in $objFolder.items()) {
if (($File | Split-Path -Extension) -in ".arw",".gif",".tiff",".jpg",".png",".nef") {
Write-Host $File.Name`t -NoNewline -ForegroundColor Green
try {
$DateTaken = ($objFolder.getDetailsOf($File,12) -replace [char]8206) -replace [char]8207
$DateTaken = [DateTime]::ParseExact($DateTaken, "g", $null)
$Year = $DateTaken.ToString('yyyy')
$Date = $DateTaken.ToString('yyyy-MM-dd')
Write-Host $Date`t -ForegroundColor Blue -NoNewline
}
catch {
$Year = 'Other'
$Date = 'Other'
Write-Host $Date`t -ForegroundColor DarkRed -NoNewline
}
finally {
$DatePath = (Join-Path (Join-Path F:\ $Year ) $Date)
}
write-Host $File.Path -> (Join-Path $DatePath ($File | Split-Path -Leaf))-NoNewline
#Move-Item $File.Path (Join-Path $DatePath ($File | Split-Path -Leaf))
Write-Host Completed`n -NoNewline
if (Test-Path (Join-Path ($File | Split-Path) (($File | Split-Path -LeafBase) + '.xmp'))) {
Write-Host XMP:`t -ForegroundColor Magenta -NoNewLine
Write-Host (Join-Path ($File | Split-Path) (($File | Split-Path -LeafBase) + '.xmp')) -> (Join-Path ($DatePath) (($File | Split-Path -LeafBase) + '.xmp'))
#Move-Item (Join-Path ($File | Split-Path) (($File | Split-Path -LeafBase) + '.xmp')) (Join-Path ($DatePath) (($File | Split-Path -LeafBase) + '.xmp'))
}
}else {
Write-Host $File.Name is not an image `n -NoNewline -ForegroundColor DarkYellow
}
}
}
I'm not sure why $objFolder.items() doesn't actually return all expected files, but I would suggest using PowerShell's built-in file system provider to discover the actual files in each folder and then use $objFolder.ParseName() to obtain a reference you can pass to GetDetailsOf():
$Folders = (Get-ChildItem -Path F:\ -Recurse -Directory -Force).FullName
$objShell = New-Object -ComObject Shell.Application
$Folders | % {
$objFolder = $objShell.NameSpace($_)
foreach ($FileInfo in Get-ChildItem -LiteralPath $_ -File -Force) { # <-- trick is right here, use Get-ChildItem to discover the files in the folder
if($FileInfo.Extension -in ".arw",".gif",".tiff",".jpg",".png",".nef"){
$File = $objFolder.ParseName($FileInfo.Name) # <-- here we resolve the corresponding file item with shell app
# ... resolve and parse dateTaken here, just like before
# *.xmp file path might be easier to construct like this
$xmpPath = $FileInfo.FullName -replace "$($FileInfo.Extension)`$",'.xmp'
if(Test-Path -LiteralPath $xmpPath){
# ...
}
}
}
}