Folder permissions script exceeding the time allocated for execution - powershell

I have a simple PowerShell v4 script that pulls directory permissions on a hard drive and writes them to a file.
It runs fine 90% of the time, but other times my output file is empty or only partially full. The error log says the "task was terminated due to exceeding the time allocated for execution" which is 1 hour. Normally this task runs in 5 minutes.
Any ideas about what might be causing this and a solution?
$RootPath = "D:"
$Folders = dir $RootPath -recurse | where {$_.psiscontainer -eq $true}
foreach ($Folder in $Folders){
$ACLs = get-acl $Folder.fullname | ForEach-Object { $_.Access }
Foreach ($ACL in $ACLs){
if ($Folder.fullname -NotMatch "superoldhome"){
$OutInfo = $Folder.Fullname + "`t" + $ACL.IdentityReference + "`t" + $ACL.AccessControlType
Add-Content -Value $OutInfo -Path $OutFile
}
}}

Is this running as a Windows task? Maybe it is losing access to the hard drive for some reason? If you really want to use this, I would suggest a few debug lines to make sure it stays connected . Maybe something like:
if (Test-Path $Folder) {}
else {Add-Content -Value "Unable to resolve Path" -Path $OutFile}
If you have that in your loop if that is the problem itll write to the log. Or you can log everything to a file that happens. I forget how to do that off the top of my head, but let me know, I can probably dig that up.

Thank you for all the feedback.
I am experimenting with dumpsec and have high hopes for that.
If it doesn't get me what I want I will try adding some more logging like Matt & Kenny suggested.

Related

Powershell Get-ChildItem returns nothing for folder

A friend of mine asked me to write a short script for him. The script should check a specific folder, find all files and subfolders older than X days and remove them. Simple so far, I wrote the script, successfully tested it on my own system and sent it to him. Here's the thing - it doesn't work on his system. To be more specific, the Get-ChildItem cmdlet does not return anything for the provided path, but it gets weirder even, more on that later. I'm using the following code to first find the files and folders (and log them before deleting them later on):
$Folder = "D:\Data\Drive_B$\General\ExchangeFolder"
$CurrentDate = Get-Date
$TimeSpan = "-1"
$DatetoDelete = $CurrentDate.AddDays($TimeSpan)
$FilesInFolder = (Get-ChildItem -Path $Folder -Recurse -ErrorAction SilentlyContinue | Where-Object {$_.LastWriteTime -lt $DatetoDelete})
All variables are filled and we both know that the folder is filled to the brim with files and subfolders older than one day, which was our timespan we chose for the test. Now, the interesting part is that not only does Get-ChildItem not return anything - going to the folder itself and typing in "dir" does not return anything either. Never seen behaviour like this. I've checked everything I could think of - is it DFS?, typos, folder permissions, share permissions, hidden files, ExecutionPolicy. Everything is as it should be to allow this script to work properly as it did on my own system when initially testing it. The script does not return any errors whatsoever.
So for some reason, the content of the folder cannot be found by powershell. Does anyone know of a reason why this could be happening? I'm at a loss here :-/
Thanks for your time & help,
Fred
.AddDays() takes an double I would use that.
Filter then action
This code will work for you.
$folder = Read-Host -Prompt 'File path'
$datetodel = (Get-Date).AddDays(-1)
$results = try{ gci -path $folder -Recurse | select FullName, LastWriteTime | ?{ $_.LastWriteTime -lt $datetodel}}catch{ $Error[-1] }
$info = "{0} files older than: {1} deleting ...." -f $results.count, $datetodel
if($results | ogv -PassThru){
[System.Windows.Forms.MessageBox]::Show($info)
#Put you code here for the removal of the files
# $results | % del FullName -force
}else{
[System.Windows.Forms.MessageBox]::Show("Exiting")
}

PowerShell - Move file, Rename & Rotate

I am fairly new to powershell and still learning. I have completed my first script and now trying to add some logging into it. I am able to append to log file OK but stuck on backing up the log and rotating it. This is what I have so far
$CheckFile = Test-Path $Logfilepath
IF ($CheckFile -eq $false) {
$Date = (Get-Date).tostring()
$Date + ' - Automonitor log created - INFO' | Out-File -Append -Force $Logfilepath }
Else {
IF ((Get-Item $Logfilepath).length -gt $Size) {
Move-Item $Logfilepath -Destination $LogsOldFolder -Force}
}
This is where I am stuck. If the file is bigger than 5MB I need it to move to another folder (which I have in the script) but when moved into that folder I only want to keep the 5 newest files to avoid storage issues. I will need the files named like the below.
Automonitor.log.1
Automonitor.log.2
Automonitor.log.3
Automonitor.log.4
Automonitor.log.5
Automonitor.log.1 being the newest created file. So I am really baffled on the process I would take and how to rename the files to match the above format and when new file is copied over, to rename all of them again dependent on date created and deleting the oldest so only 5 files ever exist.
I hope that makes sense, if anyone has any ideas that would be great.
You can go this way:
$a = gci $destfolder
if ( $a.count -gt 5)
{
$a | sort lastwritetime | select -first ($a.count - 5) | remove-item
}
This will get you every file older than the first 5.
So, this script doesnt care about the filenames. If you want that, you should Change the $a = gci $destfolder part to some Wildcards.

PowerShell Memory leak misunderstanding

New to PowerShell, so kind of learning by doing.
The process I have created works, but it ends up locking down my machine until it is completed, eating up all memory. I thought I had this fixed by looking into forcing the garbage collector, and also moving from a for-each statement to using %() to loop through everything.
Quick synopsis of process: Need to merge multiple SharePoint log files into single ones to track usage across all of the companies' different SharePoint sites. PowerShell loops through all log directories on the SP server, and checks each file in the directory if it already exists on my local machine. If it does exist it appends the file text, otherwise it does a straight copy. Rinse-repeat for each file and directory on the SharePoint Log server. Between each loop, I'm forcing the GC because... Well because my basic understanding is the looped variables are held in memory, and I want to flush them. I'm probably looking at this all wrong. So here is the script in question.
$FinFiles = 'F:\Monthly Logging\Logs'
dir -path '\\SP-Log-Server\Log-Directory' | ?{$_.PSISContainer} | %{
$CurrentDir = $_
dir $CurrentDir.FullName | ?(-not $_.PSISContainer} | %{
if($_.Extension -eq ".log"){
$DestinationFile = $FinFiles + '\' + $_.Name
if((Test-Path $DestinationFile) -eq $false){
New-Item -ItemType file -path $DestinationFile -Force
Copy-Item $_.FullName $DestinationFile
}
else{
$A = Get-Content $_.FullName ; Add-Content $DestinationFile $A
Write-Host "Log File"$_.FullName"merged."
}
[GC]::Collect()
}
[GC]::Collect()
}
Granted the completed/appended log files get very very large (min 300 MB, max 1GB). Am I not closing something I should be, or keeping something open in memory? (It is currently sitting at 7.5 of my 8 Gig memory total.)
Thanks in advance.
Don't nest Get-ChildItem commands like that. Use wildcards instead. Try: dir "\\SP-Log-Server\Log-Directory\*\*.log" instead. That should improve things to start with. Then move this to a ForEach($X in $Y){} loop instead of a ForEach-Object{} loop (what you're using now). I'm betting that takes care of your problem.
So, re-written just off the top of my head:
$FinFiles = 'F:\Monthly Logging\Logs'
ForEach($LogFile in (dir -path '\\SP-Log-Server\Log-Directory\*\*.log')){
$DestinationFile = $FinFiles + '\' + $LogFile.Name
if((Test-Path $DestinationFile) -eq $false){
New-Item -ItemType file -path $DestinationFile -Force
Copy-Item $LogFile.FullName $DestinationFile
}
else{
$A = Get-Content $LogFile.FullName ; Add-Content $DestinationFile $A
Write-Host "Log File"$LogFile.FullName"merged."
}
}
}
Edit: Oh, right, Alexander Obersht may be quite right as well. You may well benefit from a StreamReader approach as well. At the very least you should use the -readcount argument to Get-Content, and there's no reason to save it as a variable, just pipe it right to the add-content cmdlet.
Get-Content $LogFile.FullName -ReadCount 5000| Add-Content $DestinationFile
To explain my answer a little more, if you use ForEach-Object in the pipeline it keeps everything in memory (regardless of your GC call). Using a ForEach loop does not do this, and should take care of your issue.
You might find this and this helpful.
In short: Add-Content, Get-Content and Out-File are convenient but notoriously slow when you need to deal with large amounts of data or I/O operations. You want to fall back to StreamReader and StreamWriter .NET classes for performance and/or memory usage optimization in cases like yours.
Code sample:
$sInFile = "infile.txt"
$sOutFile = "outfile.txt"
$oStreamReader = New-Object -TypeName System.IO.StreamReader -ArgumentList #($sInFile)
# $true sets append mode.
$oStreamWriter = New-Object -TypeName System.IO.StreamWriter -ArgumentList #($sOutFile, $true)
foreach ($sLine in $oStreamReader.ReadLine()) {
$oStreamWriter.WriteLine($sLine)
}
$oStreamReader.Close()
$oStreamWriter.Close()

Why is my PowerShell script writing blank lines to console?

I have a bit of an odd problem. Or maybe not so odd. I had to implement a "custom clean" for a PowerShell script developed for building some unique configurations for my current project (the whys are not particularly important). Basically it copies a bunch of files from the release directories into some temporary directories with this code:
$Paths = Get-ChildItem $ProjectPath -recurse |
Where-Object { ($_.PSIsContainer -eq $true) -and
(Test-Path($_.Fullname + 'bin\release')) } |
Select-Object Fullname
ForEach ($Path in $Paths)
{
$CopyPath = $Path.Fullname + '\bin\Temp'
$DeletePath = $Path.Fullname + '\bin\Release'
New-Item -ItemType directory -path $CopyPath
Copy-Item $DeletePath $CopyPath -recurse
Remove-Item $DeletePath Recurse
}
And after the build copies it back with:
ForEach ($Path in $Paths)
{
$CopiedPath = $Path.Fullname + '\bin\Temp\'
$DeletedPath = $Path.Fullname + '\bin\Release\'
$Files = Get-ChildItem $CopiedPath -recurse |
where-object {-not $_PSIsContainer}
ForEach ($File in $Files)
{
if(-not (Test-Path ($DeletedPath+$File.Name)))
{
Copy-Item $File.Fullname ($DeletedPath+$File.Name)
}
}
Remove-Item $CopyPath -recurse -force
}
This is pretty clunky and noobish (Sorry, I'm a PowerShell noob drinking from a fire hose), but it works for the purpose and I will clean it up later. However, when it executes the initial copy to the temp directories, it writes a lot of blank lines to the screen, which isn't ideal as I have a message I display while this process is executing to assure our CM doesn't freak out and think it broke, but this message is blown away by the blank lines. Do you know what might be causing this and how I might solve this? I'm using PowerShell 2.0 out of the box and due to the nature of this project I can't upgrade or get any outside libraries. Thanks guys.
If the only thing you're looking to do is clean up the console output, then all you need to do is use the pipeline. You can start the command with [void], which will exclude all information from the pipeline. You can also pipe the whole thing into the Out-Null cmdlet, which will trap all output, except for the lines that don't have output.
The New-Item cmdlet by default returns output to the console on my version of Windows PowerShell (4.0). This may not be true on previous versions, but I think it is... Remove-Item also doesn't return any output, usually. If I were to take a stab, I'd kill output on those lines that use the "Item" noun using one of the methods mentioned above.

Deleting cabinet files

I'm trying to create a script to delete cabinet files in virtual servers. For some reason, the code that I've created ends up not deleting any cabinet files and instead tries to delete the entire WINDOWS Directory, and I have no idea why this is occurring. Was curious if anyone might have any ideas on what the issue may be, since I can't find anything:
$dir = "\\$server" + '\C$\windows'
$cabinetArray = #()
foreach ($item in get-childitem -path $dir){
if ($item.name -like "*.cab"){
$cabinetArray = $cabinetArray + $item
}
}
for ($i = 0; $i -le $cabinetArray.length; $i++){
$removal = $dir + "\" + $cabinetArray[$i]
remove-item $removal -force -recurse
}
I did some testing and it seems that for some reason my array that I'm trying to use to gather all the cabinet files isn't even getting filled for some reason. I'm not sure if there's a specific way to only gather the .cab files since right now whenever I run this on my test server it tries deleting everything.
I don't know if deleting all the cab files in that folder is a good idea or not, but I'll answer your question. You're doing a lot of math and building your own collection of objects when PoweShell will do it all for you. Try something like this:
$dir = "\\" + $server + '\C$\windows'
$cabinetFiles = Get-ChildItem -Path $dir -Filter "*.cab" -Recurse
$cabinetFiles | %{
Remove-Item -Path $_.FullName -Force
}
Or, as a one liner:
Get-ChildItem -Path ("\\" + $server + '\C$\windows') -Filter "*.cab" -Recurse | %{Remove-Item -Path $_.FullName -Force}
Use the pipeline, here's a simplified version of your code (remove -WhatIf do delete the files). The code gets all *.cab files from the windows directory of the remote box (recursively), makes sure that only file objects passes on and then deletes them.
Get-ChildItem "\\$server\admin$" -Filter *.cab -Recurse |
Where-Object {!$_.PSIsContainer} |
Remove-Item -Force -WhatIf
For some reason, the code that I've created ends up not deleting any cabinet files and instead tries to delete the entire WINDOWS Directory, and I have no idea why this is occurring.
It is occurring because your for loop is being entered, and that is happening because $cabinetArray's length is zero. Once the for loop is entered, the $removal variable is assigned the value of $dir plus a trailing backslash. You are then calling remove-item on the windows directory.