Powershell runs script from previous edit - powershell

I am practicing Powershell in VSCode.
I have noticed that when I hit F5 to run my script, Powershell runs the previous version of the script (before my changes). I am definitely saving the file after my edits.
So, for example, I might change the text in this line:
Add-Content -Path "D:\Text_File.txt" -value "$_ - MP3 File count is: $Cnt"
But it will continue to show the same output until I run it a second time.
This is the script I am using in case it is relevant:
Param(
[string]$Path = 'D:\WMA - BU of wma music files',
[string]$DestinationPath = "D:\WMA-Only",
[string]$OutputFileDest = "D:\Text_File.txt"
)
$MainDirList = Get-ChildItem -Path $Path -Directory | ForEach-Object {$_.FullName}
$MainDirList | Out-File -FilePath D:\Text_File.txt
$MainDirList | WriteFolder
function WriteFolder
{
process
{
$FilteredList = Get-ChildItem -Path $_ -Force -Recurse -Filter "*.mp3"
$Cnt = $FilteredList.count
Add-Content -Path "D:\Text_File.txt" -value "$_ - MP3 File count is: $Cnt"
}
}
Visual Studio code version is 1.62.2
Powershell extension is v2021.10.2
I am wondering if I am meant to clear variables or the like at the end of a script?
I am viewing the output in the latest version of notepad++

Could it be because you've defined one function inside another function? Normally when you run a script, it finds any Functions and pre-processes them before running the script. But in your case, it only processes the outer function, which only creates the inner function when it's run.
I'm not sure what your parent function is called, but you could try just having the two functions side-by-side:
Function myFunction{
Param(
[string]$Path = 'D:\WMA - BU of wma music files',
[string]$DestinationPath = "D:\WMA-Only",
[string]$OutputFileDest = "D:\Text_File.txt"
)
Process{
$MainDirList = Get-ChildItem -Path $Path -Directory | ForEach-Object {$_.FullName}
$MainDirList | Out-File -FilePath D:\Text_File.txt
$MainDirList | WriteFolder
}
}
function WriteFolder
{
process
{
$FilteredList = Get-ChildItem -Path $_ -Force -Recurse -Filter "*.mp3"
$Cnt = $FilteredList.count
Add-Content -Path "D:\Text_File.txt" -value "$_ - MP3 File count is: $Cnt"
}
}
Or else you could move the WriteFolder function into its own Powershell module (*.psm1) and load it in your main script:
Import-Module "WriteFolder.psm1" -Force
The -Force parameter is optional, but if you're regularly changing the module file you'll need it to ensure the updates are loaded, rather than just caching the old version of the module.

Related

Powershell variable - verbose if it is receiving input

I have created a script that compress files older than N days using PowerShell.
Like this:
param (
$dirPath, `
[int] $daysAgo, `
$logOutput=$dirPath+"\old_reports.log", `
$fileExt
)
$curDate = Get-Date
$timeAgo = ($curDate).AddDays($daysAgo)
$files = Get-ChildItem -Recurse `
-Path $dirPath `
-Include *.$fileExt| `
Where-Object { $_.LastWriteTime -lt $timeAgo } | `
Select -ExpandProperty FullName
& 'C:\Program Files\7-Zip\7Z.exe' a -t7z -mx9 old_reports.7z $files -bb1 -sdel
echo $files > $logOutput
It is working, but, since there are many files, it takes a while to fill the $files variable. While it is doing that, the prompt shows only a blinking cursor. Therefore, I am not aware if the script is actually doing something or it's paused by an accidental click.
Is there a way to show that $files variable is receiving input?
Without restructuring your command - and thereby sacrificing performance - I see only one option:
In addition to capturing file-info objects in variable $files, print them to the display as well, which you can do with the help of the common -OutVariable parameter:
# Output the files of interest *and* capture them in
# variable $files, via -OutVariable
Get-ChildItem -Recurse `
-Path $dirPath `
-Include *.$fileExt| `
Where-Object { $_.LastWriteTime -lt $timeAgo } | `
Select -ExpandProperty FullName -OutVariable files

PowerShell Core script runs but is not executed by Task Scheduler?

I am trying to schedule a PowerShell Core 7.2 script to run on Windows Server 2012R2.
The script runs manually, without any errors, from the server and the Task Scheduler runs the task. In the History, I can see Task Completed
The issue is that the script is not executed. It is supposed to move the files and files are not moving which means the script was not executed.
Settings of the Task Scheduler that are selected are as follows:
General - Run whether user is logged on or not, Run with the highest privileges.
Actions -> Action Start a Program
Actions -> Program/Script "C:\Program Files\PowerShell\7\pwsh.exe" (location of pwsh.exe)
Actions -> Add Arguments -Executionpolicy Bypass -File "R:\Public\IT\Vantage_Utilities\CNC_Scripts\File Transfer\Fastems\CNC_File_Transfer_Fastems.ps1"
Location -> Name of the local machine
I am not really sure what is going wrong here.
EDIT
I am thinking there is an issue with the script. Because there is another script set up to be executed with PS Core and Task Scheduler. I am going to post the script here. It is a simple batch file that moves all the contents of one folder from one server to another. I achieve this in two functions. Function MoveFiles moves all the contents of the parent folder(excluding the subfolder called "Mazak"). The second function,Function MoveMazakFiles function moves the contents of "Mazak" only. ( I am completely aware I could have done this using fewer lines of code but that is not the point here)
Code:
$logPath = "\\MMS25163S1\Public\IT\Vantage_Utilities\CNC_Scripts\File Transfer\Fastems\Log.txt"
$trancriptPath = "\\MMS25163S1\Public\IT\Vantage_Utilities\CNC_Scripts\File Transfer\Fastems\LogTranscript.txt"
$getDate = Get-Date -Format "dddd MM/dd/yyyy HH:mm "
$counter = 0
$mazakCounter = 0
Start-Transcript -Path $trancriptPath -Append
Add-Content -Path $logPath -Value ("LOG CREATED $getDate") -PassThru
#Sources
$srcMca = "\\MMS25163S1\Public\NcLib\FromNC\*"
$srcMcaNameChg ="\\MMS25163S1\Public\NcLib\FromNC"
$srcMazak= "\\MMS25163S1\Public\NcLib\FromNC\Mazak\*"
$srcMcaNameChgMazak = "\\MMS25163S1\Public\NcLib\FromNC\Mazak"
#Destination
$destMca = "\\Sidney2\MfgLib\RevisedPrograms\MC-A"
#Time with milliseconds
$time = (Get-Date -Format hh-mm-fff-tt).ToString()
Function MoveFiles{
Param(
[string]$src,
[string]$dest,
[string]$srcNameChange
)
Get-Item -Path $src -Exclude *Mazak* -ErrorAction SilentlyContinue | ForEach-Object{
$counter++
$fileName = $_.BaseName
$fileNameExt = $_.Name
Write-host $fileName -ForegroundColor Green
Rename-Item -Path "$srcMcaNameChg\$fileNameExt" -NewName ($fileName+"_"+"(Time-$time)"+$_.Extension);
Add-Content -Path $logPath -Value ("Name changed: Time stamp added to $fileName ") -PassThru
}
Move-Item -Path $src -Exclude *Mazak* -Destination $dest -Force
Add-Content -Path $logPath -Value ("$counter file(s) moved to $dest") -PassThru
}
MoveFiles -src $srcMca -dest $destMca -srcNameChange $srcMcaNameChg
Function MoveMazakFiles{
Param(
[string]$srcMazak,
[string]$dest,
[string]$srcNameChange
)
Get-ChildItem $srcMazak -Recurse -ErrorAction SilentlyContinue | ForEach-Object{
$mazakCounter++
$fileName = $_.BaseName
$fileNameExt = $_.Name
Write-host $fileName -ForegroundColor Green
Rename-Item -Path "$srcMcaNameChgMazak\$fileNameExt" -NewName ($fileName+"_"+"(Time-$time)"+$_.Extension);
}
Move-Item -Path $srcMazak -Destination $dest -Force
Add-Content -Path $logPath -Value ("$mazakCounter file(s) from Mazak folder moved to $dest") -PassThru
}
MoveMazakFiles -srcMazak $srcMazak -dest $destMca -srcNameChange $srcMcaNameChg
Stop-Transcript
When setting the scheduled task, under Action -> Start a Program -> Program/Script. Call powershell and pass the script as the parameter
Like
powershell -File "R:\Public\IT\Vantage_Utilities\CNC_Scripts\File Transfer\Fastems\CNC_File_Transfer_Fastems.ps1"

Trying to iterate through folders and take recursive actions on each file

I'm trying to build a script that I can use to delete old files based on Last Accessed date. As part of the script I want to interrogate each sub folder, find files not accessed in the last X days, create a log in the same folder of the files found and record file details in the log then delete the files.
What I think I need is a nested loop, loop 1 will get each subfolder (Get-ChildItem -Directory -Recurse) then for each folder found a second loop checks all files for last accessed date and if outside the limit will append the file details to a logfile in the folder (for user reference) and also to a master logfile (for IT Admin)
loop 1 is working as expected and getting the subfolders, but I cannot get the inner loop to recurse through the objects in the folder, I'm trying to use Get-ChildItem inside the first loop, is this the correct approach?
Code sample below, I have added pseudo to demo the logic, its really the loops I need help with:
# Set variables
$FolderPath = "E:TEST_G"
$ArchiveLimit = 7
$ArchiveDate = (Get-Date).AddDays(-$ArchiveLimit)
$MasterLogFile = "C:\Temp\ArchiveLog $(Get-Date -f yyyy-MM-dd).csv"
# Loop 1 - Iterate through each subfolder of $FolderPath
Get-ChildItem -Path $FolderPath -Directory -Recurse | ForEach-Object {
# Loop 2 - Check each file in the Subfolder and if Last Access is past
# $ArchiveDate take Action
Get-ChildItem -Path $_.DirectoryName | where {
$_.LastAccessTime -le $ArchiveDate
} | ForEach-Object {
# Check if FolderLogFile Exists, if not create it
# Append file details to folder Log
# Append File & Folder Details to Master Log
}
}
I think you're overcomplicating a bit:
#Set Variables
$FolderPath = "E:\TEST_G"
$ArchiveLimit = 7
$ArchiveDate = (Get-Date).AddDays(-$ArchiveLimit)
$MasterLogFile = "C:\Temp\ArchiveLog $(get-date -f yyyy-MM-dd).csv"
If (!(Test-Path $MasterLogFile)) {New-Item $MasterLogFile -Force}
Get-ChildItem -Path $FolderPath -File -Recurse |
Where-Object { $_.LastAccessTime -lt $ArchiveDate -and
$_.Extension -ne '.log' } |
ForEach-Object {
$FolderLogFile = Join-Path $_.DirectoryName 'name.log'
Add-Content -Value "details" -Path $FolderLogFile,$MasterLogFile
Try {
Remove-Item $_ -Force -EA Stop
} Catch {
Add-Content -Value "Unable to delete item! [$($_.Exception.GetType().FullName)] $($_.Exception.Message)"`
-Path $FolderLogFile,$MasterLogFile
}
}
Edit:
Multiple recursive loops are unnecessary since you're already taking a recursive action in the pipeline. It's powerful enough to do the processing without having to take extra action. Add-Content from the other answer is an excellent solution over Out-File as well, so I replaced mine.
One note, though, Add-Content's -Force flag does not create the folder structure like New-Item's will. That is the reason for the line under the $MasterLogFile declaration.
Your nested loop doesn't need recursion (the outer loop already takes care of that). Just process the files in each folder (make sure you exclude the folder log):
Get-ChildItem -Path $FolderPath -Directory -Recurse | ForEach-Object {
$FolderLogFile = Join-Path $_.DirectoryName 'FolderLog.log'
Get-ChildItem -Path $_.DirectoryName -File | Where-Object {
$_.LastAccessTime -le $ArchiveDate -and
$_.FullName -ne $FolderLogFile
} | ForEach-Object {
'file details' | Add-Content $FolderLogFile
'file and folder details' | Add-Content $MasterLogFile
Remove-Item $_.FullName -Force
}
}
You don't need to test for the existence of the folder log file, because Add-Content will automatically create it if it's missing.

PowerShell run script simultaneously

I created a PowerShell script to remove all files and folders older than X days. This works perfectly fine and the logging is also ok. Because PowerShell is a bit slow, it can take some time to delete these files and folders when big quantities are to be treated.
My questions: How can I have this script ran on multiple directories ($Target) at the same time?
Ideally, we would like to have this in a scheduled task on Win 2008 R2 server and have an input file (txt, csv) to paste some new target locations in.
Thank you for your help/advise.
The script
#================= VARIABLES ==================================================
$Target = \\share\dir1"
$OlderThanDays = "10"
$Logfile = "$Target\Auto_Clean.log"
#================= BODY =======================================================
# Set start time
$StartTime = (Get-Date).ToShortDateString()+", "+(Get-Date).ToLongTimeString()
Write-Output "`nDeleting folders that are older than $OlderThanDays days:`n" | Tee-Object $LogFile -Append
Get-ChildItem -Directory -Path $Target |
Where-Object { $_.LastWriteTime -lt (Get-Date).AddDays(-$OlderThanDays) } | ForEach {
$Folder = $_.FullName
Remove-Item $Folder -Recurse -Force -ErrorAction SilentlyContinue
$Timestamp = (Get-Date).ToShortDateString()+" | "+(Get-Date).ToLongTimeString()
# If folder can't be removed
if (Test-Path $Folder)
{ "$Timestamp | FAILLED: $Folder (IN USE)" }
else
{ "$Timestamp | REMOVED: $Folder" }
} | Tee-Object $LogFile -Append # Output folder names to console & logfile at the same time
# Set end time & calculate runtime
$EndTime = (Get-Date).ToShortDateString()+", "+(Get-Date).ToLongTimeString()
$TimeTaken = New-TimeSpan -Start $StartTime -End $EndTime
# Write footer to log
Write-Output ($Footer = #"
Start Time : $StartTime
End Time : $EndTime
Total runtime : $TimeTaken
$("-"*79)
"#)
# Create logfile
Out-File -FilePath $LogFile -Append -InputObject $Footer
# Clean up variables at end of script
$Target=$StartTime=$EndTime=$OlderThanDays = $null
One way to achieve this would be to write an "outer" script that passes the directory-to-be-cleaned, into the "inner" script, as a parameter.
For your "outer" script, have something like this:
$DirectoryList = Get-Content -Path $PSScriptRoot\DirList;
foreach ($Directory in $DirectoryList) {
Start-Process -FilePath powershell.exe -ArgumentList ('"{0}\InnerScript.ps1" -Path "{1}"' -f $PSScriptRoot, $Directory);
}
Note: Using Start-Process kicks off a new process that is, by default, asynchronous. If you use the -Wait parameter, then the process will run synchronously. Since you want things to run more quickly and asynchronously, omitting the -Wait parameter should achieve the desired results.
Invoke-Command
Alternatively, you could use Invoke-Command to kick off a PowerShell script, using the parameters: -File, -ArgumentList, -ThrottleLimit, and -AsJob. The Invoke-Command command relies on PowerShell Remoting, so that must enabled, at least on the local machine.
Add a parameter block to the top of your "inner" script (the one you posted above), like so:
param (
[Parameter(Mandatory = $true)]
[string] $Path
)
That way, your "outer" script can pass in the directory path, using the -Path parameter for the "inner" script.

powershell backup script with error logging per file

Really need help creating a script that backs up, and shoots out the error along the file that did not copy
Here is what I tried:
Creating lists of filepaths to pass on to copy-item, in hopes to later catch errors per file, and later log them:
by using $list2X I would be able to cycle through each file, but copy-item loses the Directory structure and shoots it all out to a single folder.
So for now I am using $list2 and later I do copy-item -recurse to copy the folders:
#create list to copy
$list = Get-ChildItem -path $source | Select-Object Fullname
$list2 = $list -replace ("}"),("")
$list2 = $list2 -replace ("#{Fullname=") , ("")
out-file -FilePath g:\backuplog\DirList.txt -InputObject $list2
#create list crosscheck later
$listX = Get-ChildItem -path $source -recurse | Select-Object Fullname
$list2X = $listX -replace ("}"),("")
$list2X = $list2X -replace ("#{Fullname=") , ("")
out-file -FilePath g:\backuplog\FileDirList.txt -InputObject $list2X
And here I would pass the list:
$error.clear()
Foreach($item in $list2){
Copy-Item -Path $item -Destination $destination -recurse -force -erroraction Continue
}
out-file -FilePath g:\backuplog\errorsBackup.txt -InputObject $error
Any help with this is greatly appreciated!!!
The answer to complex file-copying or backup scripts is almost always: "Use robocopy."
Bill
"Want to copy all the items in C:\Scripts (including subfolders) to C:\Test? Then simply use a wildcard character..."
Next make it easier on yourself and do something like this:
$files = (Get-ChildItem $path).FullName #Requires PS 3.0
#or
$files = Get-ChildItem $path | % {$_.Fullname}
$files | Out-File $outpath
well it took me a long time, considering my response time. here is my copy function, which logs most errors(network drops, failed copies , etc) the copy function , and targetobject.
Function backUP{ Param ([string]$destination1 ,$list1)
$destination2 = $destination1
#extract new made string for backuplog
$index = $destination2.LastIndexOf("\")
$count = $destination2.length - $index
$source1 = $destination2.Substring($index, $count)
$finalstr2 = $logdrive + $source1
Foreach($item in $list1){
Copy-Item -Container: $true -Recurse -Force -Path $item -Destination $destination1 -erroraction Continue
if(-not $?)
{
write-output "ERROR de copiado : " $error| format-list | out-file -Append "$finalstr2\GCI-ERRORS-backup.txt"
Foreach($erritem in $error){
write-output "Error Data:" $erritem.TargetObject | out-file -Append "$finalstr2\GCI- ERRORS-backup.txt"
}
$error.Clear()
}
}
}