"Get ChildItem : Cannot find path" showing path I did not supply - powershell

I inherited a script that should simply move files from -source to -target. I am prompted for both, and after supplying the paths, it tells me that it cannot find the path while showing a path that I absolutely did not submit, but can't figure out how it's arriving there.
[Parameter(
Mandatory = $true,
Position = 0,
HelpMessage = "Root of the folders or share to archive"
)]
[String] $source,
[Parameter(
Mandatory = $true,
Position = 1,
HelpMessage = "Path of the folder or share of archive"
)]
[string] $target,
[Parameter(
Mandatory = $false,
Position = 3
)]
[int] $days = 30
)
# Get all the files from the source path, that are not shortcuts (*.lnk) and older than the days set
Get-ChildItem $source -Recurse |
Where-Object {!$_.psiscontainer -and ((get-date) - $_.lastwritetime).totaldays -gt $days -and $_.extension -ne ".lnk"} |
ForEach-Object {
# For each file build the destination path
$dest = $_.fullname -replace ([regex]::escape($source)), $target
# Move the files into the destination
Move-Item -Path $_.fullname -Destination $dest -ErrorAction silentlycontinue
}
The log says "Cannot find path '\\appserver\abc$\Automation\Daily\Archive\appserver\abc$\Storage' because it does not exist" - see how it starts repeating itself? \\appserver\abc$\Automation\Daily\Archive\ is the location of the script, whereas \\appserver\abc$\Storage\ is what I am entering as -source. So I have no idea why it is looking at the path to the script, then appending the source path concurrently.
EDIT: This is how I am calling the script (from a little-known finance application called APX):
SHELL PowerShell \\appserver\abc$\Automation\Daily\Archive\ArchiveFiles.ps1 -source \\appserver\abc$\Dataport\dpdata -target \\appserver\abc$\Dataport\archived -days 30

When your script starts, it is beginning in the directory that you are running it from, so it already in \\appserver\abc$\Automation\Daily\Archive\ and if you do not supply a UNC resource prefix such as \\ or A:\ then it will look for ChildItems from that directory down. So when you're supplying the folder path, it's appending that to its current directory and unable to find the new path.
As this would only happen if you had omitted the \\ at the beginning of your string, I would only expect your output if you had submitted appserver\abc$\Storage\ as your source. If you are sure you did supply the \\, then look more closely at whatever line of script is passing the command to this script, to see if there's a reason it's stripping the \\ off beforehand.

Related

powershell Get-Content and .Net Open() handing path differently [duplicate]

This question already has answers here:
How does PowerShell treat "." in paths?
(3 answers)
Closed 1 year ago.
Get-Content appears to use the current working directory location to resolve realative paths. However, the .Net System.Io.File Open() method does not. What is the PowerShell-centric way to resolve a relative path for .Net?
PS C:\src\t> type .\ReadWays.ps1
[CmdletBinding()]
param (
[Parameter(Mandatory=$true)]
[String]$Path
)
Write-Host "Path is $Path"
Get-Content -Path $Path | Out-Null
if ([System.IO.StreamReader]$sr = [System.IO.File]::Open($Path, [System.IO.FileMode]::Open)) { $sr.Close() }
PS C:\src\t> .\ReadWays.ps1 -Path '.\t.txt'
Path is .\t.txt
MethodInvocationException: C:\src\t\ReadWays.ps1:8
Line |
8 | if ([System.IO.StreamReader]$sr = [System.IO.File]::Open($Path, [Syst …
| ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
| Exception calling "Open" with "2" argument(s): "Could not find file 'C:\Program Files\PowerShell\7\t.txt'."
PS C:\src\t> $PSVersionTable.PSVersion.ToString()
7.2.0
You can add a test to see if the path is relative and if so, convert it to absolute like:
if (![System.IO.Path]::IsPathRooted($Path) -or $Path -match '^\\[^\\]+') {
$path = [System.IO.Path]::GetFullPath([System.IO.Path]::Combine($pwd, $Path))
}
I added $Path -match '^\\[^\\]+' to also convert relative paths starting with a backslash like \ReadWays.ps1 meaning the path starts at the root directory. UNC paths that start with two backslashes are regarded as absolute.
The following works fine for me and is compatible with Windows
and Linux. This is using Convert-Path to resolve the relative paths. I was previously using Resolve-Path which is incorrect, only the former resolves to file-system-native paths, thanks mklement0 for pointing it out
param(
[ValidateScript({
if(Test-Path $_ -PathType Leaf)
{
return $true
}
throw 'Invalid File Path'
})]
[string]$Path
)
if(-not $Path.StartsWith('\\'))
{
[string]$Path = Convert-Path $Path
}
$reader = [System.IO.StreamReader]::new(
[System.IO.File]::Open(
$Path, [System.IO.FileMode]::Open
)
)
$reader.BaseStream
$reader.Close()
Last Edit
The following should be able to handle:
UNC Paths
Work on Windows and Linux
Be efficient
Handle Relative Paths
Starting from the base that $Path is valid thanks to the ValidateScript attribute, we only need to determine if the path we are dealing with is UNC, Relative or Absolute.
UNC paths must always be fully qualified. They can include relative directory segments (. and ..), but these must be part of a fully qualified path. You can use relative paths only by mapping a UNC path to a drive letter.
We can assume a UNC path must always start with \\, so this condition should suffice to determine if $Path will be manipulated or not:
if(-not $Path.StartsWith('\\'))
Lastly, in the begin block, updating the environment's current directory each time our script or function runs with:
[Environment]::CurrentDirectory = $pwd.ProviderPath
By doing so, ([System.IO.FileInfo]$Path).FullName should give us the absolute path of our parameter, be it UNC, Relative or Absolute.
param(
[ValidateScript({
if(Test-Path $_ -PathType Leaf) {
return $true
}
throw 'Invalid File Path'
})] [string]$Path
)
begin
{
[Environment]::CurrentDirectory = $pwd.ProviderPath
}
process
{
if(-not $Path.StartsWith('\\'))
{
$Path = ([System.IO.FileInfo]$Path).FullName
}
try
{
$reader = [System.IO.StreamReader]::new(
[System.IO.File]::Open(
$Path, [System.IO.FileMode]::Open
)
)
$reader.BaseStream
}
catch
{
$_.Exception.Message
}
finally
{
$reader.Close()
$reader.Dispose()
}
}
This is a common question. Somehow .net and powershell don't agree on the current directory.
[System.IO.File]::Open("$pwd\$Path", [System.IO.FileMode]::Open)

How do I modify this script to take parameters?

I have a power shell script that combines power points. The issue is it only works on power points in the current directory (the directory the script is in) and saves the combined power point to documents. How do I change the script to work from any directory that is given as a parameter. I run the power shell script like this ./Merge-Presentation -Source $Presentations -Destination $save -Verbose -Open;. Where $Presentations is the path of the individual power point and $save is the path where the combined power point is saved. Here is the script.
#region function definitions
#Function for releasing a COM object
Function Remove-Ref
{
param
(
[Object]
$ref
)
$null = Remove-Variable -Name $ref -ErrorAction SilentlyContinue
while ([System.Runtime.InteropServices.Marshal]::ReleaseComObject([System.__ComObject]$ref) -gt 0)
{
}
[System.GC]::Collect()
[System.GC]::WaitForPendingFinalizers()
}
#Main function for merging PowerPoint presentations
Function Merge-PowerPointPresentation
{
<#
.SYNOPSIS
Merge multiple PowerPoint presentation files to one file
.DESCRIPTION
Merge multiple PowerPoint presentation files to one file
.PARAMETER Source
The PowerPoint presentation files to merge specified by its full name
.PARAMETER Destination
The target PowerPoint presentation file specified by its full name
.PARAMETER Open
A switch to specify if we keep the PowerPoint application opened after the processing
.EXAMPLE
$Get-ChildItem -Path $CurrentDir -filter *.pptx | Sort-Object -Property Name | Merge-PowerPointPresentation -Verbose -Open
Will merge all the PowerPoint files into the current directory into one single Powerpoint file by using a timestamped filename (ie. yyyyMMddTHHmmss.pptx like 20170126T091011.pptx)
The output will be verbose
The PowerPoint application won't be left after the processing
.EXAMPLE
$Presentations = "$CurrentDir\0.pptx","$CurrentDir\1.pptx","$CurrentDir\2.pptx","$CurrentDir\3.pptx","$CurrentDir\4.pptx","$CurrentDir\5.pptx","$CurrentDir\6.pptx","$CurrentDir\7.pptx","$CurrentDir\8.pptx","$CurrentDir\9.pptx"
Merge-PowerPointPresentation -Source $Presentations -Destination C:\Temp\MergedPresentation.pptx
Will merge all the specified PowerPoint files into into the C:\Temp\MergedPresentation.pptx Powerpoint file
#>
[CmdletBinding()]
Param(
#The collection of the powerpoint files to merge
[Parameter(Mandatory = $True,ValueFromPipeline = $True,ValueFromPipelineByPropertyName = $True)]
[ValidateScript({
(Test-Path -Path $_ -PathType Leaf) -and ($_ -match "\.ppt(x{0,1})$")
})]
[alias('FilePath', 'Path', 'FullName')]
[string[]]$Source,
#The path of the generated powerpoint file
[Parameter(Mandatory = $False)]
[ValidateNotNullOrEmpty()]
[alias('OutputFile')]
[string]$Destination = $(Join-Path -Path $([Environment]::GetFolderPath('MyDocuments')) -ChildPath $('{0:yyyyMMddTHHmmss}' -f (Get-Date))),
#To keep open the generated Powerpoint presentation
[parameter(Mandatory = $False)]
[switch]$Open
)
begin
{
#Opening the PowerPoint application once
Add-Type -AssemblyName Microsoft.Office.Interop.PowerPoint
$Powerpoint = New-Object -ComObject Powerpoint.Application
#Creating a new PowerPoint presentation
$NewPresentation = $Powerpoint.Presentations.Add($True)
# Adding an empty slide : mandatory
$null = $NewPresentation.Slides.Add(1, [Microsoft.Office.Interop.PowerPoint.PpSlideLayout]::ppLayoutBlank)
$SlidesNb = 0
}
process
{
#For all files passed as argument outside a pipeline context
foreach ($CurrentSource in $Source)
{
#Getting the base name of the processed presentation
$CurrentPresentationName = (Get-Item -Path $CurrentSource).BaseName
#Inserting the slide of the current presentationt o the new one
$InsertedSlidesNb = $NewPresentation.Slides.InsertFromfile($CurrentSource, $SlidesNb)
#Applying the original template
$NewPresentation.Slides.Range(($SlidesNb+1)..($SlidesNb+$InsertedSlidesNb)).ApplyTemplate($CurrentSource)
#Adding a new section for the inserted context with the name of the processed presentation
Write-Verbose -Message "Adding the section $CurrentPresentationName before Slide $($SlidesNb+1)..."
$null = $NewPresentation.SectionProperties.AddBeforeSlide($SlidesNb+1, $CurrentPresentationName)
Write-Verbose -Message "Processed file $CurrentSource by inserting $InsertedSlidesNb slides ($($SlidesNb+1) ==> $($SlidesNb+$InsertedSlidesNb)) ..."
$SlidesNb += $InsertedSlidesNb
}
}
end
{
#Deleting the useless empty slide (added at the beginning)
$NewPresentation.Slides.Range($SlidesNb+1).Delete()
#Saving the final file
$NewPresentation.SaveAs($Destination)
Write-Host -Object "The new presentation was saved in $($NewPresentation.FullName) ($SlidesNb slides)"
#If the -Open switch is specified we keep the PowerPoint application opened
if (!$Open)
{
$NewPresentation.Close()
#$Powerpoint.Quit() | Out-Null
Write-Verbose -Message 'Releasing PowerPoint ...'
Remove-Ref -ref ($NewPresentation)
Remove-Ref -ref ($Powerpoint)
}
}
}
#endregion
Clear-Host
#Getting the current directory (where this script file resides)
$CurrentDir = Split-Path -Path $MyInvocation.MyCommand.Path
#Loading the PowerPoint assembly
#Example 1 : Processing all the PowerPoint presentation in current directory in the alphabetical order
Get-ChildItem -Path $CurrentDir -Filter *.pptx |
Sort-Object -Property Name |
Merge-PowerPointPresentation -Verbose -Open
#Example 2 : Processing a list of some PowerPoint presentations specified by their absolute path
$Presentations = "$CurrentDir\0.pptx", "$CurrentDir\1.pptx", "$CurrentDir\2.pptx", "$CurrentDir\3.pptx", "$CurrentDir\4.pptx", "$CurrentDir\5.pptx", "$CurrentDir\6.pptx", "$CurrentDir\7.pptx", "$CurrentDir\8.pptx", "$CurrentDir\9.pptx"
Merge-PowerPointPresentation -Source $Presentations -Destination $CurrentDir\all.pptx -Verbose
The expected result is to load the power points from the directory specified as parameter and save the combined power point in the directory specified as a parameter.
You can just add a parameter declaration at the top, something like:
Param(
[parameter(Mandatory = $false, Position = 1)]
[ValidateScript( {Test-Path $_} )]
[String]$SourcePath = ( Split-Path $MyInvocation.MyCommand.Path ),
[parameter(Mandatory = $false, Position = 2)]
[ValidateScript( {Test-Path $_} )]
[String]$DestinationPath
) #End Parameter Block...
I used Path as the parameter because it's quite common and $CurrentDirector wouldn't make sense here. I defaulted the parameter to the same as you had it. In the script body you'll want to replace $CurrentPath with $Path.
Get-ChildItem -Path $SourcePath -Filter *.pptx |
Sort-Object -Property Name |
Merge-PowerPointPresentation -Verbose -Open -Destination $DestinationPath
Note: The addition of -DestinationPath in the above call. That should override the default value in the Merge-PowerPointPresentation function.
If you don't use the parameters the script should behave same as before. If you do use it It will operate against whatever paths you gave it.
Obviously not tested, but let me know if that helps. Thanks.

Accept a path as literal path, relative path or relative to the script root

Currently we want to cover these 3 cases when retrieving file information:
A literal path
A relative path, relative to the script root
A relative path, relative to the present working directory
To cover these 3 cases we created the following function:
Function Get-FilePathItemHC {
Param (
[Parameter(Mandatory)]
[String]$Path
)
$Params = #(
#{
# Path relative to the script root
LiteralPath = Join-Path -Path $PSScriptRoot -ChildPath $Path
}
#{
# Literal or path relative to the present work directory
Path = $Path
}
)
$Item = $null
foreach ($P in $Params) {
if ($Item = Get-Item #P -ErrorAction Ignore) {
$Item
Break
}
}
if (-not $Item) {
throw "Cannot find path '$Path' because it does not exist."
}
}
Is this the right way of doing it? It seems like we're reinventing the wheel here.
Make your -Path Parameter a System.IO.FileInfo object, and just pass in a relative path as the parameter. The file object will resolve with either a relative or full path, then you can use $path.FullName to reference the full path to the file.
Function Get-FilePathItemHC {
Param (
[Parameter(Mandatory)]
[ValidateScript({ $_.Exists })]
[System.IO.FileInfo]$Path
)
# The ValidateScript attribute makes sure the file you passed in exists
# so your validation code no longer is required
}
If you want to handle both directories and files, you would want to have two separate variables in this case as directory paths would become a System.IO.DirectoryInfo object, but you can make the arguments mutually exclusive:
Function Get-FilePathItemHC {
Param (
[Parameter(Mandatory=$true, ParameterSetName="FilePath")]
[ValidateScript({ $_.Exists })]
[System.IO.FileInfo]$FilePath,
[Parameter(Mandatory=$true, ParameterSetName="DirectoryPath")]
[ValidateScript({ $_.Exists })]
[System.IO.DirectoryInfo]$DirectoryPath
)
$Path = $FilePath
if( $DirectoryPath ) {
$Path = $DirectoryPath
}
# The ValidateScript attribute makes sure the file you passed in exists
# so your validation code no longer is required
}
Get-FilePathItemHC -Path .\path\to\file.txt
Get the relative path from $PSScriptRoot
I'm not sure why you need the path relative to $PSScriptRoot if you already have the full path to the file, but after getting the System.IO.FileInfo or System.IO.DirectoryInfo object, you can use Resolve-Path from $PSScriptRoot to get the relative path from that directory:
$file = Get-FilePathItemHC -Path .\path\to\file.txt
Push-Location $PSScriptRoot
$relativeFromScriptRootPath = Resolve-Path -Relative $file
Pop-Location
Push-Location and Pop-Location treat the location as a stack. The push operation sets a new location and adds it to the stack, and the pop operation removes the last added location from the stack and places you at the next most recent location. Works a bit like cd - on Linux if you're familiar.
Resolve-Path will return a file path, and the -Relative switch will return a path relative to your current directory. You cannot pass in an alternate directory to resolve from, which is why we change the location to run this.

How to search content in files in Powershell

I want to make a dynamic function that searches for the requested $ErrorCode within the files inputted and eventually copy the files with the error to another folder.
Right now, my code takes only one file and returns the sentence of where the $Error_Code was found. I want to search through multiple files and return the name of the file that have the $ErrorCode.
function SearchError{
Param (
[Parameter (Mandatory=$true)] [STRING] $SourcePath,
[Parameter (Mandatory=$true)] [STRING] $SourceFile,
[Parameter (Mandatory=$true)] [STRING] $ErrorCode,
[Parameter (Mandatory=$true)] [STRING] $FileType
# [Parameter (Mandatory=$true)] [STRING] $DestPath
)
$TargetPath = "$($SourcePath)\$($SourceFile)"
#Return $TargetPath
$DestinationPath = "$($DestPath)"
#Return $DestinationPath
#foreach($error in $TargetPath) {
Get-ChildItem $TargetPath | Select-String -pattern $ErrorCode
}
SearchError
Select-String's output objects - which are of type [Microsoft.PowerShell.Commands.MatchInfo] - have a .Path property that reflects the input file path.
Adding the -List switch to Select-String makes it stop searching after the first match in the file, so you'll get exactly 1 output object for each file in which at least 1 match was found.
Therefore, the following outputs only the paths of the input files in which at least 1 match was found:
Get-ChildItem $TargetPath |
Select-String -List -Pattern $ErrorCode | ForEach-Object Path
Note: -Pattern supports an array of regex patterns, so if you define your $ErrorCode parameter as [string[]], files that have any one of the patterns will match; use -SimpleMatch instead of -Pattern to search by literal substrings instead.
Re:
eventually copy the files with the error to another folder
Simply appending | Copy-Item -Destination $DestPath to the above command should do.
Re:
I want to search through multiple files
Depending on your needs, you can make your $SourcePath and $SourceFile parameters array-valued ([string[]]) and / or pass wildcard expressions as arguments.

Combine multiple log files to archive using PoSh and 7zip

I am trying to join multiple log files into a single archive file, then move that archive file to another location, in an effort to both clean up old log files, and save hard drive space. We have a bunch of tools that all log to the same root, with a per-tool folder for their logs. (E.g.,
C:\ServerLogs
C:\ServerLogs\App1
C:\ServerLogs\2ndApp
each of which will have log files inside, like
C:\ServerLogs\App1\June1.log
C:\ServerLogs\App1\June2.log
C:\ServerLogs\2ndApp\June1.log
C:\ServerLogs\2ndApp\June2.log
I want to go into each of these subfolders, archive up all the files older than 5 days, then move the archive to another (long-term storage) drive and delete the now-zipped files. The tools I'm using are PowerShell and 7zip. The below code is using test locations.
I have cobbled together two scripts from various sources online, over the course of two full shifts, but neither one works right. Here's the first:
# Alias for 7-zip
if (-not (test-path "$env:ProgramFiles\7-Zip\7z.exe")) {throw "$env:ProgramFiles\7-Zip\7z.exe needed"}
set-alias 7zip "$env:ProgramFiles\7-Zip\7z.exe"
$Days = 5 #minimum age of files to archive; in other words, newer than this many days ago are ignored
$SourcePath = C:\WorkingFolder\FolderSource\
$DestinationPath = C:\Temp\
$LogsToArchive = Get-ChildItem -Recurse -Path $SourcePath | Where-Object {$_.lastwritetime -le (get-date).addDays(-$Days)}
$archive = $DestinationPath + $now + ".7z"
#endregion
foreach ($log in $LogsToArchive) {
#define Args
$Args = a -mx9 $archive $log
$Command = 7zip
#write-verbose $command
#invoke the command
invoke-expression -command $Command $Args
The problem with this one is that I get errors trying to invoke the expression. I've tried restructuring it, but then I get errors because my $Args have an "a"
So I abandoned this method (despite it being my preferred), and tried this set.
#region Params
param(
[Parameter(Position=0, Mandatory=$true)]
[ValidateScript({Test-Path -Path $_ -PathType 'container'})]
[System.String]
$SourceDirectory,
[Parameter(Position=1, Mandatory=$true)]
[ValidateNotNullOrEmpty()]
[System.String]
$DestinationDirectory
)
#endregion
function Compress-File{
#region Params
param(
[Parameter(Position=0, Mandatory=$true)]
[ValidateScript({Test-Path -Path $_ -PathType 'leaf'})]
[System.String]
$InputFile,
[Parameter(Position=1, Mandatory=$true)]
[ValidateNotNullOrEmpty()]
[System.String]
$OutputFile
)
#endregion
try{
#Creating buffer with size 50MB
$bytesGZipFileBuffer = New-Object -TypeName byte[](52428800)
$streamGZipFileInput = New-Object -TypeName System.IO.FileStream($InputFile,[System.IO.FileMode]::Open,[System.IO.FileAccess]::Read)
$streamGZipFileOutput = New-Object -TypeName System.IO.FileStream($OutputFile,[System.IO.FileMode]::Create,[System.IO.FileAccess]::Write)
$streamGZipFileArchive = New-Object -TypeName System.IO.Compression.GZipStream($streamGZipFileOutput,[System.IO.Compression.CompressionMode]::Compress)
for($iBytes = $streamGZipFileInput.Read($bytesGZipFileBuffer, 0,$bytesGZipFileBuffer.Count);
$iBytes -gt 0;
$iBytes = $streamGZipFileInput.Read($bytesGZipFileBuffer, 0,$bytesGZipFileBuffer.Count)){
$streamGZipFileArchive.Write($bytesGZipFileBuffer,0,$iBytes)
}
$streamGZipFileArchive.Dispose()
$streamGZipFileInput.Close()
$streamGZipFileOutput.Close()
Get-Item $OutputFile
}
catch { throw $_ }
}
Get-ChildItem -Path $SourceDirectory -Recurse -Exclude "*.7z"|ForEach-Object{
if($($_.Attributes -band [System.IO.FileAttributes]::Directory) -ne [System.IO.FileAttributes]::Directory){
#Current file
$curFile = $_
#Check the file wasn't modified recently
if($curFile.LastWriteTime.Date -le (get-date).adddays(-5)){
$containedDir=$curFile.Directory.FullName.Replace($SourceDirectory,$DestinationDirectory)
#if target directory doesn't exist - create
if($(Test-Path -Path "$containedDir") -eq $false){
New-Item -Path "$containedDir" -ItemType directory
}
Write-Host $("Archiving " + $curFile.FullName)
Compress-File -InputFile $curFile.FullName -OutputFile $("$containedDir\" + $curFile.Name + ".7z")
Remove-Item -Path $curFile.FullName
}
}
}
This actually seems to work, insofar as it creates individual archives for each eligible log, but I need to "bundle" up the logs into one mega-archive, and I can't seem to figure out how to recurse (to get sub-level items) and do a foreach (to confirm age) without having that foreach produce individual archives.
I haven't even gotten into the Move and Delete phase, because I can't seem to get the archiving stage to work properly, but I certainly don't mind grinding away at that once this gets figured out (I've already spent two full days trying to figure this one!).
I greatly appreciate any and all suggestions! If I've not explained something, or been a bit unclear, please let me know!
EDIT1: Part of the requirement, which I completely forgot to mention, is that I need to keep the structure in the new location. So the new location will have
C:\ServerLogs --> C:\Archive\
C:\ServerLogs\App1 --> C:\Archive\App1
C:\ServerLogs\2ndApp --> C:\Archive\2ndApp
C:\Archive
C:\Archive\App1\archivedlogs.zip
C:\Archive\2ndApp\archivedlogs.zip
And I have absolutely no idea how to specify that the logs from App1 need to go to App1.
EDIT2: For this latter part, I used Robocopy - It maintains the folder structure, and if you feed it ".zip" as an argument, it'll only do the .zip files.
this line $Args = a -mx9 $archive $log likely needs to have the right side value wrapped in double quotes OR each non-variable wrapped in quotes with a comma between each so that you get an array of args.
another method would be to declare an array of args explicitly. something like this ...
$ArgList = #(
'a'
'-mx9'
$archive
$log
)
i also recommend you NOT use an automatic $Var name. take a look at Get-Help about_Automatic_Variables and you will see that $Args is one of those. you are strongly recommended NOT to use any of them for anything other than reading. writing to them is iffy. [grin]