How to pipe embedded script to powershell - powershell

I adapted some code from a solution here, but it doesn't work for me when executed as a bat script. Executed within a bat script, it gives the error below. Executed on the command line (with the correct line number as calculated by the script in place of %Line%, it works fine.
The key line is
more +%Line% %0 | powershell -c –
The error is
û : The term 'û' is not recognized as the name of a cmdlet, function, script file, or operable program.
The full Bat script is
#echo off
set fn=Archive-Extract
set fnp0=C:\RMT\VCS\GIT\Games\soulfu\build\dependencies2\libogg-1.2.2.zip
set fnp1=C:\RMT\VCS\GIT\Games\soulfu\build\dependencies2
::::::::::::::::::::::::::::::::::::
REM Set %A to the line number of all lines starting with ':', this leaves %A with the line number of the last ':'.
for /f "delims=:" %%a In ('findstr /Bn ":" %0') do set /A Line=%%a
REM Send the content of this script past the last line starting with ':' to powershell.
more +%Line% %0 | powershell -c –
::::::::::::::::::::::::::::::::::::
dir *.bat
pause & exit /b
::::::::::::::::::::::::::::::::::::
function Archive-Extract([string]$zipFilePath, [string]$destinationPath) {
# This will get added when paths are joined, and path comparison will need it to be absent.
$destinationPath = $destinationPath.TrimEnd("\");
[Reflection.Assembly]::LoadWithPartialName('System.IO.Compression.FileSystem');
$zipfile = [IO.Compression.ZipFile]::OpenRead($zipFilePath);
# Determine how many top level entries there are.
$ziplevel0files = #{};
$zipfile.Entries | foreach {
$s = ($_.FullName.TrimEnd("/") -split "/")[0];
if ($ziplevel0files.ContainsKey($s)) {
$ziplevel0files[$s] = $ziplevel0files[$s] + 1;
} else {
$ziplevel0files[$s] = 0;
}
}
if ($ziplevel0files.count -ne 1) {
Write-Host "Zip archives are (at this time) expected to contain one top-level directory, and all content within it.";
return 1; # Failure
}
$zipDirPath = Join-Path -Path $destinationPath -ChildPath $ziplevel0files.Keys[0];
# If the directory does not exist, extract the zip archive into the current folder.
if (Test-Path -LiteralPath $zipDirPath) {
Write-Host "Top-level extraction directory already exists.";
return 2; # Failure
}
$zipfile.Entries | foreach {
$extractFilePath = Join-Path -Path $destinationPath -ChildPath $_.FullName;
$extractFileDirPath = Split-Path -Parent $extractFilePath;
# Skip the top-level directory everything comes under.
if ($extractFileDirPath -ne $destinationPath) {
if (-not (Test-Path -LiteralPath $extractFileDirPath -PathType Container)) {
New-Item -Path $extractFileDirPath -Type Directory | Out-Null;
}
# Sometimes a directory comes after a file within the directory (the latter causes it to be created implicitly above).
if (-not $extractFilePath.EndsWith("\")) {
try {
[IO.Compression.ZipFileExtensions]::ExtractToFile($_, $extractFilePath, $true);
} catch {
Write-Host "Failed to extract file:" $extractFilePath;
return 3; # Failure
}
}
}
}
return 0; # Success
}
# Anything that calls should execute the powershell and set the parameters.
$fn = (Get-ChildItem Env:fn).Value;
$f = (get-item -path function:$fn);
Write-Host "sss1" $fn;
if ($fn -eq "Archive-Extract") {
Write-Host "sss2";
$archivepath = (Get-ChildItem Env:fnp0).Value;
$destinationpath = (Get-ChildItem Env:fnp1).Value;
$err = & $f.ScriptBlock $archivepath $destinationpath;
exit $err;
} else {
Write-Host "sss3";
Write-Error "Failed to match function: "+ $fn;
exit 1000;
}
What needs to be added to the BAT script to execute the same as when the code is executed line-by-line on the command-line?
EDIT: Note that when I adapted this script to follow the multi-line powerscript commenting approach recommended by npocmaka, the existing code above based on more but with no skipped line numbers worked. However, I am not entirely convinced that this solves the problem. I believe the above code worked fine at one point, as is, and worked for the person who came up with the original base code to begin with.

<# :
#echo off
echo --START POWERSHELL--
:: $cmdargs variable will contain the command line.
powershell $cmdargs="""%*""";$exp=$(Get-Content '%~f0') -replace """`n""",""";""" ;invoke-expression """$exp"""
echo --START BATCH--
dir *.bat
pause & exit /b
#>
Write-Host "sss";
exit;
Try this.I think it's better way for hybridization and allows something like argument passing to powershell script.Just have in mind that every line should be terminated with ;

Related

How to generate Powershell registry import script?

Basically I simply want to right-click on a branch in Regedit and say 'Generate Powershell script for import'. So that instead of a .reg file I get a PS script which will import/create elsewhere the entire selected registry branch with all keys/values etc.
I thought this would be a standard thing somewhere but I can't find anything, nor anyone with the same question, which surprises me.
Of course I could code it all out in PS but I'm feeling really lazy...
What you're looking for would indeed be convenient, but, as of this writing:
There is no official mechanism for customizing the regedit.exe utility's GUI that I'm aware of - unlike the (registry-based) mechanism for customizing File Explorer's shortcut menus.
Conceivably, specialized tools / advanced WinAPI-based techniques exist to achieve that.
Separately, there's no packaged PowerShell solution that I'm aware of that creates self-contained .ps1 scripts that bundle registry-import code with the data to import.
Leaving the regedit.exe GUI-integration aspect out of the picture, the building blocks of what you're looking for are:
(a) Using reg.exe export to export a given registry key's subtree to a .reg file.
(b) Later using reg.exe import to import such a file.
PowerShell code that combines (a) and (b) as follows:
It performs (a) ...
... and embeds the resulting .reg file's content in a dynamically generated script (.ps1) ...
which, when executed on a given machine, imports the embedded data into the registry, via (b).
Below is function New-RegistryImportScript, which implements the steps above; here's a sample invocation:
Get-Item HKCU:\Console | New-RegistryImportScript -OutPath .
The above creates script .\Import-RegKey_HKEY_CURRENT_USER_Console.ps1, which has the data from the HKEY_CURRENT_USER\Console registry key (subtree) embedded and, when executed, imports that data into the registry.
The script file name was auto-generated, from the given key path, because only an output directory was specified to -OutPath (. to target the current dir.), but you may specify a file path instead, so as to use a file name of choice.
As for regedit.exe integration: Invoke shortcut-menu command Copy Key Name on the key of interest, and then pass it as an argument to New-RegistryImportScript; e.g.:
# 'HKEY_CURRENT_USER\Console' is an example path copied from regedit.exe
New-RegistryImportScript HKEY_CURRENT_USER\Console .
New-RegistryImportScript source code:
function New-RegistryImportScript {
<#
.SYNOPSIS
Generates a self-contained registry-import script.
.DESCRIPTION
Generates a self-contained registry-import script that bundles the
data exported from a given registry key (subtree), using `reg.exe`
behind the scenes.
By default, the content of the generated script is output; redirect
it to a file as needed.
Alternatively, use -OutPath to directly save it to a file.
If you specify a *directory*, a file name is auto-generated as
Import-RegKey_<sanitized_key_path>.ps1, where <sanitized_key_path>
is the input key path with all non-alphanumeric characters replaced with
"_".
If you provide multiple key paths via the pipeline, a *single* output file
is created if you pass a *file* path to -OutPath.
With a *directory* path, an auto-named script is generate for each
input key path.
.EXAMPLE
Get-Item HKCU:\Console | New-RegistryImportScript -OutPath .
Creates automatically named script .\Import-RegKey_HKEY_CURRENT_USER_Console.ps1
with the data exported from HKEY_CURRENT_USER\Console embeded in it.
#>
param(
[Alias('PSPath')]
[Parameter(Mandatory, ValueFromPipeline, ValueFromPipelineByPropertyName)] [string] $KeyPath,
[string] $OutPath
)
begin {
# Code to add at the top and bottom of the generated script
$scriptProlog = #'
[CmdletBinding()] param()
$tempFile = "$env:TEMP\" + [System.IO.Path]::GetRandomFileName() + '.reg'
& {
'#
$scriptEpilog = #'
} | Set-Content -Encoding Unicode -LiteralPath $tempFile
reg.exe import $tempFile
Remove-Item -LiteralPath $tempFile
exit $LASTEXITCODE
'#
if ($env:OS -ne 'Windows_NT') { throw "This command runs on Windows only." }
# Note: For cross-PS-edition compatibility we ensure that UTF-8 files *with BOM* are created.
$enc = if ($IsCoreCLR) { 'utf8BOM'} else { 'utf8 '}
$autoGenerateName = $OutPath -and (Test-Path -Type Container -LiteralPath $OutPath)
if (-not $OutPath) {
$scriptProlog # Output the prolog to the success output stream.
} elseif (-not $autoGenerateName) {
if (($parentPath = (Split-Path -Parent $OutPath)) -and -not (Test-Path -Type Container -LiteralPath $parentPath)) {
throw "Cannot find part of the output path: $OutPath"
}
Write-Verbose "Generating script `"$($outFile.FullName)`"..."
# Initialize the single output file.
$scriptProlog | Set-Content -LiteralPath $OutPath -Encoding $enc
}
}
process {
# First, try to convert to a full, provider-native path.
$nativeRegPath = Convert-Path -ErrorAction Ignore -LiteralPath $KeyPath
if (-not $nativeRegPath) { $nativeRegPath = $KeyPath } # Assume that a native registry path was directly given.
# Resolve it to a full, native registry path via a Get-Item call.
# By using "Microsoft.PowerShell.Core\Registry::" as the prefix, we rule out non-registry paths.
# !! Sadly, even the .Name property does NOT contain the *case-exact* form of the key path - it reflects the case *as specified*.
# !! However, given that the registry is inherently case-INsensitive, this should not matter.
$nativeRegPath = (Get-Item -ErrorAction Ignore -LiteralPath "Microsoft.PowerShell.Core\Registry::$nativeRegPath").Name
if (-not $nativeRegPath) {
"Not an (existing) registry path: `"$KeyPath`""
return
}
Write-Verbose "Targeting registry key `"$nativeRegPath`""
# Export the target key's subtree from the registry.
$tempFile = New-TemporaryFile
reg.exe export $nativeRegPath $tempFile /y >$null # Creates a UTF-16LE file.
if ($LASTEXITCODE) {
Write-Error "Export of registry key `"$nativeRegPath`" failed."
return
}
$regFileContent = Get-Content -Raw $tempFile
$tempFile | Remove-Item
# Create the part of the generated script that has the exported
# data embedded as a here-string.
$scriptEmbeddedData = #"
Write-Verbose "Importing into ``"$nativeRegPath``"..."
#'
$regFileContent
'#
"#
if (-not $OutPath) {
$scriptEmbeddedData # output to the success output stream
}
else {
if ($autoGenerateName) {
# Auto-generate a filename for the key path at hand.
$OutFile = Join-Path $OutPath ('Import-RegKey_' + ($nativeRegPath -replace '[^\p{L}\d]', '_') + '.ps1')
Write-Verbose -Verbose "Generating auto-named script `"$OutFile`"..."
$scriptProlog, $scriptEmbeddedData, $scriptEpilog | Set-Content -Encoding $enc $OutFile
} else {
# Append the embedded data to the single output script.
$scriptEmbeddedData | Add-Content -Encoding $enc $OutPath
}
}
}
end {
if (-not $OutPath) {
# Output the the epilog.
$scriptEpilog
}
elseif (-not $autoGenerateName) {
# Single output file? Append the epilog.
$scriptEpilog | Add-Content -Encoding $enc $OutPath
}
}
}

script file, wait function until end of generation files

I have a script file (batch file) which generate three files in a specific folder. Then i have a ps1 file which copy / move the generated files to another server / folders. Separately, everything is working properly
I'd like if it's possible to merge this, and have a wait function between the two scripts. In fact launching the copy / move ps1 function, only when the three files was correctly generated.
The following assumes:
that the files are created and written in full in a single operation.
that it is the appearance of a *.zip file that signals that all files of interest have been created (though they may still in the process of being written to), as you've indicated in a later comment.
$inFolder = '.' # set to the folder of interest.
$outFolder = './out' # ditto
Write-Verbose -vb 'Waiting for a *.zip file to appear...'
while (-not (Test-Path "$inFolder/*.zip")) { Start-Sleep 1 }
# Get a list of all files.
$files = Get-ChildItem -File $inFolder
Write-Verbose -vb 'Waiting for all files to be written completely...'
$files | ForEach-Object {
do {
# Infer from the ability to obtain an exclusive lock that the file has
# has been written in its entirety.
try { [IO.File]::Open($_.FullName, 'Open', 'Read', 'None').Dispose(); return }
catch { Start-Sleep 1 }
} while ($true)
}
# Move the files elsewhere
Write-Verbose -vb 'Moving...'
$files | Move-Item -Destination $outFolder -WhatIf
Note: The -WhatIf common parameter in the last command above previews the operation. Remove -WhatIf once you're sure the operation will do what you want.
param(
[String]$sourceDirectory = "c:\tmp\001",
[String]$destDirectory = "c:\tmp\001"
)
Get-ChildItem $sourceDirectory | ? {
#this step wait while locks free
[bool]$flag
while (!$flag) {
try {
$FileStream = [System.IO.File]::Open($_,'Open','Write')
$FileStream.Close()
$FileStream.Dispose()
$flag = $true
}
catch{
Start-ScheduledTask -s 1
$null
}
}
$true
} | Copy-Item -Destination $destDirectory

Converting a line of cmd to powershell

EDIT2: Final code below
I need help on converting some codes as I am very new to mkvmerge, powershell and command prompt.
The CMD code is from https://github.com/Serede/mkvtoolnix-batch/blob/master/mkvtoolnix-batch.bat
for %%f in (*.mkv) do %mkvmerge% #options.json -o "mkvmerge_out/%%f" "%%f"
What I've managed so far
$SourceFolder = "C:\tmp" #In my actual code, this is done using folder browser
$SourceFiles = Get-ChildItem -LiteralPath $SourceFolder -File -Include *.mkv
$SourceFiles | foreach
{
start-process "F:\Desktop\#progs\mkvtoolnix\mkvmerge.exe"
}
I'd be grateful for any help as I'm having trouble understanding and converting while learning both sides. Thank you very much.
**EDIT 2:**Here's my final working code.
Function Get-Folder($initialDirectory) {
#Prompt to choose source folder
[void] [System.Reflection.Assembly]::LoadWithPartialName('System.Windows.Forms')
$FolderBrowserDialog = New-Object System.Windows.Forms.FolderBrowserDialog
$FolderBrowserDialog.Description = 'Choose the video folder'
$FolderBrowserDialog.RootFolder = 'MyComputer'
if ($initialDirectory) { $FolderBrowserDialog.SelectedPath = $initialDirectory }
[void] $FolderBrowserDialog.ShowDialog()
return $FolderBrowserDialog.SelectedPath
}
Function ExitMessage
{
#endregion Function output
Write-Host "`nOperation complete";
Write-Host -NoNewLine 'Press any key to continue...';
$null = $Host.UI.RawUI.ReadKey('NoEcho,IncludeKeyDown');
Exit;
}
($SourceFolder = Get-Folder | select )
#Check for output folder and create if unavailable
$TestFile = "$SourceFolder" + "\mkvmerge_out"
if ((Test-Path -LiteralPath $TestFile) -like "False")
{
new-item -Path $SourceFolder -name "mkvmerge_out" -type directory
Write-Host 'Folder created';
}
#Checking for the presence of a Json file
$TestFile = (Get-ChildItem -LiteralPath $SourceFolder -File -Filter *.json)
if ($TestFile.count -eq 0)
{
Write-Host 'json file not found';
ExitMessage;
}
$TestFile = "$SourceFolder" + "\$TestFile"
#Getting the total number of files and start timer.
[Int] $TotalFiles = 0;
[Int] $FilesDone = 0;
$TotalFiles = (Get-ChildItem -LiteralPath $SourceFolder -File -Filter *.mkv).count
$PercentFiles = 0;
$Time = [System.Diagnostics.Stopwatch]::StartNew()
#Start mkvmerge process with progress bar
$mkvmergeExe = 'F:\Desktop\#progs\mkvtoolnix\mkvmerge.exe'
$JsonFile = "$TestFile" # alternatively, use Join-Path
Get-ChildItem -LiteralPath $SourceFolder -File -Filter *.mkv | ForEach-Object {
$PercentFiles = [math]::truncate(($FilesDone/$TotalFiles)*100)
Write-Progress -Activity mkvmerge -Status ("{0}% Completed; {1}/{2} done; Time Elapsed: {3:d2}:{4:d2}:{5:d2}" -f $PercentFiles, $FilesDone, $TotalFiles, $Time.Elapsed.Hours, $Time.Elapsed.minutes, $Time.Elapsed.seconds) -PercentComplete $PercentFiles;
Write-Host "Processing $_"
$f = $_.FullName
$of = "$SourceFolder\mkvmerge_out\$($_.Name)"
& $mkvmergeExe -q `#$JsonFile -o $of $f
$FilesDone++
}
Remove-Item -LiteralPath $JsonFile #Remove this line if you want to keep the Json file
$PercentFiles = [math]::truncate(($FilesDone/$TotalFiles)*100)
Write-Progress -Activity mkvmerge -Status ("{0}% Completed; {1}/{2} done; Time Elapsed: {3:d2}:{4:d2}:{5:d2}" -f $PercentFiles, $FilesDone, $TotalFiles, $Time.Elapsed.Hours, $Time.Elapsed.minutes, $Time.Elapsed.seconds) -PercentComplete $PercentFiles;
ExitMessage;
$mkvmergeExe = 'F:\Desktop\#progs\mkvtoolnix\mkvmerge.exe'
$optionsFile = "$SourceFolder\options.json" # alternatively, use Join-Path
Get-ChildItem -LiteralPath $SourceFolder -File -Filter *.mkv | ForEach-Object {
$f = $_.FullName
$of = "$SourceFolder\mkvmerge_out\$($_.Name)"
& $mkvmergeExe `#$optionsFile -o $of $f
}
Note that your cmd code assumes that it's operating in the current directory, while your PowerShell code passes a directory explicitly via $SourceFolder; therefore, the options.json file must be looked for in $SourceFolder and too, and the output file path passed to -o must be prefixed with $SourceFolder too which is achieved via expandable strings ("...") .
The main points to consider:
for %%f in (*.mkv) has no direct counterpart in PowerShell; you correctly used Get-ChildItem instead, to get a list of matching files, which are returned as System.IO.FileInfo instances.
However, -Include won't work as intended in the absence of -Recurse (unless you append \* - see this GitHub issue; -Filter does, and is also the faster method, but it has its limitations and legacy quirks (see this answer).
While PowerShell too allows you to execute commands whose names or paths are stored in a variable (or specified as a quoted string literal), you then need &, the call operator, to invoke it, for syntactic reasons.
Inside a script block ({ ... }) passed to the ForEach-Object cmdlet, automatic variable $_ represents the pipeline input object at hand.
$_.FullName ensures that the System.IO.FileInfo input instances are represented by their full path when used in a string context.
This extra step is no longer necessary in PowerShell [Core] 6+, where System.IO.FileInfo instances thankfully always stringify as their full paths.
The # character is preceded by ` (backtick), PowerShell's escape character, because # - unlike in cmd - is a metacharacter, i.e. a character with special syntactic meaning. `# ensures that the # is treated verbatim, and therefore passed through to mkvmerge.
Alternatively, you could have quoted the argument instead of escaping just the #: "#$optionsFile"
See this answer for background information.
You generally do not need to enclose arguments in "..." in PowerShell, even if they contain spaces or other metacharacters.

Powershell: how to read an environment variable?

This is the first powershell script I have attempted. When I run it, part one runs fine and creates the log.txt, but after running it again it still runs part 1. How can I check if the log file exists??
EDIT: I forgot to mention that I am running the script from PowerShell ISE if that makes a difference.
#Variables.
#Set working directory to scripts location.
$scriptpath = $MyInvocation.MyCommand.Path
$dir = Split-Path $scriptpath
#Check if log file exists.
$ChkFile = "%userprofile%\Desktop\log.txt"
$FileExists = (Test-Path $ChkFile -PathType Leaf)
#Set dir to script location.
Set-Location $dir
#Part 1.
If (!($FileExists))
{
Write-Host "Part 1"
echo 0 >>log.txt
}
#Part 2.
ElseIf ($FileExists)
{
Write-Host "Part 2"
}
% is for cmd.exe variable expansion. You need a different syntax for PowerShell. Instead use:
"$env:userprofile\Desktop\log.txt"

How to speed up Powershell Get-Childitem over UNC

DIR or GCI is slow in Powershell, but fast in CMD. Is there any way to speed this up?
In CMD.exe, after a sub-second delay, this responds as fast as the CMD window can keep up
dir \\remote-server.domain.com\share\folder\file*.*
In Powershell (v2), after a 40+ second delay, this responds with a noticable slowness (maybe 3-4 lines per second)
gci \\remote-server.domain.com\share\folder\file*.*
I'm trying to scan logs on a remote server, so maybe there's a faster approach.
get-childitem \\$s\logs -include $filemask -recurse | select-string -pattern $regex
Okay, this is how I'm doing it, and it seems to work.
$files = cmd /c "$GETFILESBAT \\$server\logs\$filemask"
foreach( $f in $files ) {
if( $f.length -gt 0 ) {
select-string -Path $f -pattern $regex | foreach-object { $_ }
}
}
Then $GETFILESBAT points to this:
#dir /a-d /b /s %1
#exit
I'm writing and deleting this BAT file from the PowerShell script, so I guess it's a PowerShell-only solution, but it doesn't use only PowerShell.
My preliminary performance metrics show this to be eleventy-thousand times faster.
I tested gci vs. cmd dir vs. FileIO.FileSystem.GetFiles from #Shawn Melton's referenced link.
The bottom line is that, for daily use on local drives, GetFiles is the fastest. By far. CMD DIR is respectable. Once you introduce a slower network connection with many files, CMD DIR is slightly faster than GetFiles. Then Get-ChildItem... wow, this ranges from not too bad to horrible, depending on the number of files involved and the speed of the connection.
Some test runs. I've moved GCI around in the tests to make sure the results were consistent.
10 iterations of scanning c:\windows\temp for *.tmp files
.\test.ps1 "c:\windows\temp" "*.tmp" 10
GetFiles ... 00:00:00.0570057
CMD dir ... 00:00:00.5360536
GCI ... 00:00:01.1391139
GetFiles is 10x faster than CMD dir, which itself is more than 2x faster than GCI.
10 iterations of scanning c:\windows\temp for *.tmp files with recursion
.\test.ps1 "c:\windows\temp" "*.tmp" 10 -recurse
GetFiles ... 00:00:00.7020180
CMD dir ... 00:00:00.7644196
GCI ... 00:00:04.7737224
GetFiles is a little faster than CMD dir, and both are almost 7x faster than GCI.
10 iterations of scanning an on-site server on another domain for application log files
.\test.ps1 "\\closeserver\logs\subdir" "appname*.*" 10
GetFiles ... 00:00:00.3590359
CMD dir ... 00:00:00.6270627
GCI ... 00:00:06.0796079
GetFiles is about 2x faster than CMD dir, itself 10x faster than GCI.
One iteration of scanning a distant server on another domain for application log files, with many files involved
.\test.ps1 "\\distantserver.company.com\logs\subdir" "appname.2011082*.*"
CMD dir ... 00:00:00.3340334
GetFiles ... 00:00:00.4360436
GCI ... 00:11:09.5525579
CMD dir is fastest going to the distant server with many files, but GetFiles is respectably close. GCI on the other hand is a couple of thousand times slower.
Two iterations of scanning a distant server on another domain for application log files, with many files
.\test.ps1 "\\distantserver.company.com\logs\subdir" "appname.20110822*.*" 2
CMD dir ... 00:00:00.9360240
GetFiles ... 00:00:01.4976384
GCI ... 00:22:17.3068616
More or less linear increase as test iterations increase.
One iteration of scanning a distant server on another domain for application log files, with fewer files
.\test.ps1 "\\distantserver.company.com\logs\othersubdir" "appname.2011082*.*" 10
GetFiles ... 00:00:00.5304170
CMD dir ... 00:00:00.6240200
GCI ... 00:00:01.9656630
Here GCI is not too bad, GetFiles is 3x faster, and CMD dir is close behind.
Conclusion
GCI needs a -raw or -fast option that does not try to do so much. In the meantime, GetFiles is a healthy alternative that is only occasionally a little slower than CMD dir, and usually faster (due to spawning CMD.exe?).
For reference, here's the test.ps1 code.
param ( [string]$path, [string]$filemask, [switch]$recurse=$false, [int]$n=1 )
[reflection.assembly]::loadwithpartialname("Microsoft.VisualBasic") | Out-Null
write-host "GetFiles... " -nonewline
$dt = get-date;
for($i=0;$i -lt $n;$i++){
if( $recurse ){ [Microsoft.VisualBasic.FileIO.FileSystem]::GetFiles( $path,
[Microsoft.VisualBasic.FileIO.SearchOption]::SearchAllSubDirectories,$filemask
) | out-file ".\testfiles1.txt"}
else{ [Microsoft.VisualBasic.FileIO.FileSystem]::GetFiles( $path,
[Microsoft.VisualBasic.FileIO.SearchOption]::SearchTopLevelOnly,$filemask
) | out-file ".\testfiles1.txt" }}
$dt2=get-date;
write-host $dt2.subtract($dt)
write-host "CMD dir... " -nonewline
$dt = get-date;
for($i=0;$i -lt $n;$i++){
if($recurse){
cmd /c "dir /a-d /b /s $path\$filemask" | out-file ".\testfiles2.txt"}
else{ cmd /c "dir /a-d /b $path\$filemask" | out-file ".\testfiles2.txt"}}
$dt2=get-date;
write-host $dt2.subtract($dt)
write-host "GCI... " -nonewline
$dt = get-date;
for($i=0;$i -lt $n;$i++){
if( $recurse ) {
get-childitem "$path\*" -include $filemask -recurse | out-file ".\testfiles0.txt"}
else {get-childitem "$path\*" -include $filemask | out-file ".\testfiles0.txt"}}
$dt2=get-date;
write-host $dt2.subtract($dt)
Here is a good explanation on why Get-ChildItem is slow by Lee Holmes. If you take note of the comment from "Anon 11 Mar 2010 11:11 AM" at the bottom of the page his solution might work for you.
Anon's Code:
# SCOPE: SEARCH A DIRECTORY FOR FILES (W/WILDCARDS IF NECESSARY)
# Usage:
# $directory = "\\SERVER\SHARE"
# $searchterms = "filname[*].ext"
# PS> $Results = Search $directory $searchterms
[reflection.assembly]::loadwithpartialname("Microsoft.VisualBasic") | Out-Null
Function Search {
# Parameters $Path and $SearchString
param ([Parameter(Mandatory=$true, ValueFromPipeline = $true)][string]$Path,
[Parameter(Mandatory=$true)][string]$SearchString
)
try {
#.NET FindInFiles Method to Look for file
# BENEFITS : Possibly running as background job (haven't looked into it yet)
[Microsoft.VisualBasic.FileIO.FileSystem]::GetFiles(
$Path,
[Microsoft.VisualBasic.FileIO.SearchOption]::SearchAllSubDirectories,
$SearchString
)
} catch { $_ }
}
I tried some of the suggested methods with a large amount of files (~190.000). As mentioned in Kyle's comment, GetFiles isn't very useful here, because it needs nearly forever.
cmd dir was better than Get-ChildItems at my first tests, but it seems, GCI speeds up a lot if you use the -Force parameter. With this the needed time was about the same as for cmd dir.
P.S.: In my case I had to exclude most of the files because of their extension. This was made with -Exclude in gci and with a | where in the other commands. So the results for just searching files might slightly differ.
Here's an interactive reader that parses cmd /c dir (which can handle unc paths), and will collect the 3 most important properties for most people: full path, size, timestamp
usage would be something like $files_with_details = $faster_get_files.GetFileList($unc_compatible_folder)
and there's a helper function to check combined size $faster_get_files.GetSize($files_with_details)
$faster_get_files = New-Module -AsCustomObject -ScriptBlock {
#$DebugPreference = 'Continue' #verbose, this will take figuratively forever
#$DebugPreference = 'SilentlyContinue'
$directory_filter = "Directory of (.+)"
$file_filter = "(\d+/\d+/\d+)\s+(\d+:\d+ \w{2})\s+([\d,]+)\s+(.+)" # [1] is day, [2] is time (AM/PM), [3] is size, [4] is filename
$extension_filter = "(.+)[\.](\w{3,4})" # [1] is leaf, [2] is extension
$directory = ""
function GetFileList ($directory = $this.directory) {
if ([System.IO.Directory]::Exists($directory)) {
# Gather raw file list
write-Information "Gathering files..."
$files_raw = cmd /c dir $directory \*.* /s/a-d
# Parse file list
Write-Information "Parsing file list..."
$files_with_details = foreach ($line in $files_raw) {
Write-Debug "starting line {$($line)}"
Switch -regex ($line) {
$this.directory_filter{
$directory = $matches[1]
break
}
$this.file_filter {
Write-Debug "parsing matches {$($matches.value -join ";")}"
$date = $matches[1]
$time = $matches[2] # am/pm style
$size = $matches[3]
$filename = $matches[4]
# we do a second match here so as to not append a fake period to files without an extension, otherwise we could do a single match up above
Write-Debug "parsing extension from {$($filename)}"
if ($filename -match $this.extension_filter) {
$file_leaf = $matches[1]
$file_extension = $matches[2]
} else {
$file_leaf = $filename
$file_extension = ""
}
[pscustomobject][ordered]#{
"fullname" = [string]"$($directory)\$($filename)"
"filename" = [string]$filename
"folder" = [string]$directory
"file_leaf" = [string]$file_leaf
"extension" = [string]$file_extension
"date" = get-date "$($date) $($time)"
"size" = [int]$size
}
break
}
} # finish directory/file test
} # finish all files
return $files_with_details
} #finish directory exists test
else #directory doesn't exist {throw("Directory not found")}
}
function GetSize($files_with_details) {
$combined_size = ($files_with_details|measure -Property size -sum).sum
$pretty_size_gb = "$([math]::Round($combined_size / 1GB, 4)) GB"
return $pretty_size_gb
}
Export-ModuleMember -Function * -Variable *
}