Background:
I'm trying to filter a specific GIT branch from my local branches. Therefore I'm using git branch --all.
Powershell specific question:
I'm performing pipeline filtering via Where-Object and want to ensure that only one object and not an array is returned from the pipeline.
I've e.g. :
$branch = Invoke-Expression "git branch --all" | % { $_.Trim('*').Trim() | ? { $_ -match "MySpecificBranchRegex" }
If I mess up my specific filter regex $branch might be an array and not a string.
Is there an elegant way to ensure only one string is returned. Possible solutions I don't like:
Call Select-Object -First 1 at the end of the pipeline
Perform if '$arr.GetType().BaseType.Name -eq "Array"`
Thx.
Well Select-Object -First 1 is the elegant solution imho, but you could turn it around by forcing it to always return an array:
$branches = #(Invoke-Expression "git branch --all" | % { $_.Trim('*').Trim() | ? { $_ -match "MySpecificBranchRegex" })
if ($branches.Count -ne 1)
{
throw "Something went wrong..."
}
$result = git branch --all | where { $_ -match 'MySpecificBranchRegex' }
$result.count
I wouldn't be above using findstr (even with quotes) instead of where. /i is case insensitive.
$result = git branch --all | findstr /i MySpecificBranchRegex
Related
I'm trying to create a report file which lists all files in my Git repository with these columns:
Name
Size
Last Modified
Git Commit
The first 3 are no problem:
gci -File -Recurse | Select-Object -Property #{
label = 'Size(KB)'
expr = { [string]::Format("{0:0.00}", $_.Length/1KB) }
}, LastWriteTime, FullName
However, retrieving the git commit requires running Git, a native command.
I've tried, among others:
gci -File -Recurse | Select-Object -Property #{
label = 'Git commit'
expr = { Invoke-Expression "git log -1 --format:format=`"%s`" -- $($_.FullName)" }
},
#{
label = 'Size(KB)'
expr = { [string]::Format("{0:0.00}", $_.Length/1KB) }
}, LastWriteTime, FullName
But it just gets stuck.
Does anyone know how to do this??
P.S.
All the flags and options to Git doesn't matter for the manner of sake, I just copied what I already did.
As #mklement0 suggested in the comments, the issue was just that your formatting for the --format command was off just enough to cause a problem.
You had:
--format:format="%s" # won't work
--format=format:"%s" # works
So to fix it, we just swap in the right format, giving us this command with the output below.
gci -File | Select-Object -Property #{
label = 'Git commit'
expr = { Invoke-Expression "git log -1 --format=format:`"%s`" $($_.Name)" }
},
#{
label = 'Size(KB)'
expr = { [string]::Format("{0:0.00}", $_.Length/1KB) }
}, LastWriteTime, FullName
Git commit Size(KB) LastWriteTime FullName
---------- -------- ------------- --------
applied gitingore 6.07 5/11/2020 11:22:06 AM C:\git\ClientFaux\.gitignore
cleanin up layout 1.40 10/9/2020 3:20:33 PM C:\git\ClientFaux\ClientFaux.sln
Create LICENSE (#25) 34.98 7/13/2020 9:55:00 AM C:\git\ClientFaux\LICENSE
Update README.md (#27) 3.37 7/13/2020 9:55:00 AM C:\git\ClientFaux\README.md
31.13 7/13/2020 9:55:27 AM C:\git\ClientFaux\UpgradeLog.htm
FoxDeploy's helpful answer contains the crucial pointer, but let me offer a (corrected) PowerShell-idiomatic reformulation of your code.
Get-ChildItem -File -Recurse | Select-Object -Property #{
label = 'Git commit'
expr = { git log -1 --format=format:%s -- $_.FullName }
},
#{
label = 'Size(KB)'
expr = { '{0:0.00}' -f ($_.Length/1KB) }
}, LastWriteTime, FullName
Note:
Invoke-Expression should generally be avoided; definitely don't use it to invoke an external program or PowerShell script, so the above uses direct invocation of git.
As mentioned, --format:format="%s" is syntactically incorrect and should be --format=format:"%s" (I've omitted the " above, which PowerShell would strip behind the scenes anyway).
-f, the format operator, is used as the PowerShell-idiomatic alternative to directly calling the underlying .NET API, System.String.Format.
Calling native executables in calculated properties:
Generally, note that there's nothing fundamentally special about calling native executables from the expression script block ({ ... }) of a calculated property.
Specifically, the following considerations apply:
In PowerShell, stdout output from native executables is invariably converted to text ([string] instances). If the output comprises multiple lines, the property values becomes a (regular [object[]]) array containing strings.
If the native executable produces no stdout output, the value of the property is "nothing", i.e. the so-called "Automation Null" value ([System.Management.Automation.Internal.AutomationNull]::Value) that in most context behaves like $null.
Any stderr output from the native executable is quietly discarded. (Analogously, errors from PowerShell commands or expressions in expression script blocks are quietly ignored.)
Therefore, in order to troubleshoot an expression script block, execute it stand-alone, via a ForEach-Object command (whose built-in alias is %), so as to surface any errors; e.g.:
# Uses broken `git` syntax; note the resulting error message.
PS> (Get-ChildItem -File)[0] | % { git log -1 --format:format=%s -- $_.FullName }
fatal: unrecognized argument: --format:format=%s
As for what you tried:
Because your git command was syntactically incorrect, git only produced stderr output, which PowerShell then ignored, as explained above.
Thus, your Git commit properties ended up containing "nothing", which simply renders blank in output formatting.
How to get the name of the previous in pipe cmdlet? For example:
gci myDir\*.ps1 | % { $prevCmdletName = ...?... }
resolve-Path myDir\*.ps1 | % { $prevCmdletName = ...?... }
gci myDir\*.ps1 | ? { $_.fullname -match 'tests' } | % { $prevCmdletName = ...?... }
test1.ps1, test2.ps1 | % { $prevCmdletName = ...?... }
Is there a common code to determinate the previous in pipe cmdlet? Is there a module with such functions?
Thanks.
What you may be after is a transcript start-transcript filepath\filename. It really depends on what the desired end result will be, but a transcript will indicate to you what commands are executing and what those commands are doing.
If you want your code to tell you what command you are running while you're running it, then that's a seemingly strange requirement - however, can be done.
gci c:\ | % {(Get-PSCallStack).Position.StartScriptPosition.GetFullScript()}
Doing this will obscure your output, but you can add it as one of the outputs. Getting creative with how you then use it to determine the precise previous argument in a pipeline, could be done by taking that string output and splitting it by the pipeline character. Whichever output you desire, you will need to encapsulate that in a script block via calculating which item in that split list you're after. Doing this all in one line as part of the pipe will yield undesired results.
It's unclear what you are after.
The previously iterated $PsItem/$_ in a ForEach?
Other than storing the previous in a variable?
$prevCmdletName = ""
Get-ChildItem myDir\*.ps1 | ForEach-Object {
"Current {0} previous {1}" -f $_,$prevCmdletName
$prevCmdletName=$_
}
Currently, I'm writing a PowerShell module which automatically configures aliases for all git commands, inspired by git-sh.
Then I wrote functions below.
The Enable-GitAliases function is the entry point to configure aliases automatically.
it collects git's subcommands by Get-GitCommands, which parses git --help -a to get all git's subcommands.
Then it defines the wrapper functions for the collected git commands.
My question is: why is git --help -a called so many times (possibly infinitely) when invoking Enable-GitAliases, which causing significant slow down?
After writing the code, I found Enable-GitAliases takes too much time (I've never seen it finishes).
According to the Task Manager, the git --help -a command is launched and exits repeatedly.
I expected the git --help -a command is called only once.
Actually, Get-GitCommands | % { echo $_ } calls git --help -a only once.
What is the difference, and what is best way to fix?
function Get-GitCommands {
-Split (git --help -a | select-string -pattern '^ [-a-zA-Z0-9.]+\s*')
}
function Enable-GitAliases($avoidConflicts = $true) {
Get-GitCommands | % {
$aliasName = $_
if (-not ($avoidConflicts -and (Get-Command $aliasName 2> $null) -ne $null)) {
Enable-GitAliases $aliasName
}
}
}
function Enable-GitAlias($commandName) {
$wrapper = #'
function global:{0} {{
git {0} $args
}}
'# -f $commandName
Invoke-Expression $wrapper
}
You call Enable-GitAliases recursively, but is this intended?
Is your intention this?
function Enable-GitAliases($avoidConflicts = $true) {
Get-GitCommands | % {
$aliasName = $_
if (-not ($avoidConflicts -and (Get-Command $aliasName 2> $null) -ne $null)) {
# Enable-GitAliases -> Enable-GitAlias
Enable-GitAlias $aliasName
}
}
}
I was wondering if this was possible. I am trying to make a script we will refer to as a master script. This script queries a DB to get a list of servers we will call $svrs. Simple stuff.
The thing I don't know how to do or if it is possible is to run a series of subscripts from the master script using the $srvrs.Name variable as a parameter on those scripts.
$svrs = "get list sql stuff"
$scrpath = 'D:\test'
$scripts = Get-ChildItem $scrpath
$scripts.Name | ForEach-Object {
Invoke-Expression $_ {I have no idea how to get server name variable here}
}
Based on the comments, you do need a nested loop which won't be too complicated.
$Scripts | Select-object Name | % {$curScript = $_
$Servers | % {.\$_ $CurScript}
}
I ended up resolving this myself with #JNK 's assistance...
Here is how I got the result I needed.
$allServers | ForEach-Object {
$currentServer = $_
$scripts.Name | ForEach-Object {
Invoke-Expression ".\$_ $currentServer"
}
}
I have written a simple PowerShell filter that pushes the current object down the pipeline if its date is between the specified begin and end date. The objects coming down the pipeline are always in ascending date order so as soon as the date exceeds the specified end date I know my work is done and I would like to let tell the pipeline that the upstream commands can abandon their work so that the pipeline can finish its work. I am reading some very large log files and I will frequently want to examine just a portion of the log. I am pretty sure this is not possible but I wanted to ask to be sure.
It is possible to break a pipeline with anything that would otherwise break an outside loop or halt script execution altogether (like throwing an exception). The solution then is to wrap the pipeline in a loop that you can break if you need to stop the pipeline. For example, the below code will return the first item from the pipeline and then break the pipeline by breaking the outside do-while loop:
do {
Get-ChildItem|% { $_;break }
} while ($false)
This functionality can be wrapped into a function like this, where the last line accomplishes the same thing as above:
function Breakable-Pipeline([ScriptBlock]$ScriptBlock) {
do {
. $ScriptBlock
} while ($false)
}
Breakable-Pipeline { Get-ChildItem|% { $_;break } }
It is not possible to stop an upstream command from a downstream command.. it will continue to filter out objects that do not match your criteria, but the first command will process everything it was set to process.
The workaround will be to do more filtering on the upstream cmdlet or function/filter. Working with log files makes it a bit more comoplicated, but perhaps using Select-String and a regular expression to filter out the undesired dates might work for you.
Unless you know how many lines you want to take and from where, the whole file will be read to check for the pattern.
You can throw an exception when ending the pipeline.
gc demo.txt -ReadCount 1 | %{$num=0}{$num++; if($num -eq 5){throw "terminated pipeline!"}else{write-host $_}}
or
Look at this post about how to terminate a pipeline: https://web.archive.org/web/20160829015320/http://powershell.com/cs/blogs/tobias/archive/2010/01/01/cancelling-a-pipeline.aspx
Not sure about your exact needs, but it may be worth your time to look at Log Parser to see if you can't use a query to filter the data before it even hits the pipe.
If you're willing to use non-public members here is a way to stop the pipeline. It mimics what select-object does. invoke-method (alias im) is a function to invoke non-public methods. select-property (alias selp) is a function to select (similar to select-object) non-public properties - however it automatically acts like -ExpandProperty if only one matching property is found. (I wrote select-property and invoke-method at work, so can't share the source code of those).
# Get the system.management.automation assembly
$script:smaa=[appdomain]::currentdomain.getassemblies()|
? location -like "*system.management.automation*"
# Get the StopUpstreamCommandsException class
$script:upcet=$smaa.gettypes()| ? name -like "*StopUpstreamCommandsException *"
function stop-pipeline {
# Create a StopUpstreamCommandsException
$upce = [activator]::CreateInstance($upcet,#($pscmdlet))
$PipelineProcessor=$pscmdlet.CommandRuntime|select-property PipelineProcessor
$commands = $PipelineProcessor|select-property commands
$commandProcessor= $commands[0]
$ci = $commandProcessor|select-property commandinfo
$upce.RequestingCommandProcessor | im set_commandinfo #($ci)
$cr = $commandProcessor|select-property commandruntime
$upce.RequestingCommandProcessor| im set_commandruntime #($cr)
$null = $PipelineProcessor|
invoke-method recordfailure #($upce, $commandProcessor.command)
if ($commands.count -gt 1) {
$doCompletes = #()
1..($commands.count-1) | % {
write-debug "Stop-pipeline: added DoComplete for $($commands[$_])"
$doCompletes += $commands[$_] | invoke-method DoComplete -returnClosure
}
foreach ($DoComplete in $doCompletes) {
$null = & $DoComplete
}
}
throw $upce
}
EDIT: per mklement0's comment:
Here is a link to the Nivot ink blog on a script on the "poke" module which similarly gives access to non-public members.
As far as additional comments, I don't have meaningful ones at this point. This code just mimics what a decompilation of select-object reveals. The original MS comments (if any) are of course not in the decompilation. Frankly I don't know the purpose of the various types the function uses. Getting that level of understanding would likely require a considerable amount of effort.
My suggestion: get Oisin's poke module. Tweak the code to use that module. And then try it out. If you like the way it works, then use it and don't worry how it works (that's what I did).
Note: I haven't studied "poke" in any depth, but my guess is that it doesn't have anything like -returnClosure. However adding that should be easy as this:
if (-not $returnClosure) {
$methodInfo.Invoke($arguments)
} else {
{$methodInfo.Invoke($arguments)}.GetNewClosure()
}
Here's an - imperfect - implementation of a Stop-Pipeline cmdlet (requires PS v3+), gratefully adapted from this answer:
#requires -version 3
Filter Stop-Pipeline {
$sp = { Select-Object -First 1 }.GetSteppablePipeline($MyInvocation.CommandOrigin)
$sp.Begin($true)
$sp.Process(0)
}
# Example
1..5 | % { if ($_ -gt 2) { Stop-Pipeline }; $_ } # -> 1, 2
Caveat: I don't fully understand how it works, though fundamentally it takes advantage of Select -First's ability to stop the pipeline prematurely (PS v3+). However, in this case there is one crucial difference to how Select -First terminates the pipeline: downstream cmdlets (commands later in the pipeline) do not get a chance to run their end blocks.
Therefore, aggregating cmdlets (those that must receive all input before producing output, such as Sort-Object, Group-Object, and Measure-Object) will not produce output if placed later in the same pipeline; e.g.:
# !! NO output, because Sort-Object never finishes.
1..5 | % { if ($_ -gt 2) { Stop-Pipeline }; $_ } | Sort-Object
Background info that may lead to a better solution:
Thanks to PetSerAl, my answer here shows how to produce the same exception that Select-Object -First uses internally to stop upstream cmdlets.
However, there the exception is thrown from inside the cmdlet that is itself connected to the pipeline to stop, which is not the case here:
Stop-Pipeline, as used in the examples above, is not connected to the pipeline that should be stopped (only the enclosing ForEach-Object (%) block is), so the question is: How can the exception be thrown in the context of the target pipeline?
Try these filters, they'll force the pipeline to stop after the first object -or the first n elements- and store it -them- in a variable; you need to pass the name of the variable, if you don't the object(s) are pushed out but cannot be assigned to a variable.
filter FirstObject ([string]$vName = '') {
if ($vName) {sv $vName $_ -s 1} else {$_}
break
}
filter FirstElements ([int]$max = 2, [string]$vName = '') {
if ($max -le 0) {break} else {$_arr += ,$_}
if (!--$max) {
if ($vName) {sv $vName $_arr -s 1} else {$_arr}
break
}
}
# can't assign to a variable directly
$myLog = get-eventLog security | ... | firstObject
# pass the the varName
get-eventLog security | ... | firstObject myLog
$myLog
# can't assign to a variable directly
$myLogs = get-eventLog security | ... | firstElements 3
# pass the number of elements and the varName
get-eventLog security | ... | firstElements 3 myLogs
$myLogs
####################################
get-eventLog security | % {
if ($_.timegenerated -lt (date 11.09.08) -and`
$_.timegenerated -gt (date 11.01.08)) {$log1 = $_; break}
}
#
$log1
Another option would be to use the -file parameter on a switch statement. Using -file will read the file one line at a time, and you can use break to exit immediately without reading the rest of the file.
switch -file $someFile {
# Parse current line for later matches.
{ $script:line = [DateTime]$_ } { }
# If less than min date, keep looking.
{ $line -lt $minDate } { Write-Host "skipping: $line"; continue }
# If greater than max date, stop checking.
{ $line -gt $maxDate } { Write-Host "stopping: $line"; break }
# Otherwise, date is between min and max.
default { Write-Host "match: $line" }
}