I am able to execute powershell script on machine but not able to do it using jenkins powershell plugin
My powershell script executes another program's UI (QlikView) and then closes it it works when I execute script directly on machine. But when I do the same using jenkins powershell plugin it does not work the execution goes on for infinite time.
[CmdletBinding()]
param (
$FullQvwPath
)
function qv-SaveAndClose-QVW
{
param(
[Parameter(Mandatory=$true,ValueFromPipeline=$true)]
$QvwPath
)
try {
$qvComObject = new-object -comobject QlikTech.QlikView
$NewCreatedDoc = $qvComObject.CreateDoc()
$NewCreatedDoc.SaveAs($QvwPath)
$NewCreatedDoc.CloseDoc()
$qvComObject.Quit()
}
finally {
}
}
qv-SaveAndClose-QVW -QvwPath $FullQvwPath
I have put above code in file - QlikSaveAndClose.ps1
.\QlikSaveAndClose.ps1 -FullQvwPath 'C:\Program Files (x86)\Jenkins\Dashboard.qvw
Could it be that the file already exists? In that case SaveAs is prompting to overwrite the file. So, remove it first. Also place the Quit in the finally so the comobject is always closed, even on errors. And while we're at it, only use approved Verb-Noun names for your cmdlets:
[CmdletBinding()]
param (
[Parameter(Mandatory=$true)]
[String] $FullQvwPath
)
function Save-QVW
{
param (
[Parameter(Mandatory=$true,ValueFromPipeline=$true)]
[String] $Path
)
$qvComObject = New-Object -ComObject "QlikTech.QlikView"
try
{
$newCreatedDoc = $qvComObject.CreateDoc()
if (Test-Path -Path $Path)
{
Remove-Item -Path $Path -Force
}
$newCreatedDoc.SaveAs($Path)
$newCreatedDoc.CloseDoc()
}
finally
{
$qvComObject.Quit()
}
}
Save-QVW -Path $FullQvwPath
Related
Summary
I would like to use PowerShell to retrieve an image from the Windows clipboard, and save it to a file.
When I copy an image to the clipboard, from a web browser for example, I am able to retrieve a System.Windows.Interop.InteropBitmap object.
Once I have this object, I need to save the object to disk.
Question: How can I accomplish this?
Add-Type -AssemblyName PresentationCore
$DataObject = [System.Windows.Clipboard]::GetDataObject()
$DataObject | Get-Member
All credits goes to this answer, this is merely a PowerShell adaptation of that code.
The code consists in 2 functions, one that outputs the InteropBitmap instance of the image in our clipboard and the other function that receives this object from the pipeline and outputs it to a file. The Set-ClipboardImage function can output using 3 different Encoders, you can add more if needed. See System.Windows.Media.Imaging Namespace for details.
These functions are tested and compatible with Windows PowerShell 5.1 and PowerShell Core. I do not know if this can or will work in prior versions.
using namespace System.Windows
using namespace System.Windows.Media.Imaging
using namespace System.IO
using namespace System.Windows.Interop
Add-Type -AssemblyName PresentationCore
function Get-ClipboardImage {
$image = [Clipboard]::GetDataObject().GetImage()
do {
Start-Sleep -Milliseconds 200
$downloading = $image.IsDownloading
} while($downloading)
$image
}
function Set-ClipboardImage {
[cmdletbinding()]
param(
[Parameter(Mandatory)]
[ValidateScript({
if(Test-Path (Split-Path $_)) {
return $true
}
throw [ArgumentException]::new(
'Destination folder does not exist.', 'Destination'
)
})]
[string] $Destination,
[Parameter(Mandatory, ValueFromPipeline, DontShow)]
[InteropBitmap] $InputObject,
[Parameter()]
[ValidateSet('Png', 'Bmp', 'Jpeg')]
[string] $Encoding = 'Png'
)
end {
try {
$Destination = $PSCmdlet.GetUnresolvedProviderPathFromPSPath($Destination)
$encoder = switch($Encoding) {
Png { [PngBitmapEncoder]::new(); break }
Bmp { [BmpBitmapEncoder]::new(); break }
Jpeg { [JpegBitmapEncoder]::new() }
}
$fs = [File]::Create($Destination)
$encoder.Frames.Add([BitmapFrame]::Create($InputObject))
$encoder.Save($fs)
}
catch {
$PSCmdlet.WriteError($_)
}
finally {
$fs.ForEach('Dispose')
}
}
}
Get-ClipboardImage | Set-ClipboardImage -Destination .\test.jpg -Encoding Jpeg
I am triying to get some dynamic validatesets in my powershell script like
MyScript.ps1 -Parameter1 DynamicAttributeDefinedWithinRuntime
All help I found is for Dynamic Parameters within a PowerShell Function like
MyPowerShellFunction -Parameter1 DynamicAttributeDefinedWithinRuntime
If I create a function within the script with the Dynamic Parameters can I get these to be used at the CLI when the script file is executed from a powershell window?
Here's a great tutorial on Dynamic ValidateSet parameters:
https://vexx32.github.io/2018/11/29/Dynamic-ValidateSet/
One method from that site is using a 2nd function:
function Get-ValidValues {
[CmdletBinding()]
param($Path)
(Get-ChildItem -Path $Path -File).Name
}
function Clear-FileInCurrentLocation {
<#
.synopsis
Dynamic ValidateSet from: https://vexx32.github.io/2018/11/29/Dynamic-ValidateSet/
#>
[CmdletBinding(SupportsShouldProcess, ConfirmImpact = 'High')]
param(
[Parameter(Position = 0, Mandatory)]
[ArgumentCompleter(
{
param($Command, $Parameter, $WordToComplete, $CommandAst, $FakeBoundParams)
Get-ValidValues -Path (Get-Location)
}
)]
[ValidateScript(
{
$_ -in (Get-ValidValues -Path (Get-Location))
}
)]
[string]
$Path
)
Clear-Content -Path $Path
}
I have a .zip containing an installer (setup.exe and associated files).
How can I run setup.exe in a PowerShell script without extracting the zip?
Also, I need to pass command line parameters to setup.exe.
I tried
& 'C:\myzip.zip\setup.exe'
but I get an error
... not recognized as the name of a cmdlet, function, script file, or operable program.
This opens the exe:
explorer 'C:\myzip.zip\setup.exe'
but I cannot pass parameters.
What you're asking is not possible. You must extract the zip file in order to be able to run the executable. The explorer statement only works because the Windows Explorer does the extraction transparently in the background.
What you can do is write a custom function to encapsulate extraction, invocation, and cleanup.
function Invoke-Installer {
Param(
[Parameter(Mandatory=$true)]
[ValidateScript({Test-Path -LiteralPath $_})]
[string[]]$Path,
[Parameter(Manatory=$false)]
[string[]]$ArgumentList = #()
)
Begin {
Add-Type -Assembly System.IO.Compression.FileSystem
}
Process {
$Path | ForEach-Object {
$zip, $exe = $_ -split '(?<=\.zip)\\+', 2
if (-not $exe) { throw "Invalid installer path: ${_}" }
$tempdir = Join-Path $env:TEMP [IO.File]::GetFileName($zip)
[IO.Compression.ZipFile]::ExtractToDirectory($zip, $tempdir)
$installer = Join-Path $tempdir $exe
& $installer #ArgumentList
Remove-Item $tempdir -Recurse -Force
}
}
}
Invoke-Installer 'C:\myzip.zip\setup.exe' 'arg1', 'arg2', ...
Note that this requires .Net Framework v4.5 or newer.
I have a few functions stored in a .psm1 file that is used by several different ps1 scripts. I created a logging function (shown below), which I use throughout these ps1 scripts. Typically, I import the module within a script by simply calling something like:
Import-Module $PSScriptRoot\Module_Name.psm1
Then within the module, I have a write logger function:
Write-Logger -Message "Insert log message here." #logParams
This function is used throughout the main script and the module itself. The splatting parameter #logParams is defined in my main .ps1 file and is not explicitly passed to the module, I suppose the variables are implicitly within the module's scope upon being imported. What I have works, but I feel like it isn't a great practice. Would it be better practice to add a param block within my module to require #logParams to be explicitly passed from the main .ps1 script? Thanks!
function Write-Logger() {
[cmdletbinding()]
Param (
[Parameter(Mandatory=$true)]
[string]$Path,
[Parameter(Mandatory=$true)]
[string]$Message,
[Parameter(Mandatory=$false)]
[string]$FileName = "Scheduled_IDX_Backup_Transcript",
[switch]$Warning,
[switch]$Error
)
# Creates new log directory if it does not exist
if (-Not (Test-Path ($path))) {
New-Item ($path) -type directory | Out-Null
}
if ($error) {
$label = "Error"
}
elseif ($warning) {
$label = "Warning"
}
else {
$label = "Normal"
}
# Mutex allows for writing to single log file from multiple runspaces
$mutex = new-object System.Threading.Mutex $false,'MutexTest'
[void]$mutex.WaitOne()
Write-Host "$(Format-LogTimeStamp) $label`: $message"
"$(Format-LogTimeStamp) $label`: $message" | Out-file "$path\$fileName.log" -encoding UTF8 -append
[void]$mutex.ReleaseMutex()
}
I have this code in ps1 that I dot source into scripts where I want to produce my own logs. The ps1 contains the simpleLogger class as well as the routine below that creates a global variable. The script can be dot sourced again and the global variable value passed to subsequently spawned jobs to maintain a single log file.
class simpleLogger
{
[string]$source
[string]$target
[string]$action
[datetime]$datetime
hidden [string]$logFile = $global:simpleLog
simpleLogger()
{
$this.datetime = Get-Date
}
simpleLogger( [string]$source, [string]$target, [string]$action )
{
$this.action = $action
$this.source = $source
$this.target = $target
$this.datetime = Get-Date
}
static [simpleLogger] log( [string]$source, [string]$target, [string]$action )
{
$newLogger = [simpleLogger]::new( [string]$source, [string]$target, [string]$action )
do {
$done = $true
try {
$newLogger | export-csv -Path $global:simpleLog -Append -NoTypeInformation
}
catch {
$done = $false
start-sleep -milliseconds $(get-random -min 1000 -max 10000)
}
} until ( $done )
return $newLogger
}
}
if( -not $LogSession ){
$global:logUser = $env:USERNAME
$global:logDir = $env:TEMP + "\"
$startLog = (get-date).tostring("MMddyyyyHHmmss")
$global:LogSessionGuid = (New-Guid)
$global:simpleLog = $script:logDir+$script:logUser+"-"+$LogSessionGuid+".log"
[simpleLogger]::new() | export-csv -Path $script:simpleLog -NoTypeInformation
$global:LogSession = [simpleLogger]::log( $script:logUser, $LogSessionGuid, 'Log init' )
}
Is it possible in Powershell to dot-source or re-use somehow script functions without it being executed? I'm trying to reuse the functions of a script, without executing the script itself. I could factor out the functions into a functions only file but I'm trying to avoid doing that.
Example dot-sourced file:
function doA
{
Write-Host "DoAMethod"
}
Write-Host "reuseme.ps1 main."
Example consuming file:
. ".\reuseme.ps1"
Write-Host "consume.ps1 main."
doA
Execution results:
reuseme.ps1 main.
consume.ps1 main.
DoAMethod
Desired result:
consume.ps1 main.
DoAMethod
You have to execute the function definitions to make them available. There is no way around it.
You could try throwing the PowerShell parser at the file and only executing function definitions and nothing else, but I guess the far easier approach would be to structure your reusable portions as modules or simply as scripts that don't do anything besides declaring functions.
For the record, a rough test script that would do exactly that:
$file = 'foo.ps1'
$tokens = #()
$errors = #()
$result = [System.Management.Automation.Language.Parser]::ParseFile($file, [ref]$tokens, [ref]$errors)
$tokens | %{$s=''; $braces = 0}{
if ($_.TokenFlags -eq 'Keyword' -and $_.Kind -eq 'Function') {
$inFunction = $true
}
if ($inFunction) { $s += $_.Text + ' ' }
if ($_.TokenFlags -eq 'ParseModeInvariant' -and $_.Kind -eq 'LCurly') {
$braces++
}
if ($_.TokenFlags -eq 'ParseModeInvariant' -and $_.Kind -eq 'RCurly') {
$braces--
if ($braces -eq 0) {
$inFunction = $false;
}
}
if (!$inFunction -and $s -ne '') {
$s
$s = ''
}
} | iex
You will have problems if functions declared in the script reference script parameters (as the parameter block of the script isn't included). And there are probably a whole host of other problems that can occur that I cannot think of right now. My best advice is still to distinguish between reusable library scripts and scripts intended to be invoked.
After your function, the line Write-Host "reuseme.ps1 main." is known as "procedure code" (i.e., it is not within the function). You can tell the script not to run this procedure code by wrapping it in an IF statement that evaluates $MyInvocation.InvocationName -ne "."
$MyInvocation.InvocationName looks at how the script was invoked and if you used the dot (.) to dot-source the script, it will ignore the procedure code. If you run/invoke the script without the dot (.) then it will execute the procedure code. Example below:
function doA
{
Write-Host "DoAMethod"
}
If ($MyInvocation.InvocationName -ne ".")
{
Write-Host "reuseme.ps1 main."
}
Thus, when you run the script normally, you will see the output. When you dot-source the script, you will NOT see the output; however, the function (but not the procedure code) will be added to the current scope.
The best way to re-use code is to put your functions in a PowerShell module. Simply create a file with all your functions and give it a .psm1 extension. You then import the module to make all your functions available. For example, reuseme.psm1:
function doA
{
Write-Host "DoAMethod"
}
Write-Host "reuseme.ps1 main."
Then, in whatever script you want to use your module of functions,
# If you're using PowerShell 2, you have to set $PSScriptRoot yourself:
# $PSScriptRoot = Split-Path -Parent -Path $MyInvocation.MyCommand.Definition
Import-Module -Name (Join-Path $PSScriptRoot reuseme.psm1 -Resolve)
doA
While looking a bit further for solutions for this issue, I came across a solution which is pretty much a followup to the hints in Aaron's answer. The intention is a bit different, but it can be used to achieve the same result.
This is what I found:
https://virtualengine.co.uk/2015/testing-private-functions-with-pester/
It needed it for some testing with Pester where I wanted to avoid changing the structure of the file before having written any tests for the logic.
It works quite well, and gives me the confidence to write some tests for the logic first, before refactoring the structure of the files so I no longer have to dot-source the functions.
Describe "SomeFunction" {
# Import the ‘SomeFunction’ function into the current scope
. (Get-FunctionDefinition –Path $scriptPath –Function SomeFunction)
It "executes the function without executing the script" {
SomeFunction | Should Be "fooBar"
}
}
And the code for Get-FunctionDefinition
#Requires -Version 3
<#
.SYNOPSIS
Retrieves a function's definition from a .ps1 file or ScriptBlock.
.DESCRIPTION
Returns a function's source definition as a Powershell ScriptBlock from an
external .ps1 file or existing ScriptBlock. This module is primarily
intended to be used to test private/nested/internal functions with Pester
by dot-sourcsing the internal function into Pester's scope.
.PARAMETER Function
The source function's name to return as a [ScriptBlock].
.PARAMETER Path
Path to a Powershell script file that contains the source function's
definition.
.PARAMETER LiteralPath
Literal path to a Powershell script file that contains the source
function's definition.
.PARAMETER ScriptBlock
A Powershell [ScriptBlock] that contains the function's definition.
.EXAMPLE
If the following functions are defined in a file named 'PrivateFunction.ps1'
function PublicFunction {
param ()
function PrivateFunction {
param ()
Write-Output 'InnerPrivate'
}
Write-Output (PrivateFunction)
}
The 'PrivateFunction' function can be tested with Pester by dot-sourcing
the required function in the either the 'Describe', 'Context' or 'It'
scopes.
Describe "PrivateFunction" {
It "tests private function" {
## Import the 'PrivateFunction' definition into the current scope.
. (Get-FunctionDefinition -Path "$here\$sut" -Function PrivateFunction)
PrivateFunction | Should BeExactly 'InnerPrivate'
}
}
.LINK
https://virtualengine.co.uk/2015/testing-private-functions-with-pester/
#>
function Get-FunctionDefinition {
[CmdletBinding(DefaultParameterSetName='Path')]
[OutputType([System.Management.Automation.ScriptBlock])]
param (
[Parameter(Position = 0,
ValueFromPipeline = $true,
ValueFromPipelineByPropertyName = $true,
ParameterSetName='Path')]
[ValidateNotNullOrEmpty()]
[Alias('PSPath','FullName')]
[System.String] $Path = (Get-Location -PSProvider FileSystem),
[Parameter(Position = 0,
ValueFromPipelineByPropertyName = $true,
ParameterSetName = 'LiteralPath')]
[ValidateNotNullOrEmpty()]
[System.String] $LiteralPath,
[Parameter(Position = 0,
ValueFromPipeline = $true,
ParameterSetName = 'ScriptBlock')]
[ValidateNotNullOrEmpty()]
[System.Management.Automation.ScriptBlock] $ScriptBlock,
[Parameter(Mandatory = $true,
Position =1,
ValueFromPipelineByPropertyName = $true)]
[Alias('Name')]
[System.String] $Function
)
begin {
if ($PSCmdlet.ParameterSetName -eq 'Path') {
$Path = $ExecutionContext.SessionState.Path.GetUnresolvedProviderPathFromPSPath($Path);
}
elseif ($PSCmdlet.ParameterSetName -eq 'LiteralPath') {
## Set $Path reference to the literal path(s)
$Path = $LiteralPath;
}
} # end begin
process {
$errors = #();
$tokens = #();
if ($PSCmdlet.ParameterSetName -eq 'ScriptBlock') {
$ast = [System.Management.Automation.Language.Parser]::ParseInput($ScriptBlock.ToString(), [ref] $tokens, [ref] $errors);
}
else {
$ast = [System.Management.Automation.Language.Parser]::ParseFile($Path, [ref] $tokens, [ref] $errors);
}
[System.Boolean] $isFunctionFound = $false;
$functions = $ast.FindAll({ $args[0] -is [System.Management.Automation.Language.FunctionDefinitionAst] }, $true);
foreach ($f in $functions) {
if ($f.Name -eq $Function) {
Write-Output ([System.Management.Automation.ScriptBlock]::Create($f.Extent.Text));
$isFunctionFound = $true;
}
} # end foreach function
if (-not $isFunctionFound) {
if ($PSCmdlet.ParameterSetName -eq 'ScriptBlock') {
$errorMessage = 'Function "{0}" not defined in script block.' -f $Function;
}
else {
$errorMessage = 'Function "{0}" not defined in "{1}".' -f $Function, $Path;
}
Write-Error -Message $errorMessage;
}
} # end process
} #end function Get-Function