PowerShell Test-Path not working in scriptblock - powershell

I have the following PowerShell code that tests if a file exists within a script block:
$scriptblock =
{
Param($filename)
return "Scriptblock filename $filename Exists? -> $(Test-Path $filename)"
}
$myFilename = "MyFile.xml"
Write-Host "Main filename $myFilename Exists? -> $(Test-Path $myFilename)"
$job = Start-Job -Name "myJob" -ScriptBlock $scriptBlock -ArgumentList $myFilename
$result = Receive-Job -Name "myJob"
Write-Host $result
When I run it I get the following output indicating the file exists in the main execution but not in the script block.
Main filename MyFile.xml Exists? -> True
Scriptblock filename MyFile.xml Exists? -> False
Can someone please indicate what is needed to test for file existence in a script block?
Thanks!

As a best practice, you should probably be including the full path to the file you want tested, rather than relying on the current directory (which can vary if you run the script under a different user context).
$scriptblock = {
param($filename)
"Scriptblock filename $filename Exists? -> $(Test-Path $filename)"
}
$myFilename = "C:\Temp\MyFile.xml"
"Main filename $myFilename Exists? -> $(Test-Path $myFilename)"
Start-Job -Name "myJob" -ScriptBlock $scriptBlock -ArgumentList $myFilename
Receive-Job -Name "myJob"

Related

Specify NON-hardcoded path and filename in a scriptblock?

Ok I feel spechul for even asking, but I have tried several iterations of this and nothing has worked except hard coding the scriptname in the Scriptblock statement, which is unacceptable.
Here is the code that works, hard coded, unacceptable....
$Scriptblock = { C:\Scripts\Path1\ScriptName.ps1 -arguement0 $args[0] -arguement1 $args[1] }
Start-Job -ScriptBlock $Scriptblock -ArgumentList $argue0, $argue1 | Out-Null
Ive tried this, and it doesn't work...
$loc = (Get-Location).Path
Set-Location -Path $loc
And this....
$rootpath = $MyInvocation.MyCommand.Path.Substring(0, ($MyInvocation.MyCommand.Path).LastIndexOf("\"))
Set-Location -Path $rootpath
And this....
$rootpath = $MyInvocation.MyCommand.Path.Substring(0, ($MyInvocation.MyCommand.Path).LastIndexOf("\"))
$scriptFilename = $([string]::Format("{0}\ScriptName.ps1", $rootpath))
$sb = $([string]::Format("{0} -arguement0 $args[0] -arguement1 $args[1]", $scriptFilename))
$Scriptblock = { $sb }
Start-Job -ScriptBlock $Scriptblock -ArgumentList $argue0, $argue1 | Out-Null
Nothing else has worked except the first code above with hardcoded path and script name - I know it has to be something stupid I am missing - help me fix stoopid please! ;-)
In your last example, this line:
$ScriptBlock = { $sb }
simply creates a scriptblock with a string inside it. Change it to:
$ScriptBlock = [scriptblock]::Create($sb)

Waiting for copy process to finish

Is there any way to wait for the copy process to finish before running another command?
I tried Start-job and Wait-Job, but it doesn't work.
$func = {
function move-tozip
{
param([string]$filedest)
$Shell = New-Object -com Shell.Application
$b = $shell.namespace($zippath.ToString())
$b.CopyHere($filedest.tostring())
#Remove-Item -Path $filedest
}
}
start-job -InitializationScript $func -ScriptBlock {move-tozip $args[0]} -ArgumentList $file
The easiest way to wait for a job to complete is to give it a name and tell Wait-Job to wait on the task with that name, your script will wait for the job with the name WaitForMe to complete and then run the rest of your code once it has.
Using the -Name paramter with your code below:
$func =
{
function Move-ToZip
{
Param([string[]]$path, [string]$zipfile)
if (-not $zipfile.EndsWith('.zip')) {$zipfile += '.zip'}
if (-not (test-path $zipfile))
{
set-content $zipfile ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
}
$shell = (new-object -com shell.application).NameSpace($zipfile)
foreach($file in $path)
{
$shell.CopyHere($file)
start-sleep -milliseconds 100
}
}
}
Start-Job -Name "WaitForMe" -InitializationScript $func -ScriptBlock {Move-ToZip -path $args[0] -zipfile $args[1]} -ArgumentList "D:\data.log", "D:\datazip.zip"
Write-Host "Waiting for job to complete"
Wait-Job -Name "WaitForMe"
Write-Host "Job has completed :D"
To zip one file or folder
-ArgumentList "D:\testfile.log", "D:\datazip.zip"
To zip multiple files or folders
-ArgumentList #("D:\testfile.log","D:\testFolder1"), "D:\testzip.zip"
EDIT 17/12/2015
I've adapted code from This MSDN blog to the Move-ToZip function as the previous code didnt work for me at all, i've tested the above code successfully on files and folders. I have not tested the performance of this method, if you wish to compress/zip multiple large files/folders i would highly suggest looking into using a known working library or third party utility like 7zip.

Powershell: Call functions outside scriptblock

I was reading this post about getting functions passed into a scriptblock for use with jobs:
Powershell start-job -scriptblock cannot recognize the function defined in the same file?
I get how that works by passing the function in as variable and it works for the simple example. What about a real world solution though, is there a more elegant way of handling this?
I have script I'm using to deploy changes to vendor software. It reads an xml that tells it how to navigate the environment and performs the various tasks, ie: map drives, stop services, call a perl installation script. I would like to provide a parameter to the script to allow it to run concurrently, this way if the perl script takes 5 minutes (not uncommon) and you're rolling out to 11 servers you're not waiting for the script to run for an hour.
I'm just going to post some snippets since the full script is a little lengthy. A log function:
function Log
{
Param(
[parameter(ValueFromPipeline=$true)]
$InputObject,
[parameter()]
[alias("v")]
$verbosity = $debug
)
$messageIndex = [array]::IndexOf($verbosityArray, $verbosity)
$verbosityIndex = [array]::IndexOf($verbosityArray, $loggingVerbosity)
if($messageIndex -ge $VerbosityIndex)
{
switch($verbosity)
{
$debug {Write-Host $verbosity ": " $InputObject}
$info {Write-Host $verbosity ": " $InputObject}
$warn {Write-Host $verbosity ": " $InputObject -ForegroundColor yellow}
$error {Write-Host $verbosity ": " $InputObject -ForegroundColor red}
}
}
}
Here's another function that calls the log function:
function ExecuteRollout
{
param(
[parameter(Mandatory=$true)]
[alias("ses")]
$session,
[parameter(Mandatory=$true)]
$command
)
#invoke command
Invoke-Command -session $session -ScriptBlock {$res = cmd /v /k `"$args[0]`"} -args $command
#get the return code from the remote session
$res = Invoke-Command -session $session {$res}
Log ("Command Output: "+$res)
$res = [string] $res
$exitCode = $res.substring($res.IndexOf("ExitCode:"), 10)
$exitCode = $exitCode.substring(9,1)
Log ("Exit code: "+$exitCode)
return $exitCode
}
And lastly a snippet from my main so you can get an idea of what's going on. $target.Destinations.Destination will contain all the servers and relevant information about them that the deployment will go to. I removed some variable setup and logging to make this more compact so yes you'll see variables referenced that are never defined:
#Execute concurrently
$target.Destinations.Destination | %{
$ScriptBlock = {
$destination = $args[0]
Log -v $info ("Starting remote session on: "+$destination.Server)
$session = New-PSSession -computerName $destination.Server
$InitializeRemote -session $session -destination $destination
#Gets a little tricky here, we need to keep the cmd session so it doesn't lose the sys vars set by env.bat
#String everything together with &'s
$cmdString = $destDrive + ": & call "+$lesDestDir+"data\env.bat & cd "+$rolloutDir+" & perl ..\JDH-rollout-2010.pl "+$rollout+" NC,r:\les & echo ExitCode:!errorlevel!"
Log ("cmdString: "+$cmdString)
Log -v $info ("Please wait, executing the rollout now...")
$exitCode = $ExecuteRollout -session $session -command $cmdString
Log ("ExitCode: "+$exitCode)
#respond to return code from rollout script
$HandleExitCode -session $session -destination $destination -exitCode $exitCode
$CleanUpRemote -session $session -destination $destination
}
Start-Job $ScriptBlock -Args $_
}
So if i go with the approach in the link I'd be converting all my functions to variables and passing them in to the script block. Currently, my log function will by default log in DEBUG unless the verbosity parameter is explicitly passed as a different verbosity. If I convert my functins to variables however powershell doesn't seem to like this syntax:
$Log ("Print this to the log")
So I think I'd need to use the parameter all the time now:
$Log ("Print this to the log" -v $debug
So bottom line it looks like I just need to pass all my functions as variables to the script block and change some formatting when I call them. It's not a huge effort, but I'd like to know if there's a better way before I start hacking my script up. Thanks for the input and for looking, I know this is quite a long post.
I started another post about passing parameters to functions stored as variables, the answer to that also resolves this issue. That post can be found here:
Powershell: passing parameters to functions stored in variables
The short answer is you can use the initializationscript parameter of Start-Job to feed all your functions in if you wrap them in a block and store that in a variable.
Example:
# concurrency
$func = {
function Logx
{
param(
[parameter(ValueFromPipeline=$true)]
$msg
)
Write-Host ("OUT:"+$msg)
}
}
# Execution starts here
cls
$colors = #("red","blue","green")
$colors | %{
$scriptBlock =
{
Logx $args[0]
Start-Sleep 9
}
Write-Host "Processing: " $_
Start-Job -InitializationScript $func -scriptblock $scriptBlock -args $_
}
Get-Job
while(Get-Job -State "Running")
{
write-host "Running..."
Start-Sleep 2
}
# Output
Get-Job | Receive-Job
# Cleanup jobs
Remove-Job *

How to solve the error while invoking function through Start-Job?

I am having a file named "build.ps1" where there is a function called "Execute-build" available.
I am calling that function from another file named "Dailybuild.ps1" like below.
. ./Build.ps1
# starting different jobs (parallel processing)
$job1 = Start-Job { Execute-Build "List.txt" }
$job2 = Start-Job { Execute-Build "List2.txt" }
# synchronizing all jobs, waiting for all to be done
Wait-Job $job1, $job2
# receiving all results
Receive-Job $job1, $job2
# cleanup
Remove-Job $job1, $job2
But i am receiving error like follows
Receive-Job : The term 'Execute-Build' is not recognized as the name
of a cmdle t, function, script file, or operable program. Check the
spelling of the name, or if a path was included, verify that the path
is correct and try again.
Why this error occurs and how to resolve this?
The dot sourced code will not be available in the background job.
One way to solve this is to dot source Build.ps1 in the background job like this:
$job1 = Start-Job {
. "C:\Path\To\Build.ps1"
Execute-Build "List.txt"
}
You can also pass the path as a parameter like this:
$path = (Resolve-Path ./Build.ps1).Path
$job1 = Start-Job {
param ($ScriptPath)
. "$ScriptPath"
Execute-Build "List.txt"
} -ArgumentList $path
Start-Job open a new instance of PowerShell.exe which doesn't have your Execute-Build function. You need to include it in the script block and then call it or use -InitializationScript parameter:
$a = { function myfunction {return "whatever!"} }
$job = Start-Job {myfunction} -InitializationScript $a
Get-Job

Background Job in Powershell

I'm trying to run a job in a background which is a .exe with parameters and the destination has spaces. For example:
$exec = "C:\Program Files\foo.exe"
and I want to run this with parameters:
foo.exe /param1 /param2, etc.
I know that Start-Job does this but I've tried tons of different combinations and it either gives me an error because of the white space or because of the parameters. Can someone help me out with the syntax here? I need to assume that $exec is the path of the executable because it is part of a configuration file and could change later on.
One way to do this is use a script block with a param block.
If there is a single argument with a space in it such as a file/folder path it should be quoted to treat it as a single item. The arguments are an array passed to the script block.
This example uses a script block but you can also use a PowerShell script using the -FilePath parameter of the Start-Job cmdlet instead of the -ScriptBlock parameter.
Here is another example that has arguments with spaces:
$scriptBlock = {
param (
[string] $Source,
[string] $Destination
)
$output = & xcopy $Source $Destination 2>&1
return $output
}
$job = Start-Job -scriptblock $scriptBlock -ArgumentList 'C:\My Folder', 'C:\My Folder 2'
Wait-Job $job
Receive-Job $job
Here is an example using the $args built-in variable instead of the param block.
$scriptBlock = {
$output = & xcopy $args 2>&1
return $output
}
$path1 = "C:\My Folder"
$path2 = "C:\My Folder 2"
"hello world" | Out-File -FilePath "$path1\file.txt"
$job = Start-Job -scriptblock $scriptBlock -ArgumentList $path1, $path2
Wait-Job $job
Receive-Job $job
Andy's trick generally works very well. If you have parameter sets, or otherwise want to move complex information into the job, you can also try this technique:
$jobArgs = #{Source="foo"; Destination="bar"}
$jobArgs |Export-CliXml -Path $env:\Temp\MyArgs.clixml
and in the job...
Start-Job {
.... $jobArgs = Import-CliXml -Path $env:\Temp\MyArgs.clixml
} | Wait-Job | Receive-Job
I use both approaches routinely.
I use -ArgumentList / ScriptBlock parameters when:
I am not dealing with parameter sets
I'm using an In-Memory Job (like the -AsJob capability in ShowUI or WPK) where the arguments are real objects, and they cannot die
I'm running in a user context where I can run a job, but can't store to disk (web servers, ISO compliant labs, etc)
If I need complex arguments, and they don't need to be passed in memory (or its otherwise convenient to have them on disk later), I'll use the hashtable approach.
Hope this Helps