I am currently trying to import a .psm1 file dynamically into a script block to execute it.
I am using parallelisation along with jobs as I need to trigger several modules simultaneously as different users.
This is the code:
$tasksToRun | ForEach-Object -Parallel {
$ScriptBlock = {
param ($scriptName, $Logger, $GlobalConfig, $scriptsRootFolder )
Write-Output ("hello $($scriptsRootFolder)\tasks\$($scriptName)")
Import-Module ("$($scriptsRootFolder)\tasks\$($scriptName)")
& $scriptName -Logger $Logger -GlobalConfig $GlobalConfig
}
$job = Start-Job -scriptblock $ScriptBlock `
-credential $Cred -Name $_ `
-ArgumentList ($_, $using:Logger, $using:globalConfig, $using:scriptsRootFolder) `
Write-Host ("Running task $_")
$job | Wait-job -Timeout $using:timeout
if ($job.State -eq 'Running') {
# Job is still running, stop it
$job.StopJob()
Write-Host "Stopped $($job.Name) task as it took too long"
}
else {
# Job completed normally, get the results
$job | Receive-Job
Write-Host "Finished task $($job.Name)"
}
}
The logger variable is a hashtable as defined here:
$Logger = #{
generalLog = $function:Logger
certificateLog = $function:LoggerCertificate
alertLog = $function:LoggerAlert
endpointServiceLog = $function:LoggerEndpointService
}
Currently, it is erroring with the following:
ObjectNotFound: The term
' blah blah blah, this is the code straight from the logger function '
is not recognized as the name of a cmdlet, function, script file, or operable program.
Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
The logger function servers the purpose of logging to a file in a specific way, it is generalised to that it can be used across many tasks.
A cut down example of a logger (probably won't compile, just deleted a bunch of lines to give you the general idea):
function LoggerEndpointService {
param (
# The full service name.
[string]$ServiceFullName,
# The unique identifier of the service assigned by the operating system.
[string]$ServiceId,
# The description of the service.
[string]$Description,
# The friendly service name.
[string]$ServiceFriendlyName,
# The start mode for the service. (disabled, manual, auto)
[string]$StartMode,
# The status of the service. (critical, started, stopped, warning)
[string]$Status,
# The user account associated with the service.
[string]$User,
# The vendor and product name of the Endpoint solution that reported the event, such as Carbon Black Cb Response.
[string]$VendorProduct
)
$ServiceFullName = If ([string]::IsNullOrEmpty($ServiceFullName)) { "" } Else { $ServiceFullName }
$ServiceId = If ([string]::IsNullOrEmpty($ServiceId)) { "" } Else { $ServiceId }
$ServiceFriendlyName = If ([string]::IsNullOrEmpty($ServiceFriendlyName)) { "" } Else { $ServServiceFriendlyNameiceName }
$StartMode = If ([string]::IsNullOrEmpty($StartMode)) { "" } Else { $StartMode }
$Status = If ([string]::IsNullOrEmpty($Status)) { "" } Else { $Status }
$User = If ([string]::IsNullOrEmpty($User)) { "" } Else { $User }
$Description = If ([string]::IsNullOrEmpty($Description)) { "" } Else { $Description }
$VendorProduct = If ([string]::IsNullOrEmpty($VendorProduct)) { "" } Else { $VendorProduct }
$EventTimeStamp = Get-Date -Format "yyyy-MM-ddTHH:mm:ssK"
$Delay = 100
For ($i = 0; $i -lt 30; $i++) {
try {
$logLine = "{{timestamp=""{0}"" dest=""{1}"" description=""{2}"" service=""{3}"" service_id=""{4}""" `
+ "service_name=""{5}"" start_mode=""{6}"" vendor_product=""{7}"" user=""{8}"" status=""{9}""}}"
$logLine -f $EventTimeStamp, $env:ComputerName, $Description, $ServiceFullName, $ServiceId, $ServiceFriendlyName, $StartMode, $VendorProduct, $User, $Status | Add-Content $LogFile -ErrorAction Stop
break;
}
catch {
Start-Sleep -Milliseconds $Delay
}
if ($i -eq 29) {
Write-Error "Alert logger failed to log, likely due to Splunk holding the file, check eventlog for details." -ErrorAction Continue
if ([System.Diagnostics.EventLog]::SourceExists("SDOLiveScripts") -eq $False) {
Write-Host "Doesn't exist"
New-EventLog -LogName Application -Source "SDOLiveScripts"
}
Write-EventLog -LogName "Application" -Source "SDOLiveScripts" `
-EventID 1337 `
-EntryType Error `
-Message "Failed to log to file $_.Exception.InnerException.Message" `
-ErrorAction Continue
}
}
}
Export-ModuleMember -Function LoggerEndpointService
If anyone could help that'd be great, thank you!
As mentioned in the comments, PowerShell Jobs execute in separate processes and you can't share live objects across process boundaries.
By the time the job executes, $Logger.generalLog is no longer a reference to the scriptblock registered as the Logger function in the calling process - it's just a string, containing the definition of the source function.
You can re-create it from the source code:
$actualLogger = [scriptblock]::Create($Logger.generalLog)
or, in your case, to recreate all of them:
#($Logger.Keys) |ForEach-Object { $Logger[$_] = [scriptblock]::Create($Logger[$_]) }
This will only work if the logging functions are completely independent of their environment - any references to variables in the calling scope or belonging to the source module will fail to resolve!
Related
I have a function which requires the user to enter Y or N to delete a file and I'm trying to run this whole function as a start-job function. My code is:
$JobFunction2 = {
function init {
try {
$output = terraform init
$path = Get-Item -Path ".\"
$in = $output | Select-String "Terraform has been successfully initialized!"
if ($in) {
Write-Host -ForegroundColor GREEN 'Initialization successfull'
}
else {
Write-Host -ForegroundColor YELLOW 'Intiialization failed Please enter Y to delete the .terraform folder'
$fail = del $path/.terraform -Force
$output = terraform init
return $fail, $output
}
}
catch {
Write-Warning "Error Occurred while Initializing the folder and the path is: $path"
}
}
}
Start-Job -InitializationScript $JobFunction2 -ScriptBlock { init } | Wait-Job | Receive-Job
When I'm running the start-job it shows "Wait-Job cmdlet cannot finish working, because one or more jobs are blocked waiting for user interaction". But if I'm calling only the function name 'init' without starting a job it works perfectly. Is there any way I could prompt the user to input Y or No so that the function could work in start-job?
Not sure what is setting your job's State to Blocked, but if the question is: Is there any way I could prompt the user to input Y or N so that the function could work in Start-Job?
The answer is yes, below you can see an example of how it can be implemented, note that I'm using a while loop instead of Wait-Job.
According to the documentation, the cmdlet has a -Sate parameter that takes a <JobState> argument but I couldn't get it to work. Even tried:
Wait-Job -Any -Force -State Blocked
And got the same error as you, so here is the brute force solution you can use:
function init {
'Initializing Function {0}' -f $MyInvocation.MyCommand.Name
do
{
$choice = Read-Host 'Waiting for user input. Continue? [Y/N]'
if($choice -notmatch '^y|n$')
{
'Invalid input.'
}
}
until ($choice -match '^y|n$')
if($choice -eq 'y')
{
'Continue script...'
}
else
{
'Abort script...'
}
}
$funcDef = [scriptblock]::Create("function init { $function:init }")
$job = Start-Job -InitializationScript $funcDef -ScriptBlock { init }
$desiredState = 'Stopped', 'Completed', 'Failed'
while($job.State -notin $desiredState)
{
if($job.State -eq 'Blocked')
{
Receive-Job $job
}
Start-Sleep -Milliseconds 500
}
$job | Receive-Job -Wait -AutoRemoveJob
$fail = Remove-Item $path/.terraform -Force -Recurse -Confirm:$false
this helped me to solve my issue
This is my first program in powershell, Im trying to get from the user input and then pinging the IP address or the hostname, Creating text file on the desktop.
But if the user wants the add more than one IP I get into infinite loop.
Here Im asking for IP address.
$dirPath = "C:\Users\$env:UserName\Desktop"
function getUserInput()
{
$ipsArray = #()
$response = 'y'
while($response -ne 'n')
{
$choice = Read-Host '
======================================================================
======================================================================
Please enter HOSTNAME or IP Address, enter n to stop adding'
$ipsArray += $choice
$response = Read-Host 'Do you want to add more? (y\n)'
}
ForEach($ip in $ipsArray)
{
createFile($ip)
startPing($ip)
}
}
Then I creating the file for each IP address:
function createFile($ip)
{
$textPath = "$($dirPath)\$($ip).txt"
if(!(Test-Path -Path $textPath))
{
New-Item -Path $dirPath -Name "$ip.txt" -ItemType "file"
}
}
And now you can see the problem, Because I want the write with TIME format, I have problem with the ForEach loop, When I start to ping, And I cant reach the next element in the array until I stop
the cmd.exe.
function startPing($ip)
{
ping.exe $ip -t | foreach {"{0} - {1}" -f (Get-Date), $_
} >> $dirPath\$ip.txt
}
Maybe I should create other files ForEach IP address and pass params?
Here's a old script I have. You can watch a list of computers in a window.
# pinger.ps1
# example: pinger yahoo.com
# pinger c001,c002,c003
# $list = cat list.txt; pinger $list
param ($hostnames)
#$pingcmd = 'test-netconnection -port 515'
$pingcmd = 'test-connection'
$sleeptime = 1
$sawup = #{}
$sawdown = #{}
foreach ($hostname in $hostnames) {
$sawup[$hostname] = $false
$sawdown[$hostname] = $false
}
#$sawup = 0
#$sawdown = 0
while ($true) {
# if (invoke-expression "$pingcmd $($hostname)") {
foreach ($hostname in $hostnames) {
if (& $pingcmd -count 1 $hostname -ea 0) {
if (! $sawup[$hostname]) {
echo "$([console]::beep(500,300))$hostname is up $(get-date)"
$sawup[$hostname] = $true
$sawdown[$hostname] = $false
}
} else {
if (! $sawdown[$hostname]) {
echo "$([console]::beep(500,300))$hostname is down $(get-date)"
$sawdown[$hostname] = $true
$sawup[$hostname] = $false
}
}
}
sleep $sleeptime
}
pinger microsoft.com,yahoo.com
microsoft.com is down 11/08/2020 17:54:54
yahoo.com is up 11/08/2020 17:54:55
Have a look at PowerShell Jobs. Note that there are better and faster alternatives (like thread jobs, runspaces, etc), but for a beginner, this would be the easiest way. Basically, it starts a new PowerShell process.
A very simple example:
function startPing($ip) {
Start-Job -ScriptBlock {
param ($Address, $Path)
ping.exe $Address -t | foreach {"{0} - {1}" -f (Get-Date), $_ } >> $Path
} -ArgumentList $ip, $dirPath\$ip.txt
}
This simplified example does not take care of stopping the jobs. So depending on what behavior you want, you should look that up.
Also, note there there is also PowerShell's equivalent to ping, Test-Connection
Trying to user start-parallel module but receiving not output. While debugging I could see that it goes to that line but never enter function that I have provided
I have tried using a script block and as a function as well. Went through example provided for Start-parallel module but no result. Be is a script block or a function it showcases a bar that executing two tasks but nothing gets executed
Function avolume{
param ($SCID_server_profile)
#foreach ($SCID_server_profile in $SCID_JSON.serverProfile)
#{
if ($SCID_server_profile.spdefName -like "spdef_workload*")
{
#---------------------------------
#Getting Server Profile Details
#---------------------------------
$SCID_frame_number = $SCID_server_profile.synergyFrame
$SCID_bay_number = $SCID_server_profile.bayNumber
$OV_server_profile_uri = (Get-HPOVServer -Name "$SCID_frame_number, Bay $SCID_bay_number").serverProfileUri
if ($OV_server_profile_uri)
{
$OV_server_profile = Send-HPOVRequest -uri $OV_server_profile_uri
$result = Add-SanVolume -ServerProfile $SCID_server_profile -serverprofileobj $OV_server_profile
Report-Row -reportwriter $reportwriter -category $result.category -status $result.status -description $result.description
if($result.status -eq "fail")
{
$Script:status = $false
}
else{$Successful_Configs += 1}
}
else
{
Report-Row -reportwriter $reportwriter -category "Update Server Profile" -status "Fail" -description "Server Profile does not exist for $SCID_frame_number, $SCID_frame_number"
$Script:status = $false
}
}
#}
}
#$SCID_JSON.serverProfile | Start-Parallel -ScriptBlock $Script
#Start-Parallel -InputObject $SCID_JSON.serverProfile -ScriptBlock $Script
$SCID_JSON.serverProfile | Start-Parallel -Scriptblock ${Function:\avolume}
I am expecting that whatever is in function should get executed in parallel for values that I am providing
I have a few functions that get called either from Jenkins as part of a pipeline, they also get called from a pester test or lastly they can get called from the powershell console. The issue I have really stems from Jenkins not seeming to handle write-output in the way I think it should.
So what I am doing is creating a Boolean param that will allow my to choose if I terminate my function with a exit code or a return message. The exit code will be used by my pipeline logic and the return message for the rest ?
Is there a alternate approach I should be using this seems to be a bit of a hack.
function Get-ServerPowerState
{
[CmdletBinding()]
param
(
[string[]]$ilo_ip,
[ValidateSet('ON', 'OFF')]
[string]$Status,
[boolean]$fail
)
BEGIN
{
$here = Split-Path -Parent $Script:MyInvocation.MyCommand.Path
$Credentials = IMPORT-CLIXML "$($here)\Lib\iLOCred.xml"
}
PROCESS
{
foreach ($ip in $ilo_ip)
{
New-LogEntry -Message ("Getting current powerstate " + $ip)
If (Test-Connection -ComputerName $ip.ToString() -Count 1 -Quiet)
{
$hostPower = Get-HPiLOhostpower -Server $ip -Credential
$Credentials -DisableCertificateAuthentication
}
}
}
END
{
If($fail){
New-LogEntry -Message "Script been set to fail with exit code" -Log Verbose
New-LogEntry -Message "The host is powered - $($HostPower.Host_Power)" -Log Verbose
If($hostPower.HOST_POWER -match $Status)
{
Exit 0
}
else {
Exit 1
}
}
else {
New-LogEntry -Message "Script been set to NOT fail with exit code" -Log Verbose
New-LogEntry -Message "The host is powered - $($HostPower.Host_Power)" -Log Verbose
If($hostPower.HOST_POWER -match $Status)
{
return 0
}
else {
return 1
}
}
}
}
Like this
function Get-Output {
param ([switch]$asint)
if ($asint) {
return 1
}
else {
write-output 'one'
}
}
Get-Output
Get-Output -asint
If you intend to use the output in the pipeline then use Write-Output. If you intend to only send it to the host process then use Write-Host. I typically use the return keyword if I want to assign a return value to a variable.
[int]$result = Get-Output -asint
This lambda function executes as expected:
$WriteServerName = {
param($server)
Write-Host $server
}
$server = "servername"
$WriteServerName.invoke($server)
servername
However, using the same syntax, the following script prompts for credentials and then exits to the command line (running like this: .\ScriptName.ps1 -ConfigFile Chef.config), implying that the lambda functions aren't executing properly (for testing, each should just output the server name).
Why does the former lambda function return the server name, but the ones in the script don't?
Param(
$ConfigFile
)
Function Main {
#Pre-reqs: get credential, load config from file, and define lambda functions.
$jobs = #()
$Credential = Get-Credential
$Username = $Credential.username
$ConvertedPassword = [System.Runtime.InteropServices.Marshal]::SecureStringToBSTR($Credential.password)
$Password = [System.Runtime.InteropServices.Marshal]::PtrToStringAuto($ConvertedPassword)
$Config = Get-Content $ConfigFile -Raw | Out-String | Invoke-Expression
#Define lambda functions
$BootStrap = {
param($Server)
write-host $server
}
$RunChefClient = {
param($Server)
write-host $server
}
$SetEnvironment = {
param($Server)
write-host $server
}
#Create bootstrap job for each server and pass lambda functions to Scriptblock for execution.
if(($Username -ne $null) -and ($Password -ne $null))
{
ForEach($HashTable in $Config)
{
$Server = $HashTable.Server
$Roles = $HashTable.Roles
$Tags = $HashTable.Tags
$Environment = $HashTable.Environment
$ScriptBlock = {
param ($Server,$BootStrap,$RunChefClient,$SetEnvironment)
$BootStrap.invoke($Server)
$RunChefClient.invoke($Server)
$SetEnvironment.invoke($Server)
}
$Jobs += Start-Job -ScriptBlock $ScriptBlock -ArgumentList #($Server,$BootStrap,$RunChefClient,$SetEnvironment)
}
}
else {Write-Host "Username or password is missing, exiting..." -ForegroundColor Red; exit}
}
Main
Without testing, I am going to go ahead and say it's because you are putting your scriptblock executions in PowerShell Jobs and then not doing anything with them. When you start a job, it starts a new PowerShell instance and executes the code you give it along with the parameters you give it. Once it completes, the completed PSRemotingJob object sits there and does nothing until you actually do something with it.
In your code, all the jobs you start are assigned to the $Jobs variable. You can also get all your running jobs with Get-Job:
Get-Job -State Running
If you want to get any of the data returned from your jobs, you'll have to use Receive-Job
# Either
$Jobs | Receive-Job
# Or
Get-Job -State Running | Receive-Job