I have a fairly large PowerShell script that I've broken into two separate scripts. The first script ends with a call to an async function and then exits.
The function checks on the status of an e-mail export that was generated in the previous script, performing a WHILE loop where it makes a call to Google every sixty seconds to see if the export is ready. Once the export is ready it updates a couple of SQL dbs and my second script knows to take over.
This works 100% of the time when I run the "first script" in the shell/console. But I've started noticing that when my scheduled task is triggered in Task Scheduler that... nothing happens. I have extensive logging, so I know that the parameters ARE being sent over to the async function, but it seems to just poop out rather than continue to loop through the WHILE and do the every-sixty-second checks.
I feel like I've done my due diligence in Googling here, but... is there something I'm missing with a Task Scheduler job to ensure that a function containing a WHILE loop will properly run?
EDIT BELOW
To better explain what I'm doing I will include stripped code from my function and the call to the function from the main script below.
First, the call to the function at the very end of "script_01."
# Let's send the Google Vault export information over to our function.
Try {
$VaultParameters = #{
employee_name = "$employee_name"
export_name = "$export_name"
sql_id = "$sql_id"
vault_status_id = "$vault_status_id"
}
VaultExport #VaultParameters
$LoggingParameters = #{
logfile = "C:\script_logs\log.log"
log = "INFO: Sent the Google Vault export information over to our async function."
}
EventLogging #LoggingParameters
} Catch {
$LoggingParameters = #{
logfile = "C:\script_logs\log.log"
log = "ERROR: Could not send the Google Vault export information over to our async function.`n$_"
}
EventLogging #LoggingParameters
}
And now the function itself. It is large...
function VaultExport {
param (
[cmdletbinding()]
[parameter()]
[string]$employee_name,
[parameter()]
[string]$export_name,
[parameter()]
[string]$sql_id,
[parameter()]
[string]$vault_status_id
)
$scriptBlock = {
param ($employee_name,$export_name,$sql_id,$vault_status_id)
Import-Module SimplySQL
$logfile = "C:\script_logs\log.log"
$now = (Get-Date).tostring("MM-dd-yyyy hh:mm:ss")
# Let's define our MySQL database credentials for later use.
# DEFINING SQL CREDS HERE
# Let's generate secure credentials for our MySQL 'terms' db.
# GENERATING SECURE CREDS HERE
# And now we'll connect to our SQL db...
# CONNECTING TO SQL HERE
$vault_ready = "no"
Add-Content $logfile "$now INFO: Beginning the WHILE loop while $export_name completes..."
while ($vault_ready -eq "no") {
$vault_status = gam info export "Email Exports" "$export_name"
$vault_status = $vault_status -Match "COMPLETED"
$vault_status = $vault_status -split(": ")
$vault_status = $vault_status[1]
if ($vault_status -eq "COMPLETED") {
$vault_ready = "yes"
$completed = Get-Date -Format "yyyy-MM-dd HH:mm:ss"
Invoke-SqlUpdate -Query "UPDATE ``table`` SET ``vault_status`` = '$vault_status', ``vault_completed`` = '$completed' WHERE ``id`` = '$vault_status_id'"
Invoke-SqlUpdate -Query "UPDATE ``table`` SET ``vault_status`` = '1' WHERE ``id`` = '$sql_id'"
$now = (Get-Date).tostring("MM-dd-yyyy hh:mm:ss")
Add-Content $logfile "$now INFO: $export_name is ready to download. Updated vault_status in our dbs."
} else {
$vault_status = gam info export "Email Exports" "$export_name"
$vault_status = $vault_status -Match "IN_PROGRESS"
$vault_status = $vault_status -split(": ")
$vault_status = $vault_status[1]
Invoke-SqlUpdate -Query "UPDATE ``table`` SET ``vault_status`` = '$vault_status' WHERE ``id`` = '$vault_status_id'"
$now = (Get-Date).tostring("MM-dd-yyyy hh:mm:ss")
Add-Content $logfile "$now INFO: $export_name is not yet ready: ($vault_status). Checking again in sixty seconds."
Start-Sleep 60
}
}
}
Start-Job -ScriptBlock $scriptBlock -ArgumentList #($employee_name,$export_name,$sql_id,$vault_status_id)
}
exit
I'm unsure why you wouldn't just have the task execute every 30 seconds instead of having a process run indefinitely and using it's own timers
The condition you are setting for the WHILE statement is being met, this is why the loop doesn't continue.
Change the condition to something that will never be met and the problem should go away, i.e.
$Value1 = 0
WHILE($Value1 -ne 1){
#do things, never updating the $value1 variable
}
Related
I have a script that's designed to ping another host across a site-to-site VPN tunnel every minute. After 10 minutes, it checks the average uptime of test-connection, and if it falls below a certain threshold, it sends a Teams message telling us to check things out.
This works perfectly well when I manually run the script in situ, however, when I leave it to run as a background job, it isn't sending the Teams messages.
My question is this: as a relatively new sysadmin, the tools in my toolkit are pretty limited. Does anyone have a good tip for where I should start looking, to troubleshoot this issue? To rule out potential problems with my script, I've also included it below. But I suspect the issue is more to do with leaving the script to run on a server that I then log out of. The server in question is running Windows Server 2012 (yes I know, migration is on my to-do list).
Import-Module message_module # a module i wrote to wrap messages to Teams webhooks (included below)
# this array will accept output values from the ongoing test
$test_table = new-object system.collections.arraylist
# this index counts how many times we've checked recently
[int32[]]$test_index = 1
# our desired threshold for uptime / response
$uptime = .8
# how many minutes to count before testing
$count_length = 10
# IP to ping
$ping_ip = 'XXX.XXX.XXX.XXX'
$test_ip = '142.251.33.110' # google.com, used for testing
# here's the actual function that does the pinging and puts values in the arraylist
function Ping-VPN {
$ping_host = test-connection $ping_ip -erroraction silentlycontinue
if ( $ping_host ) {
$test_table.add(1) > $null
} else {
$test_table.add(0) > $null
}
}
# this function calculates the average of the values in test_table, and then clears them
function Get-Average-Uptime {
$sum = 0
foreach ($entry in $test_table) {
$sum += $entry
}
$avg = $sum / $test_table.count
return $avg
}
function Main-Loop {
while ( $test_index -lt $count_length ) {
Ping-VPN
$test_index += 1
start-sleep -seconds 60
}
$avguptime = Get-Average-Uptime
$test_table.clear
if ( $avguptime -lt $uptime ) {
$title = "XXX/XXX VPN Down"
$message = "XXXXXX response to ping from XXXXXXX at less than desired rate. Please investigate."
Send-TeamsMessage -Message $message -Title $title
start-sleep -seconds 3600 # sleep for an hour, to avoid spamming us
}
$test_index = 0 # restart the testing interval
Main-Loop
}
Main-Loop
And the module code:
function Send-TeamsMessage {
Param(
[Parameter(Position = 0, Mandatory = $true)][String]$Message,
[Parameter(Position = 1, Mandatory = $true)][String]$Title
)
$JSONBody = [PSCustomObject][Ordered]#{
"#type" = "MessageCard"
"#context" = "http://schema.org/extensions"
"themeColor" = '0078D7'
"title" = $Title
"text" = $Message
}
$TeamMessageBody = ConvertTo-Json $JSONBody -Depth 100
$parameters = #{
"URI" = 'XXXXXXXX (webhook URI)'
"Method" = 'POST'
"Body" = $TeamMessageBody
"ContentType" = 'application/json'
}
Invoke-RestMethod #parameters | Out-Null
}
Export-ModuleMember -Function Send-TeamsMessage
Right now, I'm calling the main file with:
start-job -file C:\path\to\file.ps1
Then minimizing the terminal and disconnecting from the server. I suspect the problem is something to do with this, that I'm missing something really obvious.
As it turns out, this question is quite similar to another, though they're phrased very differently.
Basically, what I need to do is to run the script as NT AUTHORITY\System on startup. Run an infinite command on Windows Server even if someone is logged out
I definitely feel like I've done my due diligence in trying to sort this one out, so bear with me as I try to explain what I'd like to accomplish.
I have a PowerShell script that automates many different offboarding tasks. Some of these tasks can take quite some time. For example, waiting on an e-mail export to complete can take hours depending on the employee and their length of time with the company. So, rather than performing my "while" loop in the script itself (and log jamming all of the next steps) I am trying to pass it off to an async function to run in the background while the rest of my offboarding script moves along.
I can tell that the async function successfully runs and the required variables are being passed, but... nothing is happening. I'm getting no updates printed to my log file. I'm getting no updates to the entry in my SQL db. I'm truly at a loss with this one.
Here is the call to my function in the main script:
$VaultParameters = #{
employee_name = "$employee_name"
export_name = "$export_name"
sql_id = "$sql_id"
vault_status_id = "$vault_status_id"
}
VaultExport #VaultParameters
And here is the function itself:
function VaultExport {
param (
[cmdletbinding()]
[parameter()]
[string]$employee_name,
[parameter()]
[string]$export_name,
[parameter()]
[string]$sql_id,
[parameter()]
[string]$vault_status_id
)
$scriptBlock = {
param ($employee_name,$export_name,$sql_id,$vault_status_id)
$vault_ready = "no"
while ($vault_ready -eq "no") {
$vault_status = gam info export "Email Exports" "$export_name"
$vault_status = $vault_status -Match "COMPLETED"
$vault_status = $vault_status -split(": ")
$vault_status = $vault_status[1]
Invoke-SqlUpdate -Query "UPDATE ``db_name``.``table_name`` SET ``vault_status`` = '$vault_status' WHERE ``id`` = '$vault_status_id'"
if ($vault_status -eq "COMPLETED") {
$vault_ready = "yes"
$completed = Get-Date -Format "yyyy-MM-dd HH:mm:ss"
Invoke-SqlUpdate -Query "UPDATE ``db_name``.``table_name`` SET ``vault_completed`` = '$completed' WHERE ``id`` = '$vault_status_id'"
Invoke-SqlUpdate -Query "UPDATE ``db_name``.``table_name`` SET ``vault_status`` = '1' WHERE ``id`` = '$sql_id'"
$LoggingParameters = #{
logfile = "C:\script_logs\threxit.log"
log = "INFO: (GVEF, $employee_name) $export_name is ready ($vault_status). Downloading now..."
}
EventLogging #LoggingParameters
Write-Output "$export_name is ready ($vault_status). Downloading now..."
} else {
$vault_status = gam info export "Email Exports" "$export_name"
$vault_status = $vault_status -Match "IN_PROGRESS"
$vault_status = $vault_status -split(": ")
$vault_status = $vault_status[1]
$LoggingParameters = #{
logfile = "C:\script_logs\threxit.log"
log = "INFO: (GVEF, $employee_name) $export_name is not yet ready ($vault_status). Checking again in ten seconds."
}
EventLogging #LoggingParameters
Write-Output "$export_name is not yet ready ($vault_status). Checking again in ten seconds."
Start-Sleep 10
}
}
}
Start-Job -ScriptBlock $scriptBlock -ArgumentList #($employee_name,$export_name,$sql_id,$vault_status_id)
}
exit
I am currently trying to import a .psm1 file dynamically into a script block to execute it.
I am using parallelisation along with jobs as I need to trigger several modules simultaneously as different users.
This is the code:
$tasksToRun | ForEach-Object -Parallel {
$ScriptBlock = {
param ($scriptName, $Logger, $GlobalConfig, $scriptsRootFolder )
Write-Output ("hello $($scriptsRootFolder)\tasks\$($scriptName)")
Import-Module ("$($scriptsRootFolder)\tasks\$($scriptName)")
& $scriptName -Logger $Logger -GlobalConfig $GlobalConfig
}
$job = Start-Job -scriptblock $ScriptBlock `
-credential $Cred -Name $_ `
-ArgumentList ($_, $using:Logger, $using:globalConfig, $using:scriptsRootFolder) `
Write-Host ("Running task $_")
$job | Wait-job -Timeout $using:timeout
if ($job.State -eq 'Running') {
# Job is still running, stop it
$job.StopJob()
Write-Host "Stopped $($job.Name) task as it took too long"
}
else {
# Job completed normally, get the results
$job | Receive-Job
Write-Host "Finished task $($job.Name)"
}
}
The logger variable is a hashtable as defined here:
$Logger = #{
generalLog = $function:Logger
certificateLog = $function:LoggerCertificate
alertLog = $function:LoggerAlert
endpointServiceLog = $function:LoggerEndpointService
}
Currently, it is erroring with the following:
ObjectNotFound: The term
' blah blah blah, this is the code straight from the logger function '
is not recognized as the name of a cmdlet, function, script file, or operable program.
Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
The logger function servers the purpose of logging to a file in a specific way, it is generalised to that it can be used across many tasks.
A cut down example of a logger (probably won't compile, just deleted a bunch of lines to give you the general idea):
function LoggerEndpointService {
param (
# The full service name.
[string]$ServiceFullName,
# The unique identifier of the service assigned by the operating system.
[string]$ServiceId,
# The description of the service.
[string]$Description,
# The friendly service name.
[string]$ServiceFriendlyName,
# The start mode for the service. (disabled, manual, auto)
[string]$StartMode,
# The status of the service. (critical, started, stopped, warning)
[string]$Status,
# The user account associated with the service.
[string]$User,
# The vendor and product name of the Endpoint solution that reported the event, such as Carbon Black Cb Response.
[string]$VendorProduct
)
$ServiceFullName = If ([string]::IsNullOrEmpty($ServiceFullName)) { "" } Else { $ServiceFullName }
$ServiceId = If ([string]::IsNullOrEmpty($ServiceId)) { "" } Else { $ServiceId }
$ServiceFriendlyName = If ([string]::IsNullOrEmpty($ServiceFriendlyName)) { "" } Else { $ServServiceFriendlyNameiceName }
$StartMode = If ([string]::IsNullOrEmpty($StartMode)) { "" } Else { $StartMode }
$Status = If ([string]::IsNullOrEmpty($Status)) { "" } Else { $Status }
$User = If ([string]::IsNullOrEmpty($User)) { "" } Else { $User }
$Description = If ([string]::IsNullOrEmpty($Description)) { "" } Else { $Description }
$VendorProduct = If ([string]::IsNullOrEmpty($VendorProduct)) { "" } Else { $VendorProduct }
$EventTimeStamp = Get-Date -Format "yyyy-MM-ddTHH:mm:ssK"
$Delay = 100
For ($i = 0; $i -lt 30; $i++) {
try {
$logLine = "{{timestamp=""{0}"" dest=""{1}"" description=""{2}"" service=""{3}"" service_id=""{4}""" `
+ "service_name=""{5}"" start_mode=""{6}"" vendor_product=""{7}"" user=""{8}"" status=""{9}""}}"
$logLine -f $EventTimeStamp, $env:ComputerName, $Description, $ServiceFullName, $ServiceId, $ServiceFriendlyName, $StartMode, $VendorProduct, $User, $Status | Add-Content $LogFile -ErrorAction Stop
break;
}
catch {
Start-Sleep -Milliseconds $Delay
}
if ($i -eq 29) {
Write-Error "Alert logger failed to log, likely due to Splunk holding the file, check eventlog for details." -ErrorAction Continue
if ([System.Diagnostics.EventLog]::SourceExists("SDOLiveScripts") -eq $False) {
Write-Host "Doesn't exist"
New-EventLog -LogName Application -Source "SDOLiveScripts"
}
Write-EventLog -LogName "Application" -Source "SDOLiveScripts" `
-EventID 1337 `
-EntryType Error `
-Message "Failed to log to file $_.Exception.InnerException.Message" `
-ErrorAction Continue
}
}
}
Export-ModuleMember -Function LoggerEndpointService
If anyone could help that'd be great, thank you!
As mentioned in the comments, PowerShell Jobs execute in separate processes and you can't share live objects across process boundaries.
By the time the job executes, $Logger.generalLog is no longer a reference to the scriptblock registered as the Logger function in the calling process - it's just a string, containing the definition of the source function.
You can re-create it from the source code:
$actualLogger = [scriptblock]::Create($Logger.generalLog)
or, in your case, to recreate all of them:
#($Logger.Keys) |ForEach-Object { $Logger[$_] = [scriptblock]::Create($Logger[$_]) }
This will only work if the logging functions are completely independent of their environment - any references to variables in the calling scope or belonging to the source module will fail to resolve!
This lambda function executes as expected:
$WriteServerName = {
param($server)
Write-Host $server
}
$server = "servername"
$WriteServerName.invoke($server)
servername
However, using the same syntax, the following script prompts for credentials and then exits to the command line (running like this: .\ScriptName.ps1 -ConfigFile Chef.config), implying that the lambda functions aren't executing properly (for testing, each should just output the server name).
Why does the former lambda function return the server name, but the ones in the script don't?
Param(
$ConfigFile
)
Function Main {
#Pre-reqs: get credential, load config from file, and define lambda functions.
$jobs = #()
$Credential = Get-Credential
$Username = $Credential.username
$ConvertedPassword = [System.Runtime.InteropServices.Marshal]::SecureStringToBSTR($Credential.password)
$Password = [System.Runtime.InteropServices.Marshal]::PtrToStringAuto($ConvertedPassword)
$Config = Get-Content $ConfigFile -Raw | Out-String | Invoke-Expression
#Define lambda functions
$BootStrap = {
param($Server)
write-host $server
}
$RunChefClient = {
param($Server)
write-host $server
}
$SetEnvironment = {
param($Server)
write-host $server
}
#Create bootstrap job for each server and pass lambda functions to Scriptblock for execution.
if(($Username -ne $null) -and ($Password -ne $null))
{
ForEach($HashTable in $Config)
{
$Server = $HashTable.Server
$Roles = $HashTable.Roles
$Tags = $HashTable.Tags
$Environment = $HashTable.Environment
$ScriptBlock = {
param ($Server,$BootStrap,$RunChefClient,$SetEnvironment)
$BootStrap.invoke($Server)
$RunChefClient.invoke($Server)
$SetEnvironment.invoke($Server)
}
$Jobs += Start-Job -ScriptBlock $ScriptBlock -ArgumentList #($Server,$BootStrap,$RunChefClient,$SetEnvironment)
}
}
else {Write-Host "Username or password is missing, exiting..." -ForegroundColor Red; exit}
}
Main
Without testing, I am going to go ahead and say it's because you are putting your scriptblock executions in PowerShell Jobs and then not doing anything with them. When you start a job, it starts a new PowerShell instance and executes the code you give it along with the parameters you give it. Once it completes, the completed PSRemotingJob object sits there and does nothing until you actually do something with it.
In your code, all the jobs you start are assigned to the $Jobs variable. You can also get all your running jobs with Get-Job:
Get-Job -State Running
If you want to get any of the data returned from your jobs, you'll have to use Receive-Job
# Either
$Jobs | Receive-Job
# Or
Get-Job -State Running | Receive-Job
TL:DR actual question is at the bottom
I'm trying to troubleshoot a Powershell v1.0 script issue. The script basically downloads a file from an FTP site and puts it on a remote server via UNC and emails the success or failure of the task.
The script runs as a task with a generic ID that is a Domain Admin but is not used to log into systems so the server it runs off of does not contain a profile for it.
If I do a runas for that user and execute the script via command line it works flawlessly. However, if I try to run it as a task it runs then exits instantly. If I open a runas command prompt and run the scheduled task vi at he command line all I get back is:
SUCCESS: Attempted to run the scheduled task "Task Name".
I've tried writing variable values to a text file to see what is going on but it never writes even when I write them as the very first step of execution.
What I want to do is capture any script error messages you would normally see when trying to run the script and/or write the variable information to a text file.
Is there any way to do this? BTW I doing via calling powershell with the following arguments:
-file -ExecutionPolicy Bypass "d:\datscript\myscript.ps1"
-I've tried -command instead of -file.
-I've tried "d:\datscript\myscript.ps1 5>&1 test.txt"
-I've tried "d:\datscript\myscript.ps1 9>&1 test.txt"
-I've tried "d:\datscript\myscript.ps1 | out-file d:\datscript\test.txt"
Nothing worked. I'm sure I can fix whatever bug I have but I'm banging my head against the wall trying to get some kind of failure info.
--Update: Here is a copy of the script minus details--
#-------------------------------------------------------------------------------------------------------------------------------------------------------------
#
#Variable Declaration
#
#$path = Path on local server to downlaod DAT to
#$olddat = Old/last DAT downloaded
#$currentdat = Next DAT number
#$ftpsite = McAfee FTP site. Update if path changes
#$ftpuser = FTP user (anon login)
#$ftppass = FTP password (anon login)
#$tempstring = Manipulation variable
#$gotdat = Boolean if updated DAT exists
#$success = Status if a new DAT exists and has been downloaded (used for email notification).
#$thetime = Variable use dto hold time of day manipulation.
$path = "\\myservername\ftproot\pub\mcafee\datfiles\"
$olddat = ""
$currentdat =""
$ftpsite = "ftp://ftp.nai.com/virusdefs/4.x/"
$ftpuser = "something"
$ftppass = "anything"
$tempstring =""
$gotdat = "False"
$success = ""
$thetime = ""
#
#Normalized functions handles UNC paths
#
function Get-NormalizedFileSystemPath
{
<#
.Synopsis
Normalizes file system paths.
.DESCRIPTION
Normalizes file system paths. This is similar to what the Resolve-Path cmdlet does, except Get-NormalizedFileSystemPath also properly handles UNC paths and converts 8.3 short names to long paths.
.PARAMETER Path
The path or paths to be normalized.
.PARAMETER IncludeProviderPrefix
If this switch is passed, normalized paths will be prefixed with 'FileSystem::'. This allows them to be reliably passed to cmdlets such as Get-Content, Get-Item, etc, regardless of Powershell's current location.
.EXAMPLE
Get-NormalizedFileSystemPath -Path '\\server\share\.\SomeFolder\..\SomeOtherFolder\File.txt'
Returns '\\server\share\SomeOtherFolder\File.txt'
.EXAMPLE
'\\server\c$\.\SomeFolder\..\PROGRA~1' | Get-NormalizedFileSystemPath -IncludeProviderPrefix
Assuming you can access the c$ share on \\server, and PROGRA~1 is the short name for "Program Files" (which is common), returns:
'FileSystem::\\server\c$\Program Files'
.INPUTS
String
.OUTPUTS
String
.NOTES
Paths passed to this command cannot contain wildcards; these will be treated as invalid characters by the .NET Framework classes which do the work of validating and normalizing the path.
.LINK
Resolve-Path
#>
[CmdletBinding()]
param (
[Parameter(Mandatory = $true, ValueFromPipeline = $true, ValueFromPipelineByPropertyName = $true)]
[Alias('PSPath', 'FullName')]
[string[]]
$Path,
[switch]
$IncludeProviderPrefix
)
process
{
foreach ($_path in $Path)
{
$_resolved = $_path
if ($_resolved -match '^([^:]+)::')
{
$providerName = $matches[1]
if ($providerName -ne 'FileSystem')
{
Write-Error "Only FileSystem paths may be passed to Get-NormalizedFileSystemPath. Value '$_path' is for provider '$providerName'."
continue
}
$_resolved = $_resolved.Substring($matches[0].Length)
}
if (-not [System.IO.Path]::IsPathRooted($_resolved))
{
$_resolved = Join-Path -Path $PSCmdlet.SessionState.Path.CurrentFileSystemLocation -ChildPath $_resolved
}
try
{
$dirInfo = New-Object System.IO.DirectoryInfo($_resolved)
}
catch
{
$exception = $_.Exception
while ($null -ne $exception.InnerException)
{
$exception = $exception.InnerException
}
Write-Error "Value '$_path' could not be parsed as a FileSystem path: $($exception.Message)"
continue
}
$_resolved = $dirInfo.FullName
if ($IncludeProviderPrefix)
{
$_resolved = "FileSystem::$_resolved"
}
Write-Output $_resolved
}
} # process
} # function Get-NormalizedFileSystemPath
#
#Get the number of the exisiting DAT file and increment for next DAT if the DAT's age is older than today.
# Otherwise, exit the program if DATs age is today.
#
$tempstring = "xdat.exe"
$env:Path = $env:Path + ";d:\datscript"
$path2 ="d:\datscript\debug.txt"
add-content $path2 $path
add-content $path2 $olddat
add-content $path2 $currentdat
add-content $path2 $success
add-content $path2 " "
$path = Get-NormalizedFileSystemPath -Path $path
Set-Location -Path $path
$olddat = dir $path | %{$_.Name.substring(0, 4) }
$olddatfull = "$olddat" + "$tempstring"
if ( ((get-date) - (ls $olddatfull).LastWriteTime).day -lt 1)
{
#***** Commented out for testing!
# exit
}
$currentdat = [INT] $olddat
$currentdat++
$currentdat = "$currentdat" + "$tempstring"
add-content $path2 $olddat
add-content $path2 $currentdat
add-content $path2 $success
add-content $path2 " "
#
#Connect to FTP site and get a current directory listing.
#
[System.Net.FtpWebRequest]$ftp = [System.Net.WebRequest]::Create($ftpsite)
$ftp.Method = [System.Net.WebRequestMethods+FTP]::ListDirectoryDetails
$response = $ftp.getresponse()
$stream = $response.getresponsestream()
$buffer = new-object System.Byte[] 1024
$encoding = new-object System.Text.AsciiEncoding
$outputBuffer = ""
$foundMore = $false
#
# Read all the data available from the ftp directory stream, writing it to the
# output buffer when done. After that the buffer is searched to see if it cotains the expected
# lastest DAT.
#
do
{
## Allow data to buffer for a bit
start-sleep -m 1000
## Read what data is available
$foundmore = $false
$stream.ReadTimeout = 1000
do
{
try
{
$read = $stream.Read($buffer, 0, 1024)
if($read -gt 0)
{
$foundmore = $true
$outputBuffer += ($encoding.GetString($buffer, 0, $read))
}
} catch { $foundMore = $false; $read = 0 }
} while($read -gt 0)
} while($foundmore)
$gotdat = $outputbuffer.Contains($currentdat)
$target = $path + $currentdat
#
# Downloads DATs and cleans up old DAT file. Returns status of the operation.
# Return 1 = success
# Return 2 = Latest DAT not found and 4pm or later
# Return 3 = DAT available but did not download or is 0 bytes
# Return 4 = LatesT DAT not found and before 4pm
#
$success = 0
if ($gotdat -eq "True")
{
$ftpfile = $ftpsite + $ftppath + $currentdat
write-host $ftpfile
write-host $target
$ftpclient = New-Object system.Net.WebClient
$uri = New-Object System.Uri($ftpfile)
$ftpclient.DownloadFile($uri, $target)
Start-Sleep -s 30
if ( ((get-date) - (ls $target).LastWriteTime).days -ge 1)
{
$success = 3
}
else
{
$testlength = (get-item $target).length
if( (get-item $target).length -gt 0)
{
Remove-Item "$olddatfull"
$success = 1
}
else
{
$success = 3
}
}
}
else
{
$thetime = Get-Date
$thetime = $thetime.Hour
if ($thetime -ge 16)
{
$success = 2
}
else
{
$success = 4
exit
}
}
#
# If successful download (success = 1) run push bat
#
if ($success -eq 1)
{
Start-Process "cmd.exe" "/c c:\scripts\mcafeepush.bat"
}
#Email structure
#
#Sends result email based on previous determination
#
#SMTP server name
$smtpServer = "emailserver.domain.com"
#Creating a Mail object
$msg = new-object Net.Mail.MailMessage
#Creating SMTP server object
$smtp = new-object Net.Mail.SmtpClient($smtpServer)
$msg.From = "email1#domain.com"
$msg.ReplyTo = "email2#domain.com"
$msg.To.Add("email2#domain.com")
switch ($success)
{
1 {
$msg.subject = "McAfee Dats $currentdat successful"
$msg.body = ("DAT download completed successfully. Automaton v1.0")
}
2 {
$msg.subject = "McAfee DATs Error"
$msg.body = ("Looking for DAT $currentdat on the FTP site but I coud not find it. Human intervention may be required. Automaton v1.0")
}
3 {
$msg.subject = "McAfee DATs Error"
$msg.body = ("$currentdat is available for download but download has failed. Human intervention will be required. Automaton v1.0")
}
default {
$msg.subject = "DAT Automaton Error"
$msg.body = ("Something broke with the McAfee automation script. Human intervention will be required. Automaton v1.0")
}
}
#Sending email
$smtp.Send($msg)
#Needed to keep the program from exiting too fast.
Start-Sleep -s 30
#debugging stuff
add-content $path2 $olddat
add-content $path2 $currentdat
add-content $path2 $success
add-content $path2 " "
Apparently you have an error in starting Powershell, either because execution policy is different on the Powershell version you start, or on the account, or there is an access error on the scheduled task. To gather actual error, you can launch a task like so:
cmd /c "powershell.exe -file d:\datscript\myscript.ps1 test.txt 2>&1" >c:\windows\temp\test.log 2&>1
This way if there would be an error on starting Powershell, it will be logged in the c:\windows\temp\test.log file. If the issue is in execution policy, you can create and run (once) a task with the following:
powershell -command "Get-ExecutionPolicy -List | out-file c:/windows/temp/policy.txt; Set-ExecutionPolicy RemoteSigned -Scope LocalMachine -Force"
Running a task under the account you plan to run your main task will first get the policies in effect (so that if setting machine-level policy won't help, you'll know what scope to alter) and set machine-level policy to "RemoteSigned", the least restrictive level beyond allowing every script (highly not recommended, there are encoder scripts written on Powershell that can ruin your data).
Hope this helps.
UPDATE: If that's not policy, there might be some errors in properly writing the parameters for the task. You can do this: Create a .bat file with the string that launches your script and redirects output to say test1.txt, then change the scheduled task to cmd.exe -c launcher.bat >test2.txt, properly specifying the home folder. Run the task and review both files, at least one of them should contain an error that prevents your script from launching.