Powershell as scheduled task issue with importing module, it seems - powershell

I am attempting to configure some powershell/view powercli scripts for our VMware horizon environment. I have a powershell script that works properly to query the horizon instance and check machine states. However, when I try to run this as a scheduled task using a service account, it seems to fail to import a module, as a command is unrecognized ("The term 'Connect-HVServer' is not recognized as the name of a cmdlet, function, script file, or operable program.")
I tried profiles as well, didn't matter.
What I observed is that if I open powershell as the user in question (run as different user > authenticate as service account), leaving that powershell instance open will allow the scheduled task to run as expected. However, if i close the powershell instance, the scheduled task fails. This is obviously not viable since the goal is for this script to run on a schedule without the service account (or any account) being logged into the windows server at the time the powershell script gets run.

The problem you're running into is environment variables. In the course of running as a user versus running as machine, the PSModulePath environment variable changes to include user-directories for user-scoped module installs. You should install PowerCLI machine-wide.
Alternatives (these assume your service account has admin privileges):
Modify your $Env:PSModulePath in the script to include each user's module path
Specify the path in an Import-Module statement in your script before you use any of the cmdlets
Example of the first alternative:
foreach ($user in (Get-ChildItem -Path C:\Users)) {
$Env:PSModulePath += ";$($user.FullName)\Documents\WindowsPowerShell\Modules"
}
Example of the second:
Import-Module -Name 'C:\Users\KnownUser\Documents\WindowsPowerShell\Modules\PowerCLI'

Related

Run powershell script within powershell script (Intune related)

This is Intune related but could probably apply outside the scope of Intune as well.
I wrote a PowerShell script that downloads a folder from an Azure blob storage and extracts the content. Within the extracted folder is another PowerShell script that I want to run.
The PowerShell script is deployed in Intune and runs successfully all the way up to the point where the second PowerShell script runs. From the log, it's running into a permission issue.
Scripts deployed through Intune are ran as administrator/system and don't require any local policy change to allow the execution of PowerShell scripts on the device. However on the device, the user account is only a standard user so they don't have permissions to execute PowerShell scripts. In the first script, I've included the "Set-ExclusionPolicy Bypass" command already.
I need to be able to deploy the script from Intune to the local device and essential run another script as the local user (non administrator account). I thought maybe I needed the local user to be included in the local administrators group to be able to run the second script but that did not work either.
Also read somewhere that PowerShell can run PowerShell scripts from other PowerShell scripts directly. The only time you need Start-Process for that is when you want to run the called script with elevated privileges (which isn't necessary here, since your parent script is already running elevated).
^^^Is this my issue? My script does include "Start-Process" to run the next powershell script.
Script below for reference.
New-Item -Path "C:\IT Drivers" -ItemType Directory ;
Invoke-WebRequest - Uri 'https://xyz.zip' -OutFile "C:\xyz.zip" ;
Expand-Archive -Path "C:\xyz.zip" -DestinationPath "C:\";
Set-ExclusionPolicy Bypass ;
Start-Sleep -Seconds 30 ;
Start-Process powershell "C:\xyz.ps1"
Any guidance would be appreciated, thank you!

Task Scheduler - Powershell script not firing?

I've created numerous scripts in PowerShell that are working as intended if I execute them directly, however, when I try and setup a schedule to run these in Task Scheduler (to run with highest privileges) it doesn't seem to be running anything at all.
I'm running the following in my actions:
powershell.exe -ExecutionPolicy Bypass -File C:\PS\Mailboxes\CheckForwardingList.ps1
I'm getting a "Last Run Result" of 0x0 and the particular purpose of the above script is to generate a TXT file from EXO which it then mails out via SMTP and I've yet to receive any emails and I also don't see any TXT being generated in the folder where the script is located.
I do have two additional scripts setup which aren't running but once I've addressed the issue on the above this should quickly rectify the problems.
I like to test my PowerShell scripts from a command prompt first.
For example a script called C:\Tests\Test-PowerShellScriptsRunning.ps1 that only contains the following one liner helps me to test if scripts can run successfully on a machine
Write-Host -ForegroundColor Yellow "If you see this, then your script is running"
Next, I run this script from a command prompt, to get the syntax right in my scheduled task:
powershell.exe -nologo -file c:\Tests\Test-PowerShellScriptsRunning.ps1
Of course, you can add the -Executionpolicy bypass parameter, but I prefer to test the execution policy first.
However, as you are running a script that connects to ExchangeOnline, I suspect it has to do with the user you are running this task under. Does the script run if you run this task under your credentials or are the credentials stored on the system or in the script?
You might want to check this article to see how you can register an app and authenticate without storing your credentials on the machine to run the script unattended: App-only authentication for unattended scripts in the EXO V2 module

Running quser.exe in PowerShell scripts works in IDE but not when running as a service

If I run the following snippet in a console window or in ISE it works as expected, listing the active user sessions on the local computer:
(Invoke-Expression "$env:windir\system32\quser.exe") -replace '\s{2,}', ',' | ConvertFrom-Csv
Unfortunately this is not a console application, rather a PowerShell script that is installed as a service. The service runs as LocalSystem (not LocalService). When the service attempts to run this code it outputs the following error:
The term 'C:\Windows\system32\quser.exe' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
I contacted Sapien support and was informed that A service runs with no profile and does not have execution access to the system folders. You need to give the service account execution access to the exe and its support DLLs as well as using the full path to the EXE.
I have proven (I think) that the security principal has access to quser. I used PSExen to open a PowerShell console running as LocalSystem and successfully ran the quser application:
The issue MUST be that I'm running as a service. Does anyone know how I can access/use QUser in a service?
I guess the real question would be, how can services running as LocalSystem execute applications in system folders?
My guess is that whatever tool you are using to run your PowerShell script as a service is 32-bit, and there is not a quser.exe in C:\Windows\SysWOW64.
If this is the case, you can probably work around this on a 64-bit OS by running C:\Windows\Sysnative\quser.exe (see File System Redirector in the documentation for details).
If that's the case, I would say that the information you got ("service runs with no profile and does not have execution access to the system folders") is simply incorrect.

Powershell Script Deployed through Intune - Command Not Found

Not sure if this is for stakoverflow or serverfault.
I am deploying a Powershell script using MS Intune. The script works when run locally, but when deployed I get the error below:
Remove-LocalGroupMember : The term 'Remove-LocalGroupMember' is not
recognized as the name of a cmdlet, function, script file, or
operable program. Check the spelling of the name, or if a path was
included, verify that the path is correct and try again. At
C:\Program Files (x86)\Microsoft Intune Management
I am not sure why this commandlet is unavailable as it is definitely there if I open a powershell and run this command.
I am logging the $user variable to check that it is not null or running under a different context.
The code is quite simple as below:
$user = $(whoami)
$user | Out-File 'C:\powershelllog.log'
Remove-LocalGroupMember -Group Administrators -Member $user
I believe I've run into the same issue as you are having. I've been trying to create a local admin account on machines. Running the powershell script with the system context in Intune. What I've found is that you must check: "Run script in 64 bit PowerShell Host" inside of the Intune where you import powershell scripts.
Apparently not all commands are available with the 32 bit ps console running that way.
I also used the get-command to determine what module the command that was reporting not found was is in, and used the import-module at the top of my script in case.

How do I get powershell run from a scheduled tasks to pick up the newest environment variables

I have a scheduled task that runs a powershell script as the system user. That's all good, except from the part that it doesn't pick up the latest environment variables as it seem.
I have verified that the environment variable in question is a "System Variable" and not just a user variable for me only.
In the scheduled tasks I've specified PowerShell as the command and the provided arguments like:
-command "& 'myscript' 'my args'"
The script runs, but I fail to import a module since it seems like the scheduled task is using an old environment.
The "Local Service" user can see the updated variables, but not the system user.
how do you set your environment variable ?using setx ?
you can use the following to interactively verify the variable :
From a elevated command prompt I used setx / m test testvalue
Then I used psexec to run a powershell as the system user :
psexec -i -s powershell.exe -noexit
In the opened powershell, I can read the variable :
PS C:\Windows\system32> $env:test
testvalue
Update
I confirm the newest environment variables are not seen from the scheduler. But after a reboot, this is working.
A post on superuser suggest that killing all taskeng.exe should be enough but this has not work on my 2008R2 server.
My guess is the registry values are read when the schedule service start and not reloaded at each task run. Still it does not explain why the variables are accessible to the local service account ...
As a workaround you should be able to read the env. value directly from the registry