Update: I used Azure Automation per BenV's suggestion below and it worked! More info can be found here.
I have a PowerShell script that needs to run a few Azure commands like New-AzureStorageContext, Get-AzureStorageContainer, Set-AzureStorageBlobContent, etc. I'd like to run the script as a webjob.
When I run this script as a webjob I receive errors below on the Azure commands. Other PS commands run successfully from the webjob.
I searched StackOverflow and couldn't find posts for these errors generated when Azure commands are run from a webjob. Somewhat related posts mentioned to use Import-Module which is similar to the advice given below.
An older MSDN blogpost suggested adding “Import-Module Azure.ps1” in the PS script and include Azure.ps1 inside the webjob zip file. (It's really Azure.psd1 from my local C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure). Separately, I tried Import-Module with Azure.psd1 then Azure.ps1 thinking the errors might be related to the file extension, but it wasn't.
My webjob .zip file only has my .CMD file, GetLinks.ps1 and Azure.ps1.
My .CMD file launches my PS script with: PowerShell.exe -ExecutionPolicy RemoteSigned -File GetLinks.ps1
At the top of this .ps1 file I have: "Import-Module .\Azure.ps1". This runs successfully since I see "INFO" statements in my WebJob run log.
Next my PS script tries to run the Azure PS commands and I still get the same errors like the one example error below.
New-AzureStorageContext : The term 'New-AzureStorageContext' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
At D:\local\Temp\jobs\triggered\getlinks2\b2025qk5.ddj\GetLinks.ps1:75 char:19
+ $storageContext = New-AzureStorageContext -StorageAccountName $storageAccountNam ...
+ ~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (New-AzureStorageContext:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException
Azure PowerShell is not currently supported from the sandbox in which WebJobs runs. There are a variety of factors:
The CmdLets are not installed on the worker
Even if installed, there are issues that prevent them from running correctly
Even if they run, you'd need to authenticate before running command. This last part is solvable using Service Principals.
Factor #2 is the biggest blocker. We'd like to get to a point where this is possible, but right now it is tricky.
One potential workaround is to do ARM requests directly, though that's a bit more work (and you still need to auth using Service Principal). You could also write C# code to make the calls.
Another possible option might be to use a node script and the azure cli. Alas, I also tried to work around this issue with .sh script but that will fail trying to setup azure cli env (see https://github.com/projectkudu/kudu/issues/1935). Finally, if just storage functions you need might try using SAS tokens and http requests to do basic stuff within your own ps functions...
Related
When I try to run the command Add-PSSnapin Microsoft.SharePoint.Powershell in SharePoint Online Management Shell
I get the following error:
Add-PSSnapin : The Windows PowerShell snap-in 'Microsoft.SharePoint.Powershell` is not installed on this computer
+ CategoryInfo: InvalidArgument: (Microsoft.SharePoint.Powershell:string) [Add-PSSnapin], PSArgumentException
From what I understand this is supposed to come installed with SharePoint Online Management Shell anyway (this is a fresh download) so why won't it let me install it
I can login to Sharepoint using the $AdminURL, $AdminName & $Password so it's not the end of the world (and proves that this should work), but obviously makes the script less easy to run across sites as it has to be modified everytime to change the url and adminname
EDIT: I read on another post that adding this module (not snapin) would fix my issue of the error Get-SPSite is not recognized as the name of a cmdlet, function, script file, or operable program and Get-SPWebApplication is not recognized as the name of a cmdlet, function, script file, or operable program However, this did not fix my problem...
As mentioned by user #kuzimoto the commands I was using were not compatible with SharePoint Online and were designed to be used in conjunction with SharePoint Server.
The correct way to connect is using the command Connect-SPOService and passing through the credentials through there or just have them hard coded into your script.
I am running a PowerShell script which requires connection to a DPM Server.
When I run run the Connect-DPMServer <DPM Server Name> cmdlet from the DPM Manangement Shell, the command succeeds and I am able to connect to the server.
However, when I enclose the same command in a script and invoke the script through the DPM Management Shell, the following error occurs:
The term 'Connect-DPMServer' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
+ CategoryInfo : ObjectNotFound: (Connect-DPMServer:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException
Similar is the case with other DPM cmdlets like Get-DPMProtectionGroup.
I am running Powershell version 2.0 on Windows Server 2008 R2.
What is the reason for this peculiar behaviour and how can I get around this?
Edit
There is some observation I made. My script has two parts: A wrapper script and a helper script which is called by the wrapper script as an independent job.
All the DPM Commands are identified in the wrapper script but they are not identified in the helper script when it runs as a job.
Any explanation why this may be and any suggestions to resolve the same?
I figured out the solution and here it is:
What is Happening
The wrapper script runs in the DPM PowerShell and then invokes a helper script as a separate job or thread. The environment in which this helper script runs is windows native powershell and not DPM Powershell. Hence the DPM commands are not identified there.
Solution
The DPM specific modules need to be imported as soon as the helper script is invoked. The steps are as follows :
Right Click on the DPM Management Shell icon and view properties.
Select the value of Target. For me, it looks like this C:\Windows\system32\windowspowershell\v1.0\powershell.exe -noexit -File "D:\DPM\DPM\bin\dpmcliinitscript.ps1"
The value of the parameter -File that is "D:\DPM\DPM\bin\dpmcliinitscript.ps1" is the file which when imported to Windows Powershell converts it to DPM Management Console. That means, it loads the shell with DPM commands.
Include this file in the helper script through dot-sourcing. Which means, the first line of the helper script should look like this : ."D:\DPM\DPM\bin\dpmcliinitscript.ps1"
This will help the invoked shell to identify the DPM specific commands.
I have an powershell script saved in a .cmd file that downloads a file from the web and then unzips it. My azure web role executes it upon startup. This is the script:
powershell -ExecutionPolicy Bypass -c $(New-Object Net.WebClient).DownloadFile('URL.zip', 'FILE.zip') ;
(New-Object -com shell.application).namespace('c:\FOLDER').Copyhere((New-Object -com shell.application).namespace('FILE.zip').items())
When I run the script via Azure startup tasks:
The first part of the script works. The file is downloaded. The second part of the script which unzips does not run.
When I run the script via the command line when remoted into the VM:
The entire script runs.
I therefore know this is not a syntax error. The only difference I can think of between the two cases above is a permissions issue. But, I am running powershell with -ExecutionPolicy set to Bypass, which is the highest permission level. Anybody have any ideas? Thanks!
Change the command so that the output of the command is dumped into a file. Something like this should work
<YOUR_COMMAND> > out.log 2> err.log
Run the task again and checkout the output in the logs.
Also, you are using relative paths rather than absolute ones. The scheduled task probably run with the windows system folder as its working directory, so you may be getting a permissions error from that. Try using an absolute path to a directory you created.
Trying to retrieve help from a script gives the following error:
Get-Help : Cannot find Help for topic ".\Process-Test.ps1".
At line:1 char:9
+ get-help <<<< .\Process-Test.ps1
+ CategoryInfo : ResourceUnavailable: (:) [Get-Help], HelpNotFoundException
+ FullyQualifiedErrorId : HelpNotFound,Microsoft.PowerShell.Commands.GetHelpCommand
I've encountered the same error when attempting to retrieve help information from any custom PowerShell script. This does not happen when viewing help information from built-in cmdlets.
A test script is below:
<#
.SYNOPSIS
Adds a file name extension to a supplied name.
.DESCRIPTION
Adds a file name extension to a supplied name.
Takes any strings for the file name or extension.
.EXAMPLE
C:\PS> extension -name "File"
File.txt
#>
Write-Host "Test script"
Troubleshooting steps I've taken:
I've copied this script (or similar scripts) to other machines with PowerShell installed and used it to view help successfully.
I've also been able to view the help using a different account (User2) on my computer successfully, but only when logged in as the other user (versus running the PowerShell console as User2 when logged in as User1).
I've tried viewing the help with and without my PowerShell profile loaded, with the same result (I only have one profile loaded, my personal profile versus machine profiles).
I took this to be a sign that there was a problem with my Windows user profile, so I deleted my profile and re-created it with the same result. I've also tried running System Restore, with no change.
This happens in the PowerShell console along with the ISE.
Occurs when using both Get-Help as well as help.
I noticed, however, that my PowerShell console settings stayed consistent throughout deleting and re-creating my Windows user profile (height, width, colors, etc), which I wouldn't have expected since I deleted my user profile.
Since I'm using Windows 7, I'm not able to uninstall PowerShell and re-install as it's baked into the OS.
Google wasn't helpful for me in this case, but my google skills could be lacking. Any ideas as to further troubleshooting steps, or anyone who's seen this error before?
Edit: this only happens with the 64-bit version of the console and ISE, not with the 32-bit version, and persists through profile deletion
Have you tried to set execution policy?
Set-ExecutionPolicy -ExecutionPolicy remotesigned -Scope process
Then do Get-Help .\script.ps1.
I had the same problem. That was because my script was located on a networkshare in a DFS folder. So I am pointing to network file. When I copied the file locally, directly on the root of my C drive, and called the help option for my script with the normal get-help myscript.ps1 parameter, it worked!
I'm using Hudson version 1.324 for CI and have a couple of issues:
Environment:
Windows Server 2008
Powershell v1.0
Hudson 1.324 running as a service
Hudson Powershell Plugin installed
Psake (aka. "Powershell Make/Rake" available from Github) 0.23
(All current/latest versions as of this initial post)
I have a Powershell (PS) script that works to compile, run NUnit tests, and if successful, create a 7z file of the output. The PS script works from the command line, on both my local development box as well as the CI server where Hudson is installed.
1) Execution Policy with Powershell.
I initially ran a PS console on the server, ran Set-ExecutionPolicy Unrestricted, which allows any script to be run. (Yes, I realize the security concerns here, I'm trying to get something to work and Unrestricted should remove the security issues so I can focus on other problems.)
[This worked, and allowed me to fire off the PS build script from Hudson yesterday. I then encountered another problem, but we'll discuss that more in item #2.]
Once Hudson could fire off a PS script, it complained with the following error:
"C:\Windows\system32\WindowsPowerShell\v1.0\powershell "&
'OzSystems.Tools\psake\psake.ps1' '.\oz-build.ps1'" The term
'OzSystems.Tools\psake\psake.ps1' is not recognized as a cmdlet, funct
ion, operable program, or script file. Verify the term and try again.
At line:1 char:2
+ & <<<< 'OzSystems.Tools\psake\psake.ps1' '.\oz-build.ps1'"
Using the same command line, I am able to successfully execute the PS script from the command line manually. However Hudson is unable to get PS to do the same. After looking at additional PS documentation I also tried this:
"& 'OzSystems.Tools\psake\psake.ps1' '.\oz-build.ps1'"
and got a similar error. There does not appear to be any documentation for the Powershell plugin for Hudson. I've gone through all the Powershell plugin files and don't see anything that's configurable. I can't find a log file for Hudson to get additional information.
Can anyone help me past this?
2) I spent yesterday wrestling with #1. I came in this AM and tried to dig in again, after restarting the Hudson server/service, and now it appears that the ExecutionPolicy has been reset to Restricted. I did what worked yesterday, opened a PS console and Set-ExecutionPolicy to Unrestricted. It shows Unrestricted in the PS console, but Hudson says that it doesn't have rights to execution PS scripts. I reopened a new PS console and confirmed that the ExecutionPolicy is still Unrestriced -- it is. But Hudson evidently is not aware of this change. Restarting Hudson service again does not change Hudson's view of the policy.
Does anyone know what's going on here?
Thanks, Derek
I just ran into the problem of running powershell scripts in hudson. The thing is that you are running a 32-bit process of Java, and you've configured Hudson for 64-bit but not for 32-bit. See the following thread we created at microsoft.
http://social.technet.microsoft.com/Forums/en/winserverpowershell/thread/a9c08f7e-c557-46eb-b8a6-a19ba457e26d
If your lazy.
1. Start powershell (x86) from the start menu as administrator
2. Set the execution policy to remotesigned
Run this once and your homefree.
When Running PowerShell from a scheduled task or Hudson you want to:
Specify the -ExecutionPolicy parameter (in your case: -Ex Unrestricted)
Specify that command using either -Command { ... } or -File NOT BOTH and not without specifying which you mean.
Try this (except that I don't recommend using relative paths):
PowerShell.exe -Ex Unrestricted -Command "C:\Path\To\OzSystems.Tools\psake\psake.ps1" ".\oz-build.ps1"
To be clear, this will work too:
PowerShell.exe -Ex Unrestricted -Command "&{&'OzSystems.Tools\psake\psake.ps1' '.\oz-build.ps1'}"
The first string after -Command is interpreted as THE NAME OF A COMMAND, and every parameter after that is just passed to that command as a parameter. The string is NOT a script, it's the name of a command (in this case, a script file)... you cannot put "&'OzSystems.Tools\psake\psake.ps1'" but you can put "OzSystems.Tools\psake\psake.ps1" even if it has spaces.
To quote from the help (run PowerShell -?) emphasis mine:
-Command
Executes the specified commands (and any parameters) as though they were
typed at the Windows PowerShell command prompt, and then exits, unless
NoExit is specified. The value of Command can be "-", a string. or a
script block.
If the value of Command is "-", the command text is read from standard
input.
If the value of Command is a script block, the script block must be enclosed
in braces ({}). You can specify a script block only when running PowerShell.exe
in Windows PowerShell. The results of the script block are returned
to the parent shell as deserialized XML objects, not live objects.
If the value of Command is a string, Command must be the last parameter
in the command , because any characters typed after the command are
interpreted as the command arguments.
I have been having the same problems as you (as you've seen from my comments). I have given up on the powershell launcher and moved to running things using the batch file launcher. Even though I had set the system to unrestricted that setting didn't seem to matter to hudson's launcher. I don't know if it runs in some other context or something, even adding things to the global profile.ps1 didn't seem to help. What I ended up doing was running
powershell " set-executionpolicy Unrestricted; & 'somefile.ps1'"
which does what I need, although it isn't ideal. I've e-mailed the plugin author about this and will update.
For question #1, try this (assuming you are using PowerShell 2.0):
"C:\Windows\system32\WindowsPowerShell\v1.0\powershell -executionPolicy Unrestricted -file OzSystems.Tools\psake\psake.ps1 C:\{path}\oz-build.ps1"
You are using "." for the path to oz-build.ps1. I suspect you will need to provide the full path to your oz-build.ps1 file to make this work. Unless the infrastructure that executes the command above happens to have the current dir set correctly. And even if it is set correctly for the "process", that only matters to .NET/Win32 API calls and not to PowerShell cmdlets. Current dir in PowerShell is tracked differently than the process's current dir because PowerShell can have multiple runspaces running simultaneously. That sort of global, mutable value doesn't work in this concurrent scenario.
As for question #2, what account does the Hudson service run under? Make sure that account has executed Set-ExecutionPolicy RemoteSigned (or unrestricted).
I just got through this exact problem. What a pain!
If you are running a 32-bit JVM on a 64-bit Windows, make sure that you set the execution policy for the 32-bit Powershell interface. I found my 32 bit executable here:
C:\Windows\syswow64\Windowspowershell\v1.0\powerhsell.exe
The 32- and 64-bit Powershell environments are completely distinct so setting the execution policy in one has no effect on the other.