How to execute PowerShell script as Azure Automation Runbook from InlineScript inside PSWorkflow runbook? - powershell

In a PowerShell Workflow activity, I can call a native PowerShell script using InlineScript:
workflow test
{
InlineScript
{
.\script.ps1
}
}
But in Azure Automation, the dot-path (at least in my tests) was returning c:\windows\system32, and the script-as-runbook in Azure Automation did not exist there (or rather, it failed to execute because it could not find the script).
Is it possible to execute a native PS runbook stored in AAuto like this?
If so, how do I specify the path to the file?
Is this a bug/oversight in Azure Automation's parsing/compilation process of Workflow runbooks & InlineScript activities, preventing the dependent runbook from being copied to the worker?
I did a little hunting, and found that when native PS runbooks are executed:
They are first inspected for any other runbook references.
As part of the deployment to the worker for execution, a randomly-named folder is created under C:\Temp\
Referenced runbooks are eventually copied to this folder.
If runbooks are NOT found to be referenced, they are NOT copied to the temp directory.
The root runbook does not appear to be copied to the folder.
The dynamically-named folder is NOT created (under c:\Temp) when executing a Workflow runbook.
As part of the standard Workflow compilation, InlineScript activities have their contents copied to the autogenerated xaml. I'm uncertain about a linked file, though based on behavior that looks to be a runtime concern. My guess is that compilation happens each time a workflow is executed (hence the delayed start), and takes place on the worker, utilizing the standard PS workflow compilation just like local would.
I cannot (easily) convert this script to a workflow, and it is used from within other workflow activities. Right now the only way I can make this 'work' is to copy & paste the script into the first InlineScript within a workflow that requires it, which is obviously tedious & annoying from a maintenance perspective.
Presumably, as a workaround, I could use a Hybrid Worker, but that comes with a host of other issues, like ensuring the child runbooks are published there & having to maintain them separately, or AAuto not automatically pushing custom modules from the Automation Account to the worker (though this is planned), etc.

Please see https://azure.microsoft.com/en-us/blog/announcing-powershell-script-support-azure-automation-2/:
Right now, you can only invoke inline PowerShell runbooks from PowerShell runbooks, and PowerShell Workflow or Graph runbooks from PowerShell Workflow or Graph runbooks. This may change in the future.
It hasn't changed yet :)

Related

Scale up Azure Analysis service with Windows powershell

I'm working on a tabular model deployed on Azure Analysis Service.
We use Microsoft SSIS to load the DW and process the tabular model.
We now want to increase the QPU via SSIS or with a PowerShell script.
I found a script on SSIS http://microsoft-bitools.bl...
running a Powershell script via Runbook directly on Azure.
1- What's the difference between a Powershell script and a Powershell runbook script?
2- Do you know how I can do this task?
Thank you for your help.
For Q1, from the functional point of view, there is no difference between scaling up an Azure Analysis service using local Powershell script or using Azure automation Powershell runbook. Both of them will need to import Azure PowerShell modules and call functions.
For local Powershell script, you should install modules you need with command: Install-Module, and for Azure automation Powershell runbook, if you need some module, you can manage your modules on the Azure portal
For Q2, if you want to change your Analysis service pricing tier(SKU) to get more QPU, you can just use the PowerShell command below:
Set-AzAnalysisServicesServer -sku <sku name, such as B1,B2> -Name <your service name> -ResourceGroupName <the resource group name>
This is the reference of this command.
If you are not sure about how to use Azure PowerShell command, please see this official guide.

Azure DevOps - Calling a PS script from within a PS script

Can you not call a script from within a script in an Azure PowerShell task?
Background -
I have a an Azure Repo with two scripts in it (let's call them script0 and script1). There's no build going on so there's no build pipeline. There's just a release pipeline. The artifact it is picking up from is Azure Repository Git. I have just one task in the (release) pipeline and it's the Azure PowerShell task.
In script0, which is the main script, I have a for loop, which requires me to run the script1 (apart from the various other things that goes on in the loop).
For the life of me, I am unable to figure out how I can achieve that. Worst of it, it works locally. Also, everything else works in the loop. I have tried tons of things to fix it, but I will start with just this for now: The error I am being thrown when I run
$TeamFoundationCollectionUri$TeamProject/testscript.ps1 $stage $FunctionHosts[$i] (($hashtable | select -First 6).Key[$i]) $ResourceGroupName $location $functionApps $AdminClientSecret $VaultName $JsonFile
(Now, mind you - that is part of script0 - the main script).
Here's the error:
The blurred area is script0 and testscript.ps1 is script1
I have tried almost everything
Using the Call operator &
Using \, /, //
Invoke-Expression -Command "<code here>"
Invoke-Command
Also tried powershell.exe -Command <code here>
As you can tell, none of these have worked.
I got this working by using the Call operator (&) before the path where the script resides. So, I did this:
& $(System.ArtifactsDirectory)\$(System.TeamProject)\testscript.ps1 <pass the params here>
and it worked.

Execute PowerShell script from network folder

I'm developing a quite large automatic build in TFS2017 with a local VSTS build machine. My custom tasks will be mostly in PowerShell.
The inline PowerShell task handles only 500 or so characters and is too small to use for most tasks. Right now I'm editing my Powershell script, check it in, test run, read log for errors, correct, check in again and so on.
This is a bit tedious and I wonder if there are any options. I would like to avoid checking in each change in the script. Are there any options like executing my Powershell tasks from a network location during development of the build process?
You can specify UNC file path in PowerShell task.
You also can store the script files in a server (e.g. FTP), then download the file to working directory during build through PowerShell or others task.
On the other hand, there is PowerShell on Target machines task that can execute PowerShell scripts on remote machines.
You can use dot sourcing with your UNC path:
PS> . \\server\path\to\your\scriptmcscript.ps1
or use the invocation operator:
& \\server\path\to\your\scriptmcscript.ps1
You can use UNC path for the file with Powershell Task.
Or you could use the Powershell on target machine to run it.
But be careful about your choice. You have to keep in mind that who is running your script is the build/deployment agent. So while you are running it in your corporate network everything will be fine, because your agent can see your UNC path.
The moment you use that agent on a machine outside your network you will have to think about another solution, which may include saving your powershell file to a repo like Git or TFVC and then download the file to the local computer where you are running the agent.
This is the only way that works for me, call PowerShell from a .batch script with execution policy set to bypass (scope - process only)
-NonInteractive = do not prompt for confirm
-NoProfile = run under system context
powershell.exe -NoProfile -ExecutionPolicy Bypass -NonInteractive -Command C:\Users\User\Script.ps1

Including a runbook.ps1 script in your runbook

Scenario:
I have 2 runbooks, runbook A and runbook B.
Inside runbook B, I want to load runbook A into memory.
Running below doesn't work, can someone help me out please?
.\runbookb.ps1
. ./runbookb.ps1
Assuming that you want to invoke runbook B from runbook A, first of all please make sure runbook B exists in your Automation Account and is published. Then, depending on the runbook type, you need to use different syntax:
For PowerShell runbooks, the correct syntax is: ./runbookb.ps1 or . ./runbookb.ps1 (depending on whether you want the "dot-sourcing" behavior or not)
For PowerShell Workflow runbooks, the correct syntax is: runbookb
If you really want to load the runbook content into memory, this is different:
For PowerShell runbooks: Get-Content ./runbookb.ps1
For PowerShell Workflow runbooks: use the Export-AzureRmAutomationRunbook cmdlet.

Azure Runbook - Create from gallery using a Powershell script

I was trying to create a RunBook in Azure Automation and I chose to use a Powershell script (Cmdlet). But the creation failed with an error because azure couldn't directly convert it and a manual edit is required to make it a "workflow". I am not a pro on PS and I understand you need to know how a workflow works. But my immediate goal is to get the Runbook working and hence trying to figure out an easy way to convert a PS script to a PS workflow for the purpose of runbook... Appreciate any help
Please see this blog post for info on PowerShell script to PowerShell Workflow conversion: http://azure.microsoft.com/blog/2014/11/25/introducing-the-azure-automation-script-converter/
However, regardless of conversion, you can't interact with local files on your computer from Azure Automation, so this script won't work. See https://social.msdn.microsoft.com/Forums/en-US/03a7aace-4e3e-4dab-9a8f-4ed1002fde4e/how-to-upload-files-from-users-machine-to-blob-storage-using-azure-automation?forum=azureautomation for more details.