Background
I have one Azure Runbook (named RunStoredProcedure2) defined as such:
param(
[parameter(Mandatory=$True)]
[string] $SqlServer,
[parameter(Mandatory=$False)]
[int] $SqlServerPort = 1433,
[parameter(Mandatory=$True)]
[string] $Database,
[parameter(Mandatory=$True)]
[string] $StoredProcedureName,
[parameter(Mandatory=$True)]
[PSCredential] $SqlCredential
)
# Get the username and password from the SQL Credential
$SqlUsername = $SqlCredential.UserName
$SqlPass = $SqlCredential.GetNetworkCredential().Password
$Conn = New-Object System.Data.SqlClient.SqlConnection("Server=tcp:$SqlServer,$SqlServerPort;Database=$Database;User ID=$SqlUsername;Password=$SqlPass;Trusted_Connection=False;Encrypt=True;Connection Timeout=30;")
$Conn.Open()
$Cmd=new-object system.Data.SqlClient.SqlCommand("EXEC
$StoredProcedureName", $Conn)
$Cmd.CommandTimeout=120
$Cmd.ExecuteNonQuery()
# Close the SQL connection
$Conn.Close()
I'm now attempting to call this runbook from another Azure Runbook using this command:
& .\RunStoredProcedure2.ps1 -Database 'adventureworksnh.database.windows.net' -SqlServer 'AdventureWorksDW' -SqlServerPort 1433 -StoredProcedureName 'TestJob1' -sqlCredential 'awadmin'
Issue
When I attempt to run this, I get this error:
C:\Temp\uzahthmc.su1\RunStoredProcedure2.ps1 : Cannot process argument transformation on parameter 'SqlCredential'. A
command that prompts the user failed because the host program or the command type does not support user interaction.
The host was attempting to request confirmation with the following message: Enter your credentials.
I am able to run RunStoredProcedure with these parameters successfully
What I've Tried
Adding/removing the preceding ampersand
Using Invoke-Expression
Do you have a Credential asset named "awadmin" in your Automation account? When you start a runbook directly (using the Start button in the portal or the Start-AzureRmAutomationRunbook), Azure Automation allows you to pass a credential asset name as a parameter value, and it will retrieve the specified credential automatically. However, when you invoke a runbook from another runbook, you have to follow plain PowerShell rules and pass a PSCredential object, as declared:
$sqlCred = Get-AutomationPSCredential -Name 'awadmin'
& .\RunStoredProcedure2.ps1 ... -sqlCredential $sqlCred
Related
I am having trouble creating a release definition on Azure DevOps. I believe the Register-Environment function in PowerShellOnTargetMachines.ps1 is failing:
try
{
$connection = Get-VssConnection -TaskContext $distributedTaskContext
Write-Verbose "Starting Register-Environment cmdlet call for environment : $environmentName with filter $machineFilter"
$environment = Register-Environment -EnvironmentName $environmentName -EnvironmentSpecification $environmentName -UserName $adminUserName -Password $adminPassword -WinRmProtocol $protocol -TestCertificate ($testCertificate -eq "true") -Connection $connection -TaskContext $distributedTaskContext -ResourceFilter $machineFilter
Write-Verbose "Completed Register-Environment cmdlet call for environment : $environmentName"
Write-Verbose "Starting Get-EnvironmentResources cmdlet call on environment name: $environmentName"
$resources = Get-EnvironmentResources -Environment $environment
if ($resources.Count -eq 0)
{
Write-Telemetry "Input_Validation" "No machine exists for given environment"
throw (Get-LocalizedString -Key "No machine exists under environment: '{0}' for deployment" -ArgumentList $environmentName)
}
$resourcesPropertyBag = Get-ResourcesProperties -resources $resources
}
With the following error (I have omitted some of my organization's information, but it is there and looks right):
2019-09-04T12:34:55.6886629Z ##[debug]VssConnection created
2019-09-04T12:34:55.7518340Z ##[debug]Starting Register-Environment cmdlet call for environment : [machine] with filter [machine]
2019-09-04T12:34:55.7843531Z ##[debug]Begin Create-Environment cmdlet
2019-09-04T12:34:55.7872731Z ##[debug]UserName=[username]
2019-09-04T12:34:55.7878292Z ##[debug]WinRmProtocol=HTTP
2019-09-04T12:34:55.7878658Z ##[debug]TestCertificate=False
2019-09-04T12:34:55.7878965Z ##[debug]Unable to create a environment object for given json - Unexpected character encountered while parsing value: A. Path '', line 0, position 0.
2019-09-04T12:34:55.7879241Z ##[debug]projectName=[projectName]
2019-09-04T12:34:55.7879517Z ##[debug]Getting environment [machine] from DTL Service
2019-09-04T12:34:55.8485808Z ##[debug]Processed: ##vso[task.logissue type=error;code={"Task_Internal_Error":Page not found.};]
And I do not know what to do. Any ideas are appreciated. Thanks for taking the time to read my question.
Based on the details which from comment that Mike left, the issue should caused by the the task version you are using and its feature script we defined.
According to the ticket description you raised in DC ticket:
I am trying to use V2 of the following 'PowerShellOnTargetMachines'
PowerShell script for a release on azure devops, from
azure-pipeline-tasks.
the version of PowerShellOnTargetMachines you are using is 2.*.
We can get the Register-Environment cmdlet script with decompile tool follow the method I showed in this case.
You can see the parameters this cmdlet supported in 'PowerShellOnTargetMachines V2.*' does not include TaskContext and Connection. That's why you receive the error message "Unable to create a environment object for given json - Unexpected character encountered while parsing value". Because the parameter you input in the task does not match the configuration of Register-Environment cmdlet.
You can try with PowerShellOnTargetMachines V1.*. In 1.* we support the parameters TaskContext and Connection.
Currently in my organization having total 15 QA and 5 UAT SQL environments. As of now we are managing all SQL agent jobs using multi server administration feature of SQL server in one of target server. Now we are planning to migrate these all SQL environments database on Azure. In azure we are using SQL Database service not any VM box.
So, Is there any feature or alternative solution to manage all jobs of all environment in one of central location in azure. All SQL jobs are type of T-SQL only.
You can use Azure Automation to centralize all your Azure SQL Database jobs in one place.
You can use the following PowerShell Workflow on Azure Automation to schedule execution of any stored procedure on any Azure SQL Database no matter on which Azure SQL logical server they reside.
workflow SQL_Agent_SprocJob
{
[cmdletbinding()]
param
(
# Fully-qualified name of the Azure DB server
[parameter(Mandatory=$true)]
[ValidateNotNullOrEmpty()]
[string] $SqlServerName,
# Name of database to connect and execute against
[parameter(Mandatory=$true)]
[ValidateNotNullOrEmpty()]
[string] $DBName,
# Name of stored procedure to be executed
[parameter(Mandatory=$true)]
[ValidateNotNullOrEmpty()]
[string] $StoredProcName,
# Credentials for $SqlServerName stored as an Azure Automation credential asset
[parameter(Mandatory=$true)]
[ValidateNotNullOrEmpty()]
[PSCredential] $Credential
)
inlinescript
{
Write-Output “JOB STARTING”
# Setup variables
$ServerName = $Using:SqlServerName
$UserId = $Using:Credential.UserName
$Password = ($Using:Credential).GetNetworkCredential().Password
$DB = $Using:DBName
$SP = $Using:StoredProcName
# Create & Open connection to Database
$DatabaseConnection = New-Object System.Data.SqlClient.SqlConnection
$DatabaseConnection.ConnectionString = “Data Source = $ServerName; Initial Catalog = $DB; User ID = $UserId; Password = $Password;”
$DatabaseConnection.Open();
Write-Output “CONNECTION OPENED”
# Create & Define command and query text
$DatabaseCommand = New-Object System.Data.SqlClient.SqlCommand
$DatabaseCommand.CommandType = [System.Data.CommandType]::StoredProcedure
$DatabaseCommand.Connection = $DatabaseConnection
$DatabaseCommand.CommandText = $SP
Write-Output “EXECUTING QUERY”
# Execute the query
$DatabaseCommand.ExecuteNonQuery()
# Close connection to DB
$DatabaseConnection.Close()
Write-Output “CONNECTION CLOSED”
Write-Output “JOB COMPLETED”
}
}
Use this step-by-step tutorial to get started.
I have an issue with powershell scriptblock executed by ServiceNow MID Server on target host.
Target host is running SQL server where I need to execute some PS Script running SQL commands and post process results (using JDBC activity is not valid in this case).
So I am running powershell on MID server (some kind of proxy, for these who are not familiar with ServiceNow) and I am needed to execute PS script using Encoded command via Invoke-WMIMethod cmdlet (I cannot use Invoke-Command, due to disabled PS remoting - no way to enable that due to company policies) like below:
$Bytes=[System.Text.Encoding]::Unicode.GetBytes($CallArgumentsBlock)
# Command block is passed as encoded command
$EncodedCommand=[Convert]::ToBase64String($Bytes)
$WMIArgs=#{
Class='win32_process'
Name='Create'
ComputerName=$computer
ArgumentList="powershell -EncodedCommand $EncodedCommand"
ErrorAction='SilentlyContinue'
Credential=$cred
}
Invoke-WmiMethod #WMIArgs | Out-Null
Issue is that I need to push into $EncodedCommand pre-evaluated variables. Example of script block:
$CallArgumentsBlock={
Param(
[string] $SQLServer = "$inputSQLServer", #SQL Server Name running on host
[string] $SQLDBName = "$inputSQLDBName", #SQL DB name
[string] $InputData = "$inputInputData", #JSON string holding data processed by PS+SQL
[string] $ResultFolderPath = "$OutputPath_forSQL" #File path where to output results cause WMI can not return output of invoked command
)
$ProcessData = ConvertFrom-JSON -InputObject $InputData
#For each object call sql...
...
*PS code*
...
}
So what should be powershell way to do it?
Does anybody suggest better ServiceNow based solution?
Thanks a lot for your answers!
My initial thought is to remove the Param() statement, and just list the variables and set the values when creating the script block, any reason this wouldn't work?
$CallArgumentsBlock={
[string] $SQLServer = "$inputSQLServer" #SQL Server Name running on host
[string] $SQLDBName = "$inputSQLDBName" #SQL DB name
[string] $InputData = "$inputInputData" #JSON string holding data processed by PS+SQL
[string] $ResultFolderPath = "$OutputPath_forSQL" #File path where to output results cause WMI can not return output of invoked command
$ProcessData = ConvertFrom-JSON -InputObject $InputData
#For each object call sql...
...
*PS code*
...
}
How would you go on to secure your DSC configuration that's using credentials properly in Azure Automation?
E.g.:
configuration MyServer {
param(
[Parameter(Mandatory=$true)][PSCredential]$MyCredential
);
# Some configuration using credentials
}
Normally I'd set up a public key and a proper certificate installed on each node and pass along CertificateFile and Thumbprint to ConfigurationData when compiling the configuration documents.
In Azure Automation I can't find any good solution.
The documentation says Azure automation encrypts the entire MOF by it's own : https://azure.microsoft.com/en-us/documentation/articles/automation-certificates/ and the article specifies to use PSAllowPlainTextCredentials.
When you then register a node to it's pull server and pull it's configuration, you can read out the password in plain text as long as you have local admin / or read access to temp, after being pulled down/updated. This is not good in a security perspective.
What I'd like, ideally would be to upload such a public key/certificate to the Azure Automation credentials and use it as a part of ConfigurationData when starting the compilation job.
However today, "CertificateFile" expects a path and not a AutomationCertificate so I cannot see a way to start the compilationjob with any public key present in Azure Automation. I can't see any ways of referring to my assets certificate when running the job.
Any ideas if this is possible in the current state of Azure automation and the way they work with DSC/pull to secure it properly using either an asset store din Azure Automation or Azure Key vault?
You should create a Azure Automation Credential and reference it in the configuration like so:
# Compile mof
$ConfigurationData = #{
AllNodes = #(
#{
NodeName = $nodeName
PSDscAllowPlainTextPassword = $true
}
)
}
$Parameters = #{
"nodeName" = $nodeName
"credential" = $credName ##### Notice this is only the name on Azure Automation Credential Asset, Azure Automation will securely pull those and use in the compilation
}
Start-AzureRmAutomationDscCompilationJob -ResourceGroupName $ResourceGroupName -AutomationAccountName $AutomationAccountName `
-ConfigurationName $configurationName -Parameters $Parameters -ConfigurationData $ConfigurationData
You should not worry about the PSDscAllowPlainTextPassword since Azure Automation does encrypt everything at rest for you, it's just that DSC doesn't know that (so you have to supply that to the DSC engine).
And in DSC you should have something like:
Configuration name
{
Param (
[Parameter(Mandatory)][ValidateNotNullOrEmpty()][String]$nodeName,
[Parameter(Mandatory)][ValidateNotNullOrEmpty()][pscredential]$credential
)
Import-DscResource -Module modules
Node $nodeName {DOSTUFF}
}
The correct way to pass a credential to a DSC file from Azure Automation is to use an Azure Automation Credential.
Then inside your DSC file you use the command Get-AutomationPSCredential
Example:
Configuration BaseDSC
{
Import-DscResource -ModuleName xActiveDirectory
Import-DscResource -ModuleName PSDesiredStateConfiguration
Import-DscResource -ModuleName XNetworking
$Credential = Get-AutomationPSCredential -Name "CredentialName"
Node $AllNodes.Nodename
{ ...
The credential is stored encrypted in Azure Automation, and is put into the encrypted MOF file in Azure Automation when you run the compilation job.
Additionally, the password can be updated in Azure Automation and then updated in the MOFs by just recompiling.
The password is not able to be retrieved in clear text from Azure.
Use secure credential and create user to Windows server and add to Administrator group using dsc:
**Solution ( PowerShell DSC) **
Firstly : Create credential in Automation account form azure portal or using any azure module
Home>Resource group> ...> automation account > credentials
Configuration user_windows_user
{
param
(
[Parameter()][string]$username,
[Parameter()]$azurePasswordCred **#name (string)of the credentials**
)
$passwordCred = Get-AutomationPSCredential -Name $azurePasswordCred
Node "localhost"
{
User UserAdd
{
Ensure = "Present" # To ensure the user account does not exist, set Ensure to "Absent"
UserName = $username
FullName = "$username-fullname"
PasswordChangeRequired = $false
PasswordNeverExpires = $false
Password = $passwordCred # This needs to be a credential object
}
Group AddtoAdministrators
{
GroupName = "Administrators"
Ensure = "Present"
MembersToInclude = #($username)
}
}
} # end of Configuration
#
$cd = #{
AllNodes = #(
#{
NodeName = 'localhost'
PSDscAllowPlainTextPassword = $true
}
)
}
Upload the Dsc file in azure automation>configuration
compile the configuration (provide input -username , credential name (string)
add configuration to node and wait for the configuration deployment
I'm trying to pass a hash like {"server":"database","server2":"database_b"} as a parameter to a runbook on Microsoft Azure. But neither
[parameter(Mandatory=$true)] [hashtable]$ServersWithCorrespondingDatabase,
nor
[parameter(Mandatory=$true)] [object]$ServersWithCorrespondingDatabase,
seems to work?
In this example Runbook Gallery they go with the argument ChildRunbookInputParams as hashtable, like:
Start-ChildRunbook `
-ChildRunbookName "Update-VM" `
-ChildRunbookInputParams #{'VMName'='VM204';'Retries'=3} `
-AzureConnectionName "Visual Studio Ultimate with MSDN"
-AutomationAccountName "Contoso IT Automation Production" `
-WaitForJobCompletion $true `
-ReturnJobOutput $true `
-JobPollingIntervalInSeconds 20 `
-JobPollingTimeoutInSeconds 120
But somehow I can not pass the string #{"Server"="DB";"Server2"="DB3"} to my azure runbook as a parameter... Any idea?
In the runbook, you should define the parameter as [object] type, not [hashtable]. Are you starting the runbook through the Azure Automation portal, or inline via another runbook?
Assuming this is your runbook:
workflow a {
param(
[object] $obj
)
$obj
}
If via the portal, you need to specify the object parameter as JSON, ex:
{"server":"database","server2":"database_b"}
If via a runbook calling this runbook inline, specify it as a PowerShell object or hashtable:
a -Obj #{"server"="database";"server2"="database_b"}
For more info see http://azure.microsoft.com/blog/2014/08/12/azure-automation-runbook-input-output-and-nested-runbooks/
Edit: If you are trying to start this runbook specifically by calling the Start-ChildRunbook runbook within a runbook, it would look like this, since the Start-ChildRunbook runbook takes the parameters of the runbook to start as a hashtable, and in this case the value of one of those parameters is a hashtable/object itself:
$ValueForObjParam = #{"server"="database";"server2"="database_b"}
Start-ChildRunbook `
-ChildRunbookName "a" `
-ChildRunbookInputParams #{"obj"=$ValueForObjParam} `
-AzureConnectionName "Visual Studio Ultimate with MSDN" `
-AutomationAccountName "Contoso IT Automation Production" `
-WaitForJobCompletion $true `
-ReturnJobOutput $true `
-JobPollingIntervalInSeconds 20 `
-JobPollingTimeoutInSeconds 120
The script you refer to in the gallery uses a hashtable to specify all the params to the child runbook, not a single param of type hashtable. So, the child runbook probably has a param block like so:
workflow start-update-vm
{
param ($VMName, $retries)
#rest of code
}
You can, however, take a stringed json input and it will be converted into a hashtable/array as described here: http://blogs.technet.com/b/orchestrator/archive/2014/01/10/sma-capabilities-in-depth-runbook-input-output-and-nested-runbooks.aspx.