I'm deployng DACPAC with with the following post-deployment script1:
ALTER DATABASE [$(DatabaseName)]
MODIFY FILE (NAME = [$(DatabaseName)],
SIZE = 100MB, MAXSIZE = UNLIMITED, FILEGROWTH = 20%)
However, when I try to reference this post-deployment script2, deployment fails:
ALTER DATABASE [$(DatabaseName)]
MODIFY FILE (NAME = [$(DatabaseName)],
SIZE = $(size), MAXSIZE = UNLIMITED, FILEGROWTH = 20%)
So, my question is what is the correct syntax to pass as size?
I use this command to initiate the script (first script1 works, script2 doesn't)
$resourceGroup = "SQL1"
$vmName = "SQL1"
$storageName = "sql1artifactsstorage111"
$ConfigurationPath = "C:\DSC\Ext\deployDB.ps1"
$ConfigurationName = "deployDB"
$configurationArchive = "deployDB.ps1.zip"
#Publish the configuration script into user storage
Publish-AzureRmVMDscConfiguration -ConfigurationPath $ConfigurationPath -ResourceGroupName $resourceGroup -StorageAccountName $storageName -force
#Set the VM to run the DSC configuration
$configurationArguments =
#{
Credential = Get-Credential;
DatabaseName = 'Database1'
size = '100MB'
}
Set-AzureRmVmDscExtension -Version 2.22 -Name dscExtension -ResourceGroupName $resourceGroup -VMName $vmName -ArchiveStorageAccountName $storageName -ArchiveBlobName $configurationArchive -AutoUpdate:$true -ConfigurationName $ConfigurationName -ConfigurationArgument $configurationArguments
This is the error DSC shows:
Dac Deploy Failed: 'Exception calling "Deploy"
with "3" argument(s): "An error occurred during deployment plan generation.
Deployment cannot continue."'
Dynamic SQL is the last refuge of the scoundrel:
EXEC ('ALTER DATABASE [$(DatabaseName)] MODIFY FILE (NAME = [$(DatabaseName)],SIZE = $(size), MAXSIZE = UNLIMITED, FILEGROWTH = 20%)');
GO
worked for me, albeit in a local deployment rather than through Azure DSC.
This is a general purpose technique for doing these "DBA-esque" activities that often don't support TSQL variable substitution.
Note to future readers: this trick apparently worked, which is to say the OP has marked it as the answer. However, it was probably only necessary due to differing versions of DacFX between the workstation and the build server, see for example another SO post relating to the same error message.
Related
The PowerShell function Invoke-AzOperationalInsightsQuery docs, source allows me to pass arbitrary an KQL string to an App Insights resource.
Set-AzContext -Subscription "my-sub"
$workspaceName = "vl-loganalytics-workspace"
$workspaceRG = "vl-loganalytics"
$WorkspaceID = (Get-AzOperationalInsightsWorkspace -Name $workspaceName -ResourceGroupName $workspaceRG).CustomerID
$query = "print current_cluster_endpoint()"
$kqlQuery = Invoke-AzOperationalInsightsQuery -WorkspaceId $WorkspaceID -Query $query
$kqlQuery.Results
HT learningbydoing.cloud
Does an equivalent method exist for querying an Azure Data Explorer cluster directly? No public function is listed in the Az.Kusto module as of version 2.1.0, but perhaps there is a community module or blog post documenting an ad-hoc method for this?
Referencing Kusto .NET client libraries from PowerShell, this is possible with the below helper code after downloading & unzipping the Microsoft.Azure.Kusto.Tools NuGet package
$clusterUrl = 'https://help.kusto.windows.net;Fed=True'
# DatabaseName may be null, empty, or incorrect
$databaseName = 'Samples'
$query = 'print current_cluster_endpoint()'
$packagesRoot = Resolve-Path "tools\net6.0"
[System.Reflection.Assembly]::LoadFrom("$packagesRoot\Kusto.Data.dll")
$kcsb = New-Object Kusto.Data.KustoConnectionStringBuilder ($clusterUrl, $databaseName)
$queryProvider = [Kusto.Data.Net.Client.KustoClientFactory]::CreateCslQueryProvider($kcsb)
$crp = New-Object Kusto.Data.Common.ClientRequestProperties
$crp.ClientRequestId = "MyPowershellScript.ExecuteQuery." + [Guid]::NewGuid().ToString()
$crp.SetOption([Kusto.Data.Common.ClientRequestProperties]::OptionServerTimeout, [TimeSpan]::FromSeconds(30))
$reader = $queryProvider.ExecuteQuery($query, $crp)
$dataTable = [Kusto.Cloud.Platform.Data.ExtendedDataReader]::ToDataSet($reader).Tables[0]
$dataView = New-Object System.Data.DataView($dataTable)
$dataView
Note that $databaseName does not need to correspond to an existing database to establish a connection. This can cause errors if you typo a database name; or it can be helpful if the command you wish to execute does not require a database context.
When I run the following script to create a Scheduled Task, I receive the error
Bad type(Exception de HRESULT : 0x80020005 (DISP_E_TYPEMISMATCH))
$u = "domain\$env:username"
$p = "SomePassword"
$UserPass = ConvertTo-SecureString $p -AsPlainText -Force
$TaskName = "ML"
$TaskDescr = "Descriptor"
$TaskCommand = "$pos\$nm"
$TaskStartTime = [datetime]::Now.AddMinutes(5)
$service = new-object -ComObject("Schedule.Service")
$service.Connect()
$rtFr = $service.GetFolder("\")
$TaskDefinition = $service.NewTask(0)
$TaskDefinition.RegistrationInfo.Description = "$TaskDescr"
$TaskDefinition.Settings.Enabled = $true
$TaskDefinition.Settings.AllowDemandStart = $true
$triggers = $TaskDefinition.Triggers
$dd = "T"
$vv = "yyyy-MM-dd"
$xx = "HH:mm:ss"
$pr = "$vv$dd$xx"
$trigger = $triggers.Create(9)
$trigger.StartBoundary = $TaskStartTime.ToString($pr)
$trigger.Enabled = $true
$Action = $TaskDefinition.Actions.Create(0)
$action.Path = "$TaskCommand"
$rtFr.RegisterTaskDefinition("$TaskName",$TaskDefinition,6,$u,$UserPass,5)
There is too much to say about your script but let's focus on your issue:
Exception from HRESULT: 0x80020005 (DISP_E_TYPEMISMATCH)
This error is caused by the fact that $rtFr.RegisterTaskDefinition parameter for the password requires clear text, not encrypted string. So change your script as follow:
Use $p instead of $UserPass
Change trigger type from 9 to 1 (see below why)
Change logon type from 5 to 6 (see below why)
Modifications:
# Change these lines
$trigger = $triggers.Create(9)
$rtFr.RegisterTaskDefinition("$TaskName",$TaskDefinition,6,$u,$UserPass,5)
# To this
$trigger = $triggers.Create(1)
$rtFr.RegisterTaskDefinition($TaskName,$TaskDefinition,6,$u,$p,6)
refs:
https://learn.microsoft.com/en-us/windows/win32/taskschd/taskfolder-registertaskdefinition
https://learn.microsoft.com/en-us/windows/win32/taskschd/triggercollection-create
This is enough for your script to be executed without errors
...
Now, if you don't change the trigger type from 9 to 1, you will encounter this error:
Exception from HRESULT: 0x80070005 (E_ACCESSDENIED)
You are trying to create a task that will be executed at logon ($trigger = $triggers.Create(9)), which requires administrative privileges. So you must run your script from an elevated Powershell session (run as Administrator)
And finally, if you don't change the logon type from 5 to 6, you will encounter this error:
(XX,XX):UserId: At :XX char:XX
You are trying to create a task with a logon type TASK_LOGON_SERVICE_ACCOUNT, without specifying which service account to use. Also you are passing username and password parameters to create the task, they should be empty/null.
# Change this line
$rtFr.RegisterTaskDefinition($TaskName,$TaskDefinition,6,$u,$p,6)
# To this
$TaskDefinition.Principal.UserId = "S-1-5-18" # i.e Local System Account
$rtFr.RegisterTaskDefinition($TaskName,$TaskDefinition,6,$null,$null,5)
Conclusion:
So I don't know what you are trying to achieve but you must be consistent with all parameters for the task creation to execute properly.
I would advise you to use SCHTASKS.exe instead of the COM object, it will be easier, less complex and it's available on Windows 7 and higher versions.
I am doing batch uploads from a csv file to Azure table storage through a Powershell script and i have a command: $table.CloudTable.ExecuteBatch($batchOperation)
for which i'm getting the error mentioned in the header of the question of my post. I believe that "ExecuteBatch" is a method in the old AzureRm module and not the newer Az module which i am using, which is causing it to break. Is there a corresponding method in Az module for "ExecuteBatch"?
According to my test, if we use new Azure PowerShell module Az to manage Azure Table storage, we need to use the SDK Microsoft.Azure.Cosmos.Table.
So if ou want to use ExecuteBatch method, we need to use the command [Microsoft.Azure.Cosmos.Table.TableBatchOperation] $batchOperation = New-Object -TypeName Microsoft.Azure.Cosmos.Table.TableBatchOperation to create TableBatchOperation. For example:
Connect-AzAccount
$ResourceGroupName = "testfun06"
$StorageAccountName="testfun06bf01"
$TableName="People"
$keys=Get-AzStorageAccountKey -ResourceGroupName $ResourceGroupName -Name $StorageAccountName
$ctx = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $keys[0].Value
$table = Get-AzStorageTable -Name $TableName -Context $ctx
$e = New-Object Microsoft.Azure.Cosmos.Table.DynamicTableEntity("Jim","test")
$e1 = New-Object Microsoft.Azure.Cosmos.Table.DynamicTableEntity("Jim","test1")
[Microsoft.Azure.Cosmos.Table.TableBatchOperation] $batchOperation = New-Object -TypeName Microsoft.Azure.Cosmos.Table.TableBatchOperation
$batchOperation.InsertOrMerge($e)
$batchOperation.InsertOrMerge($e1)
$table.CloudTable.ExecuteBatch($batchOperation)
I am attempting to loop through an invoke-sqlcmd for multiple AzureSQL databases via Azure Automation. The first item in the loop executes, but the all the rest fail with a:
Invoke-Sqlcmd : A network-related or instance-specific error occurred
while establishing a connection to SQL Server. The server was not
found or was not accessible. Verify that the instance name is correct
and that SQL Server is configured to allow remote connections.
(provider: Named Pipes Provider, error: 40 - Could not open a
connection to SQL Server)
I am guessing that I need to close the connection from the first invoke-sqlcmd before executing the next, but have not found a direct method to accomplish that with invoke-sqlcmd. Here is my loop:
param(
# Parameters to Pass to PowerShell Scripts
[parameter(Mandatory=$true)][String] $azureSQLServerName = "myazuresql",
[parameter(Mandatory=$true)][String] $azureSQLCred = "myazureautosqlcred"
)
# DB Name Array
$dbnamearray = #("database1","database2","database3")
$dbnamearray
# Datatable Name
$tabName = "RunbookTable"
#Create Table object
$table = New-Object system.Data.DataTable "$tabName"
#Define Columns
$col1 = New-Object system.Data.DataColumn dbname,([string])
#Add the Columns
$table.columns.add($col1)
# Add Row and Values for dname Column
ForEach ($db in $dbnamearray)
{
$row = $table.NewRow()
$row.dbname = $db
#Add the row to the table
$table.Rows.Add($row)
}
#Display the table
$table | format-table -AutoSize
# Loop through the datatable using the values per column
$table | ForEach-Object {
# Set loop variables as these are easier to pass then $_.
$azureSQLDatabaseName = $_.dbname
# Execute SQL Query Against Azure SQL
$azureSQLServerName = $azureSQLServerName + ".database.windows.net"
$Cred = Get-AutomationPSCredential -Name $azureSQLCred
$SQLOutput = $(Invoke-Sqlcmd -ServerInstance $azureSQLServerName -Username $Cred.UserName -Password $Cred.GetNetworkCredential().Password -Database $azureSQLDatabaseName -Query "SELECT * FROM INFORMATION_SCHEMA.TABLES " -QueryTimeout 65535 -ConnectionTimeout 60 -Verbose) 4>&1
Write-Output $SQLOutput
}
You can try making each connection as a powershell job. This solved a very similar issue I had some time ago. Send-MailMessage closes every 2nd connection when using attachments If you want to read an explanation. Basically, if you're unable to use a .Close() method, you can force connections to close by terminating the entire session for each run. In an ideal world the cmdlet would handle all this for you, but not everything was created perfectly.
# Loop through the datatable using the values per column
$table | ForEach-Object {
# Set loop variables as these are easier to pass then $_.
$azureSQLDatabaseName = $_.dbname
# Execute SQL Query Against Azure SQL
$azureSQLServerName = $azureSQLServerName + ".database.windows.net"
$Cred = Get-AutomationPSCredential -Name $azureSQLCred
# Pass in the needed parameters via -ArgumentList and start the job.
Start-Job -ScriptBlock { Write-Output $(Invoke-Sqlcmd -ServerInstance $args[0] -Username $args[1].UserName -Password $args[1].GetNetworkCredential().Password -Database $args[0] -Query "SELECT * FROM INFORMATION_SCHEMA.TABLES " -QueryTimeout 65535 -ConnectionTimeout 60 -Verbose) 4>&1 } -ArgumentList $azureSQLServerName, $Cred | Wait-Job | Receive-Job
}
This is untested since I do not have a server to connect to, but perhaps with a bit of work you can make something out of it.
I faced the same issue previously while doing something with the database of azure sql. You can try this
1. Create Automation Account
New-AzureRmAutomationAccount -ResourceGroupName $resourceGroupName -Name $automationAccountName -Location $location
2. Set the Automation account to work with
Set-AzureRmAutomationAccount -Name $automationAccountName -ResourceGroupName $resourceGroupName
3. Create / Import a Runbook
Here we already have a runbook ready so we import it. Here's the runbook code
workflow runbookNameValue
{
inlinescript
{
$MasterDatabaseConnection = New-Object System.Data.SqlClient.SqlConnection
$MasterDatabaseConnection.ConnectionString = "ConnectionStringValue"
# Open connection to Master DB
$MasterDatabaseConnection.Open()
# Create command
$MasterDatabaseCommand = New-Object System.Data.SqlClient.SqlCommand
$MasterDatabaseCommand.Connection = $MasterDatabaseConnection
$MasterDatabaseCommand.CommandText = "Exec stored procedure"
# Execute the query
$MasterDatabaseCommand.ExecuteNonQuery()
# Close connection to Master DB
$MasterDatabaseConnection.Close()
}
}
4. Importing
Import-AzureRMAutomationRunbook -Name $runBookName -Path $scriptPath -ResourceGroupName $resourceGroupName -AutomationAccountName $automationAccountName -Type PowerShell
I hope this helps. Instead of using Invoke-Sqlcmd use the $MasterDatabaseCommand.ExecuteNonQuery() like i've provided in the runbook. It will work
It seems that you append .database.windows.net to the server name inside the loop. I guess that's why it works for the first iteration only.
Just move this line:
$azureSQLServerName = $azureSQLServerName + ".database.windows.net"
before this line:
$table | ForEach-Object {
When creating a new Slot for an Azure WebApp, how can I successfully change one or more of the AppSettings?
The docs for New-AzureRmWebAppSlot suggest that there is a parameter called -AppSettingsOverrides, but that does not work.
It should be noted however that the linked docs seem to incorrectly reference the New-AzureRmWebApp Cmdlet, so I can't be sure if the parameter is actually valid (although it seems to be accepted without error).
Here is the code that I am running.
New-AzureRmWebAppSlot -ResourceGroupName $resourceGroupName -Name $webAppName -Slot $slotName -AppSettingsOverrides #{"FUNCTION_APP_EDIT_MODE" = "readwrite"} -ErrorAction Stop
Has anyone else experienced this seemlying incorrect behaviour, and if so, how did you fix it?
My Azure version is 3.5.0.
You could create Slot firstly, then use Set-AzureRmWebAppSlot to change AppSetting. Following script works for me.
$myResourceGroup = "shuiapp"
$mySite = "shuicli"
$slotName = "Test1"
$webApp = Get-AzureRMWebAppSlot -ResourceGroupName $myResourceGroup -Name $mySite -Slot $slotName
$appSettingList = $webApp.SiteConfig.AppSettings
$hash = #{}
ForEach ($kvp in $appSettingList) {
$hash[$kvp.Name] = $kvp.Value
}
$hash['ExistingKey2'] = "NewValue12"
Set-AzureRMWebAppSlot -ResourceGroupName $myResourceGroup -Name $mySite -AppSettings $hash -Slot $slotName
The question will be helpful.
Credit to this dingmeng-xue on Github for pointing this out but for the benefit of everyone reading Stackoverflow, it appears that the cmdlet doesn't a) check that -SourceWebApp and -AppSettingsOverrides are set together and b) only does a null check if $SourceWebApp is defined and silently does nothing even if $AppSettingsOverrides was defined. So you might set them both and test if that solves the issue.