Currently in my organization having total 15 QA and 5 UAT SQL environments. As of now we are managing all SQL agent jobs using multi server administration feature of SQL server in one of target server. Now we are planning to migrate these all SQL environments database on Azure. In azure we are using SQL Database service not any VM box.
So, Is there any feature or alternative solution to manage all jobs of all environment in one of central location in azure. All SQL jobs are type of T-SQL only.
You can use Azure Automation to centralize all your Azure SQL Database jobs in one place.
You can use the following PowerShell Workflow on Azure Automation to schedule execution of any stored procedure on any Azure SQL Database no matter on which Azure SQL logical server they reside.
workflow SQL_Agent_SprocJob
{
[cmdletbinding()]
param
(
# Fully-qualified name of the Azure DB server
[parameter(Mandatory=$true)]
[ValidateNotNullOrEmpty()]
[string] $SqlServerName,
# Name of database to connect and execute against
[parameter(Mandatory=$true)]
[ValidateNotNullOrEmpty()]
[string] $DBName,
# Name of stored procedure to be executed
[parameter(Mandatory=$true)]
[ValidateNotNullOrEmpty()]
[string] $StoredProcName,
# Credentials for $SqlServerName stored as an Azure Automation credential asset
[parameter(Mandatory=$true)]
[ValidateNotNullOrEmpty()]
[PSCredential] $Credential
)
inlinescript
{
Write-Output “JOB STARTING”
# Setup variables
$ServerName = $Using:SqlServerName
$UserId = $Using:Credential.UserName
$Password = ($Using:Credential).GetNetworkCredential().Password
$DB = $Using:DBName
$SP = $Using:StoredProcName
# Create & Open connection to Database
$DatabaseConnection = New-Object System.Data.SqlClient.SqlConnection
$DatabaseConnection.ConnectionString = “Data Source = $ServerName; Initial Catalog = $DB; User ID = $UserId; Password = $Password;”
$DatabaseConnection.Open();
Write-Output “CONNECTION OPENED”
# Create & Define command and query text
$DatabaseCommand = New-Object System.Data.SqlClient.SqlCommand
$DatabaseCommand.CommandType = [System.Data.CommandType]::StoredProcedure
$DatabaseCommand.Connection = $DatabaseConnection
$DatabaseCommand.CommandText = $SP
Write-Output “EXECUTING QUERY”
# Execute the query
$DatabaseCommand.ExecuteNonQuery()
# Close connection to DB
$DatabaseConnection.Close()
Write-Output “CONNECTION CLOSED”
Write-Output “JOB COMPLETED”
}
}
Use this step-by-step tutorial to get started.
Related
Is there any method to deploy Power BI reports to Power BI Report Server without having to manually copy these files, upload them to the server and finally change the data source connectivity information for each report on a report by report basis which is not practical in each customer sites.
Eg. PowerBI Report File - 'Report_1' need to Deploy on Customer server S1, S2, S3, & so on.
Now we doing manually copy these files, upload them to the server and finally change the data source connectivity information for each report on a report by report basis which is not practical in each customer sites.
How we can automate the deployment of PBIX reports to Power BI Report Server and changing Datasource connection string Pro-grammatically.?
Microsoft releasing feature in 2020 Jan to update connection string using API.
Microsoft releasing feature in 2020 Jan. But There is any way in 2019 ? any other way for update connection string ?
Microsoft Link
Finally invented one trick to update Connection String in PowerBI.
First Install PowerBI API in Powershell.
Microsoft API don’t give ability to update connection string but give permission to update username.
Both username and connection string are stored in encrypted format in database.
So logic is pass connection string to username and then copy encrypted string to connection string column in database.
Just check below example I have written and invented this trick. Thank you.
# Code By SB 2019
$ReportServerURI = 'http://localhost/PowerBIReports' # Input Local path of powerbi file
$filePath = "C:\12.pbix" # Input Local path of powerbi file
$PBIxfileName = "12" # INput your Powerbi File Name
$FolderName ='NewDataset' # Input PowerBI server Folder Name Where you wann to deploy
$Username ='admin'
$password ='password'
$ReportServerName ='localhost\SQl2017' #input SQL server where POWERBI database installed
$ReportServerDatabase = 'ReportServerPowerBI' #input PowerBi Database Name
$ConnectionString ='data source=Client01\SQL2019;initial catalog=Client_SB_1' # input New Connection String / Client ConnectionString
$FolderLocation = '/'
$FolderPath = $FolderLocation + $FolderName
write-host "Deployment Started ..." -ForeGroundColor Yellow
$session = New-RsRestSession -ReportPortalUri $ReportServerURI
Write-RsRestCatalogItem -WebSession $session -Path $filePath -RsFolder $folderPath -Description $Description -Overwrite
$datasources = Get-RsRestItemDataSource -WebSession $session -RsItem "$FolderPath/$PBIxfileName"
$dataSources[0].DataModelDataSource.AuthType = ‘Windows'
$dataSources[0].DataModelDataSource.Username = $ConnectionString
$dataSources[0].DataModelDataSource.Secret = $password
Set-RsRestItemDataSource -WebSession $session -RsItem "$folderPath/$PBIxfileName" -RsItemType PowerBIReport -DataSources $datasources
$ID = $dataSources[0].Id
$Query = " Update [DataModelDataSource] SET ConnectionString = Username From [dbo].[DataModelDataSource] Where DataSourceID ='" + $ID + "' "
Invoke-Sqlcmd -Query $Query -ServerInstance CPMSUNRSQL17\CPMSRINST17 -Database ReportServerPowerBI
$datasources = Get-RsRestItemDataSource -WebSession $session -RsItem "$FolderPath/$PBIxfileName"
$dataSources[0].DataModelDataSource.Username = $Username
$dataSources[0].DataModelDataSource.Secret = $password
Set-RsRestItemDataSource -WebSession $session -RsItem "$folderPath/$PBIxfileName" -RsItemType PowerBIReport -DataSources $datasources
write-host "Deployment Done . . ." -ForeGroundColor Green
This would only work if the change you need can be driven by a parameter, e.g. for a SQL Server source, can set database, schema or table name (but not server name).
First I would set up the query definitions to use query parameter(s) and test. The specifics of this would depend on your data sources and scenario - you have not provided any info on that.
Then I would call the appropriate REST API Update Parameters method - probably the Group version.
https://learn.microsoft.com/en-us/rest/api/power-bi/datasets/updateparametersingroup
You can deploy using to Power BI Report Server, and change connections and other setting using Powershell using the ReportingServiceTools library, As Power BI Report Service is SSRS you can use the same tools, to load reports, change data connections etc
Example of deploying a file and here
You can also change the connection settings directly in the PBIX file. If you change the extension from pbix to zip you can take a look inside.
If you open the 'Connections' file, it contains the setting via a JSON structured file
{"Version":1,"Connections":[{"Name":"EntityDataSource","ConnectionString":"Data Source=asazure://region.asazure.windows.net/somecubegoes her;Initial Catalog=SmartSpacesAnalysis;Cube=SmartSpacesModel","ConnectionType":"analysisServicesDatabaseLive"}]}
That can be read and changed if needed
Is is possible to decrease the instance count of an application hosted in Service fabric without having to redeploy the whole package.
When I deployed it the application instance count is set to -1. now I would like to reduce it to 3
Here is the powershell script I'm using. It will translate xml to powershell parameter and run Start-ServiceFabricApplicationUpgrade with it:
Param
(
[Parameter(Mandatory=$true)]
[String]
$ApplicationName,
[Parameter(Mandatory=$true)]
[String]
$ApplicationTypeVersion,
[Parameter(Mandatory=$true)]
[String]
$ApplicationParameterFile
)
function Read-ApplicationParameters
{
Param (
[ValidateScript({Test-Path $_ -PathType Leaf})]
[String]
$ApplicationParameterFile
)
$applicationParameterXml = [Xml] (Get-Content $ApplicationParameterFile)
$applicationParameter = #{}
$applicationParameterXml.Application.Parameters.ChildNodes | Foreach {$applicationParameter[$_.Name] = $_.Value}
return $applicationParameter
}
$appParams = Read-ApplicationParameters $ApplicationParameterFile
Start-ServiceFabricApplicationUpgrade -ApplicationName $ApplicationName -ApplicationTypeVersion $ApplicationTypeVersion -ApplicationParameter $appParams -UnmonitoredAuto
Take latest application parameters xml file
Modify values you want to update (leave the other one untouched)
Connect to the cluster
Run Get-ServiceFabricApplication
Run the script above and take ApplicationName, ApplicationTypeVersion values from execution of point 4. ApplicationParameterFile is path to the newly modified xml file.
Monitor upgrade by using Get-ServiceFabricApplicationUpgrade -ApplicationName fabric:/MyApp
Once it is done, verify parameters by running Get-ServiceFabricApplication again.
Background
I have one Azure Runbook (named RunStoredProcedure2) defined as such:
param(
[parameter(Mandatory=$True)]
[string] $SqlServer,
[parameter(Mandatory=$False)]
[int] $SqlServerPort = 1433,
[parameter(Mandatory=$True)]
[string] $Database,
[parameter(Mandatory=$True)]
[string] $StoredProcedureName,
[parameter(Mandatory=$True)]
[PSCredential] $SqlCredential
)
# Get the username and password from the SQL Credential
$SqlUsername = $SqlCredential.UserName
$SqlPass = $SqlCredential.GetNetworkCredential().Password
$Conn = New-Object System.Data.SqlClient.SqlConnection("Server=tcp:$SqlServer,$SqlServerPort;Database=$Database;User ID=$SqlUsername;Password=$SqlPass;Trusted_Connection=False;Encrypt=True;Connection Timeout=30;")
$Conn.Open()
$Cmd=new-object system.Data.SqlClient.SqlCommand("EXEC
$StoredProcedureName", $Conn)
$Cmd.CommandTimeout=120
$Cmd.ExecuteNonQuery()
# Close the SQL connection
$Conn.Close()
I'm now attempting to call this runbook from another Azure Runbook using this command:
& .\RunStoredProcedure2.ps1 -Database 'adventureworksnh.database.windows.net' -SqlServer 'AdventureWorksDW' -SqlServerPort 1433 -StoredProcedureName 'TestJob1' -sqlCredential 'awadmin'
Issue
When I attempt to run this, I get this error:
C:\Temp\uzahthmc.su1\RunStoredProcedure2.ps1 : Cannot process argument transformation on parameter 'SqlCredential'. A
command that prompts the user failed because the host program or the command type does not support user interaction.
The host was attempting to request confirmation with the following message: Enter your credentials.
I am able to run RunStoredProcedure with these parameters successfully
What I've Tried
Adding/removing the preceding ampersand
Using Invoke-Expression
Do you have a Credential asset named "awadmin" in your Automation account? When you start a runbook directly (using the Start button in the portal or the Start-AzureRmAutomationRunbook), Azure Automation allows you to pass a credential asset name as a parameter value, and it will retrieve the specified credential automatically. However, when you invoke a runbook from another runbook, you have to follow plain PowerShell rules and pass a PSCredential object, as declared:
$sqlCred = Get-AutomationPSCredential -Name 'awadmin'
& .\RunStoredProcedure2.ps1 ... -sqlCredential $sqlCred
How would you go on to secure your DSC configuration that's using credentials properly in Azure Automation?
E.g.:
configuration MyServer {
param(
[Parameter(Mandatory=$true)][PSCredential]$MyCredential
);
# Some configuration using credentials
}
Normally I'd set up a public key and a proper certificate installed on each node and pass along CertificateFile and Thumbprint to ConfigurationData when compiling the configuration documents.
In Azure Automation I can't find any good solution.
The documentation says Azure automation encrypts the entire MOF by it's own : https://azure.microsoft.com/en-us/documentation/articles/automation-certificates/ and the article specifies to use PSAllowPlainTextCredentials.
When you then register a node to it's pull server and pull it's configuration, you can read out the password in plain text as long as you have local admin / or read access to temp, after being pulled down/updated. This is not good in a security perspective.
What I'd like, ideally would be to upload such a public key/certificate to the Azure Automation credentials and use it as a part of ConfigurationData when starting the compilation job.
However today, "CertificateFile" expects a path and not a AutomationCertificate so I cannot see a way to start the compilationjob with any public key present in Azure Automation. I can't see any ways of referring to my assets certificate when running the job.
Any ideas if this is possible in the current state of Azure automation and the way they work with DSC/pull to secure it properly using either an asset store din Azure Automation or Azure Key vault?
You should create a Azure Automation Credential and reference it in the configuration like so:
# Compile mof
$ConfigurationData = #{
AllNodes = #(
#{
NodeName = $nodeName
PSDscAllowPlainTextPassword = $true
}
)
}
$Parameters = #{
"nodeName" = $nodeName
"credential" = $credName ##### Notice this is only the name on Azure Automation Credential Asset, Azure Automation will securely pull those and use in the compilation
}
Start-AzureRmAutomationDscCompilationJob -ResourceGroupName $ResourceGroupName -AutomationAccountName $AutomationAccountName `
-ConfigurationName $configurationName -Parameters $Parameters -ConfigurationData $ConfigurationData
You should not worry about the PSDscAllowPlainTextPassword since Azure Automation does encrypt everything at rest for you, it's just that DSC doesn't know that (so you have to supply that to the DSC engine).
And in DSC you should have something like:
Configuration name
{
Param (
[Parameter(Mandatory)][ValidateNotNullOrEmpty()][String]$nodeName,
[Parameter(Mandatory)][ValidateNotNullOrEmpty()][pscredential]$credential
)
Import-DscResource -Module modules
Node $nodeName {DOSTUFF}
}
The correct way to pass a credential to a DSC file from Azure Automation is to use an Azure Automation Credential.
Then inside your DSC file you use the command Get-AutomationPSCredential
Example:
Configuration BaseDSC
{
Import-DscResource -ModuleName xActiveDirectory
Import-DscResource -ModuleName PSDesiredStateConfiguration
Import-DscResource -ModuleName XNetworking
$Credential = Get-AutomationPSCredential -Name "CredentialName"
Node $AllNodes.Nodename
{ ...
The credential is stored encrypted in Azure Automation, and is put into the encrypted MOF file in Azure Automation when you run the compilation job.
Additionally, the password can be updated in Azure Automation and then updated in the MOFs by just recompiling.
The password is not able to be retrieved in clear text from Azure.
Use secure credential and create user to Windows server and add to Administrator group using dsc:
**Solution ( PowerShell DSC) **
Firstly : Create credential in Automation account form azure portal or using any azure module
Home>Resource group> ...> automation account > credentials
Configuration user_windows_user
{
param
(
[Parameter()][string]$username,
[Parameter()]$azurePasswordCred **#name (string)of the credentials**
)
$passwordCred = Get-AutomationPSCredential -Name $azurePasswordCred
Node "localhost"
{
User UserAdd
{
Ensure = "Present" # To ensure the user account does not exist, set Ensure to "Absent"
UserName = $username
FullName = "$username-fullname"
PasswordChangeRequired = $false
PasswordNeverExpires = $false
Password = $passwordCred # This needs to be a credential object
}
Group AddtoAdministrators
{
GroupName = "Administrators"
Ensure = "Present"
MembersToInclude = #($username)
}
}
} # end of Configuration
#
$cd = #{
AllNodes = #(
#{
NodeName = 'localhost'
PSDscAllowPlainTextPassword = $true
}
)
}
Upload the Dsc file in azure automation>configuration
compile the configuration (provide input -username , credential name (string)
add configuration to node and wait for the configuration deployment
I have to implement a solution where I have to deploy a SSIS project (xy.ispac) from one machine to another. So far I've managed to copy-cut-paste the following stuff from all around the internet:
# Variables
$ServerName = "target"
$SSISCatalog = "SSISDB" # sort of constant
$CatalogPwd = "catalog_password"
$ProjectFilePath = "D:\Projects_to_depoly\Project_1.ispac"
$ProjectName = "Project_name"
$FolderName = "Data_collector"
# Load the IntegrationServices Assembly
[Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.Management.IntegrationServices")
# Store the IntegrationServices Assembly namespace to avoid typing it every time
$ISNamespace = "Microsoft.SqlServer.Management.IntegrationServices"
Write-Host "Connecting to server ..."
# Create a connection to the server
$sqlConnectionString = "Data Source=$ServerName;Initial Catalog=master;Integrated Security=SSPI;"
$sqlConnection = New-Object System.Data.SqlClient.SqlConnection $sqlConnectionString
$integrationServices = New-Object "$ISNamespace.IntegrationServices" $sqlConnection
$catalog = $integrationServices.Catalogs[$SSISCatalog]
# Create the Integration Services object if it does not exist
if (!$catalog) {
# Provision a new SSIS Catalog
Write-Host "Creating SSIS Catalog ..."
$catalog = New-Object "$ISNamespace.Catalog" ($integrationServices, $SSISCatalog, $CatalogPwd)
$catalog.Create()
}
$folder = $catalog.Folders[$FolderName]
if (!$folder)
{
#Create a folder in SSISDB
Write-Host "Creating Folder ..."
$folder = New-Object "$ISNamespace.CatalogFolder" ($catalog, $FolderName, $FolderName)
$folder.Create()
}
# Read the project file, and deploy it to the folder
Write-Host "Deploying Project ..."
[byte[]] $projectFile = [System.IO.File]::ReadAllBytes($ProjectFilePath)
$folder.DeployProject($ProjectName, $projectFile)
This seemed to be working surprisingly well on the development machine - test server pair. However, the live environment will be a bit different, the machine doing the deployment job (deployment server, or DS from now on) and the SQL Server (DB for short) the project is to be deployed are in different domains and since SSIS requires windows authentication, I'm going to need to run the above code locally on DS but using credentials of a user on the DB.
And that's the point where I fail. The only thing that worked is to start the Powershell command line interface using runas /netonly /user:thatdomain\anuserthere powershell, enter the password, and paste the script unaltered into it. Alas, this is not an option, since there's no way to pass the password to runas (at least once with /savecred) and user interactivity is not possible anyway (the whole thing has to be automated).
I've tried the following:
Simply unning the script on DS, the line $sqlConnection = New-Object System.Data.SqlClient.SqlConnection $sqlConnectionString would use the credentials from DS which is not recognized by DB, and New-Object does not have a -Credential arg that I could pass to
Putting everything into an Invoke-Command with -Credential requires using -Computername as well. I guess it would be possible to use the local as 'remote' (using . as Computername) but it still complains about access being denied. I'm scanning through about_Remote_Troubleshooting, so far without any success.
Any hints on how to overcome this issue?
A solution might be to use a sql user (with the right access rights) instead of an AD used.
Something like this should work.
(Check also the answer to correct the connection string)