Service fabric change application count without having to redeploy the whole package - azure-service-fabric

Is is possible to decrease the instance count of an application hosted in Service fabric without having to redeploy the whole package.
When I deployed it the application instance count is set to -1. now I would like to reduce it to 3

Here is the powershell script I'm using. It will translate xml to powershell parameter and run Start-ServiceFabricApplicationUpgrade with it:
Param
(
[Parameter(Mandatory=$true)]
[String]
$ApplicationName,
[Parameter(Mandatory=$true)]
[String]
$ApplicationTypeVersion,
[Parameter(Mandatory=$true)]
[String]
$ApplicationParameterFile
)
function Read-ApplicationParameters
{
Param (
[ValidateScript({Test-Path $_ -PathType Leaf})]
[String]
$ApplicationParameterFile
)
$applicationParameterXml = [Xml] (Get-Content $ApplicationParameterFile)
$applicationParameter = #{}
$applicationParameterXml.Application.Parameters.ChildNodes | Foreach {$applicationParameter[$_.Name] = $_.Value}
return $applicationParameter
}
$appParams = Read-ApplicationParameters $ApplicationParameterFile
Start-ServiceFabricApplicationUpgrade -ApplicationName $ApplicationName -ApplicationTypeVersion $ApplicationTypeVersion -ApplicationParameter $appParams -UnmonitoredAuto
Take latest application parameters xml file
Modify values you want to update (leave the other one untouched)
Connect to the cluster
Run Get-ServiceFabricApplication
Run the script above and take ApplicationName, ApplicationTypeVersion values from execution of point 4. ApplicationParameterFile is path to the newly modified xml file.
Monitor upgrade by using Get-ServiceFabricApplicationUpgrade -ApplicationName fabric:/MyApp
Once it is done, verify parameters by running Get-ServiceFabricApplication again.

Related

Alternative approach in Azure for multi server Administration SQL agent job?

Currently in my organization having total 15 QA and 5 UAT SQL environments. As of now we are managing all SQL agent jobs using multi server administration feature of SQL server in one of target server. Now we are planning to migrate these all SQL environments database on Azure. In azure we are using SQL Database service not any VM box.
So, Is there any feature or alternative solution to manage all jobs of all environment in one of central location in azure. All SQL jobs are type of T-SQL only.
You can use Azure Automation to centralize all your Azure SQL Database jobs in one place.
You can use the following PowerShell Workflow on Azure Automation to schedule execution of any stored procedure on any Azure SQL Database no matter on which Azure SQL logical server they reside.
workflow SQL_Agent_SprocJob
{
[cmdletbinding()]
param
(
# Fully-qualified name of the Azure DB server
[parameter(Mandatory=$true)]
[ValidateNotNullOrEmpty()]
[string] $SqlServerName,
# Name of database to connect and execute against
[parameter(Mandatory=$true)]
[ValidateNotNullOrEmpty()]
[string] $DBName,
# Name of stored procedure to be executed
[parameter(Mandatory=$true)]
[ValidateNotNullOrEmpty()]
[string] $StoredProcName,
# Credentials for $SqlServerName stored as an Azure Automation credential asset
[parameter(Mandatory=$true)]
[ValidateNotNullOrEmpty()]
[PSCredential] $Credential
)
inlinescript
{
Write-Output “JOB STARTING”
# Setup variables
$ServerName = $Using:SqlServerName
$UserId = $Using:Credential.UserName
$Password = ($Using:Credential).GetNetworkCredential().Password
$DB = $Using:DBName
$SP = $Using:StoredProcName
# Create & Open connection to Database
$DatabaseConnection = New-Object System.Data.SqlClient.SqlConnection
$DatabaseConnection.ConnectionString = “Data Source = $ServerName; Initial Catalog = $DB; User ID = $UserId; Password = $Password;”
$DatabaseConnection.Open();
Write-Output “CONNECTION OPENED”
# Create & Define command and query text
$DatabaseCommand = New-Object System.Data.SqlClient.SqlCommand
$DatabaseCommand.CommandType = [System.Data.CommandType]::StoredProcedure
$DatabaseCommand.Connection = $DatabaseConnection
$DatabaseCommand.CommandText = $SP
Write-Output “EXECUTING QUERY”
# Execute the query
$DatabaseCommand.ExecuteNonQuery()
# Close connection to DB
$DatabaseConnection.Close()
Write-Output “CONNECTION CLOSED”
Write-Output “JOB COMPLETED”
}
}
Use this step-by-step tutorial to get started.

PowerShell log file file-sytem rights

I have written a function that writes log files for my scripts. The first time the function is used, it writes a log file in the directory and the name of the script. Every subsequent run, log messages are attached to the file.
So far, so good. Against all odds, other people are starting to use my scripts now! The scripts are mainly used by administrators with local admin rights on servers. But they all get errors when writing to the log file. To my understanding, when you access files with rights provided by the „Administrators“ group, you must be in elevated privilege mode. But I don’t want that. I manually tried to assign modify to the „Users“ group, but then „Administrators“ seem to take precedence.
Anyone any idea what rights to set (and/or to revoke) and how to achieve this in PowerShell?
As Ansgar Wiecher comments, you should probably look into using the Event Log service instead.
With event logs, you only need elevated privileges the first time one of the scripts run, in order to create the log and register the event source, after that anyone can write to it:
function Write-MyLog {
param(
[Parameter(Mandatory = $true)]
[string]$Message,
[Parameter(Mandatory = $true)]
[ValidateRange(1,65535)]
[int]$EventId,
[Parameter(Mandatory = $false)]
[System.Diagnostics.EventLogEntryType]$EntryType = 'Information'
)
# Prepend PID and script path to message
$PSBoundParameters['Message'] = '[{0}: {1}]{2}{3}' -f $PID,$MyInvocation.ScriptName,[Environment]::NewLine,$Message
# Set event log target
$PSBoundParameters['LogName'] = $logName = 'LeosEvents'
$PSBoundParameters['Source'] = $logSource = 'LeosScripts'
if (-not (Get-WinEvent -ListLog $logName -ErrorAction SilentlyContinue)) {
# Create event log and source if it doesn't exist already
# This is the only step that requires elevation, can be created via GPO if desired
New-EventLog -LogName $logName -Source $logSource
}
# Write event log entry
Write-EventLog #PSBoundParameters
}

Powershell encoded command variables evaluation before execution (ServiceNow, Invoke-WMIMethod)

I have an issue with powershell scriptblock executed by ServiceNow MID Server on target host.
Target host is running SQL server where I need to execute some PS Script running SQL commands and post process results (using JDBC activity is not valid in this case).
So I am running powershell on MID server (some kind of proxy, for these who are not familiar with ServiceNow) and I am needed to execute PS script using Encoded command via Invoke-WMIMethod cmdlet (I cannot use Invoke-Command, due to disabled PS remoting - no way to enable that due to company policies) like below:
$Bytes=[System.Text.Encoding]::Unicode.GetBytes($CallArgumentsBlock)
# Command block is passed as encoded command
$EncodedCommand=[Convert]::ToBase64String($Bytes)
$WMIArgs=#{
Class='win32_process'
Name='Create'
ComputerName=$computer
ArgumentList="powershell -EncodedCommand $EncodedCommand"
ErrorAction='SilentlyContinue'
Credential=$cred
}
Invoke-WmiMethod #WMIArgs | Out-Null
Issue is that I need to push into $EncodedCommand pre-evaluated variables. Example of script block:
$CallArgumentsBlock={
Param(
[string] $SQLServer = "$inputSQLServer", #SQL Server Name running on host
[string] $SQLDBName = "$inputSQLDBName", #SQL DB name
[string] $InputData = "$inputInputData", #JSON string holding data processed by PS+SQL
[string] $ResultFolderPath = "$OutputPath_forSQL" #File path where to output results cause WMI can not return output of invoked command
)
$ProcessData = ConvertFrom-JSON -InputObject $InputData
#For each object call sql...
...
*PS code*
...
}
So what should be powershell way to do it?
Does anybody suggest better ServiceNow based solution?
Thanks a lot for your answers!
My initial thought is to remove the Param() statement, and just list the variables and set the values when creating the script block, any reason this wouldn't work?
$CallArgumentsBlock={
[string] $SQLServer = "$inputSQLServer" #SQL Server Name running on host
[string] $SQLDBName = "$inputSQLDBName" #SQL DB name
[string] $InputData = "$inputInputData" #JSON string holding data processed by PS+SQL
[string] $ResultFolderPath = "$OutputPath_forSQL" #File path where to output results cause WMI can not return output of invoked command
$ProcessData = ConvertFrom-JSON -InputObject $InputData
#For each object call sql...
...
*PS code*
...
}

Error calling one Azure Automation Runbook from another

Background
I have one Azure Runbook (named RunStoredProcedure2) defined as such:
param(
[parameter(Mandatory=$True)]
[string] $SqlServer,
[parameter(Mandatory=$False)]
[int] $SqlServerPort = 1433,
[parameter(Mandatory=$True)]
[string] $Database,
[parameter(Mandatory=$True)]
[string] $StoredProcedureName,
[parameter(Mandatory=$True)]
[PSCredential] $SqlCredential
)
# Get the username and password from the SQL Credential
$SqlUsername = $SqlCredential.UserName
$SqlPass = $SqlCredential.GetNetworkCredential().Password
$Conn = New-Object System.Data.SqlClient.SqlConnection("Server=tcp:$SqlServer,$SqlServerPort;Database=$Database;User ID=$SqlUsername;Password=$SqlPass;Trusted_Connection=False;Encrypt=True;Connection Timeout=30;")
$Conn.Open()
$Cmd=new-object system.Data.SqlClient.SqlCommand("EXEC
$StoredProcedureName", $Conn)
$Cmd.CommandTimeout=120
$Cmd.ExecuteNonQuery()
# Close the SQL connection
$Conn.Close()
I'm now attempting to call this runbook from another Azure Runbook using this command:
& .\RunStoredProcedure2.ps1 -Database 'adventureworksnh.database.windows.net' -SqlServer 'AdventureWorksDW' -SqlServerPort 1433 -StoredProcedureName 'TestJob1' -sqlCredential 'awadmin'
Issue
When I attempt to run this, I get this error:
C:\Temp\uzahthmc.su1\RunStoredProcedure2.ps1 : Cannot process argument transformation on parameter 'SqlCredential'. A
command that prompts the user failed because the host program or the command type does not support user interaction.
The host was attempting to request confirmation with the following message: Enter your credentials.
I am able to run RunStoredProcedure with these parameters successfully
What I've Tried
Adding/removing the preceding ampersand
Using Invoke-Expression
Do you have a Credential asset named "awadmin" in your Automation account? When you start a runbook directly (using the Start button in the portal or the Start-AzureRmAutomationRunbook), Azure Automation allows you to pass a credential asset name as a parameter value, and it will retrieve the specified credential automatically. However, when you invoke a runbook from another runbook, you have to follow plain PowerShell rules and pass a PSCredential object, as declared:
$sqlCred = Get-AutomationPSCredential -Name 'awadmin'
& .\RunStoredProcedure2.ps1 ... -sqlCredential $sqlCred

Generating OpenCover Reports Using TFS 2015 Build Preview

I'm using the following code to generate a coverage report from a powershell script task in TFS 2015's Build Preview. I can run it on the build server and it generates the report correctly, but when it runs as part of the build it complains that there are no pdb files.
No results, this could be for a number of reasons. The most common
reasons are:
1) missing PDBs for the assemblies that match the filter please review the output file and refer to the Usage guide (Usage.rtf) about filters.
2) the profiler may not be registered correctly, please refer to the Usage guide and the -register switch.
After doing a bit of Googling I discovered that /noshadow should have been enough, but it seems that the parameters for nunit are getting ignored. I'm assuming they're being ignored because the /nologo command should be stripping out the copyright info from being printed, but in the console output I can still see the information being displayed.
Also using the build output directory as the working directory should have fixed this as well, but using Set-Location didn't resolve the problem during the build.
This is the script that I'm currently running:
Param
(
[string] $SourceDir = $env:BUILD_SOURCESDIRECTORY,
[string] $UnitTestDir = "",
[string] $UnitTestDll ="",
[string] $Filter = "",
[string] $ExcludeByAttribute = "System.CodeDom.Compiler.GeneratedCodeAttribute",
[string] $nUnitOutputPath = "Output.txt",
[string] $nUnitErrorOutputPath = "Error.text",
[string] $XmlOutputPath = "_CodeCoverageResult.xml",
[string] $ReportOutputPath = "_CodeCoverageReport"
)
$openCoverPath = "E:\BuildTools\OpenCover.4.5.3723\OpenCover.Console.exe"
$nUnitPath = "E:\BuildTools\NUnit.Runners.2.6.4\tools\nunit-console.exe"
$reportGeneratorPath = "E:\BuildTools\ReportGenerator.2.1.1.0\ReportGenerator.exe"
$nUnitArgs = "$SourceDir\$UnitTestDir\$UnitTestDll /noshadow /nologo"
Write-Host "[Debug] Setting location to $SourceDir\$UnitTestDir"
Set-Location $SourceDir\$UnitTestDir
if (!(Test-Path $SourceDir\CodeCoverage)) {
New-Item $SourceDir\CodeCoverage -type directory
}
Write-Host "[Debug] Running unit tests from $SourceDir\$UnitTestDir\$UnitTestDll"
Write-Host "[Debug] Command: $openCoverPath -target:$nUnitPath -targetargs:""$nUnitArgs"" -filter:$Filter -excludebyattribute:$ExcludeByAttribute -register:user -output:""$SourceDir\CodeCoverage\$XmlOutputPath"""
& $openCoverPath -target:$nUnitPath -targetargs:"$nUnitArgs" -filter:$Filter -excludebyattribute:$ExcludeByAttribute -register:user -output:"$SourceDir\CodeCoverage\$XmlOutputPath"
Write-Host "[Debug] Generating report"
Write-Host "[Debug] Command: $reportGeneratorPath ""-reports:$SourceDir\CodeCoverage\$XmlOutputPath"" ""-targetdir:$SourceDir\CodeCoverage\$ReportOutputPath"""
& $reportGeneratorPath -reports:$SourceDir\CodeCoverage\$XmlOutputPath -targetdir:$SourceDir\CodeCoverage\$ReportOutputPath
Write-Host "[Debug] Finished running tests and generating report"
You probably need to wrap your nunit args in quotes using `"
& $openCoverPath -target:$nUnitPath -targetargs:"`"$nUnitArgs`"" -filter:$Filter -excludebyattribute:$ExcludeByAttribute -register:user -output:"$SourceDir\CodeCoverage\$XmlOutputPath"
There is a fix in OpenCover for the next release which will detect when you are passing unrecognized arguments to OpenCover