SSAS tabular - Switch among the datasource connections - ssas-tabular

Its obvious that we need to import the data from a data source into a model of SSAS tabular.
Imagine we have two data sources connections for two different environments ENV1 and ENV2. Both environment contains same tables but with different data.
Is it possible if I want to switch to ENV2 while I am working on ENV1 in SSAS tabular. Is there any alternative available for this requirement?
Thanks in advance,
Lalith Varanasi.

It sounds like you want to have one data source, but to update the connection string based on the environment you deploy to.
I have built a CI/CD process for our tabular models which uses the TOM library in
a powershell script to read the .bim file, modify the connection strings based on the environment we are deploying to, create partitions as needed as well as the administrative roles. I can't share the full script at the moment because there are a few references specific to my company, but basically:
try{
Write-Log "loading Microsoft.AnalysisServices assemblies that we need" Debug
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices.Core") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices.Tabular") | Out-Null
}
catch{
Write-Log "Could not load the needed assemblies... TODO: Figure out and document how to install the needed assemblies. (I would start with the SQL feature pack)" Error -ErrorAction Stop
}
$modelBim = [IO.File]::ReadAllText($bimFilePath)
$db = [Microsoft.AnalysisServices.Tabular.JsonSerializer]::DeserializeDatabase($ModelBim)
#Our DEV and TEST models get deployed on the same SSAS instance. We have to modify the name of the model to reference which environment they are reading from.
$db.ID = "$($modelName)_$($TargetEnvironment)"
$db.Name = "$($modelName)_$($TargetEnvironment)"
Write-host "Updating the data source connections to use the $TargetEnvironment environment."
foreach ($ds in $db.Model.Model.DataSources){
Write-Host "Updating connection information for the $($ds.Name) connection" Debug
#I use a Powershell function called Get-DBServerFromEnvironment which we use to pull the correct server name for each of our different databases. Our database names are the same in each environment, except that they are prefixed with the environment name.
#Using this design, DB1,DB2,DB3 is the name of the data source (Say ApplicationOLAP,DataWarehouse,ThirdPartDB) and you set enviornment specific connection strings in a seperate custom function so that the logic is stored in one place
switch($ds.Name ){
"DB1"{$ds.ConnectionString = "EnvironmentSpecificConnectionStringToDB1"}
"DB2"{$ds.ConnectionString = "EnvironmentSpecificConnectionStringToDB2"}
"DB3"{$ds.ConnectionString = "EnvironmentSpecificConnectionStringToDB3"}
"DB4"{$ds.ConnectionString = "EnvironmentSpecificConnectionStringToDB4"}
default{Write-Log "Unknown Data source name" Warning}
}
}
$server = New-Object Microsoft.AnalysisServices.Tabular.Server
#$serverName is the SSAS server, I get this by calling a custom function and specifying the target enviornment.
$server.Connect($serverName)
$server.BeginTransaction()
if ($server.Databases.Contains($db.ID)){
Write-Log "Tabular database with the ID: $($db.ID) exists. Dropping and recreating"
$server.Databases.FindByName($db.Name).Drop()
$server.Databases.Remove($db.ID, [System.Boolean]::TrueString)
$server.Databases.Add($db) | Out-Null
}
else{
Write-Log "Tabular database with the ID: $($db.ID) does not exist. Creating"
$server.Databases.Add($db) | Out-Null
}
#This part is where you are actually writing your changes to the server. modify as needed.
$db.Update( "ExpandFull")
$db.Model.RequestRefresh("Automatic")
$saveOptions = New-Object Microsoft.AnalysisServices.Tabular.SaveOptions
$saveOptions.MaxParallelism = 5
Write-Log "Starting the processing at [$([DateTime]::Now)]. The script will hang while the cube is processing."
$ProcessElapsed = [system.diagnostics.stopwatch]::startnew()
$result = $db.Model.SaveChanges($saveOptions)
$impact = $result.Impact
$xmlaResult = $result.XmlaResults
#TDOD: Check the result for success/failure.
Write-Log "Processing took $($ProcessElapsed.Elapsed.ToString()). Hours:Minutes:Seconds:Milliseconds"
$server.CommitTransaction()

On your BIM model you can change your datasource connection string.
Go to Model > Existing connection > Modify
or use your Tabular Explorer and change your datasource
-> You need to process your tables after this change.
Is it what you are searching?
Have a nice day,
Arnaud

If you're using Tabular Editor, there's a simple option to prevent connection strings from being deployed, in the "Deployment Wizard" under "Model" > "Deploy..."
By default, "Deploy Connections" is unchecked, meaning that the connection strings used on the target database will be left unchanged, regardless of what you're using in your development database.

Related

Publish SSRS by Octopus

I'm building the set up to deploy my SSRS reports through Octopus Deploy, I found out one Octopus Library and I'm working on it, but I've had some issues:
1º ---- Message error: (The path is alright, but it keeps with the same warning)
WARNING: Unable to find datasource SalesDrivers in /Sales Drivers/Data Sources
2º ---- The method doesn't exist
Method invocation failed because [Microsoft.PowerShell.Commands.NewWebserviceProxy.AutogeneratedTypes.WebServiceProxy3er_ReportService2005_asmx_wsdl.ReportingService2005] doesn't contain a method named 'LoadReportDefinition'.
The powershell function from the template\library that is throwing the error can been seen below:
#region Update-ReportParamters()
Function Update-ReportParameters($ReportFile)
{
# declare local variables
$ReportParameters = #();
# necessary so that when attempting to use the report execution service, it doesn't puke on you when it can't find the data source
$ReportData = (Remove-SharedReferences -ReportFile $ReportFile)
# get just the report name
$ReportName = $ReportFile.SubString($ReportFile.LastIndexOf("\") + 1)
$ReportName = $ReportName.SubString(0, $ReportName.IndexOf("."))
# create warnings object
$ReportExecutionWarnings = $null
# load the report definition
Write-Host "*********************************************"
#Write-Host $ReportData
#(Remove-SharedReferences -ReportFile $ReportFile)
#Write-Host $ReportExecutionWarnings
$ExecutionInfo = $ReportExecutionProxy.LoadReportDefinition($ReportData, [ref] $ReportExecutionWarnings);
# loop through the report execution parameters
foreach($Parameter in $ExecutionInfo.Parameters)
{
# create new item parameter object
$ItemParameter = New-Object "$ReportServerProxyNamespace.ItemParameter";
# fill in the properties except valid values, that one needs special processing
Copy-ObjectProperties -SourceObject $Parameter -TargetObject $ItemParameter;
# fill in the valid values
$ItemParameter.ValidValues = Convert-ValidValues -SourceValidValues $Parameter.ValidValues;
# add to list
$ReportParameters += $ItemParameter;
}
# force the parameters to update
Write-Host "Updating report parameters for $ReportFolder/$ReportName"
if ($IsReportService2005) {
$ReportServerProxy.SetReportParameters("$ReportFolder/$ReportName", $ReportParameters);
}
elseif ($IsReportService2010) {
$ReportServerProxy.SetItemParameters("$ReportFolder/$ReportName", $ReportParameters);
}
else { Write-Warning 'Report Service Unknown in Update-ReportParameters method. Use ReportService2005 or ReportService2010.' }
}
Anyone knows how I could sort it out?
I have solved a similar problem but took a slightly different approach. Rather than using powershell and octopus directly I used the useful open source tool RSBuild to deploy the reports. It is pretty easy to bundle up the rsbuild.exe executable (it is tiny) and a deploy.config along with your reports inside the octopus package. Then you can use octopus's substitution feature to rewrite the config file and Powershell function to execute the executable. This also has the advantage that you can deploy easily without octopus, the config for data sources and reports is declarative in XML rather than procedural in Powershell and the smarts of your scripted deployment can live alongside your reports rather than buried in Octopus.
So my config looks a bit like:
<?xml version="1.0" encoding="utf-8" ?>
<Settings>
<Globals>
<Global Name="CollapsedHeight">0.5in</Global>
</Globals>
<ReportServers>
<ReportServer Name="RS1" Protocol="http" Host="${ReportServer}" Path="${ReportServerPath}" Timeout="30" />
</ReportServers>
<DataSources>
<DataSource Name="Source1" Publish="true" Overwrite="true" TargetFolder="Data Sources" ReportServer="RS1">
<ConnectionString>data source=${ReportServer};initial catalog=${DatabaseName}</ConnectionString>
<CredentialRetrieval>Store</CredentialRetrieval>
<WindowsCredentials>False</WindowsCredentials>
<UserName>${RepotrUser}</UserName>
<Password>${ReportsPassword}</Password>
</DataSource>
</DataSources>
<Reports>
<ReportGroup Name="Details" DataSourceName="Source1" TargetFolder="Reports"
ReportServer="RS1" CacheTime="10080">
<Report Name="BusinessReportABC">
<FilePath>reports\BusinessReportABC.rdl</FilePath>
</Report>
<!--More reports here-->
</ReportGroup>
</Reports>
</Settings>
My deployed octopacked artefacts contain RSBuild.Core.dll, RSBuild.exe, deploy.config and the reports files
Then I simply call the executable using powershell:
PS> rsbuild deploy.config

Configuration file transformation in ASP .NET 5

We are building a web-application using the new ASP .NET 5 platform. I am configuring the build and deployment automation tools and I want to have the ability to change the application settings during deployment (like changing the web-service url). In ASP .NET 5 we don't have web.config files anymore, only the new json configuration files. Is there a mechanism in ASP .NET 5 similar to web.config transformation in the previous versions of ASP .NET?
I know that web.configs are not really supported, but they are still used in ASP.Net under IIS.
I had a desire to apply transforms as well as I wanted to control the environment variable from the config like so:
<aspNetCore>
<environmentVariables xdt:Transform="Replace">
<environmentVariable name="ASPNETCORE_ENVIRONMENT" value="Production" />
</environmentVariables>
</aspNetCore>
If you really want to transform them in ASP.Net core / 5 you can use the following method:
Add as many different web.config transform files as you want to your
project. For example, you can add Web.Development.config,
Web.Staging.config, and Web.Production.config, etc... Name them however you like.
Modify your project.json file to output the files by adding this
line to the publishoptions right below your current web.config line:
"web.*.config"
Create a publish profile and modify your powershell script for your
publish profile (located at Web Project\Properties\PublishProperties\profilename-publish.ps1) to add the below modifications:
Add this function above the try catch (I found this function here Web.Config transforms outside of Microsoft MSBuild?, slightly modified.) :
function XmlDocTransform($xml, $xdt)
{
if (!$xml -or !(Test-Path -path $xml -PathType Leaf)) {
throw "File not found. $xml";
}
if (!$xdt -or !(Test-Path -path $xdt -PathType Leaf)) {
throw "File not found. $xdt";
}
"Transforming $xml using $xdt";
$scriptPath = (Get-Variable MyInvocation -Scope 1).Value.InvocationName | split-path -parent
#C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v14.0\Web\Microsoft.Web.Publishing.Tasks.dll
Add-Type -LiteralPath "${Env:ProgramFiles(x86)}\MSBuild\Microsoft\VisualStudio\v14.0\Web\Microsoft.Web.XmlTransform.dll"
$xmldoc = New-Object Microsoft.Web.XmlTransform.XmlTransformableDocument;
$xmldoc.PreserveWhitespace = $true
$xmldoc.Load($xml);
$transf = New-Object Microsoft.Web.XmlTransform.XmlTransformation($xdt);
if ($transf.Apply($xmldoc) -eq $false)
{
throw "Transformation failed."
}
$xmldoc.Save($xml);
}
Add these lines ABOVE the Publish-AspNet call:
$xdtFiles = Get-ChildItem $packOutput | Where-Object {$_.Name -match "^web\..*\.config$"};
$webConfig = $packOutput + "web.config";
foreach($xdtFile in $xdtFiles) {
XmlDocTransform -xml $webConfig -xdt "$packOutput$xdtFile"
}
You don't really need config transforms in ASP.NET 5 as it has out of the box support for chained configuration sources. For example, take this sample:
public class Startup
{
private readonly IConfiguration _configuration;
public Startup(IApplicationEnvironment appEnv, IHostingEnvironment env)
{
_configuration = new ConfigurationBuilder(appEnv.ApplicationBasePath)
.AddJsonFile("config.json")
.AddEnvironmentVariables()
.Build();
}
// ...
}
We add two config sources and building the configuration our of them. if I ask for a config key, it will try to get a value for that key by looking at the sources from last to first order. In the above case, I can work with a config.json file during development and I can ovveride the that by providing the proper configuration from environment variables.
Look at the Configuration docs for more information.
As indicated by #tugberk, you can use environment variables instead which is a much better way of handling this situation. If you are running in a development environment and want to store passwords or connection strings you can also use user secrets to add them. After all that you can also still use environment specific config files like so (This is a ASP.NET 5 Beta 5 Sample):
ConfigurationBuilder configurationBuilder = new ConfigurationBuilder(
applicationEnvironment.ApplicationBasePath);
// Add configuration from the config.json file.
configurationBuilder.AddJsonFile("config.json");
// Add configuration from an optional config.development.json, config.staging.json or
// config.production.json file, depending on the environment. These settings override the ones in the
// config.json file.
configurationBuilder.AddJsonFile($"config.{hostingEnvironment.EnvironmentName}.json", optional: true);
if (hostingEnvironment.IsEnvironment(EnvironmentName.Development))
{
// This reads the configuration keys from the secret store. This allows you to store connection strings
// and other sensitive settings on your development environment, so you don't have to check them into
// your source control provider. See http://go.microsoft.com/fwlink/?LinkID=532709 and
// http://docs.asp.net/en/latest/security/app-secrets.html
configurationBuilder.AddUserSecrets();
}
// Add configuration specific to the Development, Staging or Production environments. This config can
// be stored on the machine being deployed to or if you are using Azure, in the cloud. These settings
// override the ones in all of the above config files.
// Note: To set environment variables for debugging navigate to:
// Project Properties -> Debug Tab -> Environment Variables
// Note: To get environment variables for the machine use the following command in PowerShell:
// $env:[VARIABLE_NAME]
// Note: To set environment variables for the machine use the following command in PowerShell:
// $env:[VARIABLE_NAME]="[VARIABLE_VALUE]"
// Note: Environment variables use a colon separator e.g. You can override the site title by creating a
// variable named AppSettings:SiteTitle. See
// http://docs.asp.net/en/latest/security/app-secrets.html
configurationBuilder.AddEnvironmentVariables();
IConfiguration configuration = configurationBuilder.Build();

OctopusDeploy - Every website in the deploy has a different AppPool and Website name; how to deal; no other differences

I'm trying to setup a deploy process that targets 16 web sites each hosting an instance of the same application.
Websites and AppPools are named as such:
appServer1:
app10.site.com
app11.site.com
app12.site.com
app13.site.com
appServer2:
app20.site.com
app21.site.com
app22.site.com
app23.site.com
etc.
etc.
...with each website having a correspondingly named AppPool.
I am desperately trying to determine how to use a single Deploy NuGet Package step to target all of these websites/app pools using variables and a combination of powershell scripts if possible.
I'd like to have a single step where I can variable substitute the website and app pool names. As this is the only difference. I basically need the equivalent of being able to loop the nuget package step passing it a list of website and app pool names. I cannot simply use variables because I can only resolve to the machine level with variable scoping.
Create list of all Website and AppPool names, iterate them passing each value to a Step for execution. ForEach processing step for lack of better words.
I do have the ability to rename the AppPools if need be for a more consistent pattern, but I cannot change the website names
Any ideas would be greatly appreciated.
http://help.octopusdeploy.com/discussions/questions/3481-every-website-in-the-deploy-has-a-different-apppool-and-website-name-how-to-deal-no-other-differences
There's a lot to your question, but I'm going to take a stab at explaining our approach, in hopes of jogging your creative juices.
tl;dr
simply put, use your own powershell scripts to install the web-application. In there you can set the app pool name on a per website basis
For starters, we do do a separate deployment step for each project. The scripts we use will allow you to do all deployments from a single deploy.ps1 (including unique appPool names), but we find that it really helps keep each deployment nice and lean, and easy to manage. Each project get's it's own nupkg and therein contains the predeploy.ps1, deploy.ps1, and postdeploy.ps1 as well as a folder of build/deploy scripts that we've open sourcesd, and a folder of environment config xml files.
A sample of an environment config would be this. The name is simply [envName].xml
<!-- environments\Production.xml -->
<environmentSettings>
<webSites>
<app>
<physicalPathRoot>c:\inetpub</physicalPathRoot>
<physicalFolderPrefix>appname</physicalFolderPrefix>
<siteProtcol>https</siteProtcol>
<siteName>appname.tld</siteName>
<siteHost>appname.tld</siteHost>
<portNumber>443</portNumber>
<appPath>/</appPath>
<appPool>
<name>appname.tld</name>
<!-- valid identityTypes are: [LocalSystem, LocalService, NetworkService, SpecificUser, ApplicationPoolIdentity] -->
<identityType>NetworkService</identityType>
<!-- Set this value to the User the Service will run under in the format DOMAIN\username -->
<!-- If Running as 'NetworkService' then 'NT AUTHORITY\Network Service' is used -->
<userName>NT AUTHORITY\Network Service</userName>
<!-- Leave blank unless using SpecificUser -->
<password></password>
<maxWorkerProcesses>5</maxWorkerProcesses>
</appPool>
</app>
</webSites>
<serverDatabase>
<name>database_name</name>
<connectionString>REPLACED BY OCTOPUS</connectionString>
<providerName>System.Data.SqlClient</providerName>
</serverDatabase>
</environmentSettings>
You can see in the corresponding Get-EnvironmentSettings.ps1 where we load up the config, and then update it with any Octopus variables. This is the trickiest part, because we use dot-Notation to update the paths (case sensitive).
Our octopus variables really only contain information that is secret, as everything else lives in [environment].xml
| Name | Value | Scope
--------------------------------------------------------------------------
| webSites.app.appPool.password | supersecret | Production
So now a typical deployment script simply imports the modules, grab environmentSettings, update config, and install the web app.
# Top of the script, get Octopus environment and version
param(
[string] $version = $OctopusPackageVersion,
[string] $environment = $OctopusEnvironmentName
)
# Make sure a failed deployment actually fails
$ErrorActionPreference = "Stop"
# Import the modules
$currentDir = Split-Path $script:MyInvocation.MyCommand.Path
$moduleDir = "$currentDir\modules"
Import-Module BuildDeployModules
# Grab the environment settings
$environmentSettings = Get-EnvironmentSettings $environment "//environmentSettings"
$databaseSettings = $environmentSettings.serverDatabase
$websiteSettings = $environmentSettings.webSites.app
# update the config
Update-XmlConfigValues $currentDir\website\Web.config "//appSettings/add[#key='databaseName']" $($databaseSettings.name) "value"
Update-XmlConfigValues $currentDir\website\Web.config "//connectionStrings/add[#name='databaseConnection']" $($databaseSettings.connectionString) "connectionString"
Update-XmlConfigValues $currentDir\website\Web.config "//connectionStrings/add[#name='databaseConnection']" $($databaseSettings.providerName) "providerName"
# Install the web application
Install-WebApplication $environment $websiteSettings $version "anonymousAuthentication"
In doing all of this, the web application is installed into IIS with a specific application pool, and appropriate config transforms without relying on any unknowns.
Our nupkg structure looks something like this
appname.1.2.3.4.nupkg
environments
dev.xml
staging.xml
qual.xml
production.xml
modules
[all of our build modules]
website
[all of our website files]
This is super repeatable, easy to maintain, and easy to edit config. Hope it helps

From Msi , how to get the list of files packed in each feature?

We have used wix to create Msi. Each Msi will be having 1 or 2 or 3 features such as Appserver feature, Webserver feature and DB server feature.
Now i was asked to get the list of config files presented in each feature.
It is tough to find the list of web.config files associated with each feature through wxs file.
Is it possible find the list of files associated with a feature with particular search pattern?
For ex. Find all the web.config files packed in Appserver feature.
Is there any way easy way ( querying or some other automated script such as powershell) to get the list?
Wix comes with a .NET SDK referred to as the DTF ("deployment tools foundation"). It wraps the windows msi.dll among other things. You can find these .NET Microsoft.Deployment.*.dll assemblies in the SDK subdirectory of the Wix Toolset installation directory. The documentation is in dtf.chm and dtfapi.chm in the doc subdirectory.
As shown in the documentation, you can use this SDK to write code which queries the msi database with SQL. You will be interested in the Feature, FeatureComponents and File tables.
If you haven't explored the internals of an MSI before, you can open it with orca to get a feel for it.
You can do it by making slight modifications to the Get-MsiProperties function described in this PowerShell article.
Please read the original article and create the prescribed comObject.types.ps1xml file.
function global:Get-MsiFeatures {
PARAM (
[Parameter(Mandatory=$true,ValueFromPipelineByPropertyName=$true,HelpMessage="MSI Database Filename",ValueFromPipeline=$true)]
[Alias("Filename","Path","Database","Msi")]
$msiDbName
)
# A quick check to see if the file exist
if(!(Test-Path $msiDbName)){
throw "Could not find " + $msiDbName
}
# Create an empty hashtable to store properties in
$msiFeatures = #{}
# Creating WI object and load MSI database
$wiObject = New-Object -com WindowsInstaller.Installer
$wiDatabase = $wiObject.InvokeMethod("OpenDatabase", (Resolve-Path $msiDbName).Path, 0)
# Open the Property-view
$view = $wiDatabase.InvokeMethod("OpenView", "SELECT * FROM Feature")
$view.InvokeMethod("Execute")
# Loop thru the table
$r = $view.InvokeMethod("Fetch")
while($r -ne $null) {
# Add property and value to hash table
$msiFeatures[$r.InvokeParamProperty("StringData",1)] = $r.InvokeParamProperty("StringData",2)
# Fetch the next row
$r = $view.InvokeMethod("Fetch")
}
$view.InvokeMethod("Close")
# Return the hash table
return $msiFeatures
}

Use AppCmd to LIST CONFIG in APPHOST only

I have a requirement to use powershell to configure IIS7.5 on WebApplications that have not yet had code deployed (possibly at all, possibly old/broken web.configs exist) to the file system. I would like to be able to do this all at the APPHOST level. (Note at the bottom about use of Powershell > AppCmd).
I can SET all the values properly, however, being somewhat diligent, I like to also validate the values were set properly by retrieving them after setting.
Here's the scenario:
I can set this value using AppCmd so the setting is applied at the APPHOST level using the /Commit:APPHOST flag. However, I havent found a way to READ the values exclusively at the APPHOST level.
Setting the Code is successful:
C:\Windows\System32\inetsrv\appcmd.exe set config "webSiteName/webAppName" -section:system.webServer/security/authentication/anonymousAuthentication /enabled:"True" /commit:apphost
However, I cant find a way to read the values using AppCmd (or Powershell):
Running the following AppCmd returns an error due to the broken pre-existing web.config in the folder (the specific error is unimportant, as it is reading the WebApp's web.config instead of the ApplicationHost.config/APPHOST):
C:\Windows\System32\inetsrv\appcmd.exe list config "MACHINE/WEBROOT/APPHOST/webSiteName/webAppName" -section:system.webServer/security/authentication/anonymousAuthentication
ERROR ( message:Configuration error
Filename: \\?\c:\inetpub\wwwroot\webSiteName\webAppName\web.config
Line Number: 254
Description: The configuration section 'system.runtime.caching' cannot be read because it is missing a section declaration
. )
Note: I would prefer to do this all in Powershell instead of using AppCmd, so if anyone has the syntax for modifying the APPHOST settings for anonymousAuthentication section of a WebApplication, that lives under a Website, from inside Powershell (Get-WebConfiguration seems to only use the WebApp web.config), that would be totally awesome and much appreciated!
Here's how to do this in PowerShell:
[Reflection.Assembly]::Load(
"Microsoft.Web.Administration, Version=7.0.0.0,
Culture=Neutral, PublicKeyToken=31bf3856ad364e35") > $null
$serverManager = New-Object Microsoft.Web.Administration.ServerManager
$config = $serverManager.GetApplicationHostConfiguration()
$anonymousAuthenticationSection = $config.GetSection("system.webServer/security/authentication/anonymousAuthentication", "simpleasp.net")
Write-Host "Current value: " $anonymousAuthenticationSection["enabled"]
# Now set new value
$anonymousAuthenticationSection["enabled"] = $true
$serverManager.CommitChanges()