Using different Datasources for different configurations in SSRS - deployment

I'm wondering if it's possible for SSRS reports to use different datasources depending on what configuration is being used?
So, for example, if I'm using the 'DebugLocal' configuration, the reports would use a different database than if I'm using the 'Production' configuration.
Thanks
Bryan

In Report Designer, the simplest way to do this would be to edit the shared datasource immediately prior to running the report.
In Report Manager, you can achieve this by having the same datasource pointing to different databases in different folders (eg. one folder for DebugLocal, another for Production), and deploying and running the report in the appropriate folder.

Related

SSIS Clear encrypted password when unchecking Sensitive

Tools used
Visual Studio 2017 (with SSIS)
SQL Server Management Studio 17.9.1
Involved in the process
Two SSIS developers and SSMS with Integration Services Catalog which stores deployed projects.
Overview
I have a solution with projects inside created in SSIS. Each project has project parameters specifying for each database connection two different params: Connection string and a password. Password is marked sensitive.
Project and all it's packages have ProtectionLevel set to EncryptAllWithPassword. The project get's pushed to git repository and another developer downloads changes. Now, he needs to provide password in order to be able to work with the project (or multiple projects within solution). So far so good, we have a "master password" on project levels which protect access to parameters such as sensitive passwords. When a developer goes to Project.params and untick sensitive mark, the password is shown. All good for now as well, since he needed to know the password for the project first to see the passwords.
Here's the tricky part
When the project is being deployed do Integration Services Catalog, ProtectionLevel is being changed and the project which can be exported from Management Studio is no longer password protected. To export such a project one obviously needs ssis_admin permission, but that's out of scope for this issue. When the project was deployed and then imported back from SSMS to SSIS, a developer can open it without password and untick the sensitive mark for Project.params passwords. All passwords are visible for him now. This is wrong.
What am I trying to achieve
I want to mimic the same behaviour with sensitive values we have in SSMS. Whenever you untick a sensitive mark on an environment variable, the value is cleared - like below.
However, when I do the same in SSIS Project.params (untick sensitive mark), the value is still shown so I can see all the passwords - as presented below.
I'd like it to be stored as it is, but unable to see it's plain text value.
Is it possible at all? Or maybe there's a better way to organise this? I need to be able to execute packages from within SQL Server Agent (SSMS) providing environment variables as well as from my own computer under SSIS, which is why I need to store these passwords in order not to repeat them every time.
This problem that you described is a real issue for any team working collaboratively on SSIS. I'll describe the pattern that I've used to solve this, which might be helpful. First, I should state that I don't like storing passwords in source control, even if they're encrypted. Here is what I typically do:
Set all SSIS packages and projects to Don't Save Sensitive. This removes all passwords from the files and closes the source control loophole
When possible, all the developers should have a local set up of the ETL ecosystem - SQL databases (no data or just test data), file system, etc. All packages should be configured to work against this local environment. In this way, you can be an admin, connect with windows authentication and have full control over the test data. This also helps you avoid interfering with anyone else's development and testing.
For a SQL connection, set parameters for the connection string and password. The connection string can point to your local instance and use windows auth. The password can be blank and checked as sensitive. If everyone sets up their local system the same way, then nothing needs to change when another developer opens this up and begins work on the project.
For deployment, environments can be configured for each server. The password can optionally be used for SQL authentication and the connection string would change to include the username property and not windows auth.
The above pattern makes it really easy to develop as a team and pretty straight forward for deployment automation.
I would propose to use SSIS Project Catalog and Project Environments together with the following approach.
Think about SSIS packages as programs or runners, and databases - as resources. Thus, packages are independent from resources, and the resources are configured at package setup phase in specific environment.
In practice, this leads to the following configuration and activities:
Packages are created and developed in SSIS Project Mode. All connection manages are declared at Project level.
Do not save passwords in Packages or Projects.
Each environment Project is deployed to has defined Environment variable configuration where we store configuration about databases, namely:
Connection strings, could be cut and paste from the original package
DB name
Server name
User name if Windows Auth is not used
Password if Windows Auth is not used
After project deploy, one has to map all project connection params with environment variables. We created a simple C# program for that.
Values from environment variables are used at corresponding param values of connections. Moreover, you can store other configuration params there, not only connections.
You can have several sets of params at the same environment, and choose set when staring package.
Automated testing is done with scripted execution, and environment is specified in testing script.
So, every environment we deploy project to has configuration environment with all connection data. Connectivity params in QA environments are supplied by env engineer; developer does not need to worry about that.

SSRS 2008 R2 Correct Generation of .rsds for SharePoint

I've put together a PowerShell script to deploy some reports and corresponding datasets and datasources as well as link the embedded dataset references to the shared datasets but getting stuck at specifying the shared datasource for the shared dataset.
So initially I had to rename the .rds to .rsds for it to show up as a selectable datasource via the SharePoint UI. I'm getting an error though when I programmatically or manually via the UI set the DataSource for the DataSet saying the schema is wrong. So I've tried running the Build->Deploy from BIDS and then downloaded the .rsds to see the difference. Turns out the BIDS version which is what gets built looks like this:
<?xml....?>
<RptDataSource...>
<ConnectionProperties>
<Extension>SHAREPOINTLIST</Extension>
<ConnectionString>...my sharepoint site url...</ConnectionString>
<IntegratedSecurity>true<IntegratedSecurity>
</ConnectionProperties>
<DataSourceID>...some guid...</DataSourceID>
<RptDataSource>
whereas BIDS generates this for SharePoint when doing Build->Deploy
<?xml....?>
<DataSourceDefinition>
<Extension>SHAREPOINTLIST</Extension>
<ConnectionString>...my sharepoint site url...</ConnectionString>
<CredentialRetrieval>Integrated</CredentialRetrieval>
<Enabled>True</Enabled>
</DataSourceDefinition>
So, is there a built in way (either in BIDS or an existing PowerShell module/script) to get this generated when building locally rather than running a Deploy or am I going to have to run some xslt to transform it (Or just copy an existing source file and replace the connection string as it's the only thing that matters) and rename as a post build process?
Roighto! I found that there's a way to create a datasource via the ReportingServices2010.asmx service. Using that and ignoring the .rds written when building the project in BIDS.

SSRS - Credentials on reports after deploy do not match source rdl file

The project I am working on has been having the issue of stored credentials in the rdl files not being reflected on the server we deploy to and it seems to be inconsistent in its behavior. We are NOT using a shared data source because we have to generate the data source connection dynamically so each report has its own data source embedded (even though they are all identical) and it has the credentials stored. I look in VS at the rdl file and see the credentials are stored, deploy to our test server, look on the test server and the credentials are set to "Not Required". Other strange thing is this doesn't happen to all reports all the time, some reports keep their credentials but not every time. It all seems very random.
Some settings and facts about the project that may be useful:
Using SQL Server 2008 R2
Using Visual Studio 2012 to deploy
ReportProject setting OverwriteDatasets = True
ReportProject setting OverwriteDataSources = True
I have had similar issues, delete the report on the SSRS server first, then deploy the report. SSRS does retain certain attributes of report it is very inconsistent when it does this and this is more of a workaround than an answer, but it works for me.

RDL vs RDLC woes when using custom code and subreports in a local report

I'm having a rather complicated problem with SSRS.
First off, we are using Local reports. We have to because the data we are using comes from rest services.
Secondly, we have build a custom library for our reports called ReportFunctions that we want to use.
We are using sql server data tools to build the .rdl (Not to be confused with RDLC) and then we copy the file into our VS project. We set build action to Content and Copy If Newer.
This works great for all of our base reports. We can even use the custom library.
Now we want to use some of these base reports as sub reports. Let's call this report "AllReports". It consists of a "SummaryReport" and a "DetailReport".
Summary and Detail are two RDL files that also need to be loaded independently.
In the code for building AllReports, I have this:
reportViewer.LocalReport.SubreportProcessing += LocalReport_SubreportProcessing;
reportViewer.LocalReport.LoadSubreportDefinition("SummaryReport", File.OpenText(Server.MapPath("SummaryReport.rdl")));
reportViewer.LocalReport.LoadSubreportDefinition("DetailReport", File.OpenText(Server.MapPath("DetailReport.rdl")));
private void LocalReport_SubreportProcessing(object sender, SubreportProcessingEventArgs e)
{
//Set e.DataSources here depending on e.ReportPath
}
When I run SummaryReport or DetailReport, they work fine. When I run AllReports I get an error:
The subreport 'SummarySubreport' could not be found at the specified location C:\path\to\my\project\Reports\SummaryReport.rdlc. Please verify that the subreport has been published and that the name is correct.
For fun, I switched the extension of the two sub reports to .rdlc Then I get a build error when I compile in visual studio:
Error while loading code module: ‘ReportFunctions, Version=1.0.0.0, Culture=neutral, PublicKeyToken=d88351e3d4dffe2f’. Details: Could not load file or assembly 'ReportFunctions, Version=1.0.0.0, Culture=neutral, PublicKeyToken=d88351e3d4dffe2f' or one of its dependencies. The system cannot find the file specified.
DetailReport is the one that uses the external library of custom functions.
I cannot place my DLL anywhere on the server except in the bin of the project. I can't stress this enough. I also cannot move to server reports. I have to use local reports.
If I set the build action to None on the RDLC, it works, but when I deploy my RDLC is not published.
Can I get the RDLC to compile? What am I missing to make my report viewer use an RDL instead of an RDLC for my subreports?
After 11 hours of working on this problem, I discovered the build server was assigning revision numbers to all libraries when they were built. So, locally ReportFunctions would be given 1.0.0.0 as the version and everything was cool. Then when it was deployed the build server would do its thing and the reports would break.
To anyone else having this issue, check the version numbers of the compiled DLL's, especially if you are using a build process.
Also, to get around the compile issue of the RDLC, we switched the build action to None and we set up the build server to copy any RDLC files over manually once the build succeeds.

Amending Web Config for Test Fixtures

I'm using CassiniDevLib to host an MVC app for integration testing.
In order to do it I need to amend some config settings on the web server so they match the integration testing environment, first one being the connection string so it points to the test database.
I know I can have two copies of the web.config file and rename them but I was wondering if there was a more elegant way. ie a way to amend the settings in code as part of the Test Fixture setup. The challenge being that I need to access the web server process from my test ficture process
Would appreciate any thoughts on this.
I assume that you are using Visual Studio 2010. In that, you have a feature called as Config Transforms. Basically you can have multiple config file for each build environment. You can have your own custom build env. You have a new one by going to Configuration Manager and adding a new one.
http://blogs.msdn.com/b/webdevtools/archive/2009/05/04/web-deployment-web-config-transformation.aspx
you can search on the internet for Config Transforms, if you need more examples.