Amending Web Config for Test Fixtures - asp.net-mvc-2

I'm using CassiniDevLib to host an MVC app for integration testing.
In order to do it I need to amend some config settings on the web server so they match the integration testing environment, first one being the connection string so it points to the test database.
I know I can have two copies of the web.config file and rename them but I was wondering if there was a more elegant way. ie a way to amend the settings in code as part of the Test Fixture setup. The challenge being that I need to access the web server process from my test ficture process
Would appreciate any thoughts on this.

I assume that you are using Visual Studio 2010. In that, you have a feature called as Config Transforms. Basically you can have multiple config file for each build environment. You can have your own custom build env. You have a new one by going to Configuration Manager and adding a new one.
http://blogs.msdn.com/b/webdevtools/archive/2009/05/04/web-deployment-web-config-transformation.aspx
you can search on the internet for Config Transforms, if you need more examples.

Related

SSIS Clear encrypted password when unchecking Sensitive

Tools used
Visual Studio 2017 (with SSIS)
SQL Server Management Studio 17.9.1
Involved in the process
Two SSIS developers and SSMS with Integration Services Catalog which stores deployed projects.
Overview
I have a solution with projects inside created in SSIS. Each project has project parameters specifying for each database connection two different params: Connection string and a password. Password is marked sensitive.
Project and all it's packages have ProtectionLevel set to EncryptAllWithPassword. The project get's pushed to git repository and another developer downloads changes. Now, he needs to provide password in order to be able to work with the project (or multiple projects within solution). So far so good, we have a "master password" on project levels which protect access to parameters such as sensitive passwords. When a developer goes to Project.params and untick sensitive mark, the password is shown. All good for now as well, since he needed to know the password for the project first to see the passwords.
Here's the tricky part
When the project is being deployed do Integration Services Catalog, ProtectionLevel is being changed and the project which can be exported from Management Studio is no longer password protected. To export such a project one obviously needs ssis_admin permission, but that's out of scope for this issue. When the project was deployed and then imported back from SSMS to SSIS, a developer can open it without password and untick the sensitive mark for Project.params passwords. All passwords are visible for him now. This is wrong.
What am I trying to achieve
I want to mimic the same behaviour with sensitive values we have in SSMS. Whenever you untick a sensitive mark on an environment variable, the value is cleared - like below.
However, when I do the same in SSIS Project.params (untick sensitive mark), the value is still shown so I can see all the passwords - as presented below.
I'd like it to be stored as it is, but unable to see it's plain text value.
Is it possible at all? Or maybe there's a better way to organise this? I need to be able to execute packages from within SQL Server Agent (SSMS) providing environment variables as well as from my own computer under SSIS, which is why I need to store these passwords in order not to repeat them every time.
This problem that you described is a real issue for any team working collaboratively on SSIS. I'll describe the pattern that I've used to solve this, which might be helpful. First, I should state that I don't like storing passwords in source control, even if they're encrypted. Here is what I typically do:
Set all SSIS packages and projects to Don't Save Sensitive. This removes all passwords from the files and closes the source control loophole
When possible, all the developers should have a local set up of the ETL ecosystem - SQL databases (no data or just test data), file system, etc. All packages should be configured to work against this local environment. In this way, you can be an admin, connect with windows authentication and have full control over the test data. This also helps you avoid interfering with anyone else's development and testing.
For a SQL connection, set parameters for the connection string and password. The connection string can point to your local instance and use windows auth. The password can be blank and checked as sensitive. If everyone sets up their local system the same way, then nothing needs to change when another developer opens this up and begins work on the project.
For deployment, environments can be configured for each server. The password can optionally be used for SQL authentication and the connection string would change to include the username property and not windows auth.
The above pattern makes it really easy to develop as a team and pretty straight forward for deployment automation.
I would propose to use SSIS Project Catalog and Project Environments together with the following approach.
Think about SSIS packages as programs or runners, and databases - as resources. Thus, packages are independent from resources, and the resources are configured at package setup phase in specific environment.
In practice, this leads to the following configuration and activities:
Packages are created and developed in SSIS Project Mode. All connection manages are declared at Project level.
Do not save passwords in Packages or Projects.
Each environment Project is deployed to has defined Environment variable configuration where we store configuration about databases, namely:
Connection strings, could be cut and paste from the original package
DB name
Server name
User name if Windows Auth is not used
Password if Windows Auth is not used
After project deploy, one has to map all project connection params with environment variables. We created a simple C# program for that.
Values from environment variables are used at corresponding param values of connections. Moreover, you can store other configuration params there, not only connections.
You can have several sets of params at the same environment, and choose set when staring package.
Automated testing is done with scripted execution, and environment is specified in testing script.
So, every environment we deploy project to has configuration environment with all connection data. Connectivity params in QA environments are supplied by env engineer; developer does not need to worry about that.

How to keep separate dev, test, and prod databases in Play! 2 Framework?

In particular, for test-cases, I want to keep the test database separate so that the test cases don't interfere with development or production databases.
What are some good practices for separating development, test and production environments?
EDIT1: Some context
In Ruby On Rails, there are different configuration files by convention for different environments. So does Play! 2 also support that ?
Or, do I have to cook the configuration files, and then write some glue code that selects the appropriate configuration files ?
At the moment if I run sbt test it uses development database ( configured as "default" in conf/application.conf ). However I would like Play!2 to use a different test database.
EDIT2: On commands that play provides
For Play! 2 framework, I observed this.
$ help play
Welcome to Play 2.2.2!
These commands are available:
-----------------------------
...OUTPUT SKIPPED...
run <port> Run the current application in DEV mode.
test Run Junit tests and/or Specs from the command line
start <port> Start the current application in another JVM in PROD mode.
...OUTPUT SKIPPED...
There are three well defined commands for "test", "development" and "production" instances which are:
test: This runs the test cases. So it should automatically select test configuration.
run <port>: this runs the development instance on the specified port. So this command should automatically select development configuration.
start <port>: this runs the production instance on the specified port. So this should automatically select production configuration.
However, all these commands select the values that are provided in conf/application.conf. I feel there is some gap to be filled here.
Please do correct me if I am wrong.
EDIT3: Best approach is using Global.scala
Described here: How to manage application.conf in several environments with play 2.0?
Good practice is keeping separate instances of the application in separate folders and synching them i.e. via git repo.
When you want to keep single instance you can use alternative configuration file for each environment.
In your application.conf file there is an entry (or entries) for your database, e.g. db.default.url=jdbc:mysql://127.0.0.1:3306/devdb
The conf file can read environment variables using ${?ENV_VAR_NAME} syntax, so change that to something like db.default.url=${?DB_URL} and use environment variables.
A simpler way to get this done and manage your configuration easier is via GlobalSettings. There is a method you can override and that its name is "onLoadConfig". Try check its api at API_LINK
Basically on your conf/ project folder, you'll setup similar to below:
conf/application.conf --> configurations common for all environment
conf/dev/application.conf --> configurations for development environment
conf/test/application.conf --> configurations for testing environment
conf/prod/application.conf --> configurations for production environment
So with this, your application knows which configuration to run for your specific environment mode. For a code snippet of my implementation on onLoadConfig try check my article at my blog post
I hope this is helpful.

When precompiling ASP.NET MVC 4 project in Visual Studio 2012, does it try to resolve the entire config chain locally?

Can someone please confirm or deny my assumption below and/or offer any alternatives?
My Goal
I'd like to be able to precompile and merge my ASP.NET MVC 4 application (as documented here and here) when one-click publishing to our production environment.
The Symptom
I've got an ASP.NET MVC 4 project in Visual Studio 2012. My Web.config contains various entries that are removed in the Web.Release.config transformation. One of the removed entries is the entire configSections element because we maintain those entries in the production server's Machine.config.
However, when I configure my project to precompile and merge for release publication, I get the following error:
Unrecognized configuration section [our custom section name]
My Assumption
What I assume is happening is that it is precompiling everything locally before publishing to the production server (which makes perfect sense) but that part of that precompilation process is to resolve and validate the entire .config chain, from the project's Web.config up to my local Machine.config (which doesn't make much sense, practically). And since my local Machine.config does not declare configSections (or any of the other settings we rely on the production server's Machine.config for), the resolved Web.config doesn't validate.
And since the Web.config doesn't validate, the site can't be precompiled and so nothing is ever published to production.
The Rub
If that is indeed what's happening, then we won't be able to precompile, because the only solution I can think of (other than some potential configuration option I haven't been able to find) would be for all of our developers to have local copies of our production server's Machine.config on their machines. And that simply isn't reasonable because it defeats the whole purpose of having those common settings declared in a single location.

Automated deployment of Check Script for Nagios

We currently use Ant to automate our deployment process. One of the tasks that requires carrying out when setting up a new Service is to implement monitoring for it.
This involves adding the service in one of the hosts in the Nagios configuration directory.
Has anyone attempted to implement such a thing where it is all automated? It seems that the Nagios configuration is laid out where the files are split up so that they are host based, opposed to application based.
For example:
localhost.cfg
This may cause an issue with implementing an automated solution as when I'm setting up the monitoring as I'm deploying the application to the environment (i.e - host). It's like a jigsaw puzzle where two pieces don't quite fit together. Any suggestions?
Ok, you can say that really you may only need to carry out the setting up of the monitor only once but I want the developers to have the power to update the checking script when the testing criteria changes without too much involvement from Operations.
Anyone have any comments on this?
Kind Regards,
Steve
The splitting of Nagios configuration files is optional, you can have it all in one file if you want to or split it up into several files as you see fit. The cfg_dir configuration statement can be used to have Nagios pick up any .cfg files found.
When configuration files have changed, you'll have to reload the configuration in Nagios. This can be done via the external commands pipe.
Nagios provides a configuration validation tool, so that you can verify that your new configuration is ok before loading it into the live environment.

How do you deploy a website and database project using TFS 2010?

I've been trying to figure this out and so far haven't found a simple solution. Is it really that hard to deploy a database project (and a web site) using TFS 2010 as part of the build process?
I've found one example that involved lots of complicated checks and editing the workflow (which is a giant workflow btw).
I've even purchased the book "professional application lifecycle management with VS 2010", but apparently professionals don't deploy their applications since it isn't even mentioned in the book.
I know I'm retarded when it comes to TFS, but it seems like there should be any easy way to do this. Is there?
I can't speak for the database portion, but I just went through this on the web portion, the magic part is not very well documented component, namely the MSBuild Parameters.
In your build definition:
Process on the Left
Required > Items to Build > Configurations to Build
Edit, add a new one, for this example
Configuration: Dev (I cover how to create a configuration below)
Platform: Any CPU
Advanced > MSBuild Process
Use the following arguments (at least for me, your publish method may vary).
MsBuild Params:
/p:MSDeployServiceURL="http://myserver"
/p:MSDeployPublishMethod=RemoteAgent
/p:DeployOnBuild=True
/p:DeployTarget=MsDeployPublish
/p:CreatePackageOnPublish=True
/p:username=aduser
/p:password=adpassword
Requirements:
You need to install the MS Deploy Remote Agent Service on the destination web server, MSDeploy needs to be on the Build/Deployer server as well, but this should be the case by default.
The account you use in the params above needs admin access, at least to IIS...I'm not sure what the minimum permission requirements are.
You configure which WebSite/Virtual Directory the site goes to in the Web project you're deploying. Personally I have a build configuration for each environment, this makes the builds very easy to handle and organize. For example we have Release, Debug and Dev (there are more but for this example that's it). Only the Web project has a Dev configuration.
To do this, right click the solution, Configuration Manager..., On the web project click the configuration drop down, click New.... Give it a name, "Dev" for this example, copy settings from debug or release, whatever matches closest to what your deployment server environment should be. Make sure "Create new solution configurations" is checked, it is by default. After creating this, change the configuration dropdown on the solution to the new Dev one, and Any CPU...make sure your projects are all correct, I had some flipping to x86 and x64 randomly, not sure of the exact cause of that).
In your web project, right click, properties. On the left, click Package/Publish Web (you'll also want to mess with the other Package/Publish SQL tab, but I can't speak to that). In the options on the right click Create deployment package as a zip file. The default location is fine, the next textbox I didn't find documented anywhere. The format is this: WebSite/Virtual Directory, so if you have a site called "BuildSite" in IIS with no virtual directory (app == site root), you would have BuildSite only in this box. If it was in a virtual directory, you might have Default Web Site/BuildVirtualDirectory.
After you set all that, make sure to check-in the solution and web project so the build server has the configuration changes you made, then kick off a build :)
If you have more questions, I recommend you watch this video by Vishal Joshi, specifically around 22 and 59 minutes in, he covers the database portion as well...but I have no actual experience trying it since we're on top of a non MSSQL database.