RedGate SQL Source Control not picking up migration scripts - redgate

I have 3 SQL Sever databases using RedGate SQL Source Control with SVN. When I create a custom migration script it is picked up by 1 of the 3 databases, but not by the other. Here is the command line that I'm using:
"C:\Program Files (x86)\Red Gate\SQL Compare 10\sqlcompare" /server1:dev03 /db1:Dev_CORE /version1:head /server2:testsql01 /db2:Test_Core /username2:foo /password2:bar /ScriptFile:"C:\MigrationScriptCore.sql" /Force /verbose

Redgate uses the SQL Server extended properties to figure out what SVN version the DB is at, as well as where to find the scripts it needs to run to update the DB. The REALLY STUPID thing, is that these extended properties are CASE SENSITIVE. So the SVN path in my dev DB was:
SQLSourceControl Migration Scripts Location
<?xml version="1.0" encoding="utf-16" standalone="yes"?>
<ISOCCompareLocation version="1" type="SvnLocation">
<RepositoryUrl>http://svn.company.com/svn/**DIR1**/trunk/Database/Core/MigrationScripts/</RepositoryUrl>
</ISOCCompareLocation>
And in my Test DB it was:
SQLSourceControl Migration Scripts Location
<?xml version="1.0" encoding="utf-16" standalone="yes"?>
<ISOCCompareLocation version="1" type="SvnLocation">
<RepositoryUrl>http://svn.company.com/svn/**dir1**/trunk/Database/Core/MigrationScripts/</RepositoryUrl>
</ISOCCompareLocation>
The guys at redgate wen't able to fix my problem right away but after a lot of trial and error I was able to find it own my own. Hopefully this will save a few people a lot of time!
Cheers.

Related

liquibase integration not interoperable between command line and application

I'm using liquibase since several years and it's extremly helpful for me as application developer to bring source code and database in sync, so thank you to all contributors for this tool.
During my daily work, I usually start liquibase from the command line in order to test the changesets and database operations. If everything is wired right, I start my application (Spring Boot) and the liquibase setup within the application performs all those sync steps. These setup works perfect unless my changelog file contains changesets with loaddata in order to populate data from CSV files into the database. Every application start fails with liquibase.exception.ValidationFailedException: Validation Failed:
change sets check sum
The reason seems to be the different file locations for the CVS files mentioned in loaddata which are part of the checksum computation. If startet from the application, the changesets looks like this:
classpath:liquibase/changelog.xml: classpath:liquibase/changelog.xml::loadDefaultRolePermissions::dominik
But if started from commandline, there is no way to use classpath resources, the changeset infos looks like that
liquibase: src/main/resources/liquibase/changelog.xml: src/main/resources/liquibase/changelog.xml::loadDefaultRolePermissions::dominik
Both values differs and leads to different checksums.
If you look into liquibase.integration.commandline.Main.java, there is no classpath resource accessor used:
FileSystemResourceAccessor fsOpener = new FileSystemResourceAccessor();
CommandLineResourceAccessor clOpener = new CommandLineResourceAccessor(classLoader);
CompositeResourceAccessor fileOpener = new CompositeResourceAccessor(fsOpener, clOpener);
from liquibase.integration.commandline.Main.java
Is there any way to let liquibase be interoperable between command line AND application startup run ?
Thanks in advance
Dominik
RESOLVED by updating from liquibase 3.3.1 to 3.5.3

Azure Cloud Deploy Keeps Recycling

I get the following error:
Recycling (Waiting for role to start... Sites are being deployed.
[2012-12-17T05:30:10Z])
Running One or more role instance is unhealthy. 1 Instance: 1
Unhealthy
i was actually trying to convert my web application to a cloud applicaton.
here is what i did:
i added a cloud project to my solutions,
i added a webrole which linked to my web app
i created an sql azure database and copied my whole structure and also the data to the db
i inserted the connection string in my webconfig and tried to run it on emulation, this worked fine
then tried to deploy it by creating a cloud service, running the builder to create the packages and uploading the packages in "staging" mode. this is where i got the errors.
i tried to create an empty cloud app and add a default webrole and load this to the cloud, this worked fine. so i figured, maybe i have something wrong in my settings of my webrole.
I checked the difference between both and i noticed that in both solutions diagnostics was enabled but the storage account was empty in my own solution, so i inserted "UseDevelopmentStorage=true" here. this didn't change anything tho. I also saw a difference in the "packages.config"
default role had:
<package id="Microsoft.WindowsAzure.ConfigurationManager" version="1.7.0.0" targetFramework="net40" />
<package id="System.Web.Providers" version="1.1" targetFramework="net40" />
<package id="System.Web.Providers.Core" version="1.0" targetFramework="net40" />
<package id="WindowsAzure.Storage" version="1.7.0.0" targetFramework="net40" />
mine had:
<package id="Microsoft.WindowsAzure.ConfigurationManager" version="1.7.0.0" targetFramework="net35" />
<package id="WindowsAzure.Storage" version="1.7.0.0" targetFramework="net35" />
i tried changing in this and uploading, didn't do anything
I am not using a worker role, i have only 1 running instance (same as default)
my application uses some authentication in global.asax where it tries to read from User.Identity.Name and compare with a user in the database (this user is inserted in the sql azure db). At first I thought this would maybe be the cause of the problem, but even if i comment out this code the application will not run on the cloud.
VM size is small, trust level = Full trust
I also saw some differences in the settings where i had remote access parameters. I tried removing all these just to exclude issues
i read something about settings references to "copy local is true", but im not sure if this will do any difference.
Any ideas because I don't really know what to do anymore
EDIT:
I modified all the references to "copy local is true" and i disabled to diagnostics just to be sure there's nothing wrong with it.
but now i get the error:
<!-- Web.Config Configuration File -->
<configuration>
<system.web>
<customErrors mode="Off"/>
</system.web>
</configuration>
Funny thing is, in my webconfig this is already set... And I can only find 1 webconfig.
I'm not quite sure what I'm doing wrong
"Keep recycling" almost always means that there are some exception occurred when your application was started. You might not be able to see any errors or exceptions in through the Diagnostics Montor since your exception might be occurred before you configured and started the diagnostics.
I recommend you enable the IntelliTrace option when deployment. It's very easy to do if you are using Visual Studio. Then you can retrieve the IntelliTrace result through Visual Studio and figure out what exception occurred. I strongly considered there are some references missed on azure that you need to set Copy Local = true. But you need IntelliTrace to find them.
The problem was that I had some referenced projects that had an app.config file with a string to a local database

Could not create the driver from NHibernate.Driver.NpgsqlDriver

I've inherited a C#/NHibernate/MS SQL Server project and am new to NHibernate. One of the first tasks given to me was to migrate the database from MS SQL Server (2008 R2) to Postgresql 9.2. I'm using the Npgsql 2.0.12 (.net 2.0 version). The Mono.Security.dll and Npgsql.dll are included in my project References and they exist in my bin directory. When the code executes the following line:
SessionFactory.OpenSession();
an exception is thrown with the message
"Could not create the driver from NHibernate.Driver.NpgsqlDriver."
Searching the web, gave me a few ideas but none have worked. This code I've inherited is in production at several clients with no issues using MS SQL Server. Here is my hibernate.cfg.xml file:
<?xml version="1.0" encoding="utf-8"?>
<hibernate-configuration xmlns="urn:nhibernate-configuration-2.2">
<session-factory>
<property name="connection.provider">NHibernate.Connection.DriverConnectionProvider</property>
<property name="dialect">NHibernate.Dialect.PostgreSQLDialect</property>
<property name="connection.driver_class">NHibernate.Driver.NpgsqlDriver</property>
<property name="connection.connection_string">server=localhost;Port=5432;Database=vehicletracker;User Id=postgres;Password=********;</property>
</session-factory>
</hibernate-configuration>
I did not forget to include "using Npgsql;", it is there. Any suggestions?
Regards,
B
I found the answer to my own question. There were several issues specific to my work environment but ultimately the Mono.Security.dll and Npgsql.dll were not available in my final output directory. The two files were present in the bin directory of my Data Access Layer (a class library) but not in the bin directory of my Test Project that called the class library. Everything is working fine now.
If you're using NuGet for the nhibernate libraries try uninstalling the packages and reinstall them. I'm sure there is a more efficient way someone knows of to fix this kind of issue but for me it solved this exact issue.

EF4 Code First pre-generated view has different hash values in other machines

My pre-generated views for EF4 Code First using this T4 template does not work in the build server. I am not re-generating the view in the build server, just compile and run the MSTest. Error is thrown when the tests are ran:
System.Data.MappingException: The mapping and metadata information for
EntityContainer 'DB' no longer matches the information used to create
the pre-generated views
I ran the same template in another machine and the hash values are different. I guess this is the reason why its it does not work in the build server. The hash values are different at runtime in other machines, hence the verification fails and throws the exception.
Im using:
VS 2010 Pro
.Net Framework v4.0.30319
Entity Framework v4.2 (Code First)
EF CodeFirst View Generation T4 Template for C# (v1.0.1) - slightly modified GetEdmxSchemaVersion to return the correct EntityFrameworkVersions version for my setup.
using Class Library project template
The tests that I am running connects to a SQL DB file that is checked in with the code.
I have checked the build server and its using the same EF dll version and .Net Framework version.
Any idea why the hash values are different?
UPDATE:
I've generated and compared two XML file from two dev machines using EdmxWriter.WriteEdmx().
Here is schema version (the same in both machine):
<?xml version="1.0" encoding="utf-8"?>
<Edmx Version="2.0" xmlns="http://schemas.microsoft.com/ado/2008/10/edmx">
The obvious difference are the order some nodes in the XML files. Here is an example:
Machine 1:
<EntityType Name="PersonEntity" BaseType="Self.Entity" />
<EntityType Name="CompanyEntity" BaseType="Self.Entity" />
Machine 2:
<EntityType Name="CompanyEntity" BaseType="Self.Entity" />
<EntityType Name="PersonEntity" BaseType="Self.Entity" />
Any idea why they are in different order?
UPDATE 2:
The Edmx (xml) from the build server is also different for the other 2 dev machines. Again, the order of some nodes are different.
Machine 1 and build server both have System.Data.Entity.dll (same File and Product version -- v4.0.30319.1) in %windir%\Microsoft.NET\assembly\GAC_MSIL\System.Data.Entity\v4.0_4.0.0.0__b77a5c561934e089.
UPDATE 3:
I also looked at the version of System.Data.Entity.Design.dll. The T4 template references this assembly. Machine 1 has two copies of this dll ... in GAC (v4.0.30319.233) and in C:\Program Files (x86)\Reference Assemblies\Microsoft\Framework.NETFramework\v4.0 (v4.0.30319.1). This is also true in the build server and Machine 2. I wonder if hash validation function during runtime is using this dll as its not referenced in my project. If it does, which version is used. But then again, the hash validation is successful in Machine 1.
I'm answering my own question. Here's how we solve the main problem (how to pre-generate views in EF Code First that is usable in different machines). I am now using .Net runtime 4.0.30319.17929.
In the ABC.csproject where MyContext.cs is located, delete the T4 template and MyContext.Views.cs.
Compile ABC.csproject
Create a console app that will generate the views. I copied most of Pawel's T4 template. Reference the ABC.dll (and other required dlls). Here's one of the changes:
var edmx = GetEdmx(typeof(MyContext));
Save the string output of GenerateViews() to a text file.
Run the console app.
Add new file named MyContext.Views.cs to ABC.csproject and copy the content of the text file to this class.
Recompile ABC.csproject
I find my solution insane and must be simplified or automated but it works.

Is there a system that I can use to update my projects on my VPS?

Hey guys, So I recently got a VPS, just so I can start gaining experience. But I'm looking for a service/program where I can code on my PC, then when I'm done, I run a script or do a command or something to have it updated to my VPS.
I thought I was looking for Git, but apparently git does not do what I'm looking for.
Any suggestions?
Windows or Linux?
On Windows, there's a host of tools.
First of all you code. Visual Studio is the most common. You get a sln-file and a batch of *.*proj-files.
When talking about deploying to remote servers, often a continuous integration server is used. We are using TeamCity (http://www.jetbrains.com/teamcity/). Download it locally, install and create a new project, selecting the "SLN-runner". Point it to the sln file of yours.
When you want the deployment part working, create a small build file such as "MyProj.build", that contains something along the lines of
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="BuildProject"
InitialTargets="CheckRequiredProperties"
xmlns="http://schemas.microsoft.com/developer/msbuild/2003"
ToolsVersion="4.0">
<Target Name="BuildProject">
<Message Text="Starting $(Configuration) build. Web site publish location $(OutputWebSite)" />
<MSBuild Projects="$(SolutionPath)"
Targets="Build"
Properties="BuildOutputPath=$(BuildOutputPath);
BuildOutputPathBin=$(BuildOutputPathBin);
Configuration=$(Configuration);
BuildConstants=$(BuildConstants);
MSBuildTargets=$(MSBuildTargets);
TargetFrameworkVersion=$(TargetFrameworkVersion);
TargetFrameworkProfile=$(TargetFrameworkProfile)">
...
Where SolutionPath points to your sln-file.
You will then update the TeamCity config to point to MyProj.build instead, using the MsBuild runner.
Then you need a way of having TeamCity upload everything to your server. Powershell is a nice scripting environment that can run .Net code, but you'd be invoking it through MsBuild...
Something like this
http://community.bartdesmet.net/blogs/bart/archive/2008/02/16/invoking-powershell-scripts-from-msbuild.aspx
And then you can script with MsDeploy accross to your server:
http://blogs.iis.net/jamescoo/archive/2008/08/21/using-msdeploy-in-powershell.aspx
"rsync" or "scp" tools may be useful