Moving EntityFramework database to Azure, MigrationHistory table is lost - entity-framework

I'm in the process of migrating our on-premises hosted ASP.NET application into the cloud and I'm running into an issue with the migration of my database.
We're using EF5 for database interaction and we've enabled the code first migrations in order to deal with database changes. However, whenever I generate a BACPAC file the __MigrationHistory table is ignored and when I use the SSIS tool to migrate data between my local and Azure database the values in the identity columns are changed regardless of whether I select 'enable identity insert', messing up my entire database.
What am I doing wrong here? I've tried various routes and none seem to have the desired effect.
Thanks,
Patrick

Related

Implement Oracle external table like functionality in Azure managed postgresql

Currently we are using Oracle 19c external table functionality on-prem whereby CSV files are loaded to a specific location on DB server and they get automatically loaded into an oracle external table. The file location is mentioned as part of the table DDL.
We have a requirement to migrate to azure managed postgresql. As per checking the postgresql documentation, similar functionality as oracle external table can be achieved in standalone postgresql using "foreign tables" with the help of file_fdw extension. But in azure managed postgresql, we cannot use this since we do not have access to the DB file system.
One option I came across was to use azure data factory but that looks like an expensive option. Expected volume is about ~ 1 million record inserts per day.
Could anyone advise possible alternatives? One option I was thinking was to have a scheduled shell script running on an azure VM which loads the files to postgresql using PSQL commands like \copy. Would that be a good option for the volume to be supported?
Regards
Jacob
We have one last option that could be simple to implement in migration. We need to use Enterprise DB (EDB) which will avoid the vendor lock-in and also it is free of cost.
Check the below video link for the migration procedure steps.
https://www.youtube.com/watch?v=V_AQs8Qelfc

IdentityServer4 entity framework SQL Server connection string

I am trying to follow quickstart to setup SQL Server (not LocalDb version of SQLServer that comes with Visual Studio) as my data store. Looks like that two databases will be needed - one for configuration and the other for operation. But my problem is that I couldn't figure out what db names I should use. I created two databases using names I came up with and ran the scripts I downloaded from quickstart to create all the tables. Now, when I try to make connection, I think I will need to specify db names in my connection string, don't I? What should I use to replace the original connection string provide by quickstart - "Data Source=(LocalDb)\MSSQLLocalDB;database=IdentityServer4.Quickstart.EntityFramework-4.0.0;trusted_connection=yes;" ?
You can have one database for both. But in general I would keep the configuration part in memory if the number of clients is small. Why spend hours keeping the config in a database for just a few clients and resources?
Better to just keep the users and persisted grants in a database.

Connection error when trying to create entity model from SAP HANA DB

I am trying to build a ADO.NET entity model from a SAP HANA database. This is for SAP B1. This process is pretty straight forward using MS Server/MySql etc.
However, when I follow the steps of creating this HANA model, I get the following error below on clicking "Test Connection":
general error: database 'EOH_CCL_TEST' does not exist
I have added a reference for Sap.Data.Hana.v4.5.dll.
Version is 1.0.120.0.
The database exists and I am able to perform queries on it as can be seen below.
Note: I am using the same credentials as I used to log into SAP HANA Studio.
What am I missing here?
There is a previous post: ADO.NET Provider for SAP HANA - Version mismatch issue
But in that issue, the user was able to make the connection.
You are using the schema name EOH_CCL_TEST as database name. The database name is different to the schema name. Did you logon to the SYSTEMDB database or to a tenant database in HANA Studio? Using the used DB name should solve the issue for you. PS: I also do not think that you need to add a port in the hostname property field.
Going from the screenshot you are not using a HANA system with multiple database containers. In this “classic” setup there is no separate admin object “database” and connections don’t take a database name.
Just put in hostname and port and leave the database name empty. The EOH_CCL_TEST is indeed just the schema name.
Beyond that, it’s really not a good idea to use SYSTEM user for working with data or really anything beyond bootstrapping the system.

Database and user creation in EF Core when deploying to Kubernetes

We have a .net core app being deployed to a Kubernetes cluster which accesses an AWS RDS MS SQL database.
In this environment we'd like to use EF Code First to handle our model (maybe with migrations later, but initially dropping and creating is fine).
How in this environment do we create a SQL user with appropriate permissions on the RDS instance so that the web application can login as this user and create the code first model?
Our initial approach involved creating a user as part of a .sh script, creating a db and assigning permissions. This fell down as when in the C# code we tried to run Database.EnsureExists() is saw there was a database and didn't build the model.
I thought perhaps not creating the db and assigning higher permissions to the user might work, but this feels like a bad approach unless we run some kind of post deploy to remove the dboesqe permissions afterwards.
What is the recommended approach for a ephemeral deployment where we intend to drop/create/seed on each run?
I've been tussling with this question as well. We're doing .NET Core EF code first on Kubernetes with a Microsoft SQL database.
I've been messing around with context.Database.Migrate(). This will create the DB then create the tables and do the migrations (case 1), or if the DB already exists, it will just create the tables and do the migrations (case 2).
For case 1, the account needs to have the dbcreator server role. Once it creates the DB, it will assign itself the dbo database role.
For case 2, you could potentially just give db_ddladmin, db_datareader, and db_datawriter. I've tested this and it seems to work fine, but I'm unsure of the side effects of not having dbo access. Julie?

TSQL: How do I move data between SQLServer instances?

I have two SQLSever instances, each of them have an identical schema. One is running in SQLAzure, the other is a standared SQLServer 2008 instance. I need to copy the data from the Azure database to my local instance.
Essentially I want to do this:
insert LOCAL_TABLE (col1, col2)
select col1, col2
from AZURE_TABLE
How would I do that?
In order to move data between SQL Servers, and if one of them is SQL Azure you have couple of options:
SQL Azure Data Sync
Using SSIS
Write your own code that will move data using, most probably SqlBulkCopy class.
If you would like to just copy all the data, you could also use SQL Azure Migration Wizard - you can omit the option for coping the schema, and let it just copy the data.
EDIT
And as, by the original answer from Matthew PK, you could link to your SQL Azure server from your on-prem Server, but this is only an option when you just want to do some ad-hoc testing. I would not use this approach in production for constantly syncing data.
You could accomplish that in a single statements using linked servers.
http://msdn.microsoft.com/en-us/library/ms188279(v=sql.100).aspx
EDIT: Here is a link which appears to explain how to link to SQL Azure:
http://blogs.msdn.com/b/sqlcat/archive/2011/03/08/linked-servers-to-sql-azure.aspx
EDIT: Here is a write-up on connecting to Azure with SSMS
http://www.silverlighthack.com/post/2009/11/11/Connecting-to-SQL-Azure-with-SQL-Server-Management-Studio-2008-R2.aspx
Otherwise I believe you need to do it in two statements.
Linked Server is not officially supported. However, Here are couple of resources that are supported and would help you do what you are looking for:
1) Check out SQL Azure Dac Examples: http://sqldacexamples.codeplex.com/
2) The other options is SQL Azure Data SYNC.
Use a product like "SQL Data Compare" from redgate. I am not a Azure user, but I am guessing it would work, I have used it for a few years and its pretty solid.
Is this a one-time copy or ongoing?
If one-time, then use the SQL Azure Migration Wizard (from Codeplex)
If ongoing then use SQL Azure data sync
Also you can verify that the schema is compliant with SQL Server Data Tools in VS, just set the target to SQL Azure or to SQL Server 2012, or 2008 and then build and any/all schema errors will show up.