I'm using Visual Studio 2008, SQL Server 2008, asp.net mvc 2.
Now, I have to change my database to postgresql. I tried many ways. But I did something different.
I created the data model based on SQL Server 2008 database.
I have the same structured database in postgresql.
So I just changed the connection string of the entity data model and I got the following error
The 'System.Data.SqlClient' provider from the specified SSDL
artifact(s) does not match the expected 'Npgsql' provider from the
connection string.
How do I solve this error.
Please let me know if this is correct way to implement.
Well, the physical layer of your model (the "SSDL" file in your metadata) will of course contain the physical aspects of your database model - including the database provider that this model is based on. You cannot just simply change the database connection string and be done with that.....
I see two options:
my preferred solution: just re-create the EF model based on your Postgres database - that's the cleanest way to go. If you're doing it right, this is contained in a single assembly in your project anyway, so you could more or less swap in a new assembly to go against Postgres instead of SQL Server
a hack in my opinion: you could have the metadata files (the *.ssdl, *.msl, *.csdl) for your model written out to disk, and then manually edit the SSDL file to switch to the Postgres provider. I have no idea if that will even work and what side-effects it might have! Do at your own risk, and do it on a backup copy of your project first!
Related
Our team is thinking of utilizing Entity Framework Core code-first to help model the database. We can have both DB projects and EF models, as per article here Database Projects vs. Entity Framework Database Migrations utilizing schema compares, just trying to figure out what will be the source of truth?
Does Entity Framework support all features in SQL Server SSDT Database Projects?
What features does EF Core 2 not support? (eg, does it not support any of following: triggers, views, functions, stored procedures, encryption keys, certificates, db properties (ansi null, quoted identifier), partitions)
I am trying to locate the Microsoft Resource.
tl;dr Database Projects are feature-rich, but database-first. Migrations is code-first, but has a very limited built-in set of database features.
For many people it won't be relevant to compare Database Projects and Migrations. They represent two different modes of working with Entity Framework. Migrations is code-first, DP is database-first. Sure, you can use migrations to control the database schema and besides that keep a DP in sync with the generated database to satisfy DBAs (as the link suggests). But both lead their own separate lives and there's no Single Source Of Truth.
So comparing them is useful if you're not sure yet wich working mode you're going to choose.
For me the most important difference is that DP will cover all database objects and detect all changes between them when comparing databases. Migrations only detect changes between a database and the mapped model. And the set of options for generating database objects is very limited. For everything you need additionally you have to inject SQL statements into the migration code. These statements are your own responsibility. You have to figure out yourself if a migration needs an ALTER PROCEDURE statement or not (for example). EF won't complain if the script and the database differ in this respect.
This is the main reason why I've never been a great fan of migrations. It's virtually impossible to maintain a mature database schema including storage, file groups, privileges, collations, and what have you.
Another advantage of DP is that they're great in combination with source control. Each database object has its own file and it's very easy to check the change history of each individual object. That's not possible with generated migrations. Indeed, many intermediate changes may never make it to a generated migration.
Of course the obvious advantage of migrations is the possibility to do a runtime check (albeit incomplete) whether the code and the database match. In database-first projects you need to create your own mechanism for that.
EF Core is only ORM.
1) You should be ready to create all DB objects except tables manually. What I create manually: constrates (defaults as well as conditions). Since this is code first - there is no need in SP, functions and so on. If you use ORM - DB is only storage. Of course practice is important. For me default constraints adds comfort on tables where I create test data manually. And conditions also are usefull in situations when you do not trust your (team) code.
2) you will do creation (and dropping) of views, triggers, sp and so on to the "migration" code (there is such concept in EF) in plain sql:
migrationBuilder.Sql("CREATE VIEW ...");
As a result you could have a separate "migration" program (e.g. command line tool) that install or remove both Ef Core tables and your manually created objects, do and revert the data migrations.
"EF Core migrations" is quite complex api (reserve a week for learning). Interesting topics: managing several dbcontexts in one db, createing db object during migration from model annotations, unistall. Or find a freelancer for it (this part of project is good for outsourcing).
I need to migrate data from many xmls to a sql server db and and it has to be done in a transactions.
I thought about EF and dbContext as it's a UOW in it's own right.
My question is
Can you do Database First at run time?
What I want to do is Read all tables from db store in class/dataset and map the db.column to equivalent in xml file and commit.
This has to work in such a way that if a table is added or column added it will work without any code changes as it is driven by db.
The problem I face is that with Db generated from model if a new column is required somebody later on "unfamiliar with EF" as to add the column "manual job".
I can do what I want with raw ado.net by reading db schema and mapping to a dataset but wondering if I could do it using EF.
Hope all clear
any suggestions
Yes you can.
You can use Entity Framework Power Tools which allows you to create Code-First file from Database of yours as you start like Code-First.
Or, you can use Database-first approach also. It's not that hard.
If you try to use database-first approach, please read my trial-and-error experience: post1, post2
Is it possible to use Entity Framework 4.3 without linking the model to an actual DB in the back-end?
I need to build a conceptual model of a database in the VS designer and then I'd like to manually handle fetches, inserts and updates to various back-end databases (horrible legacy systems). I need to be able to do this without EF moaning about not having tables mapped, etc. I realise that this is a very odd thing to want to do...
The reason for this is that we would like to move from these legacy systems into a well designed data model and .NET environment, but we need to still maintain functionality and backward compatibility with the old systems during development. We will then reach a stage where we can import the old data (coming from about 6 different databases) into a single DB that matches the EF model I'm building. In theory, we should then be able to switch over from the hacked up EF model to a proper EF model matching the new data structure.
Is this viable? Is it possible to use the EF interface, with LINQ without actually pointing it to a database?
I have managed to query the legacy systems by overriding the generated DbContext and exposing IQueryable properties which query the old systems. My big fight now is with actually updating the data.
If I am able to have EF track changes to entities, but not actually save those changes. I should be able to override the SaveChanges() method on the context to manually insert into various legacy tables.
I'm sort of at wits end with this issue at the moment.
UDPATE 4 Sept 2012: I've opted to use the EDMX file designer to build the data model and I generate the code by using T4. This enables me to then manually write mapping code to suit my needs. It also allows me to later perform a legacy data migration with relative ease.
If I were in your situation I'd setup the new DB server and link the legacy servers to it. Then create stored procedures to interface with EF for the INSERT/UPDATE/DELETE. This way your EF code remains separate from the legacy support messiness. As you decommission the legacy DB servers you can update your stored procedures accordingly. Once you have no more legacy DB servers you can either continue using your sprocs or do a refresh of your EF data connection to use the table schema directly.
Entity framework is to link entities to a data store without manual populates.
Otherwise you're just using classes with linq.
If you mean you don't want a seperate data store like sql server, mongo etc etc, then just let your application create the database as an mdb file that gets bundled in your app_data file. That means you don't need a databsae server so to speak and the database is part of your app.
If on the other hand you want a different way to save to the database, you can create your own data adapters to behave however you like. The mongo .net entity framework component is an example of this.
Alternatively, using code only you can just use stored procedures to persist to the database which can be a bit verbose and annoying with EF, but could bridge the gap for you you and allow you to build a good architecture with a model you want that gets translated into the crappy one in your repositories.
Then when the new database is ready, you can just rework your repo's to use savechanges and you're done.
This will of course only work with the code only approach.
I have two xsd files with the same exact schema. The first was generated from an SQL connection and the other was generated from a FoxPro OleDb connection by the Dataset Designer. I can't use the same XSD for generating my table adapters and tables because of SQL and OleDb providers generates different types. I can't use EF because it doesn't support FoxPro OleDb, or at least not officially.
I also have the problem where the SQL xsd file has table, field, and table adapter names in UPPERCASE and the FoxPro OleDb connection in lowercase. This creates a situation where I can't trick into casting them from a base type.
Is there a suitable work-around for this problem or do I have to have two sets of code in my DAL layer? I would hate to have to rewrite CRUD operations for all these tables.
For clarification, are you connecting to BOTH "servers" for data? Are you trying to copy data from one server to another? You could always have your query from each result in same column names and comparable data types (text vs char, integer works in both, memo vs blob, etc).
I've actually written a data querying wrapper object based on the INTERFACE types to perform querying and updating once per database connection, then subclassing from that for the respective individual table, so most of all the grunt work is done once.
As per the interfaces, I'm referring to that of IDbConnection (SQLConnection vs OleDbConnection), IDbCommand (SQLCommand vs OleDbCommand), etc and has worked well for me connecting to SQL, MySQL, Access, SQLite and Visual Foxpro databases.
All that being said, what/how are you using your XSD files. With a little more clarification, I might be able to offer a little more help.
I'm trying to store a document in SQL Server 2008 using the Entity Framework.
I believe I have the code for doing this completed. The problem I'm now facing is which Data Type to use in SQL Server and in my entity model.
'Image' was my first choice, but this causes an "Invalid mapping" error when I update the model. I see that there's no equivalent of 'Image' (going by the Type drop-down in the entity's properties).
So then I tried 'varbinary(MAX)' and I see that this maps to 'binary' in the entity model. However, when I run the code it tells me that the data would be truncated so it stopped. Upon investigation I see that the SQL Server Data Type 'binary' is 8000 bytes long - which is why I chose 'varbinary(MAX)' - so the entity model seems to be reducing/mapping 'varbinary(MAX)' to 'binary'.
Is this right?
If so, what should my Data Types be (in both SQL Server 2008 and in my entity model) please? Any suggestions?
Change the data type of your model to byte[] and it will be all right dude, if you need more explanations please leave a comment.
EDIT:
Dude, I had tried it before in Linq to Sql and this time I tried it in EF, in conceptual model of your namely Foo.edmx file your type is Binary(you can open it through open with context menu of Visual Studio and then selecting Xml Editor or any other text editor like notepad) but in a generated file named Foo.designer.cs your data type is Byte[].
And there is not a limit you mentioned above.
I tried it with a 10000 bytes and it's inserted successfully without truncating my array. About benchmarking on saving documents in database or file system I read an article and it said that in Sql Server 7, file system have a better performance on retrieving stored data but in later versions of Sql Server, it take over the file system speed and it suggested saving documents on Sql Server.
IMHO on saving documents, if they are not too large, I prefer to store them on DB (NoSql DBs has great performance here as far as I know),
First: Integrity of my data,
Second: More performance that you can have(cause if your folder has large number of files, reading and writing files in those folder slows down gradually more and more unless you organize them in more than one folder and preferably in a tree like folders),
Third: Security policies that you may apply to them through your application more easily(although you can do this on file system approach but i think it's easier here)
Fourth: you can benefit from facilities provided by your DBMS for querying and manipulating and ... those files
and much more ... :-)
Ideally you should not store documents in the database, instead store the path to the document in the database, which then points to the physical document on the web server itself (or some other storage, CDN, etc).
However, if you must store it in SQL Server (and seeing as though your on SQL 2008),
you should be using FILESTREAM.
And it is supported by EF4 (i believe). It maps to binary.
As i said though, i'm not sure how well this will perform, run some benchmarks - and if it's not performing too well, try using regular ADO.NET/FileStream API.
I still think you should put it on the file system, not in the database (my opinion of course)