Is it possible to use Entity Framework 4.3 without linking the model to an actual DB in the back-end?
I need to build a conceptual model of a database in the VS designer and then I'd like to manually handle fetches, inserts and updates to various back-end databases (horrible legacy systems). I need to be able to do this without EF moaning about not having tables mapped, etc. I realise that this is a very odd thing to want to do...
The reason for this is that we would like to move from these legacy systems into a well designed data model and .NET environment, but we need to still maintain functionality and backward compatibility with the old systems during development. We will then reach a stage where we can import the old data (coming from about 6 different databases) into a single DB that matches the EF model I'm building. In theory, we should then be able to switch over from the hacked up EF model to a proper EF model matching the new data structure.
Is this viable? Is it possible to use the EF interface, with LINQ without actually pointing it to a database?
I have managed to query the legacy systems by overriding the generated DbContext and exposing IQueryable properties which query the old systems. My big fight now is with actually updating the data.
If I am able to have EF track changes to entities, but not actually save those changes. I should be able to override the SaveChanges() method on the context to manually insert into various legacy tables.
I'm sort of at wits end with this issue at the moment.
UDPATE 4 Sept 2012: I've opted to use the EDMX file designer to build the data model and I generate the code by using T4. This enables me to then manually write mapping code to suit my needs. It also allows me to later perform a legacy data migration with relative ease.
If I were in your situation I'd setup the new DB server and link the legacy servers to it. Then create stored procedures to interface with EF for the INSERT/UPDATE/DELETE. This way your EF code remains separate from the legacy support messiness. As you decommission the legacy DB servers you can update your stored procedures accordingly. Once you have no more legacy DB servers you can either continue using your sprocs or do a refresh of your EF data connection to use the table schema directly.
Entity framework is to link entities to a data store without manual populates.
Otherwise you're just using classes with linq.
If you mean you don't want a seperate data store like sql server, mongo etc etc, then just let your application create the database as an mdb file that gets bundled in your app_data file. That means you don't need a databsae server so to speak and the database is part of your app.
If on the other hand you want a different way to save to the database, you can create your own data adapters to behave however you like. The mongo .net entity framework component is an example of this.
Alternatively, using code only you can just use stored procedures to persist to the database which can be a bit verbose and annoying with EF, but could bridge the gap for you you and allow you to build a good architecture with a model you want that gets translated into the crappy one in your repositories.
Then when the new database is ready, you can just rework your repo's to use savechanges and you're done.
This will of course only work with the code only approach.
Related
In one of my projects, I am using an existing SQL Server database. All the database scripts are managed using DBUp and SQL script migrations.
In my application, I am using Entity Framework Core to communicate with this database. When I configure my entities in EF configurations, should I still define functions like IsRequired(), HasMaxLenth() etc.?
I am not using these EF configurations to generate migration scripts; all the migration is outside of EF. I am just using these configurations to communicate with the database.
When I configure my Entities in EF configurations, should I still define functions like IsRequired(), HasMaxLenth() etc.?
Other than table and column name mapping and data type mapping, it's not required, but additional model metadata might be used by front-end components for validation.
In general, yes, you should keep them. Many of these configurations are used throughout EF to make decisions at runtime. For example, some queries can be further optimized if EF knows that a column is never NULL, the max length is used to configure the SQL parameters it sends to the database, and unique constraints are used to sort SQL statements during SaveChanges.
While a few things like constraint names, non-unique indexes, index filters, and sequences aren't currently used at runtime, it's hard to know which ones EF will and won't use, so it's best just to keep them all.
And sometimes, database features like always encrypted on SQL Server, will fail entirely if the mappings aren't precise.
Our team is thinking of utilizing Entity Framework Core code-first to help model the database. We can have both DB projects and EF models, as per article here Database Projects vs. Entity Framework Database Migrations utilizing schema compares, just trying to figure out what will be the source of truth?
Does Entity Framework support all features in SQL Server SSDT Database Projects?
What features does EF Core 2 not support? (eg, does it not support any of following: triggers, views, functions, stored procedures, encryption keys, certificates, db properties (ansi null, quoted identifier), partitions)
I am trying to locate the Microsoft Resource.
tl;dr Database Projects are feature-rich, but database-first. Migrations is code-first, but has a very limited built-in set of database features.
For many people it won't be relevant to compare Database Projects and Migrations. They represent two different modes of working with Entity Framework. Migrations is code-first, DP is database-first. Sure, you can use migrations to control the database schema and besides that keep a DP in sync with the generated database to satisfy DBAs (as the link suggests). But both lead their own separate lives and there's no Single Source Of Truth.
So comparing them is useful if you're not sure yet wich working mode you're going to choose.
For me the most important difference is that DP will cover all database objects and detect all changes between them when comparing databases. Migrations only detect changes between a database and the mapped model. And the set of options for generating database objects is very limited. For everything you need additionally you have to inject SQL statements into the migration code. These statements are your own responsibility. You have to figure out yourself if a migration needs an ALTER PROCEDURE statement or not (for example). EF won't complain if the script and the database differ in this respect.
This is the main reason why I've never been a great fan of migrations. It's virtually impossible to maintain a mature database schema including storage, file groups, privileges, collations, and what have you.
Another advantage of DP is that they're great in combination with source control. Each database object has its own file and it's very easy to check the change history of each individual object. That's not possible with generated migrations. Indeed, many intermediate changes may never make it to a generated migration.
Of course the obvious advantage of migrations is the possibility to do a runtime check (albeit incomplete) whether the code and the database match. In database-first projects you need to create your own mechanism for that.
EF Core is only ORM.
1) You should be ready to create all DB objects except tables manually. What I create manually: constrates (defaults as well as conditions). Since this is code first - there is no need in SP, functions and so on. If you use ORM - DB is only storage. Of course practice is important. For me default constraints adds comfort on tables where I create test data manually. And conditions also are usefull in situations when you do not trust your (team) code.
2) you will do creation (and dropping) of views, triggers, sp and so on to the "migration" code (there is such concept in EF) in plain sql:
migrationBuilder.Sql("CREATE VIEW ...");
As a result you could have a separate "migration" program (e.g. command line tool) that install or remove both Ef Core tables and your manually created objects, do and revert the data migrations.
"EF Core migrations" is quite complex api (reserve a week for learning). Interesting topics: managing several dbcontexts in one db, createing db object during migration from model annotations, unistall. Or find a freelancer for it (this part of project is good for outsourcing).
I'm working on a green-field application that has a corporate mandate that Stored Procedures are used for all database interaction.
I'd like to use Entity Framework and leverage Stored Procedure Mapping to gain the benefits of the ORM.
Since we will be developing the database and .NET application in parallel, I'm looking for information to help the database developer/administrator. Does anyone know of a consolidated guide on how to design tables and stored procedures so they can be best integrated with the Entity Framework?
A couple tips I've collected are:
Update Stored Procedures require exactly 1 parameter per table column
There must be an insert, update, and delete Stored Procedure for every table
I want to know as much about how the database should be designed for easy use with Entity Framework because the database is very difficult to change later in our environment.
I wrote a blog post describing the limitations of using mapping in this way after working on this for several months:
The Pitfalls of Mapping the Entity Framework to Stored Procedures
If you want to use Stored Procedures are used for all database interaction, I just don't see the need to use Entity Framework. One good reason of EF is to save time to write T-SQL, and if you don't take advantage of this, why even use EF?
I am using Self Tracking Entities with the Entity Framework 4. I have 2 databases, with the exact same schema. However, tables in one database will be added to/edited etc (and I mean data will be added/edited, not the actual table definitions) and at certain points of the day I will need to synchronize all the changes between this database and the other database.
I can create a separate context for both of them. But if I read a large graph from one database, how can I update the other database with the graph? Is there an easy way?
My database model is large and complex and fully relational. So it would be a big job to go through every single entity and do a read from the other database to see if it exists or not, update/insert it if need be, and then carry this on through the full object graph!
Any ideas?
This is not a use case for EF. In EF you will have to do exactly what you've described. Self tracking entities are able to track changes to these object instances - they know nothing about changes made to their own database over time and they will not know anything about state of your second database as well.
Try to look at SQL server native features (including mirroring, transaction log shipping or SSIS) and MS Sync framework. Depending on your detailed requirements these tools can suite you better.
I'm trying to display the results of a sproc in my MVC 3 web app.
However, the sproc calls into 4 tables on one database and joins them with 5 views (single table views only, thank goodness) on another database. Each (SQL Server) db is on a separate server but that shouldn't matter.
I've read this: http://blogs.msdn.com/b/swiss_dpe_team/archive/2008/02/04/linq-to-sql-returning-multiple-result-sets.aspx
and this:
http://www.codeproject.com/KB/dotnet/linqToSql5.aspx
and still cannot determine whether I should use the dataContext classes or just embed the straight SQL.
Perhaps there is a better way to return my results than LINQ to SQL (15 columns, 3 different data types)? I need to update the tables as well. The user will have the ability to update each value if they choose. Is this a task best suited for the entity framework classes?
I plan on using the repository pattern so I can change data access technology if I must but would rather make the correct decision the 1st go 'round.
I was hoping for a resource that was more up-to-date than say, NerdDinner and more robust than the movie apps for MVC3 that abound, particularly implementing the sproc results inside a view. Any suggestions would surely be appreciated. Thanks.
Once you plan to "update" data then you are going to handle it all through stored procedures. Both Linq-to-sql or Entity framework will not help you with this because they are not able to persist changes to something created from arbitrary query. You should very carefully check if you are even able to track the data back to the correct record in the correct table. Generally result of a stored procedure is mostly for viewing the data but once you want to modify the data you must work with each table directly or again use some stored procedure which will do the task. Working with tables from multiple databases can be pretty complex in entity framework (EF doesn't support objects from multiple databases in one entity model).
Also what you mean by 15 columns, 3 different data types? Stored procedures support in both Linq-to-sql and Entity framework will return enumeration of one flattened data type containing 15 properties.
I'm not aware of anything that linq-to-sql can do that Entity Framework can't really, so EF seems to be a better solution in this case. You can add a stored procedure to your Entity Framework model as well, so you can just have it call the procedure and deal with whatever comes back.
Since the end goal will involve accessing the same Databases with either technology and they will be using sql to retrive the data either way its really a subjective anwser.
I would use whatever technology you are most comfortable and focus more on the implementation. Both data access platforms are based off of ado.net technologies and are for the most part equally powerful.
Regardless of the technology I would evaluate how the data is accessed and make implementation decisions based on that.