Is there Entity Framework support for saving to multiple databases on the same SQL Server in single transaction? - entity-framework

I have two databases on the same SQL Server instance. I would like to write a record to each database in a single transaction.
In Linq-to-SQL, I would connect to either database with one context and use three part naming to identify the tables.
Is there a similar capability in Entity Framework?
I'm trying to avoid DTC, it has been forbidden - so the usual TransactionScope approach is not available to me.

There is not a way I know of... you could potentially use the UnitOfWork pattern
http://www.codeproject.com/Articles/581487/Unit-of-Work-Design-Pattern
That might allow you to at least go back to the other Db and un-commit?
Personally I think your going to struggle.

Related

Can Entity framework creates tables in other databases?

I am curious to know that if Entity framework can create tables in other databases besides MS-SQL ??
Moreover, is there any provision to create XML schema through EF ?
Under the hood Entity Framework uses providers that are specific for different databases. So it depends on a provider whether EF can create tables or not. However, I haven't heard about providers that do not have this possibility. The easiest way to be sure is to write a simple program with a few lines of code.
As to XML schema. Are you asking about using XML files instead of database as the storage for your data? If so, again it depends on the provider. If you want you can theoretically create one that will use XML files. However, I haven't tried to do so and I don't think that it is a good idea. There are technologies that fit here better (see this question).

How to get a connection and hold it using DAAB?

I have a task ahead of me that requires the use of local temporary tables. For performance reasons I can't use transactions.
Temporary tables much like transactions require that all queries must come from one connection which must not be closed or reset. How can I accomplish this using Enterprise library data access application block?
Enterprise Library will use a single database connection if a transaction is active. However, there is no way to force a single connection for all Database methods in the absence of a transaction.
You can definitely use the Database.CreateConnection method to get a database connection. You could then use that connection along with the DbCommand objects to perform the appropriate logic.
Other approaches would be to modify Enterprise Library source code to do exactly what you want or create a new Database implementation that does not perform connection management.
Can't see a way of doing that with DAAB. I think you are going to have to drop back to use ADO.Net connections and manage them yourself, but even then, playing with temporary tables on the server from a client-side app doesn't strike me as an optimal solution to a problem.

TransactionScope Vs stored procedure

I have a web service hosted on multiple servers, as traffic increases, race conditions arise. We're using Entity Framework and host on Azure, I've been looking into either write the queries using TransactionScope or moving logic into a stored procedure and do a transaction there.
I was wondering what's the difference between using TransactionScope or a stored procedure? What are the best practices for this problem?
I would strongly discourage you from implementing transactions in stored procedures. This can greatly limit the flexibility you have in creating units of work (which a transaction is). Since you are using EF, I would encourage you to manage transactions in your business tier code. In this way, you have greater flexibility in defining and managing units of work.
TransactionScope allows for transactions in your EF statements. so the entire linq statement will rollback. Whereas having transactions within SPROCS will only rollback whatever is processed within the sproc.
since you are using EF which allows you to interact with the database through Linq, you'd be as well going with TransactionScope IMO

Entity Framework without a DB?

Is it possible to use Entity Framework 4.3 without linking the model to an actual DB in the back-end?
I need to build a conceptual model of a database in the VS designer and then I'd like to manually handle fetches, inserts and updates to various back-end databases (horrible legacy systems). I need to be able to do this without EF moaning about not having tables mapped, etc. I realise that this is a very odd thing to want to do...
The reason for this is that we would like to move from these legacy systems into a well designed data model and .NET environment, but we need to still maintain functionality and backward compatibility with the old systems during development. We will then reach a stage where we can import the old data (coming from about 6 different databases) into a single DB that matches the EF model I'm building. In theory, we should then be able to switch over from the hacked up EF model to a proper EF model matching the new data structure.
Is this viable? Is it possible to use the EF interface, with LINQ without actually pointing it to a database?
I have managed to query the legacy systems by overriding the generated DbContext and exposing IQueryable properties which query the old systems. My big fight now is with actually updating the data.
If I am able to have EF track changes to entities, but not actually save those changes. I should be able to override the SaveChanges() method on the context to manually insert into various legacy tables.
I'm sort of at wits end with this issue at the moment.
UDPATE 4 Sept 2012: I've opted to use the EDMX file designer to build the data model and I generate the code by using T4. This enables me to then manually write mapping code to suit my needs. It also allows me to later perform a legacy data migration with relative ease.
If I were in your situation I'd setup the new DB server and link the legacy servers to it. Then create stored procedures to interface with EF for the INSERT/UPDATE/DELETE. This way your EF code remains separate from the legacy support messiness. As you decommission the legacy DB servers you can update your stored procedures accordingly. Once you have no more legacy DB servers you can either continue using your sprocs or do a refresh of your EF data connection to use the table schema directly.
Entity framework is to link entities to a data store without manual populates.
Otherwise you're just using classes with linq.
If you mean you don't want a seperate data store like sql server, mongo etc etc, then just let your application create the database as an mdb file that gets bundled in your app_data file. That means you don't need a databsae server so to speak and the database is part of your app.
If on the other hand you want a different way to save to the database, you can create your own data adapters to behave however you like. The mongo .net entity framework component is an example of this.
Alternatively, using code only you can just use stored procedures to persist to the database which can be a bit verbose and annoying with EF, but could bridge the gap for you you and allow you to build a good architecture with a model you want that gets translated into the crappy one in your repositories.
Then when the new database is ready, you can just rework your repo's to use savechanges and you're done.
This will of course only work with the code only approach.

MVC 3 and LINQ to SQL or Entity Framework

I'm trying to display the results of a sproc in my MVC 3 web app.
However, the sproc calls into 4 tables on one database and joins them with 5 views (single table views only, thank goodness) on another database. Each (SQL Server) db is on a separate server but that shouldn't matter.
I've read this: http://blogs.msdn.com/b/swiss_dpe_team/archive/2008/02/04/linq-to-sql-returning-multiple-result-sets.aspx
and this:
http://www.codeproject.com/KB/dotnet/linqToSql5.aspx
and still cannot determine whether I should use the dataContext classes or just embed the straight SQL.
Perhaps there is a better way to return my results than LINQ to SQL (15 columns, 3 different data types)? I need to update the tables as well. The user will have the ability to update each value if they choose. Is this a task best suited for the entity framework classes?
I plan on using the repository pattern so I can change data access technology if I must but would rather make the correct decision the 1st go 'round.
I was hoping for a resource that was more up-to-date than say, NerdDinner and more robust than the movie apps for MVC3 that abound, particularly implementing the sproc results inside a view. Any suggestions would surely be appreciated. Thanks.
Once you plan to "update" data then you are going to handle it all through stored procedures. Both Linq-to-sql or Entity framework will not help you with this because they are not able to persist changes to something created from arbitrary query. You should very carefully check if you are even able to track the data back to the correct record in the correct table. Generally result of a stored procedure is mostly for viewing the data but once you want to modify the data you must work with each table directly or again use some stored procedure which will do the task. Working with tables from multiple databases can be pretty complex in entity framework (EF doesn't support objects from multiple databases in one entity model).
Also what you mean by 15 columns, 3 different data types? Stored procedures support in both Linq-to-sql and Entity framework will return enumeration of one flattened data type containing 15 properties.
I'm not aware of anything that linq-to-sql can do that Entity Framework can't really, so EF seems to be a better solution in this case. You can add a stored procedure to your Entity Framework model as well, so you can just have it call the procedure and deal with whatever comes back.
Since the end goal will involve accessing the same Databases with either technology and they will be using sql to retrive the data either way its really a subjective anwser.
I would use whatever technology you are most comfortable and focus more on the implementation. Both data access platforms are based off of ado.net technologies and are for the most part equally powerful.
Regardless of the technology I would evaluate how the data is accessed and make implementation decisions based on that.