Linq to entities seems much easier and safer than entity SQL.
Can you give an example where using entity SQL makes more sense than Linq to entities?
Personally I think the best use for EntitySQL is when you don't have any CLR classes for your entities:
I.e. using eSQL you can query against the conceptual model (with all the mapping flexibility the Entity Framework offers) without actually needing classes for each of your entities.
Alex
Entity SQL was the original language design for use with EF and bears many similarities with T-SQL. It can be a very useful stepping stone for migrating your DBAs to a domain-modelling framework. They can then move on to 'proper' LINQ to SQL.
The main difference is that the first call of Entity SQL query is much faster than the call of the LINQ to Entities query (if itis not using CompiledQuery).
One more EFv1-specific feature is that internal database functions (defined in the provider manifest) could have been called only in Entity SQL queries.
Devart Team
http://www.devart.com/dotconnect
ADO.NET data providers for Oracle, MySQL, PostgreSQL, SQLite with
Entity Framework and LINQ to SQL support
You can specify a (database) collation in ESQL, which you can't do in L2E (or LINQ-to-anything, actually). You can also specify the exact type with ISTYPE(ONLY ...), which you can't do in L2E.
Related
I would like to ask the Entity Framework Core team what their ambition is for the scope/complexity of query translation compared to EF6.
I've used EF6 extensively and I know that if you can express it in LINQ and don't use any untranslatable functions, EF can probably translate the query correctly.
Will Entity Framework's translation be eventually as good as that, or is that something that is considered secondary, like the lazy loading feature.
If so, about what is the team eventually aiming at compare to EF6?
There's a ticket discussing GroupBy that appears to indicate they deem grouping an advanced type of query, but compared to what EF6 can translate, a normal group-by is pretty average.
(I'm asking here as the EF Core team says on it's site it is monitoring SO for questions.)
We took a very different approach in EF Core. Every LINQ query should work--even if you use untranslatable functions. We do this by translating the parts of the query we can into SQL and processing the rest on the client after the results are returned by the server. As EF Core evolves, we'll translate more and more of the query into SQL (e.g. GROUP BY) which can make it more efficient.
In theory, our goal is to translate everything that the store supports. In some cases however (especially on NoSQL stores) there simply is no translation for a LINQ operator, and we feel it's better to be functional and inefficient than to throw.
If you want to ensure your whole query is translated, you can disable client evaluation. This will cause it to throw like EF6.
We have a very large stablished database (PostgreSQL for the matter) that suffers from standardization and I'd like to use Entity Framework in our future-to-be ASP.NET MVC application.
Is there a way to manually write Entities mappings?
I really don't want to autogenerate Entities based on Database First approach. And I don't have the option, for now, to migrate/update our schema.
Thank you.
Edit: after some more thinking we decided using NHibernate with Fluent NHibernate for the mappings.
We decided to go with NHibernate and Fluent NHibernate for mappings.
I'm trying to display the results of a sproc in my MVC 3 web app.
However, the sproc calls into 4 tables on one database and joins them with 5 views (single table views only, thank goodness) on another database. Each (SQL Server) db is on a separate server but that shouldn't matter.
I've read this: http://blogs.msdn.com/b/swiss_dpe_team/archive/2008/02/04/linq-to-sql-returning-multiple-result-sets.aspx
and this:
http://www.codeproject.com/KB/dotnet/linqToSql5.aspx
and still cannot determine whether I should use the dataContext classes or just embed the straight SQL.
Perhaps there is a better way to return my results than LINQ to SQL (15 columns, 3 different data types)? I need to update the tables as well. The user will have the ability to update each value if they choose. Is this a task best suited for the entity framework classes?
I plan on using the repository pattern so I can change data access technology if I must but would rather make the correct decision the 1st go 'round.
I was hoping for a resource that was more up-to-date than say, NerdDinner and more robust than the movie apps for MVC3 that abound, particularly implementing the sproc results inside a view. Any suggestions would surely be appreciated. Thanks.
Once you plan to "update" data then you are going to handle it all through stored procedures. Both Linq-to-sql or Entity framework will not help you with this because they are not able to persist changes to something created from arbitrary query. You should very carefully check if you are even able to track the data back to the correct record in the correct table. Generally result of a stored procedure is mostly for viewing the data but once you want to modify the data you must work with each table directly or again use some stored procedure which will do the task. Working with tables from multiple databases can be pretty complex in entity framework (EF doesn't support objects from multiple databases in one entity model).
Also what you mean by 15 columns, 3 different data types? Stored procedures support in both Linq-to-sql and Entity framework will return enumeration of one flattened data type containing 15 properties.
I'm not aware of anything that linq-to-sql can do that Entity Framework can't really, so EF seems to be a better solution in this case. You can add a stored procedure to your Entity Framework model as well, so you can just have it call the procedure and deal with whatever comes back.
Since the end goal will involve accessing the same Databases with either technology and they will be using sql to retrive the data either way its really a subjective anwser.
I would use whatever technology you are most comfortable and focus more on the implementation. Both data access platforms are based off of ado.net technologies and are for the most part equally powerful.
Regardless of the technology I would evaluate how the data is accessed and make implementation decisions based on that.
we know that we can generate EDMX model from sql server because EF support sql server. if my database is Oracle,MS-access or MySql then it supports or not. does it support ODBC.
EF is database independent but it requires EF ADO.NET provider to be supplied for the database. You can check the list of databases offering such provider. MS-Access is not among them.
The independence is little bit more theoretical because if you are using EDMX it has always its SSDL part bounded to single provider. If you want to support more databases you must have separate SSDL or whole EDMX for each provider. This is not the problem with EFv4.1 and code-first approach.
What is faster - ADO.NET or ADO.NET Entity Framework?
Nothing is faster than an ADO.NET datareader.
Entity framework also uses this in "the basement".
However entitity framework helps you to map from database to objects..
With ADO.NET you have to do that yourself.
It depends on how you program it how fast it is..
When you use ADO.NET datatables as "objects". They are a bit slower and memory hungry than plain objects..
As Julian de Wit says nothing is faster than ADO.NET DataReaders.
ADO.NET Entity Framework is a wrapper to the old ADO.NET.
It is pure Provider independent, ORM and EDL System.
It gives us a lot of benefits that we have had to hand craft or "copy & paste" in the past.
Another benefit that comes with it is that is completely Provider independent.
Even if you like the old ADO.NET mechanism or you are a dinosaur like me(:P) you can use the Entity Framework using the EntityClient like SqlClient, MySqlClient and use the power of Entity-Sql witch is provider independent.
I Know that with ADO.NET you can write a data access Layer and the DataReaders etc can be "independent" but you have steal have Queries that are provider specific.
On the other hand,in an enterprise application you may never want to change the data provider.
But as the technology grows, always new needs arise and you may want have to alter the database schema.
When it happens with the old ADO.NET Framework we have to refactor alot of code which is less than maintainable, no matter how we reuse the code.
The performance will be affected but with all these cache technologies out there we can overcome this.
As i always say, "The C is fast, the Assembly even more...but we use C#/VB.NET/Java"