Can we create table in database from our Web application dynamically using EF where user define fields for table.
I am working on MVC 4.
Is it possible ?
and if yes then how to do it.
Depends on what you understand by dynamic here. Changing the runtime version of the Context isnt possible.
Generating, compiling, loading NEW code is. So a new context, and POCO code can be declared. Compiled and loaded. All controlled by a piece of running code.
However, the ongoing lifecycle gets complicated.
The use of code first migrations becomes tricky.
How to manage chnages to extensions in Prod.
How to merge ongoing dev and prod.
Managing the mapping in Fluent API in generated code is awful.
You cant do it all in Attributes.
The context backflips and automated migrations nightmares have cost me many a night.
Dealing with main build natural extensions and production/localised implementation extension gets VERY hard. You need to make sure they are incorporated back into the build.
And a must is You need to know how to manage different contexts on the same DB at the same time.
I have exactly this UNFINISHED project at the moment. I have put it on hold until we are on EF6.
I expect to spend several weeks on this topic alone.
And Im still not sure i want to continue down this path.
I am seriously considering moving to a meta Data / bucket data approach.
So It is possible but difficult (In my opinion at least). It will be easier in EF6 due to the multiple context support. But it will still suffer from most of these issues and require code generation and compilation still.
Good luck...
Related
I'm starting a new project and I'm interested in using Entity Framework. However, since this is a new project, there isn't an existing model or database yet, so I could either use database-first or model-first in this situation.
When starting with a blank slate like this, is it recommended to use model-first and let EF determine the database design or design the database yourself? I'm comfortable designing normalized databases so I'm not afraid of that aspect, but I'm not sure if there are maintainability and performance benefits from letting EF handle all the database design.
Thanks in advance guys!
In a green field project my choice would be code first, or possibly model first. As your project progresses over time you should only be dealing with code and have EF manage database schema changes for you. Using other approaches can lead to moving focus from core activities to model maintenance. On a project I worked on we were coerced by management to use a database created by a dedicated schema developer who only knew how to use SQL Server Management Studio. Therefore every change in database meant we had to regenerate code. This eventually caused us to spend 2 days to develop tooling for automatic importing of DB schema and generation of object model (this was in EF 3.5 days).
I'd leave database first approach for brown field and maintenance projects.
Check out: link1 and link2, then decide it's up to you
Well, as a developer, comfortable with any old way to build databases, I've found that code first lets me concentrate on my task. Your mileage may vary.
I've used it many time and have been able to maintain my databases using code first style without difficulty.
I am currently working on a project where I want to use the Entity Framework
for the first time. I read much information in the books of Lerman/Miller,
in MSDN, the ADO.NET blog and here on stackoverflow about the most recent
developments regarding the DbContext API and the Code First Migrations
capabilities available since EF 4.3.
Since especially the latter are really great, I wondered whether in the meantime
it is possible to do the same working "Model First" centered? Is it possible to
do migrations based on the visual EDMX designer (instead of the code) after
creating the code and the database from it?
I found a pretty recent link (2012-04) saying that it is not (in EF 4.3):
Using EF4 migration tool with model-first approach
Secondly, I found information that old code and database tables are
overwritten when trying to regenerate them from the EDMX designer?
But the info I refer to is about the Power Tools:
http://blogs.msdn.com/b/adonet/archive/2012/04/09/ef-power-tools-beta-2-available.aspx
Reverse Engineer Code First
This command allows one-time generation of Code First mappings for an existing database. [..]
• Running this command multiple times will overwrite any previously generated files, including any changes that have been made to generated files
Is this restriction only specific to the Power Tools Reverse Engineer Code First
or does it expand to the EDMX designer in general, especially the "Model First"
approach, too?
Furthermore, in the above article I found:
View Entity Data Model (Read-only)
Displays the Code First model in the Entity Framework designer.
• This is a read-only representation of the model; you cannot update the Code First model using the designer.
And the same question applies here.
So is there currently a way with full round-trip modelling without data-loss
(code and database) and keeping the EDMX file writable, preferrable following
"Model First"?
From which verson of EF on is this (already or planned to be) supported, which
version of .NET should I target then (4.0 sufficient?) and will this work with
Visual Studio 2010 Professional? Could you give a rough estimate for the date
you may will have implemented this?
This would of course be awesome and a huge breakthrough! I think I can only
roughly imagine how much work this would be and am aware that you are fully
working at your limit already. I want to thank you for your great work so far
and encourage you to keep it up.
If I understand your question correctly you are after Migrations for the EF Designer (i.e. update the model in the designer and have the database incrementally changed). This isn’t currently supported, it is on our backlog to address but we don’t have specific plans for a particular release. One of the things we need to work out is whether we should just integrate/extend the existing Code First Migrations feature to work with the EF Designer or whether we need something that is a bit more designer focused.
Obviously things can change, but at this stage I wouldn’t be expecting us to start working on this feature in the next 6 months. Beyond that it’s going to depend on what features we see folks asking for… so I would create a new feature on http://data.uservoice.com and get folks voting on it.
~Rowan
My app has a potentially very large CoreData datastore underneath it (could easily be upwards of 30MB). I've started noticing memory issues when using the automatic migration (addPersistentStoreWithType:configuration:URL:options:error:) so I started looking into the methods of migrating smaller parts of the store to avoid all the CoreData object buildup that happens when you migrate everything at once.
This is discussed in the official documentation in the "Multiple Passes" section, however it looks like their suggested approach is to divide up your migration by entity type, i.e. make multiple mapping models, each of which migrate a subset of the entity types from the complete data model.
The only problem is - what if one entity type is the majority of your datastore? Going by the Apple-recommended approach, that whole entity type is still going to be done in a single migration and the memory issues will presumably persist.
Are there any techniques available to actually migrate a sub-set of entities of a specific type to guarantee that you will not run out of memory when trying to migrate them all?
Thanks in advance for any help.
EDIT: After doing more digging, I have discovered that the Apple-recommended split of the DB into entity types actually only works for non-related entities (as discussed here), so it's even less likely to solve the problems of a real-world DB than I thought when I originally wrote this post.
I'm starting to think that CoreData migrations that are actually done through NSMigrationManager don't scale at all now and you basically can't have a DB that is bigger than about 20-30MB if you want to be able to migrate it on current generation iOS devices. The only viable approach seems to be to short-circuit all the NSMigrationManager / NSMappingModel stuff completely and write the migration completely custom in code. Seems like a huge oversight on Apple's part if this is actually the case.
I was able to get around this in the short-term by leveraging "lightweight" migration, as described in http://developer.apple.com/library/mac/#documentation/Cocoa/Conceptual/CoreDataVersioning/Articles/vmLightweight.html#//apple_ref/doc/uid/TP40008426-DontLinkElementID_1.
The trick is to pull the NSMigrationManager subclass that is specifically for SQLite store types when you're manually invoking the migration, as shown in the last code sample on that page.
This isn't a general-purpose fix though since it only works if the schema change in your datastore is simple enough that lightweight migration is possible. Still waiting to hear back from Apple as to what the recommended solution is when you're dealing with a non-trivial mapping.
I was surprised to find a public letter proposing a vote of no confidence in the entity framework (see http://efvote.wufoo.com/forms/ado-net-entity-framework-vote-of-no-confidence/)
Would the reasons stated in the letter keep you from using the current version of the entity framework? Would you rather wait for v4.0? Or rather use another ORM?
The current version of EF is definitely not perfect, and has lots of gotchas and drawbacks. I probably wouldn't use it right now - but the upgrade path to EF v2 (or is it EF4?) sure looks pretty rosy!
complete persistence ignorance - you can use your straight up POCO classes
deferred loading configurable as an option
much improved designer with support for pluralization/singularization (even in multiple languages!)
ability to do "domain first" design and create database from your model
ability to have self-tracking entities across multiple layers that allow you to send data to the client and get back changes and apply them to your entity context
All in all, EF v2 looks very promising and I'm very eager to give it a serious spin. If it really keeps all the promises out there right now, it's definitely a winner!
Check out the ADO.NET team blog for a flurry of recent blog posts on EF v2.
Marc
Another ORM.
Don't get me wrong you should get flamed with responses, but currently only nHibernate is functionally complete.
I'm a TDD fan, so want an easily testable POCO ORM solution. If that's your bag then EF3.5 is out. EF4.0 is introducing it (http://blogs.msdn.com/adonet/archive/2009/05/21/poco-in-the-entity-framework-part-1-the-experience.aspx) , but it still has at least 1 big drawback -> doesn't support inheritance.
NHibernate is more complete, but EF could be easier to use. As ever, best tool for the job... but if it's an Enterprise-scale TDD developed app, go nHibernate.
Also -> there's a profiler that makes nHibernate dev much easier -> http://www.nhprof.com/
I tried using it for my current project, which basically involves rewriting our current mess of a data layer.
It just doesn't work.
First, if you're trying to base an Entity off of a View, the designer tries to force every NOT NULL property to be an entity key... which is pretty much never what I wanted. To work around that you have to edit the xml in at least two places, and do it every time you add an object because it refreshes and re-adds the EntityKey properties. Must specify mapping for all key properties in Entity Framework?
Second, when you are creating associations you MUST use every entity key - How can you make an association without using all entity keys in entity framework?
Those two things held me up for 3 days, then I went back to Linq to SQL and had it done in a couple hours. (Well, at least the part of the system I was struggling with... ) I don't know if those are in the Vote of No Confidence, but it's just not ready in my opinion.
Also with the lack of answers I got here on every EF question I've asked, I have to assume current usage is so low that getting help and support is going to be difficult... which is possibly the BIGGEST reason not to use something.
Let's hope the next version is better...
EDIT: OUr current plan is to stick with Linq 2 SQL (I have to finish a project by Friday) and then evaluate all the other ORMs to see if anything else is better. The other developer hates L2S for the record, but I've never had any major problems using it...
EF has some rich design time support, but I have to agree that nHibernate is the way to go, despite the learning curve. If you need to make something fast and don't care about TDD or serialization (which is a large weakness of all of MS's ORM offerings) go EF.
Well My experience of Version 1 was interesting. I wanted to use POCOs but it didn't support it. After reading around I came across some code from a bod at microsoft that did this.
It was a bit messy to generate the code but on the whole this part of the process was not so bad.
A real nasty part that I came across was the lack of Concurrency checking built in, for N-tier development. You have to manage this yourself which after looking at the problem was not so bad, especially if you want to hand back the versioning back to the client for user intervention.
Second nasty and absolutely stupid thing missing was the IN keyword for LINQ queries. Not supported and so needs to be worked around. I found a solution but was a real mess bringing in some other code that quickly patched up the omissions.
Would I use EF 4.0 (2.0). Yes, absolutely, why not? In fact on stage 2 I will be using this. It looks like it supports POCOs, it looks like my concurrency model will move straight across with no problems (basically delta copy stuff). Its all good so far and I hope this time round that the Big guys at Microsoft have seen the errors of their ways and provided a solution that works.
If your buying into entity development and the whole Concept Model first thing, then its the only way to go for a complete Microsoft solution. Although the stuff being done on the M language might eclipse the idea and move the whole modelling thing back to the Database.
If you not buying into the Entity stuff then I would strongly go Enterprise Library. Its a proven technology that works every time built on a solid code foundation and Database centric paradigm. I would also go this route if you think that Stored Procedures are the bees knees and like what they bring to the table.
If your feeling really exotic and feel a bit frisky I would go with a NO-SQL approach such as CouchDB. This however does take some getting used to. Its damn weird and feels really really wrong. But things get developed in super quick time and the solutions seem to be robust and faster than expected. I would not got for this type of solution though if your big into Normalization and think that it can be applied to a NO-SQL approach. The whole model needs to be shifted on its head and the application will be needed to be modelled in a way that is driven by the technology applied.
I find the CouchDB way a bit dirty and very very wrong. But it has so many compelling reasons to use it, that I think it will seep into the psyche of every programmer, and it will definitely go mainstream in the next couple of years.
My biggest gripe still with the whole Entity thing though even in the new version 4 is that there really has not been much thought into N-tier environments. It still got a feel about it being a 2 tier solution with a lot of boiler plate code still needed to be done by the end user (developer), to get it working in a robust and dependable N-Tier way.
We been having some discussions on approaches to using the entity framework at work recently. We have a fairly large and complex n-tier web based application, which is due for a major overhaul.
The question is: If we where to starting using the entity framework, would it be better to create one big model, or a set of smaller functional/acivity based models.
I have my own opinions on this, but would be interested to hear what some other people think.
Update (17th November 2008):
I have been creating one model, wiping it out and re-creating, etc for small projects at home. Although I haven't tried, I suspect that this approach will be a bit more challenging when there are a large number of entity types involved.
Also, does anyone have any experience of using ef with a large team using TFS or similar?
In my experience with it, I would just make one big model of the database. Otherwise, it might be hard to track what tables changed where. When I make changes to the database, I just delete all the tables in the model and regenerate it.
Of course, I also didn't customize my model by adding "entity" functionality to it (not sure how that works exactly).
So I'm no expert in it, but I usually end up using the LINQ-To-SQL models/objects instead of the Entity Framework - it's worked better for me so far.