NHibernate,Entity Model or LightSpeed which is preferable? - entity-framework

Can any one tell which is best suited for performance oriented applications?

All of the above. Or none of the above. No way to tell without measuring performance and seeing which one does or does not work for you.

I would agree with the existing answers here: Understand what performance really means to your application before going off half-cocked on something (most of us have been there). If you're looking for something super-performant but that still has some "ORMish" behavior and takes some monkey coding out of the ADO.Net equation, take a look at the various .Net MicroOrms out there such as:
Dapper (with extensions)
Service Stack's ORM Lite
Insight.Database
Massive
There are several others out there, some of which are referenced from the dapper site.
If you really are stuck with those three choices, it definitely does depend on a lot of factors and how much time you spend tuning. That being said, I've used all three quite a bit, especially NHib 2-3 and EF 4-6. I think if you are doing just quick-and-dirty coding without spending a lot of time on optimizing, LightSpeed is a really good choice and I've personally found it to outperform the other two very handily when it comes to most basic CRUD operations and LINQ queries.
The big downside of LightSpeed is that you have to inherit from their base classes. This is somewhat mitigated by partial class support and you can also insert your own base classes in between, and there's also no true "CodeFirst" support, although you can handcode the classes and skip the designer if you like. They all work well if tuned properly. Just pick the right tool for the job.
Whichever you chose, use your SQL Profiler / Mini Profiler / NHProf / EFProf etc...

Related

Azure Hadoop and Entity Framework

I am stating a new project that needs to be portable and in some scenarios will have 100's millions of entities.
Now with Azure getting hadoop that of course got my attention for the big data scenario. But I also have the small data scenario under 1 million rows.
Entity Framework code-first is they way I see designing this but needing hadoop in the mix of course may complicate things (Entity Framework of course been used to give simpler storage providers for the smaller data sets)
Now the question is does anyone have experience in this mix?
Can anyone recommended if this is a good approach or not, if not is there a better way?
Working on a reasonably large system based on Entity Framework Code First, with the caveats that I have been working with EF4 and cannot upgrade to 5, that your mileage may vary and the outcomes are going to be strongly affected by what you intend to do, my experience has been that EF doesn't handle large amounts of data tremendously well, it is quite inflexible, so if you need to change its standard behaviour in some way there is a good chance you are going to end up having to hack some nasty workarounds, and performance isn't amazing. If you want to do things that aren't exactly the things that EF expects you to do, then you can run into walls.
If I was thinking of designing a relatively simple/small scale Asp.Net MVC setup, I think EF is a really good choice. For a larger scale operation where you need more flexibility or you are planning to go beyond basic operations you might find that something like NHibernate works better. I don't have experience with that, but colleagues who have worked with both tend to prefer NHibernate. ( Brief article on the comparison - a bit old so EF has addressed some, but not all of the points there. It is also different by design of course. )
It may be that for more high-traffic or unusual stuff you need to roll your own data access anyway just to achieve the right performance or be able to find the right data. Unsquestionably, if you are planning to try working with EF I strongly recommend some serious prototyping just to ensure it can do what you need.

Coldfusion MVC framework with crud for many-mamy

I've spent the last couple of months on and off with CFwhees.
There's a lot it has going for simplicity, but I always come up against an issue or two that I have a hard time troubleshooting. Biggest of them is to get a many-many relationship to work properly.
I'm switching between two environments Railo/mySQL and CF/MsSQL so it would be nice if it could work on both.
I'm trying to roll out a web application in a limited amount of time, as I've spent already too much time on CF wheels.
Can anyone recommend a framework that will make creating many-many relationships and the related CRUD easy and has a big community?
Some of the one's I've seen mentioned frequently are MachII, FuseBox, Model-Glue, ColdBox
Most of those frameworks you list do not have built-in ORM like wheels does. This means you'll be either using straight SQL queries or the CF9(Hibernate) ORM. I think it is only fair to mention that both of those options are also available for CFWheels.
I've written a fairly large app in CFWheels. In my app there were several instances of many-to-many relationships, and I was able to make it work without too much pain. That being said, I have felt your frustration with the CFWheels ORM. It can be clunky once you get to complex relationships. In those cases, I've had to make a judgement call as to whether it was worth it to try to build a query using the ORM, or just build a custom SQL query and store it in the CFC for my model. In fact, for 99% of my report queries for this app, I just resorted to writing the SQL in the model. But for CRUD operations, this wasn't really a limiting factor.
I'm curious what specific problems you're experiencing with Wheels - care to post an example?
yes the orm in cfwheels can be buggy at times. if you encounter a bug or even what you THINK might be a bug, we want to know. please take the time to file a bug report so we can investigate. that all being said, i'm very surprised that the CF community hasn't taken noticed about Don Humphrey's ORM called CFRel. It's probably one of the biggest things to happen to CFML since fusebox.
Oh... and there is even a cfwheels plugin for it.

Need some advice on starting a New Life with MVC 2 and which Tools to use for RAD in MVC2?

I have finally decided to hop up on the train of MVC 2.
Now I have been doing a lot of reading lately and following is the architecture which I think will be good enough for most Business Web Applications.
Layered Architecture:-
Model (layer which communicates with Database). EF4
Repository (Layer which communicates with Model and includes all the queries)
Business Layer (Validations, Helper Functions, Calls to repository)
Controllers (Controls the flow of the application and is responsible for providing data to the view from the Business Layer.)
Views (UI)
Now I have decided to create a separate project for each layer (Just to respect the separation of concerns dilemma. Although I know it's not necessary but I think it makes the project look more professional :-)
I am using AutoMetaData t4 template for Validation. I also came across FluentValidation but cant find much on it. Which one should I go with?
Which View Engine to go for?
Razor View Engine Was Love at first sight. But it's still in beta and I think it won't be easy to find examples of it. Am I right?
Spark .. I can't find much on it either and don't want to get stuck somewhere in the middle crying for help when there is no one to listen...:-(
T4 templates auto generate views and I can customize them to generate the views the way I want? Will this be possible with razor and spark or do I have to create them manually?
Is there any way to Auto generate the repositories?
I would really appreciate it if I can see a project based on the architecture above.
Kindly to let me know if it's a good architecture to follow.
I have some confusion on the business layer like is it really necessary?
This is a very broad question. I decided to use Fluent NHibernate's autoconfig feature for a greenfield application, and was quite impressed. A lot of my colleagues use CakePHP, and it needed very little configuration to get it to generate a database schema compatible with the default conventions cake uses, which is great for us.
I highly suggest the book ASP.NET MVC2 in Action. This book does a good job at covering the ecosystem of libraries that are used in making a maintainable ASP.NET MVC application.
As for the choice of view engines, that can depend on your background. I personally prefer my view to look as much like the HTML as possible, so I would choose Spark. On the other hand if you are used to working with ASP.NET classic, the WebForms view engine may get you up and running fastest.
Kindly to let me know if its a good architecture to follow?
It's a fine start - the only thing I would suggest you add is a layer of abstraction between your Business Logic and Data Access (i.e: Dependency Inversion / Injection) - see this: An Introduction to Dependency Inversion.
i know its not necessary but i think it makes the project looks more professional :-)
Ha! Usually you'll find that a lot of "stuff" isn't necessary - right up until the moment when it is, at which point it's usually too late.
Re View Engines: I'm still a newbie to ASP.NET MVC myself and so aren't familiar with the view engines your talking about; if I were you I'd dream up some test scenarios and then try tackling them with each product so you can directly compare them. Often, you need to take things for a test drive to be more comfortable - although this might take time, but it's usually worth it.
Edit:
If i suggest this layer to my PM and give him the above two reasons then i don't think he will accept it
Firstly, PM's are not tech leads (usually); you have responsibility for the design of the solution - not the PM. This isn't uncommon, in my experience most of the time the PM isn't even aware they are encroaching on your turf that isn't theres. It's not that I'm a "political land grabber" but I just tend to think of "separation of concerns" and, well, I'm sure you understand.
As the designer / architect it's up to you to interpret requirements and (taking business priorities into account) come up with solution that provides the best 'platform' going forward.
(Regarding DI) My question is , is it really worth it?
If you put a gun to my head I would say yes, however the real world is a little more complex.
If you answer yes to any of these questions then its more likely using DI would be a good idea:
The system is non-trivial
The expected life of the system is more than (not sure what the right figure is here, there probably isn't one, so I'm going to put a stake in the ground at) 2 years.
The system and/or its requirements are fluid.
Splitting up the work (BL / DAL) into different teams would be advantageous to the project (perhaps you're part of a distributed team).
The system is intended for a market with a diverse technical landscape (e.g: not everyone will want to use MS SQL).
You want to perform quality testing (this would make it easier).
The system is large / complex, so splitting up functionality and putting it into other systems is a possibility.
You want to offer more than one way to store data (say a file based repository for free, and a database driven repository for a fee).
Business drivers / environment are volatile - what if they came to you and said "this is excellent but now we want to offer a cloud-based version, can you put it on Azure?"
Id also like to point out that whilst there's definitely a learning curve involved it's not that huge, and once you're up-to-speed you'll still be at least as fast as you are now; or at worst you;ll take a little longer but you'll be providing much more value (with relatively less effort).
In terms of how much effort is involved...
One-Off Tasks (beyond getting the team up to speed):
Writting a Provider Loader or picking DI Framework. Once you've done this it will be reusable in all your projects.
'New' Common Tasks (assuming you're following the approach taken in the article):
Defining interface (on paper) - this is something you'll be doing right now anyway, except that you might not realise it. Basically it's OO Design, but as it's going to be the formal interface between two or more packages you need to give it some thought (and yes you can still refactor it - but ideally the interface should be "stable" and not change a lot; if it does change it's better to 'add' than to 'remove or change' existing members).
Writting interface code. This is very fast (minutes not hours), as you're not writting any implementation; and when you go to implement you can use tools provided by your IDE to generate code-stubs based on the interface.
Things you do now that you'd do differently:
Instantiating a variable (in your BL classes) to hold the provider, probably via a factory. Writting this shouldn't take long (again, minutes not hours) and it's fairly simple code to copy, paste & refactor where required.
Writing the DAL code: should be the same as before.
Sometimes it is way more easy to learn patterns from code : Sharp Architecture is a concrete implementation of good practices in MVC, using DDD.

Would you use v3.5 of the ADO.NET Entity Framework, or wait for v4.0?

I was surprised to find a public letter proposing a vote of no confidence in the entity framework (see http://efvote.wufoo.com/forms/ado-net-entity-framework-vote-of-no-confidence/)
Would the reasons stated in the letter keep you from using the current version of the entity framework? Would you rather wait for v4.0? Or rather use another ORM?
The current version of EF is definitely not perfect, and has lots of gotchas and drawbacks. I probably wouldn't use it right now - but the upgrade path to EF v2 (or is it EF4?) sure looks pretty rosy!
complete persistence ignorance - you can use your straight up POCO classes
deferred loading configurable as an option
much improved designer with support for pluralization/singularization (even in multiple languages!)
ability to do "domain first" design and create database from your model
ability to have self-tracking entities across multiple layers that allow you to send data to the client and get back changes and apply them to your entity context
All in all, EF v2 looks very promising and I'm very eager to give it a serious spin. If it really keeps all the promises out there right now, it's definitely a winner!
Check out the ADO.NET team blog for a flurry of recent blog posts on EF v2.
Marc
Another ORM.
Don't get me wrong you should get flamed with responses, but currently only nHibernate is functionally complete.
I'm a TDD fan, so want an easily testable POCO ORM solution. If that's your bag then EF3.5 is out. EF4.0 is introducing it (http://blogs.msdn.com/adonet/archive/2009/05/21/poco-in-the-entity-framework-part-1-the-experience.aspx) , but it still has at least 1 big drawback -> doesn't support inheritance.
NHibernate is more complete, but EF could be easier to use. As ever, best tool for the job... but if it's an Enterprise-scale TDD developed app, go nHibernate.
Also -> there's a profiler that makes nHibernate dev much easier -> http://www.nhprof.com/
I tried using it for my current project, which basically involves rewriting our current mess of a data layer.
It just doesn't work.
First, if you're trying to base an Entity off of a View, the designer tries to force every NOT NULL property to be an entity key... which is pretty much never what I wanted. To work around that you have to edit the xml in at least two places, and do it every time you add an object because it refreshes and re-adds the EntityKey properties. Must specify mapping for all key properties in Entity Framework?
Second, when you are creating associations you MUST use every entity key - How can you make an association without using all entity keys in entity framework?
Those two things held me up for 3 days, then I went back to Linq to SQL and had it done in a couple hours. (Well, at least the part of the system I was struggling with... ) I don't know if those are in the Vote of No Confidence, but it's just not ready in my opinion.
Also with the lack of answers I got here on every EF question I've asked, I have to assume current usage is so low that getting help and support is going to be difficult... which is possibly the BIGGEST reason not to use something.
Let's hope the next version is better...
EDIT: OUr current plan is to stick with Linq 2 SQL (I have to finish a project by Friday) and then evaluate all the other ORMs to see if anything else is better. The other developer hates L2S for the record, but I've never had any major problems using it...
EF has some rich design time support, but I have to agree that nHibernate is the way to go, despite the learning curve. If you need to make something fast and don't care about TDD or serialization (which is a large weakness of all of MS's ORM offerings) go EF.
Well My experience of Version 1 was interesting. I wanted to use POCOs but it didn't support it. After reading around I came across some code from a bod at microsoft that did this.
It was a bit messy to generate the code but on the whole this part of the process was not so bad.
A real nasty part that I came across was the lack of Concurrency checking built in, for N-tier development. You have to manage this yourself which after looking at the problem was not so bad, especially if you want to hand back the versioning back to the client for user intervention.
Second nasty and absolutely stupid thing missing was the IN keyword for LINQ queries. Not supported and so needs to be worked around. I found a solution but was a real mess bringing in some other code that quickly patched up the omissions.
Would I use EF 4.0 (2.0). Yes, absolutely, why not? In fact on stage 2 I will be using this. It looks like it supports POCOs, it looks like my concurrency model will move straight across with no problems (basically delta copy stuff). Its all good so far and I hope this time round that the Big guys at Microsoft have seen the errors of their ways and provided a solution that works.
If your buying into entity development and the whole Concept Model first thing, then its the only way to go for a complete Microsoft solution. Although the stuff being done on the M language might eclipse the idea and move the whole modelling thing back to the Database.
If you not buying into the Entity stuff then I would strongly go Enterprise Library. Its a proven technology that works every time built on a solid code foundation and Database centric paradigm. I would also go this route if you think that Stored Procedures are the bees knees and like what they bring to the table.
If your feeling really exotic and feel a bit frisky I would go with a NO-SQL approach such as CouchDB. This however does take some getting used to. Its damn weird and feels really really wrong. But things get developed in super quick time and the solutions seem to be robust and faster than expected. I would not got for this type of solution though if your big into Normalization and think that it can be applied to a NO-SQL approach. The whole model needs to be shifted on its head and the application will be needed to be modelled in a way that is driven by the technology applied.
I find the CouchDB way a bit dirty and very very wrong. But it has so many compelling reasons to use it, that I think it will seep into the psyche of every programmer, and it will definitely go mainstream in the next couple of years.
My biggest gripe still with the whole Entity thing though even in the new version 4 is that there really has not been much thought into N-tier environments. It still got a feel about it being a 2 tier solution with a lot of boiler plate code still needed to be done by the end user (developer), to get it working in a robust and dependable N-Tier way.

What level of complexity requires a framework?

At what level of complexity is it mandatory to switch to an existing framework for web development?
What measurement of complexity is practical for web development? Code length? Feature list? Database Size?
If you work on several different sites then by using a common framework across all of them you can spend time working on the code rather than trying to remember what is located where and why.
I'd always use a framework of some sort, even if it's your own, as the uniformity will help you structure your project. Unless it's a one page static HTML project.
There is no mandatory limit however.
I don't think there is a level of complexity that necessitates a framework. For me whenever I am writing a dynamic site I immediately consider a framework, and if it will save me time, I use it(it almost always does, and I almost always do).
Consider that the question may be faulty. Many of the most complex websites don't use any popular, preexisting, framework. Google has their own web server and their own custom way of doing things, as does Amazon, and probably lots of other sites.
If a framework makes your task easier, or provides added value, go for it. However, when you get that framework you are tied to a new dependancy. I'm starting to essentially recreate a Joel on Software post, so I will redirect you here for more on adding unneeded dependencies to your code:
http://www.joelonsoftware.com/articles/fog0000000007.html
All factors matter. You should measure how much time you can save using 3rd party framework and compare it to the risks of using other's code
Never "mandatory." Some problems are not well solved by any framework. It would be suggestible to switch to a framework when most of the code you are implementing has already be implemented by the framework in question in a way that suits your particular application. This saves you time, energy, and will most likely be more stable than the fresh code you would have written.
This is really two questions, you realize. :-) The answer to the first one is that it's never mandatory, but honestly, parsing HTML request parameters directly is pretty horrible right from the start. I don't want to do it even once, so I tend to go toward a framework relatively early on.
As far as what measurement is practical, well, what are you worried about? All of the descriptions that you list have value. Database size matters primarily for scaling, in my opinion (you can write a very simple app if you have a very simple schema, even if there are hundreds of thousands of rows in the database). The feature list will probably determine the number and complexity of UI pages, which will in turn help to dictate the code length.
There are frameworks that are there for getting moving very quickly with a simple blog, django or RoR all the way to enterprise full-stack applications Zope. Not to be tied to just the buzz world, you also have ASP.Net and J2EE, etc.
All frameworks and libraries are tools at your disposal. Determine which ones will make your life easier for your given project and use them.
I would say the reverse is true. At some point, your project gets so expansive, that you actually get slowed down by the shortcomings of the framework. For sufficiently large projects you may, in fact, be better off developing your own framework, to meet your own needs. I have seen many times where people were held back in the decisions they could make, or the work they could produce, because they were trying to do something that the framework didn't anticipate. And doing these things that the framework doesn't anticipate can be very troublesome. The nice thing about making your own framework, is that it can evolve with your project, to be a help to you system, instead of a hindrance.
So, to conclude, small projects should be use existing frameworks. Large projects should contain their own framework.