Umbraco VS Craft CMS - content-management-system

I cannot find any good comparisons between Umbraco and Craft CMS. What are the strengths/weaknesses between the two?
I am a sole developer (owner) for a marketing agency and have projects/sites of varying sizes and types. My major considerations are how time consuming it is to develop basic sites, how much maintenance time goes into things like updating, what the platform isn't capable of / where I will run into problems for larger projects, how well it handles things like SEO and page speed etc, and of course all in long term costs.

It ultimately comes down to what platform you're most comfortable working with.
I'm a big fan of Umbraco, and I've used it for the better part of a decade on both small brochureware sites and huge builds. It's probably the best CMS available on the .NET platform, and it's easy to get something built quickly with minimal fuss.
I've also used Craft in the past, and I think it's a great CMS, albeit not as user friendly as Umbraco, but as long as you're providing adequate user training it shouldn't matter.
Umbraco's weakness isn't really a weakness, it's a preference. Being a .NET CMS, you are limited to the .NET platform on Windows, and many developers would prefer to not develop on Windows tooling. If building and hosting on Windows is a problem for you, choose Craft, otherwise I'd recommend Umbraco.

Related

Seeking a roadmap for becoming a web developer

I am planning to learn web development during these summer holidays so that I can do some freelancing once I learn it, but I am a bit confused as to where to start and on a few terms. I was hoping to receive some recommendations about what to learn and in what order.
I'm doing my bachelors in Computer Science, have done a lot of C++ so I have a fair amount of concept for programming language. I've done a course of database, so I know a decent amount of database and SQL.
I also know a fair amount of HTML, CSS, JavaScript, jQuery, PHP (very basic PHP, just enough for interaction with MySQL).
An obvious path to me is to improve my PHP skills. but then there are frameworks, and CMS. I have read online but I can't quite grasp as to what exactly a framework or CMS is? and is there something other than a framework or CMS? which to learn first? Should I just learn a framework/CMS first?
Also, there seem to be a lot freelance projects for WordPress, so when it comes to CMS I would rather do that.
A framework usually provides premade code that you can use to "speed up your workflow" in the long run, once you've learned how to use the framework. It's not necessary for everything. I've been developing without any particular frameworks for years, though could probably benefit eventually from doing so.
A CMS (Content Management System) is the back-end or administration section of a website that the webmaster/company/client uses to insert content onto the site. It's an interface for the non-technical, which makes changing pages / updates / products etc easy. A good PHP based, free CMS to look at is Joomla, or you could look at WordPress. Joomla generally takes no more than a couple of days to learn how to use.
Hope this helps.

Problems/questions regarding content management system implementation

I've been on this issue for probably a good two months now and really haven't found a stable solution so I thought I'd just try to ask. I have an existing site already at http://keyjaycompound.com that runs off a CMS that I designed. While it was good at the time, I've now outgrown it and looking at it now, looks sloppy XD.
So at first I started redoing the CMS when I thought and read that there are so many CMS solutions available, why spin my wheels? It seemed more logical to get a third party solution that does the mondain tasks like article CRUD and user management where I'd primarily worry about the addons.
So I searched and tried many solutions that I thought would suite my PHP development needs. As my testing base, I needed to see how well my current site would transfer over and how much hassle would ensue. While CMS's like Drupal, E107, and others were great....on paper, neither seem to suite my need. They were either too bloated, lacking in documentation or community support, seemly comprised of large hassle for simple tasks, or just downright confusing >_<.
So now the road has put me at Frameworks now in which I'm currently trying to learn Code Igniter. Now my issue becomes security! One of the advantages of CMS systems like Drupal or Joomla is that they have (and constantly are) field tested for security holes. Something a lone modest experienced developer like myself would probably never find. However what some have told me is that the fact that the CMS would be designed by me does create somewhat of a layer of security considering it's not common to the public as much as Drupal or Wordpress.
So with that here are my questions. In consideration of time and practicality:
how do pro's actually do something like this; select a content management system for their project?
Do they start with frameworks and build out, adjusting to security problems along the way?
Do they use a particular CMS solution so they dont worry as much about common security holes?
Maybe I should start with a framework like Codeigniter and growing with it as my security and user management needs change?
Thanks guys. I'd really like to finally stick with a solution to learn so I can finally get back to developing.
This might be too old to answer, but I'm shocked nobody has bothered to answer the question! I'm in the same situation and saw this.
I started out with a CMS, but after a security attack that wiped a project site clean (and the CMS forum was completely clueless) I picked up Codeigniter. Some projects later and recreating my own CMS (twice), I settled with wordpress for small-medium projects (from personal websites to online news/magazine types). As you put it, I've outgrown my own CMS for these type of projects.
Answers (in the order you asked them):
1) It depends mostly on what it is you are doing. If its something that can be deployed with open software (with a little patience learning), you could be better off with that while making sure they're updated all the time. But if you're doing something way different from all these I'm afraid you're pretty much stuck with a custom solution, which you could accelerate with frameworks.
2,3,4) With frameworks (for starters) sticking to the security guidelines of the framework in question helps a lot, while proofing the usual suspects (form validation, session hijacking, injection, etc) . I ran my first CMS through a certified hacker and he said it was rock solid (despite how paranoid I was about security while developing). Stick to the blog of the framework for security updates (they do happen)
For CI though, a major item you have to consider thoroughly would be the user management. CI AFAIK didn't come with one at the time and picking one with security in mind made me realize how important it was.
What seems to be looking like a good idea is finding a CMS working within Codeigniter that I can extend with ease. I don't know yet if this is the same as a standalone CMS that was built on codeigniter, but tackling security problems for me would amount to running tests while being as alert as I am as I go
Sorry for the long talk. Hope this helps

Which CMS, if any, would be best suited for a database-driven website?

For educational purposes, I am delving into some web development. What I have in mind right now is a website where users can submit as well as view benchmark scores for CPUs, GPus etc. As is evident, this will be heavily driven by a database which will store all the scores etc.
I have programming experience with OOPs (C++, C#), and am not too worried about picking up PHP. However, I feel intimidated by front-end design (HTML, CSS etc.), and for that reason am shying away from developing the website from scratch.
I'm using MS WebMatrix, but I'm not sure which CMS will be best suited for me. Currently, I've reviewed the following: DotNetNuke, Umbraco, Joomla, Drupal; but haven't been able to pinpoint one yet.
Any suggestions which will be best suited for my kind of website?
Most widespread like Wordpress and Drupal CMS (and others) are extensible, meaning that you can create your own content types following the imposed workflow of each one's architecture. So the best suited for you will be the one that take less time learning.
I will recommend you Wordpress because I found that the learning curve is minimal if you can read their PHP source code, that is no need to read a book in its nth edition to cover to cover.
This page is a good start point to create a post type for Bechmarks. But again you could accomplish the same with other CMS, say Drupal. A sibling site of SO is devoted solely to WordPress.
hope that helps!

Drupal or from-scratch web app development?

I am looking to develop a multi-user web application that supports the following key features:
fill out forms with demographic data on individuals
define and administer surveys & polls
generate nice reports with graphs)
user rights administration and generic login stuff
My dilemma is whether to use a CMS (Drupal?) or develop from scratch.
Putting the time and cost issues aside for a minute, which are an obvious CMS strength, what are the weaknesses and potential risks using a CMS? my gut tells me that a CMS will be very easy and quick to start with, but when the features list begins to grow - I will pay the bill with having to delve into unfamiliar DB structure and code, try to tweak existing modules or write my own from scratch.
Is it really better, over the long run, to use a CMS?
There are two basic types of CMSes:
focused on features
focused on flexibility
The first type - focused on features - usually offer lot of modules or extensions to expand the basic functionality. You can build your web site very quickly using ready-to-use third party modules. There's a disadvantage of this way - it isn't so easy to bend or customize these modules. Usually you need to rewrite them.
Drupal. WordPress and Joomla are good examples of the first type of CMSes.
The second type - focused on flexibility - somewhere called Content Management Frameworks - don't offer so much prefabricated modules, but offer much more tools and ways to make your structure and relationships between elements fit your needs. It takes more time to learn this kind of CMS or to build your fist web site, but you can easily customize anything you need.
Some examples of the second type CMSes: SilverStripe, Symphony CMS, appRain, MODx, ezPublish.
appRain is one of the best customizable option where you will get both option of CMS and also complex coding by it's framework.
Development process is also easy. New version 4.0.4 is on they way to be released.

Should developer tools, languages, frameworks, etc. be standardized across an organization? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
The organization that I currently work for seems to be heading in the direction of dictating to software developers which tools, languages, frameworks, etc. must be used. However, nobody has convinced me that this is a good thing. The main argument I have heard is that it will make training easier. But, after developing software for over 10 years, I've never relied on training to learn how to use an IDE, programming language, or anything else; so I just can't relate.
With the rapid speed at which technology evolves, and the s-l-o-w-n-e-s-s at which I know the standards will adapt, I am concerned that my customers will have requirements that I won't be able to easily implement or won't be able to implement as efficiently as I should. For example, if there is a UI requirement for an auto-complete feature in a web app, and no API has been approved for this yet, I would need to implement auto-complete myself as opposed to using one of the many APIs that provide it out of the box.
A more radical example is if my customers wanted to have Google Wave features. In that case I would want the flexibility of configuring my development environment (including the IDE) and selecting appropriate frameworks (ex: GWT) to use.
Please provide feedback on whether or not you think that software developer tools, languages, etc should be standardized and a few points to support your argument.
There is a lot of benefit for standardization. My organization has fairly set standards on what technology we will use. We realize strong benefits in the following areas ...
Hiring. It is easy to describe what technologies we are looking for and make sure our recruiters are looking for the right people.
License/Software costs. I can buy enterprise licenses easily. It gives me the opportunity to keep costs down by letting me spend more with a smaller number of vendors and thus get more leverage.
Consistency of delivery. Our teams have a very good idea of what projects will take to build, rollout and maintain because they have done it with success before (and they know the pitfalls too).
Agility. I can have one team take over for another or one individual take over for another more easily because of standardization.
Quality. We have peer reviews across teams as well as QA across teams.
Without a consistent use of a technology stack, tools, languages and frameworks, these types of benefits would be more difficult to realize. I am not closed off to new technologies, but there has to be a concrete reason beyond "what if I want to ..."
A major issue with standardization is that once standards are out there, they get stamped in concrete and are difficult to change. This is why our corporate IT environment is stuck on IE 6, and the best change control system we have access to is CVS. Given this situation, some developers break the rules, and some find jobs at more innovative companies.
You have a mixed bag here.
I wouldn't standardize on IDEs, because every developer works differently. Those who are insanely proficient in emacs may see their performance suffer if forced to use Visual Studio. I optimize my Visual Studio experience with a 30" monitor and find it incredibly productive.
However, standardizing on some tools, such as SCons or make or something to build products is perfectly reasonable.
Banning some libraries and having a process where new libraries are either approved or not is also very reasonable. I know lots of companies that ban boost, or JQuery, or ban open source libraries in general, etc. And they had good reasons for doing it. I know I got fairly upset when an intern incorporated some random "security" library he found on the internet without running it by anyone.
In the end every company is different. You have to be standardized enough to avoid serious complications and issues as people come and go, or as new products are formed and organizational structures change. But you have to be flexible enough to avoid re-inventing every wheel you need.
The important thing is to have clear reasons for adopting a certain tool or banning some other tool or library. You can't just have management dictate that thou shalt use this and not that without consulting the engineering team and making the decision for good reasons. And once decisions are made those reasons should be written down and clearly communicated.
And also, if, in the end, your favorite tool or library isn't adopted, please don't whine about it. Be adaptable and do your job, or find a new one that makes you happier.
I once worked for a manager who felt the need to innovate at every level of his software development operation. Every development tool had to be cutting edge (preferably in beta). Many of the tools he asked us to use didn't have good documentation, and training was not available. Ultimately, most of the technology we tried simply didn't work. We wasted a lot of time churning through new technologies, only to dump them when it became clear we couldn't make progress.
I tried to make the case that innovation is perfect in the area where your value proposition lies. Innovation can also be used judiciously where standard techniques fail. But for most mundane tasks, using tried-and-true tools and methods should be the default. Less risk, less cost, less management attention needed. So you can focus time and energy on the areas where innovation has the most benefit.
So I think standardization has an important role. But blindly saying everything must be standard is just as sure to fail as my manager who thought everything must be innovative.
The number one argument in favor of standardization is that it maximizes the ability of the organization as a whole to use a common body of knowledge. Don't know how custom web controls are built in ASP.NET/C#? Ask Bill down the hall who has the knowledge. If you use different tools, such organizational wisdom is cut off at the knees. While it is not good to be restricted to a least common denominator (and hopefully your management will realize this) you should not overlook the benefits of shared experience!
UPDATE: I do not agree that innovation and standardization are polar opposites. Indeed, would we have nearly the level of web innovation if we still had the mishmash of networking standards characteristic of the 1980s? No we would not. Of course, we might have more innovation on new low-level networking protocols but is that really worth it? In its place, we've had an explosion of creativity within the bounds of TCP/IP and the Web standards (http, html, etc.)
The trick is knowing how to standardize without using it as an argument for closing down all new exploration. For example, we use only ASP.NET/C#/SQL Server in my company but I'm perfectly open to the use of new tools within this framework (we recently adopted the DevExpress reporting package, for example, supplanting the earlier standard).
Standardization is a must for a productive development team. However that doesn't mean that you can't revist the standards from time to time to adjust them to new technologies and trends.
Whether you develop operations software for internal clients, or products for external clients, there is no compelling reason not to standardize. You certainly did not give one.
Had you seen how companies are struggling with holding heterogenous products together that have been maintained for 10 years or more, and are now a conglomerate of various technologies that developers at some point thought made sense, you would not have asked this question.
From the top of my head, I could name at least 2 well-known software companies that will be driven out of business because their cost of maintenance has become so high that they can no longer compete (but I won't).
I think the misconception here is that suppressing individualism would supress innovation. That is simply not true. It is poor technical leadership that suppresses innovation.
One unpleasant consequence of standardization is that it tends to stifle innovation.
Innovation is scary. It involves cost and risk.
Standardization is not scary. It reduces cost and risk in the short term. Until your competitors have created a game-changing innovation. Then standardization is very costly.
It depends on the organization I think. One like Microsoft, yes, there should be a bit of a standard. A small business with one IT department, no. A larger business with several offices around the world ... maybe.
it all depends :-P
Assuming the organization has a broad suite of enterprise applications to manage, I'd say no for the following reasons, though I may be taking the message of everything being the same a bit too literally:
Compromise on using best-of-breed for systems, e.g. if all the databases are to be MS-SQL then any Oracle DB solution is thrown out. This would also apply to the fact that everyone using an IDE has to use the same one whether they be doing Data Warehouse report development, web applications, console applications or winForms. I'm thinking of systems like ERP, CRM, SCM, CMS, SSO and various other TLAs, FLAs, and SLAs. (LA = Letter acronyms for a decoding hint if you need it)
Upgrading by committee is another interesting issue. Where if each team can choose their tools and have one person that decides it is to upgrade things, e.g. start using Visual Studio 2008 instead of Visual Studio 2005, now have to determine at what threshold is it worth it to upgrade everyone simultaneously which may be a big headache if there are more than a few developers. For example, over the past 10 years when would there be IDE changes, framework changes, etc.?
Exceptions to the standards. Could a contractor bring in something not used in the organization if they believe it helps them build better software, e.g. Resharper or other add-ons that some contractors believe are very worthwhile that the organization doesn't want to spend the money to get? What about legacy systems that may make the standard become a bit unwieldy, e.g. this was built in ASP.Net 1.1 and so everyone has to have VS 2003 installed even if most will never use it?
Just my thoughts on this.
There are several good reasons to standardize.
First, it allows the enterprise better organizational flexibility, if everybody is more or less familiar with the same things. It also allows people to help each other better. I can't help with problems in the ASP.NET stuff, and there's not all that many people who can help me on the C++ side.
Second, it reduces support problems and expenses. Oracle and SQL Server are both decent products, but using both for similar functions is only going to cause problems. Not to mention that I've been in shops using several widely different platforms to do similar things, and it wasn't fun.
Third, there are some things that just have to be standardized. We couldn't operate half with VS 2005 and VS 2008, since we keep project files under source control. We had to pick a time and convert over.
Fourth, in some businesses, it simplifies the regulatory problems. I don't know what business you're in. I work at a place where we can get away with making mistakes right now, but I've also contracted at a bank and a utility, where it's necessary to be able to show auditors that everything is going in a standard way.
Fifth, it can simplify procurement, if you're dealing with software that costs money.
This doesn't particularly limit us, since if there's something we need that isn't standardized on we just go ahead and get it or do it.
If you want to make a business case against standardization, you'll need to have a business-related argument. Your argument seems to be that you won't be able to implement features the user wants, and that is a consideration. Got another argument?
There's nothing wrong with standardizing on an IDE that is rich enough to be configured for individual developers.
However, do make sure that you don't prevent individual developers from using additional tools, as long as the tools are licensed and that the use of the tool by one developer doesn't require all other developers to use it.
For instance, I happen to use NORMA to help me design databases. The output is SQL Server DDL (or anything else I want). I can make the DDL part of the project without making my NORMA source part of it. Later developers do not need to use NORMA to work on the project.
On the other hand, if I decided to use the Configuration Section Designer to create configuration sections, then future developers would also have to use it. A decision would need to be made about whether to use that tool.
The company I work for uses C#, ASP.NET, JavaScript and generates HTML. The advantages over and above those mentioned above are that there is a perception of improved velocity for maintenance and adaptive changes. The disadvantages include generating some boredom for people who are technically savvy (geeky) and prefer to use a mix and match of languages, depending on what they fancy is better suited, or for 'performance reasons'.
Technical and personal supervision is always good to have when you are developing as fast as you can to meet tight deadlines and competing in a highly saturated market for web development.