Related
I am looking to develop a multi-user web application that supports the following key features:
fill out forms with demographic data on individuals
define and administer surveys & polls
generate nice reports with graphs)
user rights administration and generic login stuff
My dilemma is whether to use a CMS (Drupal?) or develop from scratch.
Putting the time and cost issues aside for a minute, which are an obvious CMS strength, what are the weaknesses and potential risks using a CMS? my gut tells me that a CMS will be very easy and quick to start with, but when the features list begins to grow - I will pay the bill with having to delve into unfamiliar DB structure and code, try to tweak existing modules or write my own from scratch.
Is it really better, over the long run, to use a CMS?
There are two basic types of CMSes:
focused on features
focused on flexibility
The first type - focused on features - usually offer lot of modules or extensions to expand the basic functionality. You can build your web site very quickly using ready-to-use third party modules. There's a disadvantage of this way - it isn't so easy to bend or customize these modules. Usually you need to rewrite them.
Drupal. WordPress and Joomla are good examples of the first type of CMSes.
The second type - focused on flexibility - somewhere called Content Management Frameworks - don't offer so much prefabricated modules, but offer much more tools and ways to make your structure and relationships between elements fit your needs. It takes more time to learn this kind of CMS or to build your fist web site, but you can easily customize anything you need.
Some examples of the second type CMSes: SilverStripe, Symphony CMS, appRain, MODx, ezPublish.
appRain is one of the best customizable option where you will get both option of CMS and also complex coding by it's framework.
Development process is also easy. New version 4.0.4 is on they way to be released.
I'm planning on starting a new project, and am evaluating various web frameworks. There is one that I'm seriously considering, but I worry about its lasting power.
When choosing a web framework, what should I look for when deciding what to go with?
Here's what I have noticed with the framework I'm looking at:
Small community. There are only a few messages on the users list each day
No news on the "news" page since the previous release, over 6 months ago
No svn commits in the last 30 days
Good documentation, but wiki not updated since previous release
Most recent release still not in a maven repository
It is not the officially sanctioned Java EE framework, but I've seen several people mention it as a good solution in answers to various questions on Stack Overflow.
I'm not going to say which framework I'm looking at, because I don't want this to get into a framework war. I want to know what other aspects of the project I should look at in my evaluation of risk. This should apply to other areas besides just Java EE web, like ORM, etc.
I'll say that so-called "dead" projects are not that great a danger as long as the project itself is solid and you like it. The thing is that if the library or framework already does everything you can think you want, then it's not such a big deal. If you get a stable project up and running then you should be done thinking about the framework (done!) and focus only on your webapp. You shouldn't be required to update the framework itself with the latest release every month.
Personally, I think the most important point is that you find one that is intuitive to your project. What makes the most sense? MVC? Should each element in the URL be a separate object? How would interactivity (AJAX) work? It makes no sense to pick something just because it's an "industry standard" or because it's used by a lot of big-name sites. Maybe they chose it for needs entirely different from yours. Read the tutorials for each framework and be critical. If it doesn't gel with your way of thinking, or you have seen it done more elegantly, then move on. What you are considering here is the design and good design is tantamount for staying flexible and scalable. There's hundreds of web frameworks out there, old and new, in every language. You're bound to find half a dozen that works just the way you want to think in your project.
Points I consider mandatory:
Extensible through plug-ins: check if there's already plug-ins for various middleware tasks such as memcache, gzip, OpenID, AJAX goodness, etc.
Simplicity and modularity: the more complex, the steeper the learning curve and the less you can trust its stability; the more "locked" to specific technologies, the higher the chances that you'll end up with a chain around your ankle.
Database agnostic: can you use sqlite3 for development and then switch to your production DB by changing a single line of code or configuration?
Platform agnostic: can you run it on Apache, lighttpd, etc.? Could you port it to run in a cloud?
Template agnostic: can you switch out the template system? Let's say you hire dedicated designers and they really want to go with something else.
Documentation: I am not that strict if it's open-source, but there would need to be enough official documentation to enable me to fully understand how to write my own plug-ins, for example. Also look to see if there's source code of working sites using the same framework.
License and source code: do you have access to the source code and are you allowed to modify it? Consider if you can use it commercially! (Even if you have no current plans to do that currently.)
All in all: flexibility. If I am satisfied with all four points, I'm pretty much done. Notice how I didn't have anything about "deadness" in there? If the core design is good and there's easily installable plug-ins for doing every web-dev 3.0-beta buzzword thing you want to do, then I don't care if the last SVN commit was in 2006.
Here are the things I look for in a framework before I decide to use it for a production environment project:
Plenty of well laid out and written documentation. Bad documentation just means I'm wasting time trying to find how everything works. This is OK if I am playing around with some cool new micro framework or something else, but not when it's for a client.
A decently sized community so that you can ask questions, etc. A fun and active IRC channel is a big plus.
Constant iteration of the product. Are bugs being closed or opened on a daily/weekly basis? Probably a good sign.
I can go through the code of the framework and understand what's going on. Good framework code means that the projects longterm life has a better chance of success.
I enjoy working with it. If I play with it for a few hours and it's the worst time of my life, I sure as hell won't be using it for a client.
I can go on, but those are some primary ones off the top of my head.
Besides looking at the framework, you also need to consider a lot of things about yourself (and any other team members) when evaluating the risks:
If the framework is a new, immature, "bleeding-edge" framework, are you going to be willing and able to debug it and fix or work around whatever problems you encounter?
If there is a small community, you'll have to do a lot of this debugging and diagnosis yourself. Will you have time to do that and still meet whatever deadlines you may have?
Have you looked at the framework yourself to determine how good it is, or are you willing to rely on what others say about it? Why do you trust their judgment?
Why do you want to use this rather than the "officially sanctioned Java EE framework"? Is it a pragmatic reason, or just a desire to try something new?
If problems with the framework cause you to miss deadlines or deliver a poor product, how will you talk about it with your boss or customer?
All the signs you've cited could be bad news for your framework choice.
Another thing that I look for are books available at Amazon and such. If there's good documentation available, it means that authors believe it has traction and you'll be able to find users that know it.
The only saving grace I can think of is relative maturity. If the framework or open source component is mature, there's a chance that it does the job as written and doesn't require further extension.
There should still be a bug tracker with some evidence of activity, because no software is without bugs (except for mine). But it need not be a gusher of requests in that case.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
The organization that I currently work for seems to be heading in the direction of dictating to software developers which tools, languages, frameworks, etc. must be used. However, nobody has convinced me that this is a good thing. The main argument I have heard is that it will make training easier. But, after developing software for over 10 years, I've never relied on training to learn how to use an IDE, programming language, or anything else; so I just can't relate.
With the rapid speed at which technology evolves, and the s-l-o-w-n-e-s-s at which I know the standards will adapt, I am concerned that my customers will have requirements that I won't be able to easily implement or won't be able to implement as efficiently as I should. For example, if there is a UI requirement for an auto-complete feature in a web app, and no API has been approved for this yet, I would need to implement auto-complete myself as opposed to using one of the many APIs that provide it out of the box.
A more radical example is if my customers wanted to have Google Wave features. In that case I would want the flexibility of configuring my development environment (including the IDE) and selecting appropriate frameworks (ex: GWT) to use.
Please provide feedback on whether or not you think that software developer tools, languages, etc should be standardized and a few points to support your argument.
There is a lot of benefit for standardization. My organization has fairly set standards on what technology we will use. We realize strong benefits in the following areas ...
Hiring. It is easy to describe what technologies we are looking for and make sure our recruiters are looking for the right people.
License/Software costs. I can buy enterprise licenses easily. It gives me the opportunity to keep costs down by letting me spend more with a smaller number of vendors and thus get more leverage.
Consistency of delivery. Our teams have a very good idea of what projects will take to build, rollout and maintain because they have done it with success before (and they know the pitfalls too).
Agility. I can have one team take over for another or one individual take over for another more easily because of standardization.
Quality. We have peer reviews across teams as well as QA across teams.
Without a consistent use of a technology stack, tools, languages and frameworks, these types of benefits would be more difficult to realize. I am not closed off to new technologies, but there has to be a concrete reason beyond "what if I want to ..."
A major issue with standardization is that once standards are out there, they get stamped in concrete and are difficult to change. This is why our corporate IT environment is stuck on IE 6, and the best change control system we have access to is CVS. Given this situation, some developers break the rules, and some find jobs at more innovative companies.
You have a mixed bag here.
I wouldn't standardize on IDEs, because every developer works differently. Those who are insanely proficient in emacs may see their performance suffer if forced to use Visual Studio. I optimize my Visual Studio experience with a 30" monitor and find it incredibly productive.
However, standardizing on some tools, such as SCons or make or something to build products is perfectly reasonable.
Banning some libraries and having a process where new libraries are either approved or not is also very reasonable. I know lots of companies that ban boost, or JQuery, or ban open source libraries in general, etc. And they had good reasons for doing it. I know I got fairly upset when an intern incorporated some random "security" library he found on the internet without running it by anyone.
In the end every company is different. You have to be standardized enough to avoid serious complications and issues as people come and go, or as new products are formed and organizational structures change. But you have to be flexible enough to avoid re-inventing every wheel you need.
The important thing is to have clear reasons for adopting a certain tool or banning some other tool or library. You can't just have management dictate that thou shalt use this and not that without consulting the engineering team and making the decision for good reasons. And once decisions are made those reasons should be written down and clearly communicated.
And also, if, in the end, your favorite tool or library isn't adopted, please don't whine about it. Be adaptable and do your job, or find a new one that makes you happier.
I once worked for a manager who felt the need to innovate at every level of his software development operation. Every development tool had to be cutting edge (preferably in beta). Many of the tools he asked us to use didn't have good documentation, and training was not available. Ultimately, most of the technology we tried simply didn't work. We wasted a lot of time churning through new technologies, only to dump them when it became clear we couldn't make progress.
I tried to make the case that innovation is perfect in the area where your value proposition lies. Innovation can also be used judiciously where standard techniques fail. But for most mundane tasks, using tried-and-true tools and methods should be the default. Less risk, less cost, less management attention needed. So you can focus time and energy on the areas where innovation has the most benefit.
So I think standardization has an important role. But blindly saying everything must be standard is just as sure to fail as my manager who thought everything must be innovative.
The number one argument in favor of standardization is that it maximizes the ability of the organization as a whole to use a common body of knowledge. Don't know how custom web controls are built in ASP.NET/C#? Ask Bill down the hall who has the knowledge. If you use different tools, such organizational wisdom is cut off at the knees. While it is not good to be restricted to a least common denominator (and hopefully your management will realize this) you should not overlook the benefits of shared experience!
UPDATE: I do not agree that innovation and standardization are polar opposites. Indeed, would we have nearly the level of web innovation if we still had the mishmash of networking standards characteristic of the 1980s? No we would not. Of course, we might have more innovation on new low-level networking protocols but is that really worth it? In its place, we've had an explosion of creativity within the bounds of TCP/IP and the Web standards (http, html, etc.)
The trick is knowing how to standardize without using it as an argument for closing down all new exploration. For example, we use only ASP.NET/C#/SQL Server in my company but I'm perfectly open to the use of new tools within this framework (we recently adopted the DevExpress reporting package, for example, supplanting the earlier standard).
Standardization is a must for a productive development team. However that doesn't mean that you can't revist the standards from time to time to adjust them to new technologies and trends.
Whether you develop operations software for internal clients, or products for external clients, there is no compelling reason not to standardize. You certainly did not give one.
Had you seen how companies are struggling with holding heterogenous products together that have been maintained for 10 years or more, and are now a conglomerate of various technologies that developers at some point thought made sense, you would not have asked this question.
From the top of my head, I could name at least 2 well-known software companies that will be driven out of business because their cost of maintenance has become so high that they can no longer compete (but I won't).
I think the misconception here is that suppressing individualism would supress innovation. That is simply not true. It is poor technical leadership that suppresses innovation.
One unpleasant consequence of standardization is that it tends to stifle innovation.
Innovation is scary. It involves cost and risk.
Standardization is not scary. It reduces cost and risk in the short term. Until your competitors have created a game-changing innovation. Then standardization is very costly.
It depends on the organization I think. One like Microsoft, yes, there should be a bit of a standard. A small business with one IT department, no. A larger business with several offices around the world ... maybe.
it all depends :-P
Assuming the organization has a broad suite of enterprise applications to manage, I'd say no for the following reasons, though I may be taking the message of everything being the same a bit too literally:
Compromise on using best-of-breed for systems, e.g. if all the databases are to be MS-SQL then any Oracle DB solution is thrown out. This would also apply to the fact that everyone using an IDE has to use the same one whether they be doing Data Warehouse report development, web applications, console applications or winForms. I'm thinking of systems like ERP, CRM, SCM, CMS, SSO and various other TLAs, FLAs, and SLAs. (LA = Letter acronyms for a decoding hint if you need it)
Upgrading by committee is another interesting issue. Where if each team can choose their tools and have one person that decides it is to upgrade things, e.g. start using Visual Studio 2008 instead of Visual Studio 2005, now have to determine at what threshold is it worth it to upgrade everyone simultaneously which may be a big headache if there are more than a few developers. For example, over the past 10 years when would there be IDE changes, framework changes, etc.?
Exceptions to the standards. Could a contractor bring in something not used in the organization if they believe it helps them build better software, e.g. Resharper or other add-ons that some contractors believe are very worthwhile that the organization doesn't want to spend the money to get? What about legacy systems that may make the standard become a bit unwieldy, e.g. this was built in ASP.Net 1.1 and so everyone has to have VS 2003 installed even if most will never use it?
Just my thoughts on this.
There are several good reasons to standardize.
First, it allows the enterprise better organizational flexibility, if everybody is more or less familiar with the same things. It also allows people to help each other better. I can't help with problems in the ASP.NET stuff, and there's not all that many people who can help me on the C++ side.
Second, it reduces support problems and expenses. Oracle and SQL Server are both decent products, but using both for similar functions is only going to cause problems. Not to mention that I've been in shops using several widely different platforms to do similar things, and it wasn't fun.
Third, there are some things that just have to be standardized. We couldn't operate half with VS 2005 and VS 2008, since we keep project files under source control. We had to pick a time and convert over.
Fourth, in some businesses, it simplifies the regulatory problems. I don't know what business you're in. I work at a place where we can get away with making mistakes right now, but I've also contracted at a bank and a utility, where it's necessary to be able to show auditors that everything is going in a standard way.
Fifth, it can simplify procurement, if you're dealing with software that costs money.
This doesn't particularly limit us, since if there's something we need that isn't standardized on we just go ahead and get it or do it.
If you want to make a business case against standardization, you'll need to have a business-related argument. Your argument seems to be that you won't be able to implement features the user wants, and that is a consideration. Got another argument?
There's nothing wrong with standardizing on an IDE that is rich enough to be configured for individual developers.
However, do make sure that you don't prevent individual developers from using additional tools, as long as the tools are licensed and that the use of the tool by one developer doesn't require all other developers to use it.
For instance, I happen to use NORMA to help me design databases. The output is SQL Server DDL (or anything else I want). I can make the DDL part of the project without making my NORMA source part of it. Later developers do not need to use NORMA to work on the project.
On the other hand, if I decided to use the Configuration Section Designer to create configuration sections, then future developers would also have to use it. A decision would need to be made about whether to use that tool.
The company I work for uses C#, ASP.NET, JavaScript and generates HTML. The advantages over and above those mentioned above are that there is a perception of improved velocity for maintenance and adaptive changes. The disadvantages include generating some boredom for people who are technically savvy (geeky) and prefer to use a mix and match of languages, depending on what they fancy is better suited, or for 'performance reasons'.
Technical and personal supervision is always good to have when you are developing as fast as you can to meet tight deadlines and competing in a highly saturated market for web development.
If you were going to start building web sites as a consulting business on the side -- keeping your day job -- and you also had a toddler and a wife, what frameworks/tools would you pick to save you typing?
Any language.
I'm looking for a productivity superstar stack that won't tie my hands too much when I have to update the site 6 months later, or "evolve" the data model once in production.
It needs to allow me to say "yes" to the client: community features, CMS, security, moderation, AJAX, ...
I would suggest Django. Super simple to get something up and running really quick. You are using Python which has a large library to go with it. For me Ruby on Rails would be a close second.
I'd probably look at DotNetNuke. Its easy to set up (a lot of hosts will do it for you) and easy to use and put together a custom site that business's will be able to maintain in the future.
Its fairly easy to create custom modules that are specific to a business and hundreds of modules for sale (or free) that can be integrated into DNN for special uses.
Take a look at Microsoft's Sharepoint server if you'd like a pre-made framework with many options for plugging in your own code. Sharepoint is kind of a world unto itself but it is a very powerful environment.
Update: I'm surprised to have been voted down on this one. Keep in mind that the questioner specifically requested frameworks that included a CMS. Sharepoint meets this criteria - unlike straight .NET or other web development frameworks.
If you are going to vote the entry down, I think you owe it to the person who asked the question to explain why you don't think he should not even explore it as an option. You could be right - collective wisdom is what voting on SO is all about. But without an explanation, we don't know why you think you are right.
My answers are going to revolve around the .NET stack.
Use Master pages and CSS templates. This makes it so much easier to pop in a new look and feel for your customer.
For sure I'd include the Dynamic Data framework in the .NET world.
Hosting might become an issue for your customer. Questions around managing email addresses, procedures on how to quickly update the website to include the new contact phone number (different for each customer, I'd assume) Consider getting a reseller account on your favorite webhost, and dole out webhosting accounts as appropriate. There are lots of issues around this point. It may turn out to be a nice source of recurring revenue.
Build yourself a few patterns including a database wrapper which would handle all your data calls (i.e. a dll which wraps all your data calls, sets up your ADO.NET objects, runs your sproc calls, and picks up the connstring from app.config or something similar.)
This goes a long way to maintainability as well.
I would recomend going with anything MVC in a language you can undertand! Theres a couple of CMS's in python, php and ruby using that design and well... that allows you to be ready for combat for Ajax and expanding anything very fast.
This is definitely not a question that can be answered.
I prefer asp.net webforms because I think it allows for extremely rapid web app development, but I am sure you will receive recommendations for:
asp.net mvc
Ruby on Rails
PHP and some framework
Python and some framework such as Django
I believe PHP has the most pre-built apps that you can use, though asp.net also has the things you are looking for.
All of these platforms and frameworks can do what you want.
Choose between Rails and Django. They both have different strengths. I like Rails better in general, but Django's admin interface can save you a lot of time when you need it.
There's another factor to take into consideration here: what are you the most familiar with? I believe that some studies have found upwards of a 30% loss of productivity when trying to learn a new language/framework.
Sometimes, there's nothing wrong with just sticking to what you know. But if you're interested in what languages/frameworks to learn, I'll refer you to the other posts because the above was the only thing I really have to add.
I recommend looking into Grails. It uses Groovy which is similiar to Java (so if you know this already you're good to go). Groovy runs on the JVM so you can still use all the great libraries already available for Java. Yet, since it's a dynamic language with a lot of the similar bells and whistles like Ruby you can use closures and that kind of neat stuff when you need/want to. And you're not slowed down by Java's traditonal slow compile-deploy-test development cycle.
Grails is already setup with Hibernate and Spring. You can create CRUD application in practically no-time (pretty much like Rails applications), and at the same time drill down and be able to control every little details since it's built on such proven and well-supported technologies. In addition there's literally hundreds of plugins available that helps you easily set up things like mailing lists, security, AJAX components and so on.
Otherwise, if you want to set up a community site and don't want to code a single line you could always check out ning.com.
We have a user experience designer in our team who has no programming background. He is expected to design screens within Eclipse as a development environment. His (valid) complaint is that every time he designs a specific screen and gives it to development - they tell him what is not possible technically using either SWT or GEF. So, he wants me to teach him basics of SWT/GEF so that he can make informed decisions and maybe even try out certain things in eclipse (as opposed to using Photoshop) before proposing designs to save time.
My personal belief is that design should not be constrained by technical possibilities and in theory, everything that the designer dreams of (at least the practical things) should be possible technically - albeit with workarounds or a little hacking.
So, my question is this - how important do you think is programming knowledge for user interface design? And if it is, how do you go about teaching someone with absolutely no programming experience the graphical frameworks on various platforms?
In principle, I agree with you. Programming knowledge shouldn't be necessary to be a skilled designer of UIs and work flows. However, knowing the abilities and limitations of the technology in use can help a UI designer work more effectively with the programming staff.
Where programming knowledge can help is if the development staff is blowing smoke that something cannot be done when it can be, some knowledge of the tools being used can help refute that. If the development staff is correct that something cannot be done, then knowledge of the tools can help the UI designer find an appropriate solution that meets the design goals and is achievable.
With a properly cooperative development staff, the UI designer would need very little (if any) knowledge about the specific GUI tools being used.
I've been on the developer side of this where I was being asked to do something impossible or impractical. I always worked with the designers to find a happy middle that met the design goals. Sometimes what I thought was impossible was in fact possible. Sometimes we had to do things a different way. A few things had to be put off as "possible, but too much effort." (Such as an SWT based application that became a Windows task bar. Definitely possible, but impractical for the project in question as it would require native code.)
What is most important is that both sides realize that they are on the same team.
Its very important..
Not knowing about:
technology in general
the technology you have chosen to dedicate your time investing into to produce the end product
Will result in a complete waste of time for everyone..
Even the end user needs to learn a bit about the technology employed in able to use whatever product we make..
Someone who drives a car will always need to know how to fill in the gas and know the basics of what a car is and what it can do, software works in a similar fashion.
Its like asking someone who doesn't know that cars (the ones of today) need wheels to make a drawing of your next release model.
The way to make them more aware of the technology is:
Show him/her similar products to the ones you should be making
Show him/her stand alone implementations of the building blocks you might consider using
But by all means...this doesn't mean you should stifle their creativity..have them draw away what they dream, just make them that little bit aware of reality as needs be in order to have something done in this lifetime
So, my question is this - how important do you think is
programming knowledge for user
interface design?
I think a basic knowledge of the standard user interface for the platform is required (text fields, combo boxes, radio buttons, etc). A good designer should be familiar with the capabilities and limitations of these GUI components, from a developer's point of view. So I guess some basic programming knowledge would be useful.
My personal belief is that design
should not be constrained by technical
possibilities and in theory,
everything that the designer dreams of
(at least the practical things) should
be possible technically - albeit with
workarounds or a little hacking.
I think there are important qualifications here --- each OS has guidelines on what constitutes good GUI design, and it's beneficial for your product that you follow them because the user has a certain mental model of how he or she should interact with applications on that platform. (Having said that, there may be good reasons for breaking some design conventions, e.g. in games, specialized graphics/music applications.)
how do you go about teaching someone
with absolutely no programming
experience the graphical frameworks on
various platforms?
Each toolkit makes available a whole bunch of small sample programs to demonstrate the use of different components --- this is probably a good first step to acquaint oneself with them.
Is not as important as common sense in my opinion.
It helps of course. But if the designer is asking for something that could be done ( because some other application uses it ) the development lead should at least present a workaround.
Programming knowledge probably not, but limitations on the chosen platform certainly.
I think it's better to learn up front, but if your UI designer is forced to learn on the fly, make sure that each time he is turned away, it's explained why something can't be done rather than just a flat refusal. This will keep him from getting as frustrated as he might otherwise be because he'll be able to form at least some logical framework for what he can and can't do.
I think the designer should be aware of the features and limitations that the tools he's using offer, and he should be aware of the limitations and the deadline of the current project that those guys are making.
He should also be aware of the background processing that's going on to show the screen UI, and all these things will come only if he has some rudimentary knowledge of programming.
He doesn't have to dabble in the depth of OOP, learn SQL, know the intricacies of reflection or anything fancy like that. He just has to know his platform well, and that I think is a requirement even for the designers.
The very core of "design" is to find a way to achieve a desired result within the constraints. If you don't anything about a part of the constraints that affects your goal, then you can't design.
It all depends on your tools.
Edit: What I mean is there are tools that are designed for designers, and tools for programmers. Eclipse, for one is not a designer tool. Photoshop is. Flash maybe, Flex not. I wouldn't require a Flash designer to program, but a Flex designer does need to program.
As for telling them about the limits of your tools it depends, really good creative designers will embrace those limits and make incredible work, mediocre designers will perceive the limits as roadblocks and stop being creative and just following the rules with fear.
I have given it some thought and based on the answers given previously, i have reached certain conclusions:
A preliminary knowledge about what is possible and what are the constraints while designing on a given platform is mandatory. This means that the graphics designer should be aware of the following:
The basic design guidelines on that platform
The standard UI toolkit / widgets provided on that platform (for ex: textboxes, drop-down lists, etc)
What is not possible (or is too cumbersome) on a given platform (for ex: creating translucent modal dialogs while fading out the background in Eclipse)
This amount of knowledge might not require the designer to dabble in programming.
The second level is where the designer is making an attempt to either create new widgets or to (knowingly) go against the set standards for a given platform. For instance, if the design includes graphs or the need to depict special relationships or a unique combination of text, graphics and images that is not implicitly supported by any standard toolsets. In this case, the designer should be aware of the technical possibilities and the limits of a given platform. In this case, i would argue that the designer should be able to write a little code and try out a few things to ascertain what might be within the realms of possibility.