Real-World QVT - eclipse

QVT (Query View Transformation) is a OMG specification of a Model-to-Model transformation language. Some tools already implement it (Eclipse, androMDA). I'm wondering whether it is really used in real-world cases. Will it ever take off and be used to tackle real-world problems? Is anybody using the QVT language?

From observing the MDD community for our own projects, I'd guess that QVT will eventually pick up. Currently ATL and Kermeta seem to be very popular, and from looking at the postings in the groups not only in academia.
There's an implementation of Declarative QVT now (see the M2M Eclipse group for the announcement), that'll be very interesting for us. We've been using the ModelMorf prototype, but it was a prototype and had a very huge turnaround time. I hope that with the integration of dQVT into the Eclipse tool chain we'll be able to use it for our own projects (a SoftEng tool, see http://rcos.iist.unu.edu, sorry, academic of course :).
I guess the pain of doing Model-driven development by hand/with man-power is not high enough yet...once the tools really increase the order of magnitude of productivity, that'll change.

Seems like QVT is beeing used for Model Driven Security applications. It is a good choice because of clearly defined semantics and provability. This still is reasearch however. France Telecom is experimenting with QVT. They want to use it for database migrations and a generative approach for applications.
http://smartqvt.elibel.tm.fr/events/QVT%20Experimentations%20at%20France%20Telecom.pdf
http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=4159881

Related

Enterprise application framework supporting DDD

I spent short time studing Habanero and i found it good approach for making Enterprise Application in a really short period of time.
The pattern witch Habanero use is "Active Record" as it's developers say.
My questions are:
There any similar application like Habanero witch fully support Domain
Driven Design by determining aggregate roots, entities and value objects
Is it right decision to use such tools in big organizations
Does it worth training our team on such a tool
thank you
Framework support for Domain Driven Design is quite different from frameworks supporting data driven applications. Such framework should increase the productivity of developers that works with an ubiquitous language that evolves with the business and that is learned by a domain-expert.
They should not face concepts like aggregates, root, value objects because they are just modelling concepts, conceptual tools, but ways to ease the development process. Thus a framework exposing abstract classes or interfaces named AggregateRoot, Entity or ValueObject is fundamentally broken. It doesn't provides any real value to an application, just useless indirections.
However:
There are a few frameworks designed to support domain driven design, listed here. Moreover, I'm developing one by myself based on previous experiences that worked very well
It depends, obviosly. For example we used all of the Epic's modeling patterns with success.
We used some "home made" framewoks too, and some of them proved to really increase productivity. However, such frameworks (if useful) always have steep learning curves and it depends very much on how much reliable the software have to be and what are the developers skills.
It depends on the framework, on the complexity of the business (if you don't need a domain expert to understand it, you don't need DDD) and on the developers, too. I faced successful stories and huge failures with different frameworks in different contexts. I've also had a conference that faced the topic (you can see the slides here).

What is the most mature library for building a Data Analytics Pipeline in Java/Scala for Hadoop?

I found many options recently, and interesting in their comparisons primarely by maturity and stability.
Crunch - https://github.com/cloudera/crunch
Scrunch - https://github.com/cloudera/crunch/tree/master/scrunch
Cascading - http://www.cascading.org/
Scalding https://github.com/twitter/scalding
FlumeJava
Scoobi - https://github.com/NICTA/scoobi/
As I'm a developer of Scoobi, don't expect an unbiased answer.
First of all, FlumeJava is an internal google project that provides a (awesomely productive) abstraction ontop of MapReduce (not hadoop though). They released a paper about it, which is what projects like Scoobi and Crunch are based on.
If your only criteria is the maturity -- I guess Cascading is your best bet.
However, if you're looking for the (imho superior) FlumeJava style abstraction, you'll want to pick between (S)crunch and Scoobi.
The biggest difference, superficial as it may be is that crunch is written in Java, with Scala bindings (Scrunch). And Scoobi is written in Scala with Java bindings (scoobij). They're both really solid choices, and you won't go wrong which ever you choose. I'm sure there's quite a similar story with Crunch, but Scoobi is being used in real projects and is under continual development. We're pretty very active in fixing bugs and implementing features.
Anyway, they're both great projects with great people behind them and were both released within days of each other. They provide the same abstraction (with similiar api), so switching between the two won't be an issue in the slightest. My recommendation is to give them both a try, and see what works for you. There' no lock in in either project, so you don't need to commit :)
And if you have any feedback for either project, please be sure to provide it :)
I'm a big Scoobi fan myself and I've used it in production. I like the way it allows you to write type-safe Hadoop programs in a very idiomatic Scala way. If that is not necessarily your thing and you like the Cascading model but are scared off by the huge amount of boilerplate code you'd have to write, Twitter has recently open sourced its own Scala abstraction layer on top of Cascading called Scalding.
Announcement: https://dev.twitter.com/blog/scalding
GitHub: https://github.com/twitter/scalding
I guess it's all a matter of taste at this point since feature-wise most of the frameworks are very close to one another.
Scalding also has the advantage of significant open source projects built atop it, such as Matrix API and Algebird.
Here are some examples:
http://sujitpal.blogspot.com/2012/08/scalding-for-impatient.html
Cascalog was released almost two years before Scalding, and arguably has more advanced features for building robust workflows:
https://github.com/nathanmarz/cascalog/wiki

Bringing Scala into my company

Now i know that this one is actually not a very technical question but one that has been bothering me for some time. Actually we are using a lot of C++ and PHP at our company and some of our developers are really hoping for a new and modern language to come by to help us getting more productive. I have been talking about what scala can do and the other coders seem to gain some interest in the language. The tough job is, how do you convince your boss to consider scala as a language for the company. I saw the presentation "Sneaking Scala into your company", but it deals with the situation that you are using Java at your company which we don't.
How do you fight of the usual "that is just esoteric stuff" and "we can already do that in $LANGUAGE" arguments. I was planing to give a talk about Scala, and since I don't have much time I need ideas how to get people interested in the language rather then setting of reactions like "currying? we can already do something like this with boost::bind".
How did you guys do it?
Regards,
raichoo
EDIT: Gave my talk yesterday, people were very excited. My company is going to give it a try! Thanks for all your suggestions.
If you don't already have killer arguments, what are you basing your reasoning on that Scala will make your company more productive?
Don't like something then hunt for reasons to use it at work. Let the reasons speak for themselves..
"A hammer looking for nails"
Using it to do some stuff around the side, as datamigrations, testing and similar things will make sure the necessary experience is built and can give it some exposure.
ScalaTest is really nice to help with acceptance/integration testing. (Yes, I know it is nice for unit testing, but I do not see that immediately happening with C++/PHP target code, and it would probably be unwise).
Proof of Concept and other Prototypes are great for 2 reasons
1) It showcases the capabilities
2) You are certain they will be thrown away if you have to reimplement them in C++/PHP
Now a bad time to introduce Scala would be when you REALLY need it : hopes will be high, it will not immediately work as intended, hopes are dashed and everybody will blame Scala. As a result it will be burnt for a long time in the organisation.
Sooner or later some suit will think it was his idea to introduce Scala and use it on a formal project. If that project is moderately successful, then it is sold.
These kind of changes are complicated people issues, and the harder you push, the harder you will face push-back. On the other hand the persistent mind can move mountains.
Redo some of your work related code in Scala and compare KLOC, code structure and performence, if it looks and works better, show it to your peers and your managers.
In other words:
Talk is cheap. Show me the code.
-- Torvalds, Linus (2000-08-25)
In case of our company (and I assume, many companies share the same scenario), move to Scala (from Java) was initiated by tech people, who 1. wanted to work more productive writing code (living in the 21st century utilize modern approaches), 2. have less troubles building concurrent applications (Actors concept promoted by Scala is a way simpler than Java thread-based concurrency) 2.1 have a simpler way of building scalable staged event driven architectures.
In our company, transition to Scala was more or less simple, because Scala was literlly sold to business people as a library to Java :) -> from their POV, we're still using the same platform (JVM), application servers, etc., but developers are having more fun from their work, and therefore, are more inspired and work more efficiently.
Maybe you could pitch Scala by showing off the suite of tools that is used for development? For example, if you are not already using Eclipse in your company, show your execs a demo of what a modern IDE can do for your productivity.
There is a book called "Fearless Change" (Linda Rising) that describes a pattern language for "powerless leaders" (I LOVE that role title!). SE-radio had a really motivating interview with the author: http://www.se-radio.net/podcast/2009-06/episode-139-fearless-change-linda-rising. Listen up on that interview to collect a few non-technical strategies that can help you in this struggle!
I haven't used Scala yet for any real business code, but I know people who have.
One group used it to write a tool to analyze log files. So they didn't use it for mission-critical business code, but for a non-critical tool to support the project.
Another person I know is an architect and he just went and wrote some Scala code on his own for some production code without telling his manager. After the code was deployed successfully he did tell it. One of the things he mentioned is that because Scala runs on the JVM, the people who support the application don't even notice - to them, Scala is just another library that's included with the application (they were already used to the JVM). Ofcourse this approach is risky and not everybody will be in the position or be willing to do this.
You could start small - use it as your personal preferred scripting language for small things that you need yourself. Tell your fellow developers about it and make them enthusiasts too. If they also start using it then you can step it up to make some side code for your project (such as for example that log analyser tool).
This isn't a really easy task. I would concentrate on the fact that you will be able to produce code and therefore products faster and with a higher quality. That's always the two reasons, business wants to hear from you and will listen to.
Maybe you can show an example of 1-2 very small projects you did in your company with C++/PHP and compare the effort, quality etc. with a similar/the same implemenation in Scala? This would be very impressive and should also convince people who are not on the coding side.
There was a very good talk at Scala Days 2010 by David Copeland:
Sneaking Scala into your organisation
The executive summary: Testing. You can use Scala for testing without affecting release code.

Should developer tools, languages, frameworks, etc. be standardized across an organization? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
The organization that I currently work for seems to be heading in the direction of dictating to software developers which tools, languages, frameworks, etc. must be used. However, nobody has convinced me that this is a good thing. The main argument I have heard is that it will make training easier. But, after developing software for over 10 years, I've never relied on training to learn how to use an IDE, programming language, or anything else; so I just can't relate.
With the rapid speed at which technology evolves, and the s-l-o-w-n-e-s-s at which I know the standards will adapt, I am concerned that my customers will have requirements that I won't be able to easily implement or won't be able to implement as efficiently as I should. For example, if there is a UI requirement for an auto-complete feature in a web app, and no API has been approved for this yet, I would need to implement auto-complete myself as opposed to using one of the many APIs that provide it out of the box.
A more radical example is if my customers wanted to have Google Wave features. In that case I would want the flexibility of configuring my development environment (including the IDE) and selecting appropriate frameworks (ex: GWT) to use.
Please provide feedback on whether or not you think that software developer tools, languages, etc should be standardized and a few points to support your argument.
There is a lot of benefit for standardization. My organization has fairly set standards on what technology we will use. We realize strong benefits in the following areas ...
Hiring. It is easy to describe what technologies we are looking for and make sure our recruiters are looking for the right people.
License/Software costs. I can buy enterprise licenses easily. It gives me the opportunity to keep costs down by letting me spend more with a smaller number of vendors and thus get more leverage.
Consistency of delivery. Our teams have a very good idea of what projects will take to build, rollout and maintain because they have done it with success before (and they know the pitfalls too).
Agility. I can have one team take over for another or one individual take over for another more easily because of standardization.
Quality. We have peer reviews across teams as well as QA across teams.
Without a consistent use of a technology stack, tools, languages and frameworks, these types of benefits would be more difficult to realize. I am not closed off to new technologies, but there has to be a concrete reason beyond "what if I want to ..."
A major issue with standardization is that once standards are out there, they get stamped in concrete and are difficult to change. This is why our corporate IT environment is stuck on IE 6, and the best change control system we have access to is CVS. Given this situation, some developers break the rules, and some find jobs at more innovative companies.
You have a mixed bag here.
I wouldn't standardize on IDEs, because every developer works differently. Those who are insanely proficient in emacs may see their performance suffer if forced to use Visual Studio. I optimize my Visual Studio experience with a 30" monitor and find it incredibly productive.
However, standardizing on some tools, such as SCons or make or something to build products is perfectly reasonable.
Banning some libraries and having a process where new libraries are either approved or not is also very reasonable. I know lots of companies that ban boost, or JQuery, or ban open source libraries in general, etc. And they had good reasons for doing it. I know I got fairly upset when an intern incorporated some random "security" library he found on the internet without running it by anyone.
In the end every company is different. You have to be standardized enough to avoid serious complications and issues as people come and go, or as new products are formed and organizational structures change. But you have to be flexible enough to avoid re-inventing every wheel you need.
The important thing is to have clear reasons for adopting a certain tool or banning some other tool or library. You can't just have management dictate that thou shalt use this and not that without consulting the engineering team and making the decision for good reasons. And once decisions are made those reasons should be written down and clearly communicated.
And also, if, in the end, your favorite tool or library isn't adopted, please don't whine about it. Be adaptable and do your job, or find a new one that makes you happier.
I once worked for a manager who felt the need to innovate at every level of his software development operation. Every development tool had to be cutting edge (preferably in beta). Many of the tools he asked us to use didn't have good documentation, and training was not available. Ultimately, most of the technology we tried simply didn't work. We wasted a lot of time churning through new technologies, only to dump them when it became clear we couldn't make progress.
I tried to make the case that innovation is perfect in the area where your value proposition lies. Innovation can also be used judiciously where standard techniques fail. But for most mundane tasks, using tried-and-true tools and methods should be the default. Less risk, less cost, less management attention needed. So you can focus time and energy on the areas where innovation has the most benefit.
So I think standardization has an important role. But blindly saying everything must be standard is just as sure to fail as my manager who thought everything must be innovative.
The number one argument in favor of standardization is that it maximizes the ability of the organization as a whole to use a common body of knowledge. Don't know how custom web controls are built in ASP.NET/C#? Ask Bill down the hall who has the knowledge. If you use different tools, such organizational wisdom is cut off at the knees. While it is not good to be restricted to a least common denominator (and hopefully your management will realize this) you should not overlook the benefits of shared experience!
UPDATE: I do not agree that innovation and standardization are polar opposites. Indeed, would we have nearly the level of web innovation if we still had the mishmash of networking standards characteristic of the 1980s? No we would not. Of course, we might have more innovation on new low-level networking protocols but is that really worth it? In its place, we've had an explosion of creativity within the bounds of TCP/IP and the Web standards (http, html, etc.)
The trick is knowing how to standardize without using it as an argument for closing down all new exploration. For example, we use only ASP.NET/C#/SQL Server in my company but I'm perfectly open to the use of new tools within this framework (we recently adopted the DevExpress reporting package, for example, supplanting the earlier standard).
Standardization is a must for a productive development team. However that doesn't mean that you can't revist the standards from time to time to adjust them to new technologies and trends.
Whether you develop operations software for internal clients, or products for external clients, there is no compelling reason not to standardize. You certainly did not give one.
Had you seen how companies are struggling with holding heterogenous products together that have been maintained for 10 years or more, and are now a conglomerate of various technologies that developers at some point thought made sense, you would not have asked this question.
From the top of my head, I could name at least 2 well-known software companies that will be driven out of business because their cost of maintenance has become so high that they can no longer compete (but I won't).
I think the misconception here is that suppressing individualism would supress innovation. That is simply not true. It is poor technical leadership that suppresses innovation.
One unpleasant consequence of standardization is that it tends to stifle innovation.
Innovation is scary. It involves cost and risk.
Standardization is not scary. It reduces cost and risk in the short term. Until your competitors have created a game-changing innovation. Then standardization is very costly.
It depends on the organization I think. One like Microsoft, yes, there should be a bit of a standard. A small business with one IT department, no. A larger business with several offices around the world ... maybe.
it all depends :-P
Assuming the organization has a broad suite of enterprise applications to manage, I'd say no for the following reasons, though I may be taking the message of everything being the same a bit too literally:
Compromise on using best-of-breed for systems, e.g. if all the databases are to be MS-SQL then any Oracle DB solution is thrown out. This would also apply to the fact that everyone using an IDE has to use the same one whether they be doing Data Warehouse report development, web applications, console applications or winForms. I'm thinking of systems like ERP, CRM, SCM, CMS, SSO and various other TLAs, FLAs, and SLAs. (LA = Letter acronyms for a decoding hint if you need it)
Upgrading by committee is another interesting issue. Where if each team can choose their tools and have one person that decides it is to upgrade things, e.g. start using Visual Studio 2008 instead of Visual Studio 2005, now have to determine at what threshold is it worth it to upgrade everyone simultaneously which may be a big headache if there are more than a few developers. For example, over the past 10 years when would there be IDE changes, framework changes, etc.?
Exceptions to the standards. Could a contractor bring in something not used in the organization if they believe it helps them build better software, e.g. Resharper or other add-ons that some contractors believe are very worthwhile that the organization doesn't want to spend the money to get? What about legacy systems that may make the standard become a bit unwieldy, e.g. this was built in ASP.Net 1.1 and so everyone has to have VS 2003 installed even if most will never use it?
Just my thoughts on this.
There are several good reasons to standardize.
First, it allows the enterprise better organizational flexibility, if everybody is more or less familiar with the same things. It also allows people to help each other better. I can't help with problems in the ASP.NET stuff, and there's not all that many people who can help me on the C++ side.
Second, it reduces support problems and expenses. Oracle and SQL Server are both decent products, but using both for similar functions is only going to cause problems. Not to mention that I've been in shops using several widely different platforms to do similar things, and it wasn't fun.
Third, there are some things that just have to be standardized. We couldn't operate half with VS 2005 and VS 2008, since we keep project files under source control. We had to pick a time and convert over.
Fourth, in some businesses, it simplifies the regulatory problems. I don't know what business you're in. I work at a place where we can get away with making mistakes right now, but I've also contracted at a bank and a utility, where it's necessary to be able to show auditors that everything is going in a standard way.
Fifth, it can simplify procurement, if you're dealing with software that costs money.
This doesn't particularly limit us, since if there's something we need that isn't standardized on we just go ahead and get it or do it.
If you want to make a business case against standardization, you'll need to have a business-related argument. Your argument seems to be that you won't be able to implement features the user wants, and that is a consideration. Got another argument?
There's nothing wrong with standardizing on an IDE that is rich enough to be configured for individual developers.
However, do make sure that you don't prevent individual developers from using additional tools, as long as the tools are licensed and that the use of the tool by one developer doesn't require all other developers to use it.
For instance, I happen to use NORMA to help me design databases. The output is SQL Server DDL (or anything else I want). I can make the DDL part of the project without making my NORMA source part of it. Later developers do not need to use NORMA to work on the project.
On the other hand, if I decided to use the Configuration Section Designer to create configuration sections, then future developers would also have to use it. A decision would need to be made about whether to use that tool.
The company I work for uses C#, ASP.NET, JavaScript and generates HTML. The advantages over and above those mentioned above are that there is a perception of improved velocity for maintenance and adaptive changes. The disadvantages include generating some boredom for people who are technically savvy (geeky) and prefer to use a mix and match of languages, depending on what they fancy is better suited, or for 'performance reasons'.
Technical and personal supervision is always good to have when you are developing as fast as you can to meet tight deadlines and competing in a highly saturated market for web development.

Is programming knowledge necessary for an user experience designer?

We have a user experience designer in our team who has no programming background. He is expected to design screens within Eclipse as a development environment. His (valid) complaint is that every time he designs a specific screen and gives it to development - they tell him what is not possible technically using either SWT or GEF. So, he wants me to teach him basics of SWT/GEF so that he can make informed decisions and maybe even try out certain things in eclipse (as opposed to using Photoshop) before proposing designs to save time.
My personal belief is that design should not be constrained by technical possibilities and in theory, everything that the designer dreams of (at least the practical things) should be possible technically - albeit with workarounds or a little hacking.
So, my question is this - how important do you think is programming knowledge for user interface design? And if it is, how do you go about teaching someone with absolutely no programming experience the graphical frameworks on various platforms?
In principle, I agree with you. Programming knowledge shouldn't be necessary to be a skilled designer of UIs and work flows. However, knowing the abilities and limitations of the technology in use can help a UI designer work more effectively with the programming staff.
Where programming knowledge can help is if the development staff is blowing smoke that something cannot be done when it can be, some knowledge of the tools being used can help refute that. If the development staff is correct that something cannot be done, then knowledge of the tools can help the UI designer find an appropriate solution that meets the design goals and is achievable.
With a properly cooperative development staff, the UI designer would need very little (if any) knowledge about the specific GUI tools being used.
I've been on the developer side of this where I was being asked to do something impossible or impractical. I always worked with the designers to find a happy middle that met the design goals. Sometimes what I thought was impossible was in fact possible. Sometimes we had to do things a different way. A few things had to be put off as "possible, but too much effort." (Such as an SWT based application that became a Windows task bar. Definitely possible, but impractical for the project in question as it would require native code.)
What is most important is that both sides realize that they are on the same team.
Its very important..
Not knowing about:
technology in general
the technology you have chosen to dedicate your time investing into to produce the end product
Will result in a complete waste of time for everyone..
Even the end user needs to learn a bit about the technology employed in able to use whatever product we make..
Someone who drives a car will always need to know how to fill in the gas and know the basics of what a car is and what it can do, software works in a similar fashion.
Its like asking someone who doesn't know that cars (the ones of today) need wheels to make a drawing of your next release model.
The way to make them more aware of the technology is:
Show him/her similar products to the ones you should be making
Show him/her stand alone implementations of the building blocks you might consider using
But by all means...this doesn't mean you should stifle their creativity..have them draw away what they dream, just make them that little bit aware of reality as needs be in order to have something done in this lifetime
So, my question is this - how important do you think is
programming knowledge for user
interface design?
I think a basic knowledge of the standard user interface for the platform is required (text fields, combo boxes, radio buttons, etc). A good designer should be familiar with the capabilities and limitations of these GUI components, from a developer's point of view. So I guess some basic programming knowledge would be useful.
My personal belief is that design
should not be constrained by technical
possibilities and in theory,
everything that the designer dreams of
(at least the practical things) should
be possible technically - albeit with
workarounds or a little hacking.
I think there are important qualifications here --- each OS has guidelines on what constitutes good GUI design, and it's beneficial for your product that you follow them because the user has a certain mental model of how he or she should interact with applications on that platform. (Having said that, there may be good reasons for breaking some design conventions, e.g. in games, specialized graphics/music applications.)
how do you go about teaching someone
with absolutely no programming
experience the graphical frameworks on
various platforms?
Each toolkit makes available a whole bunch of small sample programs to demonstrate the use of different components --- this is probably a good first step to acquaint oneself with them.
Is not as important as common sense in my opinion.
It helps of course. But if the designer is asking for something that could be done ( because some other application uses it ) the development lead should at least present a workaround.
Programming knowledge probably not, but limitations on the chosen platform certainly.
I think it's better to learn up front, but if your UI designer is forced to learn on the fly, make sure that each time he is turned away, it's explained why something can't be done rather than just a flat refusal. This will keep him from getting as frustrated as he might otherwise be because he'll be able to form at least some logical framework for what he can and can't do.
I think the designer should be aware of the features and limitations that the tools he's using offer, and he should be aware of the limitations and the deadline of the current project that those guys are making.
He should also be aware of the background processing that's going on to show the screen UI, and all these things will come only if he has some rudimentary knowledge of programming.
He doesn't have to dabble in the depth of OOP, learn SQL, know the intricacies of reflection or anything fancy like that. He just has to know his platform well, and that I think is a requirement even for the designers.
The very core of "design" is to find a way to achieve a desired result within the constraints. If you don't anything about a part of the constraints that affects your goal, then you can't design.
It all depends on your tools.
Edit: What I mean is there are tools that are designed for designers, and tools for programmers. Eclipse, for one is not a designer tool. Photoshop is. Flash maybe, Flex not. I wouldn't require a Flash designer to program, but a Flex designer does need to program.
As for telling them about the limits of your tools it depends, really good creative designers will embrace those limits and make incredible work, mediocre designers will perceive the limits as roadblocks and stop being creative and just following the rules with fear.
I have given it some thought and based on the answers given previously, i have reached certain conclusions:
A preliminary knowledge about what is possible and what are the constraints while designing on a given platform is mandatory. This means that the graphics designer should be aware of the following:
The basic design guidelines on that platform
The standard UI toolkit / widgets provided on that platform (for ex: textboxes, drop-down lists, etc)
What is not possible (or is too cumbersome) on a given platform (for ex: creating translucent modal dialogs while fading out the background in Eclipse)
This amount of knowledge might not require the designer to dabble in programming.
The second level is where the designer is making an attempt to either create new widgets or to (knowingly) go against the set standards for a given platform. For instance, if the design includes graphs or the need to depict special relationships or a unique combination of text, graphics and images that is not implicitly supported by any standard toolsets. In this case, the designer should be aware of the technical possibilities and the limits of a given platform. In this case, i would argue that the designer should be able to write a little code and try out a few things to ascertain what might be within the realms of possibility.