I'm working on establishing automated testing practices and test suites in an organization. A peer is telling me that we "should use a framework". To me, a framework is any set of code and/or other tool that helps you create something.
My peer seems to be suggesting that there are industry standard automated testing frameworks.
I've seen the following patterns in designing Test systems before:
Data Driven
Keyword Driven
Model Driven
Query Driven
My counterpart includes "Modular" as one of these. Because of my background in Software Engineering, I hear the word "Modular" and think of modular programming (as opposed to object-oriented, aspect-oriented or procedural programming)... a way of organizing code, not a methodology or framework type in and of itself.
I've seen the wikipedia definition for "Modular Automation" and it looks the same as the programming paradigm.
What am I missing? What can I read to get on the same page as my counterpart? Is it me or him that doesn't understand something? I have over a decade of software engineering experience, my counterpart has been in QA for nearly that long. He's not able to site references. I've searched the google for 6 hours now trying to learn about this "Modular Framework" and can't find a technical example and nothing more than the standard programming paradigm (e.g. organize code into modules).
It turns out the major industry-standard designs for automated testing are:
Data Driven
Keyword Driven
Model Driven
Query Driven
Additionally, "hybrid" approaches are used. These are approaches in which more than one of the above designs are used.
In a number of places on the web (including wikipedia) "Modularity Driven" test case design is mistakenly referred to as if it were one of the automated test case design strategies listed above. The definition of this mistaken term ("Modularity Driven") appears to have more to do with the organizational aspects of coding than the way in which One drives an automated test. "Modularity Driven" automated testing is a misnomer (or mistaken term altogether). In other words, there is no such thing. The term "modular" describes the programming paradigm being used.
The modular aspect of a test is in its organization, storing code in modules as opposed to other programming paradigms like OOP, or Procedural, etc.
I have heard of Modular Automation also referred to as Component Based Test Case Design. HP is a big player in this space. The came up with a Product that is called Business Process Testing.
It consists of:
•Reusable Business Components
•Business Components converted into Business process test
Business components are reusable units that perform a specific task in a business process. (for example – Add to Cart)
A business process tests is a scenario comprising of business components (for example - Place an Order)
In HP's Quality Center the Business Components module enables you to create and manage reusable business components.
Then the Test Plan module enables you to drag and drop the components into business process tests, and debug the components.
Update: This question was inspired by my larger quest for mapping ontologically the whole software systems architecture enchilada. I've written a blog post about it, and hopefully it will help clarify what I'm after.
Many, many, many frameworks and stacks that's event-driven have too much variation for my little head to get around. Is there somewhere some resources that defines the outline of a reasonable Application Event Model, what events there are, and what triggers are most common?
I've got my own framework with a plugin and event-driven architecture, but I want to open-source it, and as such would like to make it closer to some common ground as not to alienate people.
So to clarify; this is for an application, meaning setting up the environment, the dependencies, the data sources (like databases), and being a MVC framework setting up the model, the view, launching controllers / actions, and in the GUI various stages of the interface (header, content, columns, etc.).
Ideas? Thoughts? Pointers? (And I've made it language and platform neutral at this point)
I read your blog entry, which btw I found an extremely interesting read, but... this question does not seem to reflect the broadness of the issue you are presenting there.
What you are after is very abstract and theoretical. What I mean to say is that if you tie any of those ideas to actual technology you will find yourself 'stuck' with it. This is why many of us are reluctant to use any framework. Especially the 'relabeled' products suddenly claiming to conform to the trend. We choose mainly on the basis of what appears to be needed to reach a predetermined result.
Frameworks (or tools in general) that target the application architecture domain distinguish themselves primarily by the amount of responsibility they are designed to take on. Spring for example only deals with the concept of decoupling and is therefore easily adopted and useable in many situations. The quality of any framework is expressed in terms of how well the designers of such frameworks were able to keep their products within the boundaries of that responsibility. Some front-to-end products will do exactly the opposite, code generators being among the 'worst' of them.
To answer your question at the top of this page, I do not think there is a framework that does what you want at this time and I do not think there is a single model of how applications (should) work. Keep in mind though that the application architecture domain deals with technology more than it does with concepts. In other words: If it works and meets the requirements, then you're pretty much done.
That said, you might find something of value in agent-based systems.
Heh. Most developers pick the major framework they like the tools for and stick with it. That's usually the winning strategy. I sympathize with your desire not to marry a single vendor.
Keep in mind however, that in developing your own framework, you're going to end up tied to a single vendor anyway. :-)
Is there somewhere some resources that defines the outline of a reasonable
Application Event Model, what events there are, and what triggers are most common?
I don't think so.
From what I see, there are two kinds of models out there: those with a real framework with which you can make a working data entry dialog, and abstract meta-meta-models that are optimized for modeling themselves.
Try surveying a few current frameworks that have good documentation online and cross-reference the major terminology in a spreadsheet. It's an interesting exercise.
I'd have a look at Spring for Java, and the XT Framework Spring module (http://springmodules.dev.java.net/docs/reference/0.9/html/xt.html), which apparently supports event-driven architecture, as starting points. Spring has an MVC framework (inc. convention-based routing to controllers), db configuration (for Hibernate, particularly), plus full dependency injection support. There's also a mechanism in Spring for modularising your web apps, called Spring Slices. And it can be integrated with Jersey for building RESTful apps.
(Unfortunately, I tried to provide links to everything, but this place only lets new users post a single link. So you'll have to do some googling :) )
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 10 months ago.
Improve this question
I'm a new software architect/lead, coming up with software design for a team of software developers. I'm coming up with the requirement spec, interface header files, and visio software design docs, and build plan, etc.
My question is: what do the rest of the team do during this period? I'm certainly engaging them in the design, but we dont need the whole team actively working on what I'm doing all the time.
Are there any good books for new software architect?
Generally the various stages overlap, so there will be some coding during design etc. There are a lot of things to do besides that. They can be reviewing unfamiliar technology that is going to be used, setting up source control system, reviewing business requirements, reviewing your documents to make sure they make sense and are clear. There is a lot of other work to be done besides programming.
What a software team does while the lead does the design is very different from company to company. On my company we try to work on the design while the developers are finalizing other projects or solving bugs.
Another approach that I've taken when starting a whole new project is to get the developers to work on the design as well - people with a good understanding of the requirements can help you designing smaller parts of the system and writing the specs for them. Others can work on mockups, frameworks. This worked rather well for the small software team I led in a previous job (4 developers in total).
I also found it useful to have other team members research parts I'm unsure of (or even validating that things I think should work will indeed work), such as:
Investigating whether an external API provides the features we need
Writing a small proof of concept or technology demonstrator
Create an API mockup (header file, interface or REST endpoint) to investigate whether the API looks useful.
As other have said, you typically want a ramp-up period during the first part of the project, and through the first iteration. You're planning on building this iteratively, aren't you? Start with a core team (nor more than 3-4 people, since you're going to need to communicate heavily with each other) to help you explore the requirements, get a basic data model in place, identify and setup any frameworks, identify and setup build and test tools. Some coding activities typically take place in the design phase: for UI mockups, run-ahead prototypes of technically sensitive areas (whatever risks you have should be mitigated by explirative coding: be they new technologies, undocumented interfaces to integrated systems, or unstable requirements).
But coders in the design phase should help with the design, in order to get their buy-in, and to help train up the rest of the team during the first iterations. Your role during this is to ensure that the major nonfunctional requirements (e.g. are known, prioritized, are met by the design, and can be tested). You should also collaborate with the project lead or whoever else is responsible for staffing and financing in order to sketch out the iterations and the staffing levels needed. Ensure the solution can be built iteratively, and aim at implementing only a basic structure during the first iteration, both to build confidence, and to eliminate risks. (Sometimes, you can push major risks to the second iteration, and focus the first towards confidence and team building.)
And of course, be sure you are not designing every detail. You should be able to use every design artifact in the next iteration (and elaborate them later as needed). Since design decisions are expensive to change, try to postpone them. However, some influence the entire solution (for instance, the data model, or your approach to security) and absolutely must be at least outlined up front. This isn't waterfall. This is just not closing your eyes and hoping a viable architecture will emerge by magic.
But design proceeds throughout the iterations. It's just that you do less of it as you go along, and with lesser impact on the solution (unless you're unlucky... and then things get expensive).
Stop doing the useless things you do and just start coding with them! ;)
If there is no overlap with another ongoing project, getting them involved as you're doing is great, maybe push it a little further by having them prototype and present the plus and minus of alternative technologies (APIs, frameworks, libraries, etc...) that your project could use.
As a new software architect, I can recommend some books that helped me understand the role of the architect (but of course not to master it):
Fundamentals of Software Architecture An Engineering Approach:
This book gives good modern overview of software architecture and its many aspects, good place to start if you are a beginner or broaden your knowlage.
Software Architecture in Practice:
Explains what software architecture is, why it's important, and how to design, instantiate, analyze, evolve, and manage it in disciplined and effective ways.
Software Architect's Handbook:
This book takes you through all the important concepts, right from design principles to different considerations at various stages of your career in software architecture. It begins by covering the fundamentals, benefits, and purpose of software architecture.
Clean Architecture: A Craftsman's Guide to Software Structure and Design:
Learn what software architects need to achieve and how to achieve it, master essential software design principles and see how designs and architectures go wrong.
Software Architecture: The Hard Parts:
An advanced architecture book, with this book, you'll learn how to think critically about the trade-offs involved with distributed architectures.
Usually there's another project they can work on, but...
I have my team review the project specs/requirements and put together a basic/preliminary structure to get them already thinking through the application and working out specific questions.
When we convene at the table to discuss the plan they already have an idea of what the project is and requires and in some cases, they present questions I may have missed or overlooked.
Although it's too late now, a good way to approach it is to move the architect over before his current project has ended. Start freeing him up at like 25% then work your way up to 75-100% on the new project a month or two before it starts (maybe more depending on how much analysis and customer interaction there is).
On a trivial project (let's say 2 man-years) it might not be necessary, but anything bigger than that can end up in chaos if somebody doesn't at least get the analysis right before everybody jumps aboard.
If your team does not have any other projects to work on, ask experienced programmers of your your team to come up with at prototype so that you can create a requirement doc according to the needs of the client.
Also programmers novice to the technologies being used in the team could utilize this time to familiarize themselves with the technologies on which your team is going to develop the project.
architect != designer
Chances are that all of your developers can help with the design; let them. Architects don't have to be "lone wolves" and do everything themselves. You lay out the guidelines and the principles and the scaffolding, rough in the wiring, and let your developers flesh out the details - whether it is drawing Visio diagrams or building prototypes to mitigate unknowns/risks.
Migrate towards Agile/XP and away from waterfall methods, and you'll find the team a lot more help.
When making the general design, it's very handy to have programmers create proof-of-concepts. Do that especially with parts of the system that could end up being show stoppers if they don't work in the way you plan to do them, so you can think of alternatives, and adjust the design.
That's going to help you to make the right design-decisions before moving entirely into a certain direction.
Just doing a design, and then moving on and start coding is a sure way to mess up a project. You won't realize that your design is not feasible (or just plain sucks) until you're half-way coding, and by then it's too late to make radical changes.
You'll waste time mitigating non-existing problems during the design, and you'll run into unforeseen problems during implementation.
Microsoft, of Cairo fame, is working on Oslo, a new modeling platform. Bob Muglia, Senior Vice President of Microsoft Server & Tools Business, states that the benefits of modeling have always been clear.
In simple, practical terms, what are the clear benefits that Oslo bestows upon its users?
In theory, there are a few benefits:
The people with the business knowledge can create the software models so you're less likely to lose anything in translation.
When non-technical shareholders create models, it forces them to "think like a developer". They see that what they considered obvious and easy is actually difficult when you formalize it.
It's more efficient. Business people have business knowledge and technical people have technical knowledge so, why not let each group design a system in their area of expertise? No more games of telephone as business experts re-explain what they mean to a developer. Developers are no longer distracted by cryptic business needs. They can focus on the interaction between highly technical systems.
In practice, it's a lot trickier:
Models are hard and that's that. Just because you push model creation to a different group doesn't mean you get foolproof models. Software development is all about modeling so developers are used to it. You may actually lose efficiency as a second group comes to grips with formalizing their understanding of a business need.
Model driven dev is tightly linked to OO concepts. OO is good for a lot of things, but not everything. What happens if what you really need falls outside the abilities of your modeling tool?
In my experience, the division between business and technical people is artificial. The most effective people are technical-minded business people or business-minded technical people. They make things happen. If you separate business tasks from technical tasks, you ruin the opportunity for cross-training and cross-thinking.
I think modeling is just about the next abstraction level. Once it is established it will lead to higher productivity.
MDSD Today - mostly in form of code generation - saves time. Duplicating working patterns for different parts of your software and only writing real business code manually boosts productivity a little bit, but most likely leads to better software quality and more clean architecture.
I think the short answer is research projects!
A good place to start though If you're keen to look into it more is Doug Purdy's PDC talk "A lap around Oslo" which you can see here. He explains how Oslo "captures the essence of the code without the ceremony",..whatever that means.
HTH.
Is CSLA.NET architecture is a good choice in an enterprise level application?
That depends on what you mean by enterprise level... what is the application? Is a desktop application, Web based? Middleware? A shared library?
CSLA has some good and bad points. I've asked a similar question. Look at that one and this one
My criticisms are that it is very verbose, and sometimes over complex and that it encapsulates the data layer within its classes meaning that TDD is tough with it.
However it was designed for business and gives you some brilliant funcitonality straight out of the tin. The book is well worth a read.