Related
I am hearing this phrase for a long time. I read few articles also still I am not able to understand what does it actually mean. I always see they give some framework name. But I want to understand what it means and why it came. Can anyone help me here?
Framework agnostic in general means exactly that i.e. agnostic or independent of any framework.
Since you have not mentioned the context here I assume that you want to know about it from the point of view of JavaScript and front end web development.
To give you an example, companies or product teams often deal with the following priorities
Often in large companies there are multiple teams working on various modules or sections of the front end of their product. These teams are often autonomous and self steering meaning they choose their own Javascript framework to do front end development.
However, being a company you would want to make sure the user experience is same across all the different modules and sections of your product irrespective of what each team chooses to work with.
In order to balance these two competing priorities, the concept of framework agnostic web component libraries comes in handy. To make it work as a company you encourage your team to develop a library of web components that are independent of any framework such as Vue, Angular , React. The teams can then use these components interchangeably no matter what framework they use.
Here are some links that should help
https://dev.to/stefannieuwenhuis/3-reasons-why-i-went-framework-agnostic-and-why-you-should-do-that-too-2o37
https://micro-frontends.org/#the-dom-is-the-api
https://j11y.io/javascript/a-framework-agnostic-model/
xyz is 'Framework agnostic' simply means that xyz does not depend on any framework. It is a great and much required idea that focuses on building libraries/components which are not dependent on any specific framework for their implementation, rather to develop a generic stuff to cater everyone.
Here is a brilliant article to know more about the core idea behind it.
https://micro-frontends.org/#the-dom-is-the-api
Update: This question was inspired by my larger quest for mapping ontologically the whole software systems architecture enchilada. I've written a blog post about it, and hopefully it will help clarify what I'm after.
Many, many, many frameworks and stacks that's event-driven have too much variation for my little head to get around. Is there somewhere some resources that defines the outline of a reasonable Application Event Model, what events there are, and what triggers are most common?
I've got my own framework with a plugin and event-driven architecture, but I want to open-source it, and as such would like to make it closer to some common ground as not to alienate people.
So to clarify; this is for an application, meaning setting up the environment, the dependencies, the data sources (like databases), and being a MVC framework setting up the model, the view, launching controllers / actions, and in the GUI various stages of the interface (header, content, columns, etc.).
Ideas? Thoughts? Pointers? (And I've made it language and platform neutral at this point)
I read your blog entry, which btw I found an extremely interesting read, but... this question does not seem to reflect the broadness of the issue you are presenting there.
What you are after is very abstract and theoretical. What I mean to say is that if you tie any of those ideas to actual technology you will find yourself 'stuck' with it. This is why many of us are reluctant to use any framework. Especially the 'relabeled' products suddenly claiming to conform to the trend. We choose mainly on the basis of what appears to be needed to reach a predetermined result.
Frameworks (or tools in general) that target the application architecture domain distinguish themselves primarily by the amount of responsibility they are designed to take on. Spring for example only deals with the concept of decoupling and is therefore easily adopted and useable in many situations. The quality of any framework is expressed in terms of how well the designers of such frameworks were able to keep their products within the boundaries of that responsibility. Some front-to-end products will do exactly the opposite, code generators being among the 'worst' of them.
To answer your question at the top of this page, I do not think there is a framework that does what you want at this time and I do not think there is a single model of how applications (should) work. Keep in mind though that the application architecture domain deals with technology more than it does with concepts. In other words: If it works and meets the requirements, then you're pretty much done.
That said, you might find something of value in agent-based systems.
Heh. Most developers pick the major framework they like the tools for and stick with it. That's usually the winning strategy. I sympathize with your desire not to marry a single vendor.
Keep in mind however, that in developing your own framework, you're going to end up tied to a single vendor anyway. :-)
Is there somewhere some resources that defines the outline of a reasonable
Application Event Model, what events there are, and what triggers are most common?
I don't think so.
From what I see, there are two kinds of models out there: those with a real framework with which you can make a working data entry dialog, and abstract meta-meta-models that are optimized for modeling themselves.
Try surveying a few current frameworks that have good documentation online and cross-reference the major terminology in a spreadsheet. It's an interesting exercise.
I'd have a look at Spring for Java, and the XT Framework Spring module (http://springmodules.dev.java.net/docs/reference/0.9/html/xt.html), which apparently supports event-driven architecture, as starting points. Spring has an MVC framework (inc. convention-based routing to controllers), db configuration (for Hibernate, particularly), plus full dependency injection support. There's also a mechanism in Spring for modularising your web apps, called Spring Slices. And it can be integrated with Jersey for building RESTful apps.
(Unfortunately, I tried to provide links to everything, but this place only lets new users post a single link. So you'll have to do some googling :) )
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
The organization that I currently work for seems to be heading in the direction of dictating to software developers which tools, languages, frameworks, etc. must be used. However, nobody has convinced me that this is a good thing. The main argument I have heard is that it will make training easier. But, after developing software for over 10 years, I've never relied on training to learn how to use an IDE, programming language, or anything else; so I just can't relate.
With the rapid speed at which technology evolves, and the s-l-o-w-n-e-s-s at which I know the standards will adapt, I am concerned that my customers will have requirements that I won't be able to easily implement or won't be able to implement as efficiently as I should. For example, if there is a UI requirement for an auto-complete feature in a web app, and no API has been approved for this yet, I would need to implement auto-complete myself as opposed to using one of the many APIs that provide it out of the box.
A more radical example is if my customers wanted to have Google Wave features. In that case I would want the flexibility of configuring my development environment (including the IDE) and selecting appropriate frameworks (ex: GWT) to use.
Please provide feedback on whether or not you think that software developer tools, languages, etc should be standardized and a few points to support your argument.
There is a lot of benefit for standardization. My organization has fairly set standards on what technology we will use. We realize strong benefits in the following areas ...
Hiring. It is easy to describe what technologies we are looking for and make sure our recruiters are looking for the right people.
License/Software costs. I can buy enterprise licenses easily. It gives me the opportunity to keep costs down by letting me spend more with a smaller number of vendors and thus get more leverage.
Consistency of delivery. Our teams have a very good idea of what projects will take to build, rollout and maintain because they have done it with success before (and they know the pitfalls too).
Agility. I can have one team take over for another or one individual take over for another more easily because of standardization.
Quality. We have peer reviews across teams as well as QA across teams.
Without a consistent use of a technology stack, tools, languages and frameworks, these types of benefits would be more difficult to realize. I am not closed off to new technologies, but there has to be a concrete reason beyond "what if I want to ..."
A major issue with standardization is that once standards are out there, they get stamped in concrete and are difficult to change. This is why our corporate IT environment is stuck on IE 6, and the best change control system we have access to is CVS. Given this situation, some developers break the rules, and some find jobs at more innovative companies.
You have a mixed bag here.
I wouldn't standardize on IDEs, because every developer works differently. Those who are insanely proficient in emacs may see their performance suffer if forced to use Visual Studio. I optimize my Visual Studio experience with a 30" monitor and find it incredibly productive.
However, standardizing on some tools, such as SCons or make or something to build products is perfectly reasonable.
Banning some libraries and having a process where new libraries are either approved or not is also very reasonable. I know lots of companies that ban boost, or JQuery, or ban open source libraries in general, etc. And they had good reasons for doing it. I know I got fairly upset when an intern incorporated some random "security" library he found on the internet without running it by anyone.
In the end every company is different. You have to be standardized enough to avoid serious complications and issues as people come and go, or as new products are formed and organizational structures change. But you have to be flexible enough to avoid re-inventing every wheel you need.
The important thing is to have clear reasons for adopting a certain tool or banning some other tool or library. You can't just have management dictate that thou shalt use this and not that without consulting the engineering team and making the decision for good reasons. And once decisions are made those reasons should be written down and clearly communicated.
And also, if, in the end, your favorite tool or library isn't adopted, please don't whine about it. Be adaptable and do your job, or find a new one that makes you happier.
I once worked for a manager who felt the need to innovate at every level of his software development operation. Every development tool had to be cutting edge (preferably in beta). Many of the tools he asked us to use didn't have good documentation, and training was not available. Ultimately, most of the technology we tried simply didn't work. We wasted a lot of time churning through new technologies, only to dump them when it became clear we couldn't make progress.
I tried to make the case that innovation is perfect in the area where your value proposition lies. Innovation can also be used judiciously where standard techniques fail. But for most mundane tasks, using tried-and-true tools and methods should be the default. Less risk, less cost, less management attention needed. So you can focus time and energy on the areas where innovation has the most benefit.
So I think standardization has an important role. But blindly saying everything must be standard is just as sure to fail as my manager who thought everything must be innovative.
The number one argument in favor of standardization is that it maximizes the ability of the organization as a whole to use a common body of knowledge. Don't know how custom web controls are built in ASP.NET/C#? Ask Bill down the hall who has the knowledge. If you use different tools, such organizational wisdom is cut off at the knees. While it is not good to be restricted to a least common denominator (and hopefully your management will realize this) you should not overlook the benefits of shared experience!
UPDATE: I do not agree that innovation and standardization are polar opposites. Indeed, would we have nearly the level of web innovation if we still had the mishmash of networking standards characteristic of the 1980s? No we would not. Of course, we might have more innovation on new low-level networking protocols but is that really worth it? In its place, we've had an explosion of creativity within the bounds of TCP/IP and the Web standards (http, html, etc.)
The trick is knowing how to standardize without using it as an argument for closing down all new exploration. For example, we use only ASP.NET/C#/SQL Server in my company but I'm perfectly open to the use of new tools within this framework (we recently adopted the DevExpress reporting package, for example, supplanting the earlier standard).
Standardization is a must for a productive development team. However that doesn't mean that you can't revist the standards from time to time to adjust them to new technologies and trends.
Whether you develop operations software for internal clients, or products for external clients, there is no compelling reason not to standardize. You certainly did not give one.
Had you seen how companies are struggling with holding heterogenous products together that have been maintained for 10 years or more, and are now a conglomerate of various technologies that developers at some point thought made sense, you would not have asked this question.
From the top of my head, I could name at least 2 well-known software companies that will be driven out of business because their cost of maintenance has become so high that they can no longer compete (but I won't).
I think the misconception here is that suppressing individualism would supress innovation. That is simply not true. It is poor technical leadership that suppresses innovation.
One unpleasant consequence of standardization is that it tends to stifle innovation.
Innovation is scary. It involves cost and risk.
Standardization is not scary. It reduces cost and risk in the short term. Until your competitors have created a game-changing innovation. Then standardization is very costly.
It depends on the organization I think. One like Microsoft, yes, there should be a bit of a standard. A small business with one IT department, no. A larger business with several offices around the world ... maybe.
it all depends :-P
Assuming the organization has a broad suite of enterprise applications to manage, I'd say no for the following reasons, though I may be taking the message of everything being the same a bit too literally:
Compromise on using best-of-breed for systems, e.g. if all the databases are to be MS-SQL then any Oracle DB solution is thrown out. This would also apply to the fact that everyone using an IDE has to use the same one whether they be doing Data Warehouse report development, web applications, console applications or winForms. I'm thinking of systems like ERP, CRM, SCM, CMS, SSO and various other TLAs, FLAs, and SLAs. (LA = Letter acronyms for a decoding hint if you need it)
Upgrading by committee is another interesting issue. Where if each team can choose their tools and have one person that decides it is to upgrade things, e.g. start using Visual Studio 2008 instead of Visual Studio 2005, now have to determine at what threshold is it worth it to upgrade everyone simultaneously which may be a big headache if there are more than a few developers. For example, over the past 10 years when would there be IDE changes, framework changes, etc.?
Exceptions to the standards. Could a contractor bring in something not used in the organization if they believe it helps them build better software, e.g. Resharper or other add-ons that some contractors believe are very worthwhile that the organization doesn't want to spend the money to get? What about legacy systems that may make the standard become a bit unwieldy, e.g. this was built in ASP.Net 1.1 and so everyone has to have VS 2003 installed even if most will never use it?
Just my thoughts on this.
There are several good reasons to standardize.
First, it allows the enterprise better organizational flexibility, if everybody is more or less familiar with the same things. It also allows people to help each other better. I can't help with problems in the ASP.NET stuff, and there's not all that many people who can help me on the C++ side.
Second, it reduces support problems and expenses. Oracle and SQL Server are both decent products, but using both for similar functions is only going to cause problems. Not to mention that I've been in shops using several widely different platforms to do similar things, and it wasn't fun.
Third, there are some things that just have to be standardized. We couldn't operate half with VS 2005 and VS 2008, since we keep project files under source control. We had to pick a time and convert over.
Fourth, in some businesses, it simplifies the regulatory problems. I don't know what business you're in. I work at a place where we can get away with making mistakes right now, but I've also contracted at a bank and a utility, where it's necessary to be able to show auditors that everything is going in a standard way.
Fifth, it can simplify procurement, if you're dealing with software that costs money.
This doesn't particularly limit us, since if there's something we need that isn't standardized on we just go ahead and get it or do it.
If you want to make a business case against standardization, you'll need to have a business-related argument. Your argument seems to be that you won't be able to implement features the user wants, and that is a consideration. Got another argument?
There's nothing wrong with standardizing on an IDE that is rich enough to be configured for individual developers.
However, do make sure that you don't prevent individual developers from using additional tools, as long as the tools are licensed and that the use of the tool by one developer doesn't require all other developers to use it.
For instance, I happen to use NORMA to help me design databases. The output is SQL Server DDL (or anything else I want). I can make the DDL part of the project without making my NORMA source part of it. Later developers do not need to use NORMA to work on the project.
On the other hand, if I decided to use the Configuration Section Designer to create configuration sections, then future developers would also have to use it. A decision would need to be made about whether to use that tool.
The company I work for uses C#, ASP.NET, JavaScript and generates HTML. The advantages over and above those mentioned above are that there is a perception of improved velocity for maintenance and adaptive changes. The disadvantages include generating some boredom for people who are technically savvy (geeky) and prefer to use a mix and match of languages, depending on what they fancy is better suited, or for 'performance reasons'.
Technical and personal supervision is always good to have when you are developing as fast as you can to meet tight deadlines and competing in a highly saturated market for web development.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 10 months ago.
Improve this question
I'm a new software architect/lead, coming up with software design for a team of software developers. I'm coming up with the requirement spec, interface header files, and visio software design docs, and build plan, etc.
My question is: what do the rest of the team do during this period? I'm certainly engaging them in the design, but we dont need the whole team actively working on what I'm doing all the time.
Are there any good books for new software architect?
Generally the various stages overlap, so there will be some coding during design etc. There are a lot of things to do besides that. They can be reviewing unfamiliar technology that is going to be used, setting up source control system, reviewing business requirements, reviewing your documents to make sure they make sense and are clear. There is a lot of other work to be done besides programming.
What a software team does while the lead does the design is very different from company to company. On my company we try to work on the design while the developers are finalizing other projects or solving bugs.
Another approach that I've taken when starting a whole new project is to get the developers to work on the design as well - people with a good understanding of the requirements can help you designing smaller parts of the system and writing the specs for them. Others can work on mockups, frameworks. This worked rather well for the small software team I led in a previous job (4 developers in total).
I also found it useful to have other team members research parts I'm unsure of (or even validating that things I think should work will indeed work), such as:
Investigating whether an external API provides the features we need
Writing a small proof of concept or technology demonstrator
Create an API mockup (header file, interface or REST endpoint) to investigate whether the API looks useful.
As other have said, you typically want a ramp-up period during the first part of the project, and through the first iteration. You're planning on building this iteratively, aren't you? Start with a core team (nor more than 3-4 people, since you're going to need to communicate heavily with each other) to help you explore the requirements, get a basic data model in place, identify and setup any frameworks, identify and setup build and test tools. Some coding activities typically take place in the design phase: for UI mockups, run-ahead prototypes of technically sensitive areas (whatever risks you have should be mitigated by explirative coding: be they new technologies, undocumented interfaces to integrated systems, or unstable requirements).
But coders in the design phase should help with the design, in order to get their buy-in, and to help train up the rest of the team during the first iterations. Your role during this is to ensure that the major nonfunctional requirements (e.g. are known, prioritized, are met by the design, and can be tested). You should also collaborate with the project lead or whoever else is responsible for staffing and financing in order to sketch out the iterations and the staffing levels needed. Ensure the solution can be built iteratively, and aim at implementing only a basic structure during the first iteration, both to build confidence, and to eliminate risks. (Sometimes, you can push major risks to the second iteration, and focus the first towards confidence and team building.)
And of course, be sure you are not designing every detail. You should be able to use every design artifact in the next iteration (and elaborate them later as needed). Since design decisions are expensive to change, try to postpone them. However, some influence the entire solution (for instance, the data model, or your approach to security) and absolutely must be at least outlined up front. This isn't waterfall. This is just not closing your eyes and hoping a viable architecture will emerge by magic.
But design proceeds throughout the iterations. It's just that you do less of it as you go along, and with lesser impact on the solution (unless you're unlucky... and then things get expensive).
Stop doing the useless things you do and just start coding with them! ;)
If there is no overlap with another ongoing project, getting them involved as you're doing is great, maybe push it a little further by having them prototype and present the plus and minus of alternative technologies (APIs, frameworks, libraries, etc...) that your project could use.
As a new software architect, I can recommend some books that helped me understand the role of the architect (but of course not to master it):
Fundamentals of Software Architecture An Engineering Approach:
This book gives good modern overview of software architecture and its many aspects, good place to start if you are a beginner or broaden your knowlage.
Software Architecture in Practice:
Explains what software architecture is, why it's important, and how to design, instantiate, analyze, evolve, and manage it in disciplined and effective ways.
Software Architect's Handbook:
This book takes you through all the important concepts, right from design principles to different considerations at various stages of your career in software architecture. It begins by covering the fundamentals, benefits, and purpose of software architecture.
Clean Architecture: A Craftsman's Guide to Software Structure and Design:
Learn what software architects need to achieve and how to achieve it, master essential software design principles and see how designs and architectures go wrong.
Software Architecture: The Hard Parts:
An advanced architecture book, with this book, you'll learn how to think critically about the trade-offs involved with distributed architectures.
Usually there's another project they can work on, but...
I have my team review the project specs/requirements and put together a basic/preliminary structure to get them already thinking through the application and working out specific questions.
When we convene at the table to discuss the plan they already have an idea of what the project is and requires and in some cases, they present questions I may have missed or overlooked.
Although it's too late now, a good way to approach it is to move the architect over before his current project has ended. Start freeing him up at like 25% then work your way up to 75-100% on the new project a month or two before it starts (maybe more depending on how much analysis and customer interaction there is).
On a trivial project (let's say 2 man-years) it might not be necessary, but anything bigger than that can end up in chaos if somebody doesn't at least get the analysis right before everybody jumps aboard.
If your team does not have any other projects to work on, ask experienced programmers of your your team to come up with at prototype so that you can create a requirement doc according to the needs of the client.
Also programmers novice to the technologies being used in the team could utilize this time to familiarize themselves with the technologies on which your team is going to develop the project.
architect != designer
Chances are that all of your developers can help with the design; let them. Architects don't have to be "lone wolves" and do everything themselves. You lay out the guidelines and the principles and the scaffolding, rough in the wiring, and let your developers flesh out the details - whether it is drawing Visio diagrams or building prototypes to mitigate unknowns/risks.
Migrate towards Agile/XP and away from waterfall methods, and you'll find the team a lot more help.
When making the general design, it's very handy to have programmers create proof-of-concepts. Do that especially with parts of the system that could end up being show stoppers if they don't work in the way you plan to do them, so you can think of alternatives, and adjust the design.
That's going to help you to make the right design-decisions before moving entirely into a certain direction.
Just doing a design, and then moving on and start coding is a sure way to mess up a project. You won't realize that your design is not feasible (or just plain sucks) until you're half-way coding, and by then it's too late to make radical changes.
You'll waste time mitigating non-existing problems during the design, and you'll run into unforeseen problems during implementation.
I need to create a internal website and can't figure out if we should be writing our own, or using an existing framework.
Most of the website will essentially be a front end to a database. We need to have a number of people enter data into forms. We then want to be able to show different views of all this data -- including running small queries (e.g. how many resources do we have with attribute 'X'). As is usually the case with this, we will want to tweak the UI on a regular basis.
There actual data design is not a simple 1:1 mapping of resource to entry. For example, we might track several attributes for one item as the "base set of data" for that item. Then we could have several additional sets of data.
Imagine a recipe application. You might have a recipse for a starter. This could then be referenced by several other recipes that need that same information.
I feel like this is best suited for a general framework (Ruby on Rails, Django, etc), but I wonder if it might not be good for a "traditional" CMS platform like Drupal? I specifically mention Drupal since the people that would develop this have the most knowledge using php and MySql.
I usually lean towards wanting to use an existing platform, but am interested in other people's thoughts. To give you an idea of scope, I would imagine if we wrote this from scratch we are probably talking about 3-5 weeks of development.
Would you recommend writing our own, or using an existing framework? If you would suggest using something that exists what would you recommend?
Would you consider this to be best suited for a straight framework or a straight CMS?
Thanks!
It's possible that Drupal will be a good solution for you, though you'll probably need a few key additional modules like the "Content Creation Kit" (CCK) and "Views".
Unlike other web CMS systems (WordPress, Exponent, phpNuke), Drupal treats your entries as a "pool" of content, from which you pull various subsets for different areas of your site.
There is a lot of documentation for Drupal (almost too much), the biggest problem is finding the piece that's relevant to what you're trying to achieve. Diving on to one of the interactive IRC channels can be a good idea, as the community is quite helpful and is almost always willing to give you a pointer in the right direction.
The power, flexiblity and capability of Drupal is both its biggest strength and weakness - I know it took me a bit of effort to get my head around key concepts, and I'm far from being a Drupal Expert.
One last comment: Having written my own CMS from scratch, which I abandoned in favour of Drupal, I'd suggest your 3-5 week estimate is likely on the light side.
Stay away from Drupal for any site that requires customized functionality. I recently used Drupal for a website at work, and it was VERY difficult to figure out how to get it to do what I wanted it to do. There is a lot of documentation out there, but all of it is unhelpful -- it answers very specific questions about specific issues but does not provide any context as to how you would approach building the site as a whole. If you're a programmer, using a more general framework will probably work better, as CMS's are designed for a specific kind of site, and if you want your site to have non-standard functionality you are going to be fighting the system instead of working with it. If your developers are most experienced in PHP, try one of the PHP frameworks that mimics the architecture of Rails -- e.g. cakePHP or CodeIgniter.
CMSes usually make sense when you have a broad and potentially expanding array of different content types and modes you need to handle. Drupal has literally dozens. Given than you mentioned RoR, it sounds like what you need is more of a MVC style framework. Maybe similar to the sort of thing stackoverflow was built with. .NET an issue for you?
If you are really limited to 3-5 weeks, however, I think a Rails-based strategy makes sense so go with RoR or CodeIgniter
If Drupal can do what you need easily I would say go with Drupal. I don't know much about Drupal though.
Otherwise, what you describe sounds like a data driven web app or more like a reporting app. It sounds like you might have some very specific needs or that users might want very specific needs in the future. That is something hard to get from premade software since you have no idea what users are going to request. Since I'm a programmer I would probably want to build it myself.
Funny you should ask... I just came across this in SD Time's Linkpalooza this afternoon:
Ten free powerful content management systems…
There are at least 4 more mentioned in the comments to this post.
It seems to make little sense to develop a new one with so many from which to choose!
BTW, this is neither a recommendation nor endorsement of any particular CMS.
Treat Drupal as a framework. Core modules + CCK + Views is a good start to build on.
If you're doing something that you might want to expose to other applications, consider the Services module. A lot of interesting things have been done with flex frontends connected to drupal running services with amfphp.