Evaluate the impact of upgrade SAPUI5 libraries - sapui5

I am an SAP Fiori developer and have been reading some days about the best way for planning an upgrade with the maximum level of guarantees to avoid unexpected errors.
I know that for evaluating the impact of changes in our applications, I have to read “What's New” and do an accurate read and analysis of "Changelog."
My idea is to create a human step by step procedure, and if we do all the steps the impact will be evaluated with a very high percentage of coverage. Because I have assumed that there isn’t an automatic process for evaluating that, is it true?
We have a table with which controls and components are in every application and view/controller for evaluating the “direct” impact of upgrades.
Table
My doubt is how I can be sure about if a fix could generate a wrong behavior. I will explain it with an example: “1.71.21 - [FIX] format/NumberFormat: parse special values ‘00000’”, analyzing it, I know that in sap.ui.comp.smarttable.SmartTable is using “Number format” for displaying the num. records in the header title but with the API is impossible to have the knowledge. This is only an example but reading the "Changelog," a lot of doubts like this appear and are also and more complicated to associate with.
To give you more info, I have thought to use the CDN with the new version, but this could position us in a scenario where we should do manual testing of everything and look for errors, warnings, and wrong behavior.
How did you analyze an upgrade before doing it, and how did you avoid doing human testing of everything, with the risk of forgetting things? Are you using some tool?
Thanks in advance
Best regards

Upgrading the library carries a comparable risk of introducing defects as regular development. Therefore, conventional quality assurance guidelines will apply to an upgrade: setting up and doing your tests manually or automatically.

Related

Is it better to have 2 simple versions of a software or 1 complicated version?

This is a simple question. Is it better to have 2 versions of code or one? One would be an engineering version that is not V&V tested with lots of functionality the other being a stripped-down customer version that is highly regulated and highly tested.
I have a simple problem, my code base is ballooning as new engineering features are demanded. However, the core customer code is extremely simple. The engineering portions sometimes cause unexpected bugs in the customer code. Also, every engineering bug has to be debugged and treated the same as a customer one which gives longer lead times for releases.
I see it as good and bad to split. First, it would allow engineering functions to be quickly made and released with no need to V&V test the software. The second advantage would be fewer and higher quality releases of customer-facing software. However, it would mean 2(smaller) codebases to maintain.
What is the common solution to this problem? Is there limit to which you decide it is time to break into two versions? Or should I just learn even more in-depth organizational techniques? I already try my best to follow the best organizational tips for the software and follow OOP best practices.
As of now, I would say my code base is about 50% customer software and 50% engineering functionality. However, I just got a new (large) engineering/manufacturing project to add to the software.
Any experience would be appreciated.
TL;DR: Split it.
You can use either of the models that you mention. It's about how you see it grow in the future and how it is presently set up.
If you decide to go the one large code base route, I would suggest that you use the MVC Architecture as it effectively sorts one large code base into smaller more manageable parts. This allows for a layer of abstraction for the engineers where they can easily narrow down a issue to it's own team. This allows for one team to not have to worry about how the other implemented a feature; is: the Graphic team does not need to worry about how the data is stored on your servers.
If you decide to go with the multiple code bases, you add an additional layer of "difficulty". You now have to maintain "twice" as much code. You will also need to make sure that the two sources play nice with each other. The benefit of doing it this way would be that you completely isolate the engineering version from your production version.
In your case, since you mention that the split is currently 50/50 between the 2 versions, I would recommend that you split the customer and the engineering version. This isolates the engineering code that is not tested form the customer code which is properly tested. This does give you "more" code to control but this issue can easily be mitigated by having 2 teams, each responsible for their own version.

Should I use multiple knowledge bases?

We have a system that will have tens of thousands of units. Each unit will have 5-10 meters and each meter will have a value associated with it. The values of these meters change and we need our rules engine to be able to respond to these changes in realtime.
We will have rules of the sort "if the first meter from unit #1 is greater than 10 and the second meter from unit #1 is less than 30 then ...", although the rules may get much longer than this. The rules for each unit will be completely independent, so there will be no rules that condition on the values of two different meters from two different units.
We'll have about 30 rules that are the same for every unit, and then each unit will have about 5-15 custom rules. These rules will need to be added dynamically while the rules engine is running. It's likely a unit will add 5-10 rules right when it signs up and then add or a remove a rule something like once a week from that point on.
We decided to use Drools for this and I'm trying to figure out how best to implement it. I'm really new to Drools so I'm kinda confused. Would it make sense for each unit to have its own knowledge base? If so, is there any way to share the rules that are the same for each unit?
I'm worried we might not have enough memory to store all these rules, so I was thinking if we had a knowledge base for each unit, we could just serialize all the knowledge bases, put them in a database, and retrieve them when we need them. Would that be reasonable?
The other reason I was thinking of using a separate knowledge base for each unit is because each unit's rules are completely independent from every other unit's rules, there might be a performance hit from putting them all into the same knowledge base. Is this correct or is the Rete algorithm smart enough to figure that out?
Also, is dynamically adding rules while the engine is running possible? Will all the rules have to be recompiled? How much time will it take and is this feasible if the engine still needs to be responding to meter changes in realtime?
Thanks guys.
Would it make sense for each unit to have its own knowledge base? If so, is there any way to share the rules that are the same for each unit?
See below on each unit to have its own knowledge base. For organization purpose, you can think of putting them in separate packages. To share rules between packages, you can create the rules in global area and import them. Though this will help share the common rules, it has some flaws:
When you have a large number of packages, you will need to manually import them first time.
When you modify the imported rule in a specific package, it will change the global rule and so the change affects every other package which has the rule imported. In guvnor UI there is no way to visibly tell if a rule is imported or specific to package.
I'm worried we might not have enough memory to store all these rules, so I was thinking if we had a knowledge base for each unit, we could just serialize all the knowledge bases, put them in a database, and retrieve them when we need them. Would that be reasonable?
This is an option but if the rules change you will need recreate the knowledge bases.
The other reason I was thinking of using a separate knowledge base for each unit is because each unit's rules are completely independent from every other unit's rules, there might be a performance hit from putting them all into the same knowledge base. Is this correct or is the Rete algorithm smart enough to figure that out?
The book Drools JBoss Rules 5.0 Developer's Guide by Michal Bali says
The performance of the Rete algorithm is theoretically independent of
the number of rules in the knowledge base.
Further,
If you are curious and want to see some benchmarks, you can find them
in the drools-examples module that is downloadable from the Drools web
site. There are also some web sites that regularly publish benchmarks
of various rule engines solving well-known mathematical problems
(usually 'Miss Manners' test and the 'Waltz'), for example,
http://illation.com.au/ benchmarks/. Performance of Drools is
comparable to other open source or even commercial engines.
Also, is dynamically adding rules while the engine is running possible? Will all the rules have to be recompiled? How much time will it take and is this feasible if the engine still needs to be responding to meter changes in realtime?
No concrete idea here but I am sure the rules need to be recompiled to be useable. The previously created kbase can still be used when the new packages are being recompiled. They are independent.

How do you evaluate a framework, library, or tool before adding it to your project?

There's so many cool ideas out there (ninject, automapper, specflow, etc) that look like they would help but I don't want to add something, tell others about it, and try using it just for it to be added to the growing heap of ideas that didn't quite work out. How can I determine if the promised benefits will happen and that it won't end up as something to be ignored or worked around?
Have a problem
Identify the cost of having the problem, or the value to solving it
Prioritize it against other problems
When it's the top priority, look for a solution that solves the problem with a proportional cost
Do you have the problem that ninject solves? Is it an important problem to solve? Is it the most important? What value will you get from solving it?
I don't think that you can tell whether any framework will deliver your expectations until you try it, and try it in anger and in context. This is usually time consuming and inevitably you'll have a few misses before you get any hits. Don't commit yourself by working through a simple sample from the authors website or howto files; these will always work and may impress but until you try to use the framework in the context of your billion user, multi-lingual, real-time on- and off- line application you're not going to find it's shortcomings.

Auto-generation of code in order to mass produce. Is this sensible?

Our company plans to auto-generate our projects from the domain area up-to the presentation layer so that we can mass produce software. The idea is we can produce a project within 24 hours. I think this is possible but that's not my concern.
What are the ramifications of such plan? I just think that the quality of software produced from such grandeur idea is below good quality. First, clients have varying requirements. Assuming we can standardize what's common among them, there are still requirements that will be beyond our original template.
Second, how can such software be reliable if not fully tested? Does a 24 hour period can cover a full unit/integration/other types of test?
At the end, it appears we won't be able to hit the 24 hour target thereby defeating our original purpose.
I just believe it's better to build quality software than mass producing them. How would I tell my boss that their idea is wrong?
Sorry, but I don't think this is sensible.
In order to build a system that can auto-generate any kind of software that would fill any kind of requirement, you will kind of have to implement more then the softwares you plan to generate.
Auto-generated code is great, when you have some repetitive tasks, information, or components, that are similar enough to enable you to make a one time effort to generate all repetitions.
But trying to write a system to produce any kind of project is not feasible. In order to meet a wide enough range of supported projects, your system will have to allow a very wide set of capabilities to describe project configuration and behavior, and the time required to describe each project's behavior will not necessarily be shorter than the time it would have taken to implement the project in the first place. You will just end up developing a development environment, and implementing projects in your own language.
So, instead, why not just take an existing development environment that is already available? Like Visual Studio?
Maybe you should have a big library reprository and stuff everything that could be reusable in it. This way, the software you have to write would be very small. There could be documentation templates associated with the library and you would just have to C&P them.
However, it takes time to build such a library.
Or do it this way.

How do I plan an enterprise level web application?

I'm at a point in my freelance career where I've developed several web applications for small to medium sized businesses that support things such as project management, booking/reservations, and email management.
I like the work but find that eventually my applications get to a point where the overhear for maintenance is very high. I look back at code I wrote 6 months ago and find I have to spend a while just relearning how I originally coded it before I can make a fix or feature additions. I do try to practice using frameworks (I've used Zend Framework before, and am considering Django for my next project)
What techniques or strategies do you use to plan out an application that is capable of handling a lot of users without breaking and still keeping the code clean enough to maintain easily?
If anyone has any books or articles they could recommend, that would be greatly appreciated as well.
Although there are certainly good articles on that topic, none of them is a substitute of real-world experience.
Maintainability is nothing you can plan straight ahead, except on very small projects. It is something you need to take care of during the whole project. In fact, creating loads of classes and infrastructure code in advance can produce code which is even harder to understand than naive spaghetti code.
So my advise is to clean up your existing projects, by continuously refactoring them. Look at the parts which were a pain to change, and strive for simpler solutions that are easier to understand and to adjust. If the code is even too bad for that, consider rewriting it from scratch.
Don't start new projects and expect them to succeed, just because your read some more articles or used a new framework. Instead, identify the failures of your existing projects and fix their specific problems. Whenever you need to change your code, ask yourself how to restructure it to support similar changes in the future. This is what you need to do anyway, because there will be similar changes in the future.
By doing those refactorings you'll stumble across various specific questions you can ask and read articles about. That way you'll learn more than by just asking general questions and reading general articles about maintenance and frameworks.
Start cleaning up your code today. Don't defer it to your future projects.
(The same is true for documentation. Everyone's first docs were very bad. After several months they turn out to be too verbose and filled with unimportant stuff. So complement the documentation with solutions to the problems you really had, because chances are good that next year you'll be confronted with a similar problem. Those experiences will improve your writing style more than any "how to write good" style guide.)
I'd honestly recommend looking at Martin Fowlers Patterns of Enterprise Application Architecture. It discusses a lot of ways to make your application more organized and maintainable. In addition, I would recommend using unit testing to give you better comprehension of your code. Kent Beck's book on Test Driven Development is a great resource for learning how to address change to your code through unit tests.
To improve the maintainability you could:
If you are the sole developer then adopt a coding style and stick to it. That will give you confidence later when navigating through your own code about things you could have possibly done and the things that you absolutely wouldn't. Being confident where to look and what to look for and what not to look for will save you a lot of time.
Always take time to bring documentation up to date. Include the task into development plan; include that time into the plan as part any of change or new feature.
Keep documentation balanced: some high level diagrams, meaningful comments. Best comments tell that cannot be read from the code itself. Like business reasons or "whys" behind certain chunks of code.
Include into the plan the effort to keep code structure, folder names, namespaces, object, variable and routine names up to date and reflective of what they actually do. This will go a long way in improving maintainability. Always call a spade "spade". Avoid large chunks of code, structure it by means available within your language of choice, give chunks meaningful names.
Low coupling and high coherency. Make sure you up to date with techniques of achieving these: design by contract, dependency injection, aspects, design patterns etc.
From task management point of view you should estimate more time and charge higher rate for non-continuous pieces of work. Do not hesitate to make customer aware that you need extra time to do small non-continuous changes spread over time as opposed to bigger continuous projects and ongoing maintenance since the administration and analysis overhead is greater (you need to manage and analyse each change including impact on the existing system separately). One benefit your customer is going to get is greater life expectancy of the system. The other is accurate documentation that will preserve their option to seek someone else's help should they decide to do so. Both protect customer investment and are strong selling points.
Use source control if you don't do that already
Keep a detailed log of everything done for the customer plus any important communication (a simple computer or paper based CMS). Refresh your memory before each assignment.
Keep a log of issues left open, ideas, suggestions per customer; again refresh your memory before beginning an assignment.
Plan ahead how the post-implementation support is going to be conducted, discuss with the customer. Make your systems are easy to maintain. Plan for parameterisation, monitoring tools, in-build sanity checks. Sell post-implementation support to customer as part of the initial contract.
Expand by hiring, even if you need someone just to provide that post-implementation support, do the admin bits.
Recommended reading:
"Code Complete" by Steve Mcconnell
Anything on design patterns are included into the list of recommended reading.
The most important advice I can give having helped grow an old web application into an extremely high available, high demand web application is to encapsulate everything. - in particular
Use good MVC principles and frameworks to separate your view layer from your business logic and data model.
Use a robust persistance layer to not couple your business logic to your data model
Plan for statelessness and asynchronous behaviour.
Here is an excellent article on how eBay tackles these problems
http://www.infoq.com/articles/ebay-scalability-best-practices
Use a framework / MVC system. The more organised and centralized your code is the better.
Try using Memcache. PHP has a built in extension for it, it takes about ten minutes to set up and another twenty to put in your application. You can cache whatever you want to it - I cache all my database records in it - for every application. It does wanders.
I would recommend using a source control system such as Subversion if you aren't already.
You should consider maybe using SharePoint. It's an environment that is already designed to do all you have mentioned, and has many other features you maybe haven't thought about (but maybe you will need in the future :-) )
Here's some information from the official site.
There are 2 different SharePoint environments you can use: Windows Sharepoint Services (WSS) or Microsoft Office Sharepoint Server (MOSS). WSS is free and ships with Windows Server 2003, while MOSS isn't free, but has much more features and covers almost all you enterprise's needs.