When to upgrade libraries - upgrade

I work with a lot of open source libraries in my daily tasks (java FYI). By the time a project comes close to maturing, many of the libraries have released newer versions. Do people generally upgrade just to upgrade, or wait until you see a specific bug? Often the release notes say things like "improved performance and memory management", so I'm not sure if it's worth it to potentially break something. But on the other hand most libraries strive and claim to not break anything when releasing new versions.
What is your stance on this? I'll admit, I am kind of addicted to upgrading the libraries. Often times it really does help with performance and making things easier.

The rule for us is to stay up to date before integration testing but to permit no changes to any library once we're past this point. Of course, if integration testing reveals flaws that are due to an issue with a library that has been fixed, then we'd go back and update. Fortunately, I cannot remember a situation where that has happened.
Update: I understand Philuminati's point about not upgrading until you have a reason. However, I think of it this way: I continuously pursue improvements in my own code and I use libraries built by people that I believe think the same way. Would I fail to use my own improved code? Well, no. So why would I fail to use the code that others have improved?

I keep what works until there is a reason to upgrade.

If the information pertaining the old version appears on secunia or securityfocus...
Otherwise - if new functionality is needed (better performance is also a 'functionality').

I'm with the lazy crowd - I can't remember ever formulating a different strategy than "upgrade when there is a reason to" - but now that I consider the question, there is something to be said about proactive upgrades.
Upgrading does make it easier for you to report a bug in the lib, should you find one. If you find a bug and have not upgraded, it's the first thing you're going to have to do before you get any help or support. You might as well do that proactively.
Especially if you have a good test suite, upgrading proactively will flush out problems early, and that is always a smart move.

It depends a lot on your deployments. If you are supporting multiple platforms then the very latest libraries may not be available on all at any given moment. I've been frustrated by trying to install something that requires the very latest version of some lib, and it's not available as a package yet.
If you deploy to customers you want to develop against libraries that are stable and widely available.

Related

Frameworks for hacklang?

Hacklang may still be relatively new but are there any frameworks (MVC for instance) that are well-documented for it ? I've stumbled accross Fastuc or Hack-mvc but they don't seem ready yet and the documentation is quite scarce;
Any thoughts ?
Thanks a lot !
First, remember that, since Hack is backwards-compatible with PHP, any PHP framework can be used in Hack. You won't get the benefits of typing code interacting with that framework of course, but the type system was carefully designed to deal with missing type information ("gradual typing"). So if you want to use Laravel, Symfony, etc, you can do so perfectly fine while still writing the code using them in Hack.
As for pure Hack frameworks, you can look at the Hack language section of the HHVM blog which contains a few "community roundups". One of the most active today is probably Titon, though I've not used it myself and so can't say how complete or ready to use it is.
At this point in time PHP frameworks seem to be dropping Hack support and Hack seems to be going down the path of truly diverging from PHP so existing PHP frameworks are no longer appropriate for use under Hack.
Titon seems to be the most recently modified Hack framework, though it was last touched in 2015. None of its forked seemed to have been touched too much more recently either.
It looks there are currently no actively maintained Hack frameworks, and whether any will appear unfortunately remains to be seen given the apparent momentum of PHP 7.

Is Lucene.net abandoned?

I'm currently testing Lucene.Net, and it's perfect for my needs but I've seen this recent post in the dev mailing list (with no answers)...
Do you think it's unsafe to start developping with this library ?
I thought it was widespread used ?
As far as I know Lucene.NET is used for RavenDB, so it should be in pretty good shape.
Also, it depends on what do you mean by "unsafe". It is hard to guarantee any OSS project will never stop, so all of them are inherently "unsafe". Same is actually true for commercial projects.
Lucene.NET seems to be a reliable project at current point (I used it in small project, so I can not guarantee that, but RavenDB seems to do just fine), so even if new development stops, it should still be possible to rely on it.
I think it all depends on longevity of your project, on your readiness to fix any issues in Lucene (if they arise), and on requirements of the project owners.

Should upgrading tools / frameworks / dependencies to latest version be automatic?

At my current job, it goes without question that if a new version of a technology that we use in our project is released, we upgrade it ASAP. At my previous job, that was not the case... we had to convince management that it was necessary. As such, we often had to do without features that could have been helpful and continue living with bugs that had long ago been fixed. At times, it was even hard to get support for the old versions we were using. I don't really see that point of view, especially after experiencing the opposite approach. Are there really 2 sides to this question?
Of the two approaches, I absolutely would prefer the one where you are now. Having applications falling behind can be painful for many reasons, some of which you noted.
The only caveats would be centered around time, really; it would usually take some non-trivial amount of time to update something for new frameworks/dependencies. It's nice when frameworks maintain backward-compatibility, but that does not always happen.
Breaking changes are usually obvious, and usually (we hope) exist for some very good reasons. More troublesome are the silent changes that do not prevent building, but cause subtle bugs; like a library function with the same signature, but which has slightly different behavior or return results.
But if an application is meant to be supported long-term, keeping it up-to-date is really a must, IMO.
There is a grey area that lies somewhere in the middle.
Your old place lived by a "don't rock the boat" approach. Yes the stuff might not be up to date but they knew what it could do (and maybe couldn't) and how to handle it. If theres an issue you hopefully know its not the kit as its been around the block (or you've been around the block find out all the bugs in it and how to handle them and keep your kit up and running).
Your new place puts all their faith in the fact that the newer kit must be better and can't possibly have been released without lots of checks and balances. Yes there might be some quirks but knowing some of those old bugs are no more (well the release docs say its fixed anyway) is worth the time spend finding new bugs in the latest release.
Its a fine line to tread and depends very much on what the tech is used for and how mission critical it is.
Yes, there certainly are two sides to this question. I'll weigh in on the side of not upgrading whenever a new version is released ...
If it ain't broke don't fix it. If your system is working then any
change is a risk, not all risks are
worth taking.
If you upgrade every time a new
release of any component pops up you
will find yourselves following other
people's schedules.
Change management is vital discipline for robust and reliable systems.

Risk evaluation for framework selection

I'm planning on starting a new project, and am evaluating various web frameworks. There is one that I'm seriously considering, but I worry about its lasting power.
When choosing a web framework, what should I look for when deciding what to go with?
Here's what I have noticed with the framework I'm looking at:
Small community. There are only a few messages on the users list each day
No news on the "news" page since the previous release, over 6 months ago
No svn commits in the last 30 days
Good documentation, but wiki not updated since previous release
Most recent release still not in a maven repository
It is not the officially sanctioned Java EE framework, but I've seen several people mention it as a good solution in answers to various questions on Stack Overflow.
I'm not going to say which framework I'm looking at, because I don't want this to get into a framework war. I want to know what other aspects of the project I should look at in my evaluation of risk. This should apply to other areas besides just Java EE web, like ORM, etc.
I'll say that so-called "dead" projects are not that great a danger as long as the project itself is solid and you like it. The thing is that if the library or framework already does everything you can think you want, then it's not such a big deal. If you get a stable project up and running then you should be done thinking about the framework (done!) and focus only on your webapp. You shouldn't be required to update the framework itself with the latest release every month.
Personally, I think the most important point is that you find one that is intuitive to your project. What makes the most sense? MVC? Should each element in the URL be a separate object? How would interactivity (AJAX) work? It makes no sense to pick something just because it's an "industry standard" or because it's used by a lot of big-name sites. Maybe they chose it for needs entirely different from yours. Read the tutorials for each framework and be critical. If it doesn't gel with your way of thinking, or you have seen it done more elegantly, then move on. What you are considering here is the design and good design is tantamount for staying flexible and scalable. There's hundreds of web frameworks out there, old and new, in every language. You're bound to find half a dozen that works just the way you want to think in your project.
Points I consider mandatory:
Extensible through plug-ins: check if there's already plug-ins for various middleware tasks such as memcache, gzip, OpenID, AJAX goodness, etc.
Simplicity and modularity: the more complex, the steeper the learning curve and the less you can trust its stability; the more "locked" to specific technologies, the higher the chances that you'll end up with a chain around your ankle.
Database agnostic: can you use sqlite3 for development and then switch to your production DB by changing a single line of code or configuration?
Platform agnostic: can you run it on Apache, lighttpd, etc.? Could you port it to run in a cloud?
Template agnostic: can you switch out the template system? Let's say you hire dedicated designers and they really want to go with something else.
Documentation: I am not that strict if it's open-source, but there would need to be enough official documentation to enable me to fully understand how to write my own plug-ins, for example. Also look to see if there's source code of working sites using the same framework.
License and source code: do you have access to the source code and are you allowed to modify it? Consider if you can use it commercially! (Even if you have no current plans to do that currently.)
All in all: flexibility. If I am satisfied with all four points, I'm pretty much done. Notice how I didn't have anything about "deadness" in there? If the core design is good and there's easily installable plug-ins for doing every web-dev 3.0-beta buzzword thing you want to do, then I don't care if the last SVN commit was in 2006.
Here are the things I look for in a framework before I decide to use it for a production environment project:
Plenty of well laid out and written documentation. Bad documentation just means I'm wasting time trying to find how everything works. This is OK if I am playing around with some cool new micro framework or something else, but not when it's for a client.
A decently sized community so that you can ask questions, etc. A fun and active IRC channel is a big plus.
Constant iteration of the product. Are bugs being closed or opened on a daily/weekly basis? Probably a good sign.
I can go through the code of the framework and understand what's going on. Good framework code means that the projects longterm life has a better chance of success.
I enjoy working with it. If I play with it for a few hours and it's the worst time of my life, I sure as hell won't be using it for a client.
I can go on, but those are some primary ones off the top of my head.
Besides looking at the framework, you also need to consider a lot of things about yourself (and any other team members) when evaluating the risks:
If the framework is a new, immature, "bleeding-edge" framework, are you going to be willing and able to debug it and fix or work around whatever problems you encounter?
If there is a small community, you'll have to do a lot of this debugging and diagnosis yourself. Will you have time to do that and still meet whatever deadlines you may have?
Have you looked at the framework yourself to determine how good it is, or are you willing to rely on what others say about it? Why do you trust their judgment?
Why do you want to use this rather than the "officially sanctioned Java EE framework"? Is it a pragmatic reason, or just a desire to try something new?
If problems with the framework cause you to miss deadlines or deliver a poor product, how will you talk about it with your boss or customer?
All the signs you've cited could be bad news for your framework choice.
Another thing that I look for are books available at Amazon and such. If there's good documentation available, it means that authors believe it has traction and you'll be able to find users that know it.
The only saving grace I can think of is relative maturity. If the framework or open source component is mature, there's a chance that it does the job as written and doesn't require further extension.
There should still be a bug tracker with some evidence of activity, because no software is without bugs (except for mine). But it need not be a gusher of requests in that case.

When generating code, what language should you generate?

I've worked on a number of products that make use of code generation. It seems to be the only way to achieve both a high degree of user-customizability and high execution speed.
The downside is that we are requiring users to install a compiler (primarily on MS Windows).
This has been an on-going headache, because vendors like MS keep obsoleting compilers, and some users tend to have more than one compiler installed.
We're considering using GNU C, and possibly C++, but even there, there are continual version issues.
I've considered possibly generating assembly language, in an effort to get off the compiler-version-treadmill, but assembly languages are all machine-specific.
Ideally there would be some way to produce generated code that would be flexible, run fast, and not expose us to the whims of third-party providers.
Maybe I'm overlooking something simple, like Java. Any ideas would be appreciated. Thanks.
If you're considering C and even assembler, take a look at LLVM first: http://llvm.org
I might be missing some context here, but could you just pin yourself to a specific version? E.g., .NET 2.0 can be installed side by side with .NET 1.1 and .NET 3.5, as well as other versions that will come out in the future. So as long as your code makes use of a specific version of a compiler, what's the problem?
I've considered possibly generating assembly language, in an effort to get off the compiler-version-treadmill, but assembly languages are all machine-specific.
That would be called a compiler :)
Why don't you stick to C90?
I haven't heard much of severe violations of standards from gcc's side, if you don't use extensions.
And you can always distribute a certain version of gcc along with your product, say, 4.3.2, giving an option to users to use their own compiler at their own risk.
As long as all code is generated by you (i. e. you don't embed your instructions into other's code), there shouldn't be any problems in testing against this version and using it to compile your libraries.
If you want to generate assembly language code, you may take a look at asmjit.
One option would be to use a language/environment that provides access to the compiler in code; For example, here is a C# example.
Why not ship a GNU C compiler with your code generator? That way you have no version issues, and the client can constantly generate code that is usable.
It sounds like you're looking for LLVM.
Start here: The Code Generation conference
In the spirit of "might not be to late to add my 2 cents" as in #Alvin's answer's case, here is something I'd think about: if your application is meant to last for some years, it is going to face several changes in how applications and systems work.
For instance, let's say you were thinking about this 10 years ago. I was watching Dexter back then, but I guess you actually have memories of how things were at that time. From what I can tell, multithreading was not much of an issue to developers of 2000, and now it is. So Moore's law broke for them. Before that people didn't even care about what will happen in "Y2K".
Speaking of Moore's law, processors are indeed getting quite fast, so maybe certain optimizations won't be even that necessary. And possibly the array of optimizations will be much bigger, some processors are getting optimizations for several server-centric stuff (XML, cryptography, compression and regex! I am surprised such things can get done on a chip) and also spend less energy (which is probably very important for warfare hardware...).
My point being that focusing on what exist today as a platform for tomorrow is not a good idea. Make it work today, and surely it will work tomorrow (backward-compatibility is especially valued by Microsoft, Apple is not bad it seems and Linux is very liberal about making it work as you want).
There is, yes, one thing that you can do. Attach your technology to something that just won't (likely) die, such as Javascript. I'm serious, Javascript VMs are getting terribly efficient nowdays and are just going to get better, plus everyone loves it so it's not going to dissappear suddenly. If needing more efficiency/features, maybe target the CRL or JVM?
Also I believe multithreading will become more and more of an issue. I have a gut feeling the number of processor cores will have a Moore's law of their own. And architectures are more than likely to change, from the looks of the cloud buzz.
PS: In any case, I belive C optimizations of the past are still quite valid under modern compilers!
I would stick to that language that you use for generating that language. You can generate and compile Java code in Java, Python code in Python, C# in C#, and even Lisp in Lisp, etc.
But it is not clear whether such languages are sufficiently fast for you. For top speed I would choose to generate C++ and use GCC for compilation.
Why not use something like SpiderMonkey or Rhino (JavaScript support in Java or C++). You can export your objects to JavaScript namespaces, and your users don't have to compile anything.
Embed an interpreter for a language like Lua/Scheme into your program, and generate code in that language.