Drools rules export/import when upgrading - import

I am working with an old version of Drools (5.2). I can export all of the rules in an XML format that has more than 6 million lines in it. I can not find any way to import that file into the most recent version of drools. Is this a legitimate way to do the upgrade? Is there a better way? I need to to a mass migration due to the sheer number of rules I'm dealing with (more than 17,000). I have been struggling for a long time reading documentation trying to figure this dilemma out.

In the 5.2 version, there is a way to create and deploy a snapshot. On the same screen, there is a button to "View Package Source". With as many rules as we have, it took some time and even failed a few times, but eventually, I was able to view that source. I copied the text and pasted it into a text editor and saved it as a .drl file. Copying that way included the line numbers and a '|' character. There are probably other ways to do it such as importing into excel, but I wrote a little java program that stripped out that unwanted data. I could then import that file directly into my new version of the workbench using the "Import Asset" functionality once I had created a project. I have done some tweaking since then, but that is the gist of what I needed. The file validates but I have not tested yet to see if the rules work properly.

Related

Entire project shown in import window

I've been working with Unity3D since version 4.2 and just recently upgraded to different 5.X versions.
When importing an external package it used to show the import window with a list of all and only the package content.
After upgrading it seems to show how the package will fit into the project by showing the entire file structure and using white names for package elements and grey names for the rest of the files/folders.
I'm currently working on a very big project and this forces me to meticulously search through an unnecessarily HUGE list of elements looking for the ones I need to add or exclude from the import process.
Googling has been of no avail, does anyone happen to know how to hide my project files from that window?
Thanks in advance
THIS ANSWER IS STILL VALID ON 23/03/17. THIS MAY CHANGE IN FUTURE
RELEASES
After googling everything and trying to force different combinations of settings my conclusion is that this is not possible.
I really hope Unity will decide to rollback this change, as it makes working with big projects really frustrating.

Human editable snippet store in eclipse

I am looking for a easier way to manage my eclipse code snippets. I know and have used Eclipse's template and snippets features. But as far as I have found, they can only be exported and imported as XML files.
Since I use many versions of eclipse and I keep migrating between machines managing the snippets is a hassle. I am looking for a UltiSnips like method to manage these snippets/templates. Is there one?
I also looked at snip2code, but it didn't appeal to me because I sometime work offline. Also, I want to have much greater control over the snippets using version control.
Oh well! I couldn't find anything that matched the set of requirements I had. So, I've ended up creating my own.

How to update zfproject.xml file after deleting some controllers, dbtables, etc. in Zend Framework?

I am using Netbeans IDE to work with Zend Framework. When I create a new controller, action, etc.. using Netbeans Zend Command Window, zfproject.xml file is updated automatically. However, when I delete some of them, the file is not updated and still keeps the names that I deleted.
Is there a way (apart from manual way) to update this file?
Is it needed to update zfproject.xml to run the project properly or is it just an organized schema of the project?
Thanks a lot
This is very good question. zfproject.xml often gets out of sync when you use both Zend Tool and manual creating of the files.
Is there a way (apart from manual way) to update this file?
I don't know a good answer for this part. You may try to iterate the application directory structure.
Is it needed to update zfproject.xml to run the project properly or is it just an organized schema of the project?
This is just a schema which is not parsed during the normal application life. Used only by the tools.

Checkstyle source control intergration

I've been looking into checkstyle recently as part of some research into standard coding conventions. Though it seems like it is perfectly suitable for brand new projects, it seems to have a huge barrier to adoption for already existing projects as it doesn't seem to supply a method of only checking new or edited code. Maybe I'm wrong?
If you have a codebase that has never had a coding standard it could be a massive effort to get the whole codebase inline with a standard all at once. Allowing it to be done incrementally over time as code naturally evolves seems like a more reasonable approach. But it doesn't seem like a possibility with checkstyle.
I assume this would have to be a tie in with a source control system in order to be possible. Is that possible with Checkstyle or is there another tool that can provide this functionality?
As far as I known, Checkstyle is meant to analyze source, without considering its history or revisions.
To add that kind of feature means script checkstyle analysis to feed it with the exact sub-set of files representing the delta.
But then, certain kind of checks would be likely to fail or to miss in their analysis, like duplicate code check.
So for that kind of incremental analysis, you not only need to restrict the set of sources, but also the set of rules you want to enforce, for some of those rules only make sense on the all sources.
So, why couldn't you run a full check on each file and then filter results based on changes managed by your source control system? Anything like that exist?
Not to my knowledge, especially with plugin like eclipse-cs for eclipse: it they analyze a file, they will display all warnings, even though the source control mentions the file has not changed since a given revision.
Only an external script would be able to do this:
The principle is simple (although it could be a bit slow at execution time):
for each file, do a diff to check if modification have been made
if yes,
do a svn blame to annotate lines with the revision number which contained the last change.
Then analyze the file with checkstyle.
The script can then filter the warning for the line being currently modified (or for all the lines modified after a given revision).
We developed a Checkstyle plugin for SCM-Manager, a tool for managing git, subversion and mercurial repositories. If activated it is possible to check committed source code against your Checkstyle rules. If the check found errors, the commit is aborted.

Solution deployment, CM, InstallShield

People,
We have 4 or 5 utilities that work in conjunction with our application. These utilities are either .bat files, or VB apps, PowerBuilder, etc. I am trying to manage these utils in source control, and am trying to figure out a better way to assign versions to them. Right now, the developers use the version control's meta-data -- specifically label -- to store the version number of the tool.
My goal is to have individual InstallShield packages for each utility, and an easy means to manage and assign version numbers to these packages.
Would you recommend a separate .ini file with the info, or store the info in InstallShield .ism file itself, or just use the meta-data info from version control tool?
UPDATE:
I like the idea Orion. I have one concern though. The script that increments the version number... it can not be intelligent enough to increment Major number etc. right. e.g. if one of the utils has version 1.2.3 and we are at a point where the new version is 2.0.0. The script may not be able to handle this.
I think this has to do a lot with our branching techniques -- we don't have any. The folks thought since the utils are so small, the source may not need branches.
PowerBuilder in particular has a nice trick you can do to incorporate the build number from an ini file into the compiled application.
Details here: http://www.pbdr.com/pbtips/ex/autorev.htm
We have ini file inside source control that stores the build number and its value is used in our build scripts to determine what label to apply to the source tree after a successful build. Works very nicely for our needs. When we branch, we do have to manually kick the file to increment the proper number though.
I managed our build system at my last job, which seemed to have some parallels to what you're asking.
There were ~30 C++ projects which needed compiling, and various .NET/Java things, and the odd perl script.
This was all built on our build machine using NAnt - If I were doing it today I'd use rake, but the idea is the same.
We basically had an auto-incrementing build number which was stored in a version.txt file in the root of the repository.
Each time we did a build (automatically done each night, or also on-demand if neccessary) the script would increment this number and check the file back into source control.
All the other apps referenced this file for their version number, or for things which didn't support working like this, the script would set environment variables or perform other workarounds
I'm pretty sure that our installshield programs referenced an environment variable for their version number, but we deprecated them in favour of wix as installshield really did suck
in the case of visual studio, grep/replace the number within the .csproj files, and check them back in
Hope this gives you some ideas
Using the meta data from your version control system should keep things simpler. It's how your developers already use the system. There is no additional file to maintain. My personal experience has taught me to version the satellite applications with the same as version as the main app. K.I.S.S