kie-maven-plugin for Gradle Drools 6.4 - drools

We are developing a Drools application and our organization has standardized on Gradle. We need to use the kie-maven-plugin to create and compile our rules in a KJar. This is so we can hot swap new versions of the rules. Is there a Gradle version of this plugin available? I don’t think that Maven plugins can be used in Gradle.

No, you need to port, I think its this project: https://github.com/kiegroup/droolsjbpm-integration/tree/master/kie-maven-plugin
However there is a way of doing drools as resource files like here: https://github.com/Spantree/drools-examples/
and this line is what got me stuck: https://github.com/Spantree/drools-examples/blob/master/src/test/groovy/net/spantree/examples/drools/helloworld/HelloWorldSpec.groovy#L19
(They just get the session directly from container, ignoring getting a base first)
This does not give you the compile time checks the maven module gives you.
However hot swapping resource based drools should be easier (I think).

Related

How to resolve classpath incompatibilities between plugin code and IntelliJ SDK

I am currently trying to develop a plugin for IntelliJ that will use a "core" library. The core library already has it's own dependencies (JAR files) and is used in other non-IntelliJ projects. Unfortunately some of the dependencies of the IntelliJ SDK are the same as that for the plugin core, but with conflicting versions. So far this has been manageable because we remove the dependencies in the SDK and provide the core's dependencies instead, and running the plugin through IntelliJ to work fine. However, I really want to be able to write automated unit tests for the plugin, and this causes problems.
Following the instructions from here, I set my first unit test to extend LightCodeInsightFixtureTestCase. However, this fails to get past the setUp method, throwing NoClassDefFoundErrors. See the gist of the error here (picocontainer is the conflicting dependency).
By inspecting the classes loaded while running the plugin, I can see that the same class from a conflicting dependency is loaded in two different classloaders, a URLClassLoader for the com.intellij dependency, a PluginClassLoader for my plugin's dependency. This explains why the plugin can be executed successfully, but the test fails.
A small, self contained example of a project that fails in this way is available here: https://github.com/holger-s/libraryconflict
My question is, what is the recommended way to resolve these conflicts that allows unit testing with the IntelliJ test fixtures?
Full disclosure: I have also sought an answer on the IntelliJ Plugin Development forum.
So this is late, but was running into the same problem, and I fixed it by adding the library under Project Structure/Libraries, then going to Modules/Dependencies, and changing the scope to Provided.
https://confluence.jetbrains.com/display/PhpStorm/Setting-up+environment+for+PhpStorm+plugin+development

How to Create a Spring+Primefaces+Hibernate (no maven) project in eclipse?

I am new to J2EE. I would like to create a Spring+Primefaces+Hibernate project.
I googled for it.
But I found all projects examples show in internet contains maven. My questions are
Is it possible to create a spring+primefaces+hibernate project in eclipse without Maven? If no, what is need of maven?
How to add the jar file of primefaces and spring and hibernate in eclipse?
Will the spring controller xml file (spring context or dispatcher servlet) be created automatically or manually?I mean Spring MVC.
Will the hibernate file (mapping file) also be created automatically or manually?
If possible, can anyone guide me to tutorial (preferably video) to implement the same?
I am using tomcat 7 and Eclipse - kepler.
Any help is appreciated.
If this is downvoted , do specify the reason also.
Although it's not a 'must' to use Maven or any other build tool, you should strongly consider using one.Eclipse Kepler has by default maven support but feel free to use other build tools(Gradle, Ant) or none(see 2.).Maven and the other build tools remove the headache of scaffolding, searching for dependencies(external jars like spring-mvc, hibernate, some db drivers), even deploying applications in a server.
If you chose not to use a build tool you have to manually get your project dependencies and enter them
into your project's buildpath(Right Click -> Build Path then enter their location).As you have noticed this step can be really really time consuming...
No, you have to manually create the configuration unless you use another project that already has what you need, again this might get easier with a build tool(maven archetypes for example)
The same as 3.
You won't have a hard time finding resources about these technologies, they are being used practically everywhere, and I think the Spring team has some videos in their YouTube channel.
Hope that helps a little!
1:* The fundamental difference between Maven and Ant is that Maven's design regards all projects as having a certain structure and a set of supported task work-flows (e.g., getting resources from source control, compiling the project, unit testing, etc.). While most software projects in effect support these operations and actually do have a well-defined structure, Maven requires that this structure and the operation implementation details be defined in the POM file. Thus, Maven relies on a convention on how to define projects and on the list of work-flows that are generally supported in all projects.
This design constraint resembles the way that an IDE handles a project, and it provides many benefits, such as a succinct project definition, and the possibility of automatic integration of a Maven project with other development tools such as IDEs, build servers, etc.
But one drawback to this approach is that Maven requires a user to first understand what a project is from the Maven point of view, and how Maven works with projects, because what happens when one executes a phase in Maven is not immediately obvious just from examining the Maven project file. In many cases, this required structure is also a significant hurdle in migrating a mature project to Maven, because it is usually hard to adapt from other approaches.
In Ant, projects do not really exist from the tool's technical perspective. Ant works with XML build scripts defined in one or more files. It processes targets from these files and each target executes tasks. Each task performs a technical operation such as running a compiler or copying files around. Targets are executed primarily in the order given by their defined dependency on other targets. Thus, Ant is a tool that chains together targets and executes them based on inter-dependencies and other Boolean conditions.
The benefits provided by Ant are also numerous. It has an XML language optimized for clearer definition of what each task does and on what it depends. Also, all the information about what will be executed by an Ant target can be found in the Ant script.
A developer not familiar with Ant would normally be able to determine what a simple Ant script does just by examining the script. This is not usually true for Maven.
However, even an experienced developer who is new to a project using Ant cannot infer what the higher level structure of an Ant script is and what it does without examining the script in detail. Depending on the script's complexity, this can quickly become a daunting challenge. With Maven, a developer who previously worked with other Maven projects can quickly examine the structure of a never-before-seen Maven project and execute the standard Maven work-flows against it while already knowing what to expect as an outcome.
It is possible to use Ant scripts that are defined and behave in a uniform manner for all projects in a working group or an organization. However, when the number and complexity of projects rises, it is also very easy to stray from the initially desired uniformity. With Maven this is less of a problem because the tool always imposes a certain way of doing thi
2:* You have to download all required jars file for hibernate/spring/primefaces from internet and place them in your project build path or in lib folder.
3:* Spring configuration files need to be created by you so that you can get the concept.
4:* Hibernate mapping files can be created by using reverse Engineering techniques for hibernate from where you can generates hbm files or you can use annotations if you dont want xml.
I suggest you to first create a sample java project in eclipse then download all required jars and place them in lib folder. Then configure hibernate in projects and spring integration.

best way to enable hot deployment on Jetty when using Gradle+Eclipse

I'm used to mvn, but I'm testing Gradle (v1.8) for a small web development project.
I've noticed that the Jetty Gradle plugin support autoscan and hot deployment, so I've enabled it. My goal is to recompile from Eclipse and get a Jetty reload the context every time I change a Controller, etc.
However, this is non working, mainly because Gradle compilation output goes to build/, however the Gradle Eclipse plugin creates a .classpath configuration that directs all the Eclipse output to /bin (even mixing test and main source folders).
Is there a way to?...
Run graddle jettyRun on a separate console.
Save a modified class on Eclipse (triggering a compilation)
See that Jetty picks up the change and reloads the context.
As per my research, I've identified three workarounds, but none of them solves the question above (I'm posting them in case you have related comments or more recommendations)
Tweak Gradle Eclipse config to direct test and main build output to the same directory that Gradle uses (using the pattern seen here). This is not recommended by some people, as it means using two different compilation systems that could interfere with each other.
Use the Gradle eclipse-wtp plugin to generate a WTP2 config, and use Eclipse's "Run AS -> Run on Server". This accomplishes the hot deployment / iterative goal and keeps both systems (IDE and Gradle) isolated. However, you need to setup the server on Eclipse.
(Not really a workaround): I've tested Spring's Eclipse build (STS) Gradle integration, however it seems that the integration is focused on the project setup, and while Gradle builds can be automatically triggered, Eclipse compilation is still redirected to bin/.
So you are interested in fine-tuning hot-deployment, right?
Please, consider using Gretty gradle plugin: https://github.com/akhikhl/gretty
It is an advanced gradle plugin for running web-apps on jetty. It does exactly what you want, regarding hot-deployment (and, possibly, even more).
Disclosure: I am author of Gretty plugin.
If you don't want to change to other plugins, here are two steps for the workaround:
add below configurations into your build.gradle:
jettyRun {
reload = "automatic"
scanIntervalSeconds = 1
}
each time after you changed java code, run the following task:
gradle compileJava
Because jetty is watching the *.class files, it will hot reload only after *.class files changed.
Refer to this link: https://discuss.gradle.org/t/hot-deploy-with-jetty-plugins-jettyrun/7416

Keeping Eclipse project dependencies in sync with an external build system

Here is the situation. A development team has a large number (hundreds) of Eclipse projects. The code is very much in churn - new projects are being created; projects are being renamed and project dependencies are constantly changing. The external build system is ant. It is proving extremely challenging to keep the dependencies defined in the ant build files in sync with the state of the world in Eclipse. The external ant build needs constant changes to keep up. For various reasons, using ant as the default builder in Eclipse is not an option. The developers want to continue using Eclipse as the build and edit environment for local use.
Question: Is there a tool which will allow a single set of dependencies to be maintained which can be used by Eclipse as well as an external build system like ant?
I have heard of Gradle but never used it before. Would it make sense in this context? I am pretty sure Maven wouldnt work for what is needed
The typical workflow should be:
1. Developers continue working as they currently do - creating and changing Eclipse project dependencies at will and using the default Eclipse builder to compile and test locally.
2. Some mechanism exists by which these dependencies can be carried into an external build system like ant and an external continuous build triggered on every checkin.
Appreciate your feedback - thanks!
We have been quite successful at using Gradle to tackle a similar problem. Here's the outline of the setup
Each project contains a build.gradle that defines project specific dependencies and tasks (may even be empty).
A special master project contains build.gradle that sets up common dependencies and tasks for child projects, and/or injects settings pertinent to a group of child projects.
Logically master project is the parent project, but it exists as a sibling folder so that Eclipse can be more comfortable with it.
Gradle contains a built-in Eclipse plugin which allows generation of Eclipse settings files for each of the projects from the dependencies information (including inter-project dependencies). It works nicely for simple projects, and as for more complicated ones Gradle allows you to tinker with the settings files, so you can do pretty much everything. From here you have two options:
Not to store Eclipse settings file in the repository and call the generation task every time you do a fresh check-out (I prefer this option).
Tell Gradle to use custom variables to make it generate generic settings files which can be checked-in to the repository. You'll then only need to run the generation task when dependencies or other configuration changes.
(Optional) It's a little tricky, but you can make Gradle parse existing project ivy.xml files and set up dependencies from there. I had some success with this, although I would recommend converting dependencies into Gradle format for more flexibility.
Continuous build system integrate with Gradle very well (same as ant). If you are using Jenkins (Hudson) there is a Gradle plugin.
The advantage of using Gradle is that it scales pretty well, and you can support other IDEs like IntelliJ or Netbeans at the same time without much effort (unless you have lots of crazy custom settings). An advantage and a disadvantage is that Gradle is a powerful build system which requires learning Groovy and Gradle DSL which may take some time to acquire. Also the documentation is awesome.
Gradle has a very active community with the sole purpose of tackling exactly this kind of problem.
Hope this helps, and best of luck!
How about parsing the .classpath files, generate a dependency tree and start building from the root. What you need is a convention on the layout of your projects or an generic (ant-) buildfile that could be changed in each project, if needed (e.g. different project layouts). I´m not sure if Eclipse Tycho could be used for that, since it´s a maven plugin(s) to build eclipse plugins or projects. But it´s able to resolve the bundle and project dependencies against maven repositories and eclipse update sites.

eclipse, one classpath for compiling, another for launching

example:
For logging, my code uses log4j. but other jars my code is dependent upon, uses slf4j instead. So both jars must be in the build path. Unfortunately, its possible for my code to directly use (depend on) slf4j now, either by context-assist, or some other developers changes. I would like any use of slf4j to show up as an error, but my application (and tests) will still need it in the classpath when running.
explanation:
I'd like to find out if this is possible in eclipse. This scenario happens often for me. I'll have a large project, that uses alot of 3rd party libraries. And of course those 3rd party jars have their own dependencies as well. So I have to include all dependencies in the classpath ("build path" in eclipse) for the application and its tests to compile and run (from within eclipse).
But I don't want my code to use all of those jars, just the few direct dependencies I've decided upon myself. So if my code accidentally uses a dependency of a dependency, I want it to show up as a compilation error. Ideally, as class not found, but any error would do.
I know I can manually configure the classpath when running outside of eclipse, and even within eclipse I can modify the classpath for a specific class I'm running (in the run configurations), but thats not manageable if you run alot of individual test cases, or have alot of main() classes.
It sounds like your project has enough dependency relationships that you might consider structuring it with OSGi bundles (plug-ins). Each bundle gets its own classloader and gets to specify what bundles (and optionally what version ranges, etc.) it depends on, what packages it exports, whether it re-exports stuff from its dependencies, etc.
Eclipse itself is structured out of Eclipse plug-ins and fragments, which are just OSGi bundles with an optional tiny bit of additional Eclipse wiring (plugin.xml, which is used to declare Eclipse "extension points" and "extensions") attached. Eclipse thus has fairly good tooling for creating and managing bundles built-in (via the Plug-in Development Environment). Much of what you find out there may lead you to conflate "OSGi bundle" with "plug-in that extends the Eclipse IDE", but the two concepts are quite separable.
The Eclipse tooling does distinguish rather clearly (and sometimes annoyingly, but in the "helpful medicine" way) between the bundles in your build environment vs. the bundles that a particular run configuration includes.
After a few years of living in OSGi land, the default Java "flat classpath" feels weird and even kind of broken to me, largely because (as you've experienced) it throws all JARs into one giant arena and hopes they can sort of work things out. The OSGi environment gives me a lot more control over dependency relationships, and as a "side effect" also naturally demands clarification of those relationships. Between these clear declarations and the tooling's enforcement of them, the project's structure is more obvious to everyone on the team.
if my code accidentally uses a dependency of a dependency, I want it to show up as a compilation error. Ideally, as class not found, but any error would do.
Put your code in one plug-in, your direct dependencies in other plug-ins, their dependencies in other plug-ins, etc. and declare each plug-in's dependencies. Eclipse will immediately do exactly what you want. You won't be offered dependencies' dependencies' contents in autocompletes; you'll get red squiggles and build errors; etc.
Why not use access rules to keep your code clean?
It looks like it would better be managed with maven, integrated in eclipse with m2eclipse.
That way, you can only execute part of the maven build lifecycle, and you can manage separate set of dependencies per build steps.
In my experience it helps to be more resrictive, I made the team filling out (paper) forms why this jar is needed and what license...
and they did rather type in a few lines of code instead of drag along 20 jars to open a file using only one line of code, or another fancy 'feature'.
Using maven could help for a while, but when you first spot jars having names like nightly-build or snapshot, you will know you're in jar-hell.
conclusion: Choose dependencies well
Would using the slf4j-over-log4j jar be useful? That allows using slf4j with actual logging going to log4j.