Referencing and deploying ILOG jrules jars within XOM of an IBM ODM rule - classpath

I am working on implementation of some logic with cascading rules execution on IBM ODM. I was pretty much following these instructions from documentation. Namely, I wrote some simple logic invoking one ruleset from another, using library class and referenced two jar files:
jrules-res-8.10.5.1-execution.jar and
jrules-res-8.10.5.1-session-java.jar.
Which I've taken from local ODM installation and referenced them in eclipse via .classpath of a XOM project like that:
<classpathentry kind="var" path="ILOG_BR_HOME/WEB-INF/lib/jrules-res-8.10.5.1-session-java.jar"/>
Classpath variable ILOG_BR_HOME points to a local folder with required jars.
The documentation states that I need to exclude the two jars from export, which I did.
The problem, however, occurred when I tried to deploy updated ruleset to an instance of ODM. I exported an archive of all projects (xom with jar files and updated boms) and imported it via web interface of the decision center. The import was successful and, for instance, verbalization of new items was accessible via web editor of action rules. However, I received the following errors upon deployment:
errorilog.rules.teamserver.brm.builder.IlrBuildException: While applying business to execution (B2X) model mapping
GBREX0033E: Cannot load execution class 'io.cascadingruletest.RulesetChildRunner' for translating business class 'io.cascadingruletest.RulesetChildRunner', got '[java.lang.NoClassDefFoundError: ilog.rules.res.session.config.IlrSessionFactoryConfig]'
GBREX0001E: Cannot find execution class 'ilog.rules.res.session.IlrSessionException' for translating business class 'ilog.rules.res.session.IlrSessionException'
GBREX0033E: Cannot load execution class 'io.cascadingruletest.contracts.ValueContext' for translating business class 'io.cascadingruletest.contracts.ValueContext', got '[java.lang.NoClassDefFoundError: ilog.rules.res.session.IlrSessionException]'
From what I gather it means that referenced jars are not accessible for deployment. I tried various ways of deploying, putting jars directly in the xom archives, or in the root of archive to be imported, with the same results. Having tried to find documentation on how exactly linking should be done in this situation, I have to admit I am confused and don't understand some principles here:
Should jrules* jars be present inside a zip archive of xom? Or are they already a part of installation and should simply be referenced properly? (I checked the same jars I have locally are also present inside a remote instance of ODM I was deploying to) Or maybe they should be deployed separately?
What is the preferred way of linking external (and internal) libraries in order to invoke them in XOM of a rule? Having looked through some examples in documentation and github, I've encountered several ways:
a. Via .classpath file (see my example below or gh example)
b. Via .ruleproject file:
<entries xsi:type="ilog.rules.studio.model.xom:SystemXOMPathEntry" name="jrules-teamserver.jar" url="file:C:/Program Files/IBM/ODM8104/teamserver/lib/jrules-teamserver.jar" exported="true"/>
Taken from here
c. Reference in a deploy ant script with res-deploy-xom task:
<target name="runloanvalidation">
<res-deploy-xom
hostname="localhost"
portnumber="9080"
webapp="res"
userid="resAdmin"
password="resAdmin"
jarMajorVersion="false"
libName="person"
outputRulesetProperty="ruleset.managedxom.uris">
<xompath>
<fileset dir="hello">
<include name="*.jar"/>
</fileset>
</xompath>
</res-deploy-xom>
<echo message="Resulting property: ${ruleset.managedxom.uris}"/>
</target>
Taken from documentation.
I do not have an ant script for deployment at the moment, and I've been deploying previously using either eclipse to sync with decision center or via exporting zip and importing it using web interface at an ODM instance.
d. Referenced as an external jars and libraries using explorer tab of Rule Execution Server
console via Add Library Reference button
If jrules jar files are already present at an instance of ODM and should simply be referenced properly, what are the ways to check the relative paths and list all accessible libraries? And do I need to grant specific rights and privileges to make such a references?
Thanks in advance for your help!
PS I am using 8.10.5.1 version of ODM decision center and eclipse as an IDE with a rule designer plugin.

The RulesetChildRunner that is mentioned in the documentation is an interesting way to cascade ruleset executions. However, it is a bit complex to use.
If you have a simple use case and you can handle your own rule project dependencies, I would suggest to create a top rule project that orchestrate the sequence of the 2 main ruleflows you want to cascade.
That would be simpler to manage than to create custom code.
If you need to pass parameters from decisionoperation1 to the decisionoperation2 ruleset, you can do that using a bal rule to initialize parameter_ruleset2 from parameter (or_xom) info from the ruleset1.
Hope this helps,
Best
Emmanuel

Related

mybatis-generator: Create non-java files?

I am using the mybatis-generator in a maven project to generate the Java files for a few tables. At the end of the generation, I would like to generate a few non-java files like properties files and resources. However the default generator allows me to generate only XML and Java files. Is there any way to also get the generator to create sql files, SPI definitions and property files for example?
Looking inside the generator, it seems that the Generated java files and XML files go through some further process(formatting et al). Even if I write a custom plugin, I can generate an XML or an sql file only but not a properties files or an sql file. Even if I did, I cannot get the process to finish because the subsequent steps would fail.
Currently, I am getting over these by creating my own files and writing them thru a custom plugin. However, during the plugin execution, the folder target/generates-sources/mybatis-generator is not created yet. Therefore assuming that location to have already been created is ruled out. On the other hand, if I go ahead and create the folder and its internal META-INF/services folder, I am not sure if this will be overwritten at a later stage. In addition, my plugin does not (by virtue of the way the generator initiates plugins), have access to the project root folder. So that is not an option either.
I neither have access to the ShellCallBack, implying that postponing the file creation to a well defined time-point in the build process is also not possible.
So how do I go about creating the service definitions and the additional resource files?
The last resort is to hard-code the project folder or to pump the project folder through a property. This is coming to my rescue now. But clearly, the generated files are being detected by my git client and I have to clean up these files also despite their being dynamic.
Hints please?
Thanks in advance.
Rahul
The generator currently supports Java, Kotlin, and XML file generation. There is an open feature request to support other file types in plugins. You can follow it here: https://github.com/mybatis/generator/issues/752

Don't load/scan class files from a specific jar

I'd like to know how to configure the maven-bundle-plugin (backed by bnd) to completely ignore the classes contained within an embedded jar.
Background
I'm working in a controlled environment where the environment my code is running on is defined by a single company (including all the tools). The code is java and uses OSGi to define module dependencies.
Some of the provided modules contain what look like invalid class files, I can only assume that the system will 'correct' these class files before it tries to load them into any type of JVM. In any case these class files work when deployed onto the target system.
I'm trying to create a build system based on Maven that can produce packages the system understands and have hit a problem where these invalid class files are being read by BND (via apache-felix) which causes errors.
I'd like a way to have the jars that contain these class files on the class path of the bundle but where the contained .class files aren't read/processed by bnd. I could settle for simply ignoring the errors and continuing but can't find a way to do that either without felix aborting the entire build phase.
I just found the -failok directive, don't know why I didn't find it before. Adding <_failok>true</_failok> to the instructions allows me to continue working.
See instructions-ref

How to Create a Spring+Primefaces+Hibernate (no maven) project in eclipse?

I am new to J2EE. I would like to create a Spring+Primefaces+Hibernate project.
I googled for it.
But I found all projects examples show in internet contains maven. My questions are
Is it possible to create a spring+primefaces+hibernate project in eclipse without Maven? If no, what is need of maven?
How to add the jar file of primefaces and spring and hibernate in eclipse?
Will the spring controller xml file (spring context or dispatcher servlet) be created automatically or manually?I mean Spring MVC.
Will the hibernate file (mapping file) also be created automatically or manually?
If possible, can anyone guide me to tutorial (preferably video) to implement the same?
I am using tomcat 7 and Eclipse - kepler.
Any help is appreciated.
If this is downvoted , do specify the reason also.
Although it's not a 'must' to use Maven or any other build tool, you should strongly consider using one.Eclipse Kepler has by default maven support but feel free to use other build tools(Gradle, Ant) or none(see 2.).Maven and the other build tools remove the headache of scaffolding, searching for dependencies(external jars like spring-mvc, hibernate, some db drivers), even deploying applications in a server.
If you chose not to use a build tool you have to manually get your project dependencies and enter them
into your project's buildpath(Right Click -> Build Path then enter their location).As you have noticed this step can be really really time consuming...
No, you have to manually create the configuration unless you use another project that already has what you need, again this might get easier with a build tool(maven archetypes for example)
The same as 3.
You won't have a hard time finding resources about these technologies, they are being used practically everywhere, and I think the Spring team has some videos in their YouTube channel.
Hope that helps a little!
1:* The fundamental difference between Maven and Ant is that Maven's design regards all projects as having a certain structure and a set of supported task work-flows (e.g., getting resources from source control, compiling the project, unit testing, etc.). While most software projects in effect support these operations and actually do have a well-defined structure, Maven requires that this structure and the operation implementation details be defined in the POM file. Thus, Maven relies on a convention on how to define projects and on the list of work-flows that are generally supported in all projects.
This design constraint resembles the way that an IDE handles a project, and it provides many benefits, such as a succinct project definition, and the possibility of automatic integration of a Maven project with other development tools such as IDEs, build servers, etc.
But one drawback to this approach is that Maven requires a user to first understand what a project is from the Maven point of view, and how Maven works with projects, because what happens when one executes a phase in Maven is not immediately obvious just from examining the Maven project file. In many cases, this required structure is also a significant hurdle in migrating a mature project to Maven, because it is usually hard to adapt from other approaches.
In Ant, projects do not really exist from the tool's technical perspective. Ant works with XML build scripts defined in one or more files. It processes targets from these files and each target executes tasks. Each task performs a technical operation such as running a compiler or copying files around. Targets are executed primarily in the order given by their defined dependency on other targets. Thus, Ant is a tool that chains together targets and executes them based on inter-dependencies and other Boolean conditions.
The benefits provided by Ant are also numerous. It has an XML language optimized for clearer definition of what each task does and on what it depends. Also, all the information about what will be executed by an Ant target can be found in the Ant script.
A developer not familiar with Ant would normally be able to determine what a simple Ant script does just by examining the script. This is not usually true for Maven.
However, even an experienced developer who is new to a project using Ant cannot infer what the higher level structure of an Ant script is and what it does without examining the script in detail. Depending on the script's complexity, this can quickly become a daunting challenge. With Maven, a developer who previously worked with other Maven projects can quickly examine the structure of a never-before-seen Maven project and execute the standard Maven work-flows against it while already knowing what to expect as an outcome.
It is possible to use Ant scripts that are defined and behave in a uniform manner for all projects in a working group or an organization. However, when the number and complexity of projects rises, it is also very easy to stray from the initially desired uniformity. With Maven this is less of a problem because the tool always imposes a certain way of doing thi
2:* You have to download all required jars file for hibernate/spring/primefaces from internet and place them in your project build path or in lib folder.
3:* Spring configuration files need to be created by you so that you can get the concept.
4:* Hibernate mapping files can be created by using reverse Engineering techniques for hibernate from where you can generates hbm files or you can use annotations if you dont want xml.
I suggest you to first create a sample java project in eclipse then download all required jars and place them in lib folder. Then configure hibernate in projects and spring integration.

How can you develop bottom-up JAX-WS web services referencing classes contained in separate jar files?

I am developing a Java EE 6 bottom-up JAX-WS to expose an EJB3.1 stateless session bean. The web service in a WAR is failing to install on deployment because it references an external jar (or shared library) which one can assume is not loaded yet.
The common suggestion is to include the jars in the /lib folder, which does fix the issue, however the jars need to remain in this external shared library location and NOT in the ear file, because they amount to 30MB.
What are some techniques to get around this issue in a Websphere (WAS v.8) environment or any server environment.
Some suggestions I have found include:
1. define classpath in META-INF file.
2. define the resources in deployment.xml
3. alter class loading order
4. (from ibm) In the case where the jars are part of a Shared Library configured on WebSphere Application Server, then a User Library must be used to configure the project for development before generating the WebService.
However, I have been unsuccessful to find any help online in these areas. Is there another technique or does anyone know anything about accomplishing this? Thanks in advance.
EDIT: If I specify the libraries in the META-INF using class-path, they are loaded before extensions, shared libraries..etc, but they are still loaded after the WAR which is not good. Again, this isn't a runtime issue because the web services are created at deployment on the fly.
I submitted a ticket to IBM. The libraries referenced by the web service are needed during deployment and must be bundled into the Ear in some fashion. I threw them in the web-inf/lib folder. However, if the referenced libraries then depend on additional libraries, these can be placed in the Shared Libraries. Seems odd to me too, but let's all just admit "shared libraries" are a hack anyways.
If you still have issues, just make sure your class loading is set to parent_last.

Spring Data repository scanning in different maven modules

Is it possible to get Sping Data to scan separate modules for repositories.
I have created a repository in one maven module and wish to access it from another on which it has a dependency.
However I cannot figure out the configuration to tell it to scan in multiple modules/jar files.
In the logs I am seeing multiple references to scanning "core-engine", where the repository that I require is sitting in "test-model"
DEBUG main - core.io.support.PathMatchingResourcePatternResolver - Searching directory
[<path>\engine\core-engine\target\test-classes\] for files matching pattern
[<path>/engine/core-engine/target/test-classes/**/model/repository/**/*.class]
The project has a number of modules but there are only 2 that should have an impact in this case and they are "core-engine" and "test-model".
"test-model" contains all of the configuration i.e. the repository definitions, the entities and the repository interfaces.
"core-engine" has a dependency on "test-model".
I am using SpringRunner to run my tests and have tried referring to the ContextConfiguration in "test-model" itself or indirectly by importing the repository config xml into a separate "core-engine" config to no avail.
I have tests running within the "test-model" module which use the repositories, my problem is just getting access to these repositories from "core-engine".
--> test-model (maven module)
---->src/main/java
------>com.test.model.domain (various simple Entities)
------>com.test.model.repository (the repository interfaces)
---->src/main/resources
------>META-INF/datapump/dao-jpa-repository.xml
---->src/test/java
------>com.test.model.domin (various simple CRUD tests using the repositories)
---->src/test/resources
------>META-INF/test-context.xml (defines the Application context and imports dao-jpa-respoitory)
dao-jpa-repository.xml contains line which is found and testable within the test-model module
core-engine has a dependency on test-model.
--> core-engine (maven module)
---->src/main/java
------>com.test.model.inject (classes which attempt to use the repositories defined in test-model)
---->src/test/java
------>com.test.model.inject (tests for the above classes)
---->src/test/resources
------>META-INF/test-context.xml (defines the Application context and also imports dao-jpa-respository from the test-model)
From above I have a test in the core-engine that tries to persist an entity from the test-model using its repository. However I cannot get access to the repository (through autowiring or by manually looking it up) as it appears that the repository is not in the context.
If anyone could help I'd appreciate it.
Cheers