Beginning jbpm 6 questions about expectations of integration - deployment

I have been researching the jbpm / drools engine and have been working on a proof of concept using jbpm 6 and the kie workbench. I am currently working on a proof of concept for page navigation, the process should receive a variable representing an action the user took on the page the engine uses the variable in the process to decide which page will be displayed to the user next and returns that value.
I have created this navigation example in drools and jbpm that has the "user action" variable mapped as a parameter before starting the process. Both a drools and jbpm application are created through eclipse where the process operated off a JbpmJUnitBaseTestCase class with console printouts that show me both processes are working as I expect.
Now I am working with the Kie Workbench in an effort to construct the same projects but produce a jar file that can be consumed by a stand alone application. Currently I am having trouble finding information for some questions and I not sure if my expectations are exceeding what Kie Workbench was designed to do.
Overall I would like to have a resulting jar file deployed from Kie Workbench that can be used in a stand alone application. It will use a data object from the data modeler that can be assigned values from the stand alone application. I will then run the Business process from the stand alone application to get the return a result to work with and load the corresponding page.
I have accessed the data object included in the deployed jar, can that jar also contain a KB and Session I can attach the data object to and run the process? I would eventually like to see a jar file I can include in a web page where I can instantiate the data object class, assign values and then attach and run a session without adding any additional libraries to the stand alone application.
Thanks you in advance.

If you managed to create your project(s) with all the assets and data models, you can build and deploy the project and that will generate a maven artefact (jar) which will be installed in a local maven repo. You can consume that jar inside your stand alone application. if you are using maven in your standalone application, you just need to add the dependency to your project and the kie-wb repository and it should work.

Related

ZK Project , Need Structural Advice for multiple ZK war applications

I need a advice for design purpose my structure of the application is as follows.
I have three module designed in ZK framework as a separate war application (web application) e.g finance-module , general-ledger and cash-account , all are separate war files can be deploy on tomcat as a separate war files,
Now I want to have a seperate war ZK application that has index or home page and have menu and from that menu I can able to call these three module or war application.
-------------------------- Main module -------------
Menu : general-ledger link , cash-account-link ,finance-module
1) This 4th Main module also has feature to user login and change user preference , means can also have code e.g view module as well as spring service.
Now the question is that how to call other war files zul pages and even if v call how to manage from 4th module and also how to share session across the four war files or applications.
Thanks
Vikas
I have worked in a similar scenario and this worked well for us:
When you create separate war's it means that they are independently running in it's own web context inside tomcat. Usually when you want to share resources like datasources, transactionManagers, or any other kind of jee resource you need to configure them at the tomcat instance level and they will be available in the JNDI directory of the tomcat server and any web application can pull any of them to use it.
https://tomcat.apache.org/tomcat-8.0-doc/config/context.html
https://tomcat.apache.org/tomcat-8.0-doc/jndi-resources-howto.html
There are several jee resources than you can share in the tomcat server but if you need more flexibility and robustness you might need a full stack java enterprise application server like wildfly http://wildfly.org/news/2014/11/20/WildFly82-Final-Released/ or any other commercial like WebSphere, Oracle Weblogic, etc
Now, if you want to share java classes, zul's or any other file, you may want to package them in separate common jars and use them in any war as a dependency and then reference them through the classpath of the web application.To organize and maintenance this modular projects Maven and Gradle are very good tools you can use.
From ZK you can call any other url of the other war's as simple as
<a href="htt://myserver/account/home.zul" label="Account"/>
<a href="htt://myserver/finnance/home.zul" label="Finance"/>
To share the session what you need is to implement a Single Sign On (there are other implementations like oracle opensso), you can configure it directly in tomcat but be aware of this Sharing security context between few web applications .
Spring Securityhas an extraordinary support for this kind of escenario.

jBPM 6 - deploy process definition from API to jbpm-console

I've created a process definition in jBPM Project in Eclipse and now I'd like to deploy this definition to jbpm-console on remote database.
I found (here:https://developer.jboss.org/thread/234899) two ways to deploy a process, but it's not what I want:
- use archetype to create maven project for kjar, then simple mvn clean install and use Deployments view in jbpm console to deploy it
- push your maven project into jbpm console git repository and build and deploy it from within console - there is GIT integration screen cast on jbpm installer chapter in docs that might be useful
Is there a possibility to do this from API? I mean by using some methods.
If I understand your question well, you are looking for some remote API call which allows you to upload your process definitions into the jBPM Console. Am I right?
Unfortunately, there is not such option. The remote API only provides methods to manipulate with the resources that are already on the server. And you can get your resources there using one of those two methods you have mentioned.
However, for process definitions there is also a third option which is more user-friendly but there is no easy way how to automate it. You can just create a new business process directly using the web interface of jBPM Console and upload your process definition in jBPM Designer.

Is it possible to apply changes to JSF files without republishing?

I'm using IBM RAD version 8.0 and deploying the EAR applications to IBM WebSphere 7.0. Each time I change a JSF file, I need to republish the application, otherwise the changes are not visible.
Publishing takes some time, so usually it takes minimum a minute before I'm able to see the efects of even the most minor change. In 'normal' application development it's about a few seconds, and it's crucial for someone who is no JSF coryphee and still learns and needs to experiment...
Is it possible to use the JSF ability to re-load the JSF page definition without application restarting, when working with IBM RAD and WebSphere? Or I'll be forced to create second environment with Eclipse & Tomcat, for JSF experiments only?
This is normally to be configured in server configuration. Doubleclick the desired server in Eclipse's Servers view and head to Publishing section.
Note that you should take Facelet cache into account as well, particularly when using MyFaces which caches relatively agressively. If you make sure that javax.faces.PROJECT_STAGE context parameter is set to Development, then both MyFaces and Mojarra will relax the Facelet caching strategy, causing it to recompile the Facelet file almost instantly instead of using the cached version for a rather long time.
An alternative to Publishing setting is to use JRebel. It is able to publish changes in Java classes such as managed beans and EJBs as well, saving a lot of hotdeployment time. It has an Eclipse plugin as well.
This thread is old, but I still had the same problem, using eclipse and WebSphere.
One place to check is this. If you use JSF files with the .xhtml ending, you have to make sure that changes in these do not trigger automatic republishing.
In the tab "Servers" double-click on your server.
Open the "Publishing settings for WebSphere Application Server"
Click on "Set Advanced Publishing Settings...".
In the "List of file extensions that do not trigger the server to publish ..." insert or append ", *.xhtml".
Close these settings and restart the server.
In web.xml I have also added a parameter with the name javax.faces.PROJECT_STAGE and the value Development , which may have an influence on the offending behavior.

Are there disadvantages to setting up unit or integration tests in Eclipse as a separate project?

I'm currently working on a project using Eclipse where the unit and integration tests are in one project that also contains the DAO and service layer, and there is another project that includes the Web interface. The Web interface contains the Spring configuration files, and instead of duplicating them for the tests in the DAO project, I want to reference the ones that already exist. However, as I started thinking about it, if this is possible, why not just move them into their own project completely and setup project dependencies. Has anyone done this, and do you have an example of this setup, or can you provide some roadblocks you encountered?
I went ahead with this approach, and it doesn't appear to be causing any issues so far. One of our projects has a (classpath) dependency on the other, but the third test project is able to manage that with some setup and configuration.

Is it possible to use Spring within Eclipse plugins?

Is it possible to use a Spring container for DI from inside Eclipse plugins?
I'm wondering because I know that Eclipse causes a lot of issues with class loading, looking up things within the plugin, etc.
The plugin is intended to be distributed as a JAR.
Yes but you will need Spring DM http://www.springsource.org/osgi
The answer is yes. You can use Spring DM, but you don't have to. It is probably better with it.
I did it without Spring DM and the main concern is class loading issues (not sure if Spring DM solves them, but I guess it should). Assuming you bundle the Spring JAR in a separate plugin with dependencies, you will need to load the context with the class loader of the invoking plugin .
Example:
Plugin A - your functional plugin
Plugin B - The Spring lib plugin exporting the spring packages
Plugin A depends on B. When plugin A starts, it will load the application context, when invoking this load, you will need to do something like:
Thread.currentThread().setContextClassLoader(PluginAActivator.class.getClassLoader())
So that the loading of the classes will happen under your own class loader. Now you can use a ClassPathXmlApplicationContext to load configuration XMLs from your class path.
One small note: the default ClassPathXmlApplicationContext validates your XMLs upon loading. You may want to disable it or point your XMLs to a local schema (rather than the standard Spring schema on springframework.org), otherwise, you will connect to the internet to download the schema files upon loading and working offline will fail.
do you have a code example for your post?
This would be great, since I´m hanging around with this for a while.
Cheers!