Is it possible to get Sping Data to scan separate modules for repositories.
I have created a repository in one maven module and wish to access it from another on which it has a dependency.
However I cannot figure out the configuration to tell it to scan in multiple modules/jar files.
In the logs I am seeing multiple references to scanning "core-engine", where the repository that I require is sitting in "test-model"
DEBUG main - core.io.support.PathMatchingResourcePatternResolver - Searching directory
[<path>\engine\core-engine\target\test-classes\] for files matching pattern
[<path>/engine/core-engine/target/test-classes/**/model/repository/**/*.class]
The project has a number of modules but there are only 2 that should have an impact in this case and they are "core-engine" and "test-model".
"test-model" contains all of the configuration i.e. the repository definitions, the entities and the repository interfaces.
"core-engine" has a dependency on "test-model".
I am using SpringRunner to run my tests and have tried referring to the ContextConfiguration in "test-model" itself or indirectly by importing the repository config xml into a separate "core-engine" config to no avail.
I have tests running within the "test-model" module which use the repositories, my problem is just getting access to these repositories from "core-engine".
--> test-model (maven module)
---->src/main/java
------>com.test.model.domain (various simple Entities)
------>com.test.model.repository (the repository interfaces)
---->src/main/resources
------>META-INF/datapump/dao-jpa-repository.xml
---->src/test/java
------>com.test.model.domin (various simple CRUD tests using the repositories)
---->src/test/resources
------>META-INF/test-context.xml (defines the Application context and imports dao-jpa-respoitory)
dao-jpa-repository.xml contains line which is found and testable within the test-model module
core-engine has a dependency on test-model.
--> core-engine (maven module)
---->src/main/java
------>com.test.model.inject (classes which attempt to use the repositories defined in test-model)
---->src/test/java
------>com.test.model.inject (tests for the above classes)
---->src/test/resources
------>META-INF/test-context.xml (defines the Application context and also imports dao-jpa-respository from the test-model)
From above I have a test in the core-engine that tries to persist an entity from the test-model using its repository. However I cannot get access to the repository (through autowiring or by manually looking it up) as it appears that the repository is not in the context.
If anyone could help I'd appreciate it.
Cheers
Related
I am working on implementation of some logic with cascading rules execution on IBM ODM. I was pretty much following these instructions from documentation. Namely, I wrote some simple logic invoking one ruleset from another, using library class and referenced two jar files:
jrules-res-8.10.5.1-execution.jar and
jrules-res-8.10.5.1-session-java.jar.
Which I've taken from local ODM installation and referenced them in eclipse via .classpath of a XOM project like that:
<classpathentry kind="var" path="ILOG_BR_HOME/WEB-INF/lib/jrules-res-8.10.5.1-session-java.jar"/>
Classpath variable ILOG_BR_HOME points to a local folder with required jars.
The documentation states that I need to exclude the two jars from export, which I did.
The problem, however, occurred when I tried to deploy updated ruleset to an instance of ODM. I exported an archive of all projects (xom with jar files and updated boms) and imported it via web interface of the decision center. The import was successful and, for instance, verbalization of new items was accessible via web editor of action rules. However, I received the following errors upon deployment:
errorilog.rules.teamserver.brm.builder.IlrBuildException: While applying business to execution (B2X) model mapping
GBREX0033E: Cannot load execution class 'io.cascadingruletest.RulesetChildRunner' for translating business class 'io.cascadingruletest.RulesetChildRunner', got '[java.lang.NoClassDefFoundError: ilog.rules.res.session.config.IlrSessionFactoryConfig]'
GBREX0001E: Cannot find execution class 'ilog.rules.res.session.IlrSessionException' for translating business class 'ilog.rules.res.session.IlrSessionException'
GBREX0033E: Cannot load execution class 'io.cascadingruletest.contracts.ValueContext' for translating business class 'io.cascadingruletest.contracts.ValueContext', got '[java.lang.NoClassDefFoundError: ilog.rules.res.session.IlrSessionException]'
From what I gather it means that referenced jars are not accessible for deployment. I tried various ways of deploying, putting jars directly in the xom archives, or in the root of archive to be imported, with the same results. Having tried to find documentation on how exactly linking should be done in this situation, I have to admit I am confused and don't understand some principles here:
Should jrules* jars be present inside a zip archive of xom? Or are they already a part of installation and should simply be referenced properly? (I checked the same jars I have locally are also present inside a remote instance of ODM I was deploying to) Or maybe they should be deployed separately?
What is the preferred way of linking external (and internal) libraries in order to invoke them in XOM of a rule? Having looked through some examples in documentation and github, I've encountered several ways:
a. Via .classpath file (see my example below or gh example)
b. Via .ruleproject file:
<entries xsi:type="ilog.rules.studio.model.xom:SystemXOMPathEntry" name="jrules-teamserver.jar" url="file:C:/Program Files/IBM/ODM8104/teamserver/lib/jrules-teamserver.jar" exported="true"/>
Taken from here
c. Reference in a deploy ant script with res-deploy-xom task:
<target name="runloanvalidation">
<res-deploy-xom
hostname="localhost"
portnumber="9080"
webapp="res"
userid="resAdmin"
password="resAdmin"
jarMajorVersion="false"
libName="person"
outputRulesetProperty="ruleset.managedxom.uris">
<xompath>
<fileset dir="hello">
<include name="*.jar"/>
</fileset>
</xompath>
</res-deploy-xom>
<echo message="Resulting property: ${ruleset.managedxom.uris}"/>
</target>
Taken from documentation.
I do not have an ant script for deployment at the moment, and I've been deploying previously using either eclipse to sync with decision center or via exporting zip and importing it using web interface at an ODM instance.
d. Referenced as an external jars and libraries using explorer tab of Rule Execution Server
console via Add Library Reference button
If jrules jar files are already present at an instance of ODM and should simply be referenced properly, what are the ways to check the relative paths and list all accessible libraries? And do I need to grant specific rights and privileges to make such a references?
Thanks in advance for your help!
PS I am using 8.10.5.1 version of ODM decision center and eclipse as an IDE with a rule designer plugin.
The RulesetChildRunner that is mentioned in the documentation is an interesting way to cascade ruleset executions. However, it is a bit complex to use.
If you have a simple use case and you can handle your own rule project dependencies, I would suggest to create a top rule project that orchestrate the sequence of the 2 main ruleflows you want to cascade.
That would be simpler to manage than to create custom code.
If you need to pass parameters from decisionoperation1 to the decisionoperation2 ruleset, you can do that using a bal rule to initialize parameter_ruleset2 from parameter (or_xom) info from the ruleset1.
Hope this helps,
Best
Emmanuel
I am writing a scala client that should perform several reads from maven remote repository (dependency tree evaluation).
To perform e2e tests to my code I need a running maven repository (artifactory, nexus, archiva etc...) with several artifacts deployed.
I am looking for a way I can use test utility that will allow me to start embedded server with code configured artifacts and dependency relationship. That way I can set it up just before my test, use it and stop it.
If possible - I want to avoid using filesystem
Of course - that library can be either scala or java
There is a MockRepositoryServer in the Mojohaus project run by the Maven committers and others that does what you need. It is specifically designed for that exact testing purpose.
You can also use a full blown Nexus Repository Manager in a local install. Either will work.
When we generate a maven AEM project, how do we decide the archetype to use? What are the deciding factors and best practices for the same?
You can find the a baseline structure provided in Adobe-Marketing-Cloud space on Github - aem-project-archetype
This is a very basic structure to start with and provides you following modules -
Core : Core bundle (java code goes here)
it.launcher - Baseline bundle to support Integration Tests with AEM
it.test - Integrations tests
ui.apps - Module for your components,template etc code.
ui.content - Project sample/test content or may be actual content (actual content in codebase is not a good practice)
Important thing to know prior to deciding the structure for your project is -
Is the implementation for multiple brands or to be used across multiple projects
Is there a need of platform which provides the basic/core functionality/features to be extended by different implementations
What is the roadmap for the project
That said, a best practice is to separate interface and implementation into different modules. Most modules will have 3 sub-modules (api, core and package).
api: OSGi specification describes a modular system with separate api
bundle
core: An implementation bundle providing a service
package: Packaging 2 bundles to generate AEM content package.
There can also be packages which consists of contents without api/service. Such modules do not follow convention of osgi bundles, for example configuration, components, design etc.
In most of the our AEM implementations, the project was generated from the com.cqblueprints.archetypes:multi-module Maven Archetype and its folder structure was refactored according to AEM 6 Implementation Guidelines.
All modules created are to organize dependencies in better way and have clean separation of package deployment.
Number of modules can vary depending on the project, some common reusable modules as baseline may include -
1. build-settings
This folder can hold commonly used settings and scripts :
- CI server scripts/setting
- Maven's settings.xml
- Reusable bash profile specific to project etc.
2. Common Module
This will have [api,core and content sub-modules]. As name suggests this should have generic service or utility classes that do not belong to any module or can be used across all modules. Be extra careful and justify reason for adding classes in this module else as a malpractice everything ends up on common module.
3. UI Module
This will have [api (optional if you need OSGI services here),core and content sub-modules].
- The core module holds all your SlingModel, WCMUse extensions and
supporting Pojos.
- Content package to contain all your UI functionality related to components, templates. Its important to structure this module correctly so that addition of components, pages etc doesn't make it unmanageable.
We created following structure in the content module, /apps/<your_project>/ui
components : all components goes here. Further sub categorized as [content,global,scripts]
install
page : page components
templates : page templates
4. Configurations Module
This module to carry OSGI, Cloud Configurations and if implemented the /conf based implementations as well. Conf based implementation sample here
OSGI Configurations Module : Package module with all configurations as content.
Cloud Configuration Module : Package module with all configurations as content
5. Sling Error Handler Module
Any error handling content should reside here. Sample configuration has author mode display error stack and in publish mode it returns 404 response.
6. Designs Module
Any error handling content should reside here. Sample configruation has author mode display error stack and in publish mode it returns 404 response.
7. Content Module
Packages sample content and/or test content. In some implementations we chose to keep test content as separate module.
8. Complete Module
The is the package module that gets compiled at the last and combines all the packages generated in above modules into a single package for deployment to the server.
If your application has lot of business logic or processing you could add up more modules, for instance in couple of projects we have following additional modules as well -
Grunt/Gulp build
Services/Operations (for business layer)
Validations
Data Import
Incontainer tests
Incontainer test content
In addition to these we created a pom project that abstracts all the dependencies, configurations, plugins, profiles specific to AEM project and used that as a parent for the project POM. This cleaned up the project pom and allowed for reusability across projects for same client.
Sample parent.pom here
We have some Maven modules shared between several teams, with the mandate to share the source code even though our projects use different dependencies and resources. To accomplish this, we have our modules set up as recommended in Using Maven When You Can't Use the Conventions under "Producing Multiple Unique JARs from a Single Source Directory." Specifically, we have a shared parent module containing the src directory but whose pom declares <packaging>pom</packaging> and only builds the two submodules. Each submodule inherits from this parent and refers to the shared src directory using this:
<build>
<sourceDirectory>../src/main/java</sourceDirectory>
</build>
The two submodules have different artifact ids, allowing dependent modules and projects to specify which version and dependency set they need. It also upholds the Maven principle of "one module, one output."
This all works great in Maven-land: compilation, installation, deployments, etc. What doesn't work well is Eclipse integration. Some things work fine: building the modules, deploying to our Maven repo, pulling in dependencies to build our project. But things such as code completion and jumping to class/method definitions do not work at all. It's as though Eclipse doesn't recognize the source at all.
If we just check out a module from SVN, Eclipse doesn't know about the classes but instead uses jars from the repo. If we then import the modules as Maven modules, they show up in package explorer and the project build path. However, all references to those classes and methods are now flagged as errors by Eclipse. And we still do not have code completion or navigation.
So my questions are these: How can we get Eclipse to recognize the code and do its normal code navigation while still satisfying our varying project requirements? Am I missing some simple Eclipse configuration? Do we need to rework our Maven module structure, and if so, how?
Some additional context: The different dependencies for the projects are rather large, including different major versions for things such as Weblogic and Spring. The Weblogic versions will converge some time next year, but the other dependencies will be slower (and some resource files will likely always remain distinct). So for the near- to mid-future, we have to account for different dependencies between the projects.
We are using profiles to allow our Jenkins server to build both submodules while allowing individual developers to build only the submodule their project needs. Using profiles to manage the dependencies is problematic because we lose transitivity of dependencies.
Update (12/8/15)
I was eventually able to make Eclipse recognize the source directory by using "Link Source..." on the "Configure Build Path..." dialog. Adding a source folder would not let me reference the module's parent directory, but Link Source let me assign an arbitrary directory to use. It's not ideal, but it seems to be working.
I was eventually able to make Eclipse recognize the source directory by using "Link Source..." on the "Configure Build Path..." dialog. Adding a source folder would not let me reference the module's parent directory, which derailed me for a while. However,Link Source let me assign an arbitrary directory to use.
It's not ideal, but it seems to be working. We can now jump to definitions with F3, and errors are now highlighted correctly. It's good enough that I don't feel bad recommending it to the other team. I wish Eclipse would automatically allow a parent source directory to be referenced, but at least the manual intervention worked right.
Ultimately we are trying to figure out a build/deploy process at my company. As the developer, I need to provide my source code to an audit group. They review the software for security deviations. If the source passes the audit, then the code goes to the Software Configuration Group. Their job is to archive and compile my app to a WAR. The WAR file is then provided to the IT department who will put the WAR on the server. I think the process would be easy if I had one self contained project.
But in Eclipse I have two Maven projects, where one depends on the other. One project core provides core functionality. I separated it because these core functionalites will be used by all my other (internal) web app projects.
Logging
filters
common models (phonebook, employee, etc)
common utilities (Emailing employess, String utils, etc..)
In the other projects, say project1, I add a dependency to core in the POM. Not sure if I need to do this but I also edited the Eclipse project properties and added a reference to the core project. With some finagling (new to Maven) I was able to get Project1 deployed to my local install of JBoss. I opened the WAR and in WEB-INF/lib folder I could see that core-0.0.1-SNAPSHOT.jar was automatically included.
But how do I give SCM my source for project1 which also needs the source for core without manually copying cores source into porject1s source.
I could copy core-0.0.1-SNAPSHOT.jar into Project1 but they should also be reviewing cores source every time I deploy a new app because I may have added or tweaked some core functionality.
You should learn more about maven SNAPSHOT and release repositories. Then install Nexus server as destination for produced jars, wars, javadocs and sources (called artifacts).
After that maybe you will be interested in commercial Nexus version with staged deployment option.
http://www.sonatype.com/people/2009/06/using-staging-repositories-for-deployment-in-nexus/
To solve packaging problem you can use Maven Assembly Plugin. You can have all sources and dependencies in one file.
Maybe there are even more suitable for your needs maven plugins.