BOM Entry creation fails for XML Schema (Business Rules service on Bluemix) - ibm-cloud

I am working with the Rule Designer for the Business Rules service on Bluemix to create a BOM entry from an XML schema. When I select the XOM entry in the wizard, I see the error:
"Invalid XOM entry, please check your log file"
The Eclipse log file contains the following stacktrace:
!MESSAGE An error occurred while loading the XML schema
C:\mySchema.xsd.
!STACK 0
ERROR ERR011: in source file:C:/mySchema.xsd, after line 18, before
lines ?, The type reference on MyType cannot be resolved.
ERROR ERR011: in source file:C:/mySchema.xsd, after line 21, before
lines ?,
The type reference on MyType cannot be resolved.
at ilog.rules.xml.model.IlrXsdXomConvertorBase.convertSchema(IlrXsdXomConvertorBase.java:111)
at com.ibm.rules.dynamic.xom.SchemaDriver.loadModel(SchemaDriver.java:159)
at com.ibm.rules.dynamic.xom.XsdBuilder.buildXom(XsdBuilder.java:63)
at ilog.rules.studio.model.xom.impl.IlrDynamicXOMPathEntryImpl.getXsdReflect(IlrDynamicXOMPathEntryImpl.java:676)
...
How do I resolve this?

I had multiple xsd:import statements for the same namespace.
I put all the declarations for the namespace in a single file, and then used a single xsd:import for that namespace.
Another way is to add a new schema file that uses several xsd:include statements to include all the schema declarations for the namespace into this new file.

I had the same error message when I used www. at my xsd namespace.
Try changing your namespace "schema targetNamespace= and xmlns:tns= schema" until you can create the BOM succesfully. Verify you have the same value for both.

Related

OpenSource: Encryption of JDBC Password in configuration properties file

As I noticed a plugin available for the enterprise version (https://download.rundeck.com/plugins/encrypted-datasource-plugin.html); is there an option for users of Rundeck open source to perform the same kind of encyption of datasource password in the configuration file?
As I noticed many people mentioning writing their own java programs and leveraging the Jasypt utilities; I tried this. I do have two jar files (one for encrypt and one for decrypt). I created a directory (since I'm using rpm based Rundeck 3.3 installation) called: /var/lib/rundeck/lib . I added this directory to the JVM classpath in /etc/sysconfig/rundeckd via: export RDECK_JVM_SETTINGS="-Djava.class.path=/var/lib/rundeck/lib/*". I converted my /etc/rundeck/rundeck-config.properties file to groovy format and updated the /etc/sysconfig/rundeck with: export RDECK_CONFIG_FILE="/etc/rundeck/rundeck-config.groovy". However when I change the /etc/rundeck/rundeck-config.groovy entry for datasource.password to:
datasource.password=MyDecrypt("MyTest123Password"); I get an error in the Rundeck logs after restarting:
[2020-09-08T18:01:03,168] WARN context.AnnotationConfigServletWebServerApplicationContext - Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'application': Initialization of bean failed; nested exception is groovy.lang.MissingMethodException: No signature of method: groovy.util.ConfigSlurper$_parse_closure5.MyDecrypt() is applicable for argument types: (String) values: [MyTest123Password]
Any suggestions?
That's encryption is only for Rundeck Enterprise, perhaps the best approach on Rundeck Community is to secure the rundeck-config.properties file through file UNIX permissions.

Allow jasper server users to create a domain

I would like that some users could create domains on my JasperReports Server (users with role X),
I found a documentation were everything is explaind (doc), it worked in a demo enterprise version of JasperReports
but when I tried the same process on the production JasperReports (LDAP configured), I got this error :
Error creating bean with name 'remoteServiceConfiguration' -> Error creating bean with name 'authenticationAuthoirizationFilterChainProxy' ->
Cannot resolve reference to bean 'filterInvocationInterceptor' while setting constructor argument with key [6] ->
java.lang.IllegalArgumentException: Unsupported configuration attributes: [L4_IT_DEVS] (my role)
Screen :
Has someone already had this kind of error when configuring domain creation for users ?
I had this problem before and solved by using roles with the prefix "ROLE_" in their names, like "ROLE_L4_IT_DEVS" for example.
Hope that could help !

XML Import Warning: Informatica

I am getting the following warning message while I tried to import XML file in Informatica repository.
Warning: Unexpected condition at: Wcursor.cpp: 305
Contact Informatica technical support for assistance
Continuing may result in damage to your repository.
The XML file is around 70mb and has got around 4500 objects in it. I am migrating an entire application from one server to another.
Not sure why this issue happens. I tried several times and from other client system as well, but no luck.
For importing the XML via command line using "pmrep" command, we need control file. But I dont have any control file for this XML. So cant go with that option.
It would be great if somebody can help me sort out this issue.
Details:
Infa version 9.1
Mounted on Unix environment.
Had the same problem back some time ago. XML parsing takes a lot of memory and / or GUI can't handle it. My solution was to use pmrep command line tool. Worked for me - my workflow was composed of around 3600 objects afair.
If you don't have a control file - create one! Here's a very simple template:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE IMPORTPARAMS SYSTEM "impcntl.dtd">
<!--IMPORTPARAMS This inputs the options and inputs required for import operation -->
<!--CHECKIN_AFTER_IMPORT Check in objects on successful import operation -->
<!--CHECKIN_COMMENTS Check in comments -->
<!--APPLY_LABEL_NAME Apply the given label name on imported objects -->
<!--RETAIN_GENERATED_VALUE Retain existing sequence generator, normalizer and XML DSQ current values in the destination -->
<!--COPY_SAP_PROGRAM Copy SAP program information into the target repository -->
<!--APPLY_DEFAULT_CONNECTION Apply the default connection when a connection used by a session does not exist in the target repository -->
<IMPORTPARAMS CHECKIN_AFTER_IMPORT="YES" CHECKIN_COMMENTS="PMREP_IMPORT_TEST" RETAIN_GENERATED_VALUE="NO" COPY_SAP_PROGRAM="NO" APPLY_DEFAULT_CONNECTION="NO">
<!--FOLDERMAP matches the folders in the imported file with the folders in the target repository -->
<FOLDERMAP SOURCEFOLDERNAME="YOUR FIRST SOURCE FOLDER NAME" SOURCEREPOSITORYNAME="REP_DEV" TARGETFOLDERNAME="YOUR FIRST SOURCE FOLDER NAME" TARGETREPOSITORYNAME="REP_TEST"/>
<FOLDERMAP SOURCEFOLDERNAME="YOUR SECOND TARGET FOLDER NAME" SOURCEREPOSITORYNAME="REP_DEV" TARGETFOLDERNAME="YOUR SECOND TARGET FOLDER NAME" TARGETREPOSITORYNAME="REP_TEST"/>
<!--Import will only import the objects in the selected types in TYPEFILTER node -->
<!--TYPENAME type name to import. This should comforming to the element name in powermart.dtd, e.g. SOURCE, TARGET and etc.-->
<!--RESOLVECONFLICT allows to specify resolution for conflicting objects during import. The combination of specified child nodes can be supplied -->
<RESOLVECONFLICT>
<!--TYPEOBJECT allows objects of certain type to apply replace/reuse upon conflict-->
<!--TYPEOBJECT = ALL conflict resolution for ALL types of objects -->
<TYPEOBJECT OBJECTTYPENAME="ALL" RESOLUTION="REPLACE"/>
<!--SPECIFICOBJECT allows a particular object(name, typename etc.) to apply replace/reuse upon conflict -->
<!--NAME Object name-->
<!--EXTRANAME Source DBD name - required for source object to identify uniquely-->
<!--OBJECTTYPENAME Object type name-->
<!--FOLDERNAME Folder which the object belongs to-->
<!--REPOSITORYNAME Repository name that this object belongs to-->
<!--RESOLUTION Resolution to apply for the object in case of conflict-->
<!--SPECIFICOBJECT NAME="your_object" OBJECTTYPENAME="your_object_type" FOLDERNAME="your_source_folder" REPOSITORYNAME="your_source_repo" RESOLUTION="REPLACE"/-->
</RESOLVECONFLICT>
</IMPORTPARAMS>

OFBiz-11.04 deployment in JBoss-5.1.0

In order to get flexibility in load balancing and clustering, I thought of deploying my OFBiz application in JBoss-5.1.0 as per document (https://cwiki.apache.org/OFBTECH/deploying-ofbiz-904-on-jboss-510.html).
Build was successful and I could see all WAR files in server/default/deploy/OFBiz.ear and other JAR files in lib folder. But I am getting a few issues when starting JBoss server. Please take a look and help me if you have any clues.
Issues:
Could not find definition for entity name EntityKeyStore
......................
Could not find definition for entity name JobSandbox
...................
Entity definitions for EntityKeyStore is located in entitymodel.xml of framework/entity/entitydef folder. I could not find this xml anywhere inside OFBiz.ear. (Not only this XML, none of the entitymodel.xml s are found in ear. But no issue with other entities. I do not know why it is so). I checked my database (server/default/data/derby) and found out ENTITY_KEY_STORE there.
The same is the case for JobSandBox.
After this exception, the server seems to proceed with creating dispatcher for each component and it started with accounting. I am getting another issue here.
Could not get root location for component with name [common], error was: org.ofbiz.base.component.ComponentException: No component found named : common
.....
....
....
---- exception report ----------------------------------------------------------
Error processing include at [component://common/webcommon/WEB-INF/common-controller.xml]:java.net.MalformedURLException: Could not get root location for component with name [common], error was: org.ofbiz.base.component.ComponentException: No component found named : common
Exception: java.net.MalformedURLException
This exception is repeatedly printed in log file. And down the line, the same kind of exception is thrown for other components like commonext, accounting and ecommerce.
After these exceptions "Could not find definition for entity name Tenant" is also thrown.
My OFBiz application uses a multitenant environment. Any help is greatly appreciated.

Seam 2.2GA + JBoss AS 5.1GA + Postgres 8.4

Sorry for the big wall of text, but its mostly logs
Thx for any help in any of my problems
I've been trying to get help from Seam forums, but in vain.
I'm trying this Setup mentioned in the title, but unsuccessfully.
I have it all installed correctly and the problems start with the seam-gen.
This is my build.properties
#Generated by seam setup
#Sat Aug 29 19:12:18 BRT 2009
hibernate.connection.password=abc123
workspace.home=/home/rgoytacaz/workspace
hibernate.connection.dataSource_class=org.postgresql.ds.PGConnectionPoolDataSource
model.package=com.atom.Commerce.model
hibernate.default_catalog=PostgreSQL
driver.jar=/home/rgoytacaz/postgresql-8.4-701.jdbc4.jar
action.package=com.atom.Commerce.action
test.package=com.atom.Commerce.test
database.type=postgres
richfaces.skin=glassX
glassfish.domain=domain1
hibernate.default_schema=Core
database.drop=n
project.name=Commerce
hibernate.connection.username=postgres
glassfish.home=C\:/Program Files/glassfish-v2.1
hibernate.connection.driver_class=org.postgresql.Driver
hibernate.cache.provider_class=org.hibernate.cache.HashtableCacheProvider
jboss.domain=default
project.type=ear
icefaces.home=
database.exists=y
jboss.home=/srv/jboss-5.1.0.GA
driver.license.jar=
hibernate.dialect=org.hibernate.dialect.PostgreSQLDialect
hibernate.connection.url=jdbc\:postgresql\:Atom
icefaces=n
./seam create-project works okay, but when I try generate-entities, I get the following...
generate-model:
[echo] Reverse engineering database using JDBC driver /home/rgoytacaz/postgresql-8.4-701.jdbc4.jar
[echo] project=/home/rgoytacaz/workspace/Commerce
[echo] model=com.atom.Commerce.model
[hibernate] Executing Hibernate Tool with a JDBC Configuration (for reverse engineering)
[hibernate] 1. task: hbm2java (Generates a set of .java files)
[hibernate] log4j:WARN No appenders could be found for logger (org.hibernate.cfg.Environment).
[hibernate] log4j:WARN Please initialize the log4j system properly.
[javaformatter] Java formatting of 4 files completed. Skipped 0 file(s).
this is problem no.1. How do I fix this? What is this? I had to do this in eclipse. It worked.
Then I import the seam-gen created project into eclipse, and deploy to JBoss 5.1. While my servers start I've noticed the following..
03:18:56,405 ERROR [SchemaUpdate] Unsuccessful: alter table PostgreSQL.atom.productsculturedetail add constraint FKBD5D849BC0A26E19 foreign key (culture_Id) references PostgreSQL.atom.cultures
03:18:56,406 ERROR [SchemaUpdate] ERROR: cross-database references are not implemented: "postgresql.atom.productsculturedetail"
03:18:56,407 ERROR [SchemaUpdate] Unsuccessful: alter table PostgreSQL.atom.productsculturedetail add constraint FKBD5D849BFFFC9417 foreign key (product_Id) references PostgreSQL.atom.products
03:18:56,408 ERROR [SchemaUpdate] ERROR: cross-database references are not implemented: "postgresql.atom.productsculturedetail"*
03:18:56,408 INFO [SchemaUpdate] schema update complete
Problem no.2. What is this cross-database references?
What about this..
03:18:55,089 INFO [SettingsFactory] JDBC driver: PostgreSQL Native Driver, version: PostgreSQL 8.4 JDBC3 (build 701)
Problem no.3 I've said in the build.properties to use JDBC4 driver, I don't know why seam insists to use JDBC3 driver. Where do I change this?
When I go into http://localhost:5443/Commerce and try to browse the auto-generated CRUD UI.
I get this error.. Error reading 'resultList' on type com.atom.Commerce.action.ProductsList_$$_javassist_seam_2
And this is what is showing in my server logs...
03:34:00,828 INFO [STDOUT] Hibernate:
select
products0_.product_Id as product1_0_,
products0_.active as active0_
from
PostgreSQL.atom.products products0_ limit ?
03:34:00,848 WARN [JDBCExceptionReporter] SQL Error: 0, SQLState: 0A000
03:34:00,849 ERROR [JDBCExceptionReporter] ERROR: cross-database references are not implemented: "postgresql.atom.products"
Position: 81
03:34:00,871 SEVERE [viewhandler] Error Rendering View[/ProductsList.xhtml]
javax.el.ELException: /ProductsList.xhtml: Error reading 'resultList' on type com.atom.Commerce.action.ProductsList_$$_javassist_seam_2
Caused by: javax.persistence.PersistenceException: org.hibernate.exception.GenericJDBCException: could not execute query
Problem no.4 What is going on here? Cross-database references?
Thx for any help in any of my problems.
You did receive a few answers on the Seam forums (here and here), but you didn't follow up. Anyway, all these are actually caused by one problem:
As Stuart Douglas told you, you shouldn't use a catalog when connecting to PostgreSQL. To fix this, replace the property "hibernate.default_catalog=PostgreSQL" in your properties file by the property: "hibernate.default_catalog.null=", so that your file looks like this:
...
model.package=com.atom.Commerce.model
hibernate.default_catalog.null= # <-- This is the replaced property
driver.jar=/home/rgoytacaz/postgresql-8.4-701.jdbc4.jar
...
You should be able to use seam generate-entities fine afterwards (assuming the rest of your configuration is correct). I'd recommend doing the generation into a clean folder.
Cross-database references is when a query tries to access two or more different databases. PostgreSQL does not support this, and thus complains when there is more than 1 period in the table name, so in PostgreSQL.atom.productsculturedetail, the bold part should be removed. Hibernate adds this prefix when you tell it to use a default catalog, which we already fixed in step 1 above (by telling it not to use a catalog), so this problem should be fixed after you regenerate your entities.
(Note that this is effectively the same as what Stuart Douglas told you, that you should remove the catalog="PostgreSQL" attribute in the annotations on your entity classes.)
When you specified the postgresql-8.4-701.jdbc4.jar file in the properties file, this didn't mean that the driver supports JDBC4. Although the name of the file would suggest so, the driver's website clearly states that "The driver provides a reasonably complete implementation of the JDBC 3 specification". This shouldn't be a problem for you, as you're not using the driver directly (or at least you're not supposed to). The driver is sufficient for Hibernate to fulfill its requirements and provide the required functionality.
This issue is caused by the same problem above. Hibernate is unable to read data from the database because of the incorrect query. Fixing the catalog problem should fix this issue.