XML Import Warning: Informatica - metadata

I am getting the following warning message while I tried to import XML file in Informatica repository.
Warning: Unexpected condition at: Wcursor.cpp: 305
Contact Informatica technical support for assistance
Continuing may result in damage to your repository.
The XML file is around 70mb and has got around 4500 objects in it. I am migrating an entire application from one server to another.
Not sure why this issue happens. I tried several times and from other client system as well, but no luck.
For importing the XML via command line using "pmrep" command, we need control file. But I dont have any control file for this XML. So cant go with that option.
It would be great if somebody can help me sort out this issue.
Details:
Infa version 9.1
Mounted on Unix environment.

Had the same problem back some time ago. XML parsing takes a lot of memory and / or GUI can't handle it. My solution was to use pmrep command line tool. Worked for me - my workflow was composed of around 3600 objects afair.
If you don't have a control file - create one! Here's a very simple template:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE IMPORTPARAMS SYSTEM "impcntl.dtd">
<!--IMPORTPARAMS This inputs the options and inputs required for import operation -->
<!--CHECKIN_AFTER_IMPORT Check in objects on successful import operation -->
<!--CHECKIN_COMMENTS Check in comments -->
<!--APPLY_LABEL_NAME Apply the given label name on imported objects -->
<!--RETAIN_GENERATED_VALUE Retain existing sequence generator, normalizer and XML DSQ current values in the destination -->
<!--COPY_SAP_PROGRAM Copy SAP program information into the target repository -->
<!--APPLY_DEFAULT_CONNECTION Apply the default connection when a connection used by a session does not exist in the target repository -->
<IMPORTPARAMS CHECKIN_AFTER_IMPORT="YES" CHECKIN_COMMENTS="PMREP_IMPORT_TEST" RETAIN_GENERATED_VALUE="NO" COPY_SAP_PROGRAM="NO" APPLY_DEFAULT_CONNECTION="NO">
<!--FOLDERMAP matches the folders in the imported file with the folders in the target repository -->
<FOLDERMAP SOURCEFOLDERNAME="YOUR FIRST SOURCE FOLDER NAME" SOURCEREPOSITORYNAME="REP_DEV" TARGETFOLDERNAME="YOUR FIRST SOURCE FOLDER NAME" TARGETREPOSITORYNAME="REP_TEST"/>
<FOLDERMAP SOURCEFOLDERNAME="YOUR SECOND TARGET FOLDER NAME" SOURCEREPOSITORYNAME="REP_DEV" TARGETFOLDERNAME="YOUR SECOND TARGET FOLDER NAME" TARGETREPOSITORYNAME="REP_TEST"/>
<!--Import will only import the objects in the selected types in TYPEFILTER node -->
<!--TYPENAME type name to import. This should comforming to the element name in powermart.dtd, e.g. SOURCE, TARGET and etc.-->
<!--RESOLVECONFLICT allows to specify resolution for conflicting objects during import. The combination of specified child nodes can be supplied -->
<RESOLVECONFLICT>
<!--TYPEOBJECT allows objects of certain type to apply replace/reuse upon conflict-->
<!--TYPEOBJECT = ALL conflict resolution for ALL types of objects -->
<TYPEOBJECT OBJECTTYPENAME="ALL" RESOLUTION="REPLACE"/>
<!--SPECIFICOBJECT allows a particular object(name, typename etc.) to apply replace/reuse upon conflict -->
<!--NAME Object name-->
<!--EXTRANAME Source DBD name - required for source object to identify uniquely-->
<!--OBJECTTYPENAME Object type name-->
<!--FOLDERNAME Folder which the object belongs to-->
<!--REPOSITORYNAME Repository name that this object belongs to-->
<!--RESOLUTION Resolution to apply for the object in case of conflict-->
<!--SPECIFICOBJECT NAME="your_object" OBJECTTYPENAME="your_object_type" FOLDERNAME="your_source_folder" REPOSITORYNAME="your_source_repo" RESOLUTION="REPLACE"/-->
</RESOLVECONFLICT>
</IMPORTPARAMS>

Related

Modsec ruleRemoveTargetById is not removing rules

I am trying to write modsecurity rule exclusions and cant seem to get ctl:ruleRemoveTargetById to work as per the reference manual.
My server is running Debian 9 with apache2 2.4.25-3+deb9 I have tried following the reference manual on github and tried emulating the sample rules in the file >REQUEST-900-EXCLUSION-RULES-BEFORE-CRS.conf. I have written a simple rule to fire on certain arguments and then tried excluding it, but it wont exclude the rule based on the arguments. I can see both the rule id and the rule exclusion id in the logs.
I have, arbitrarily, put the below rule in REQUEST-905-COMMON-EXCEPTIONS.conf
SecRule ARGS "#rx propfind" "id:905999,phase:2,log,msg:'test msg delete rule'"
This fires as it should. There is an argument named <?xml version that contains the pattern "propfind" in my nextcloud settings page, which I am using for testing.
I have also written an exclusion and put it into REQUEST-900-EXCLUSION-RULES-BEFORE-CRS.conf
SecRule ARGS_NAMES "#rx <\?xml\sversion" "phase:2,log,id:1030,ctl:ruleRemoveTargetById=905999;ARGS:/<\?xml\sversion/"
This rule triggers as expected but does not prevent rule 905999 from firing on argument named <?xml version
I have simplified the rules to the two below and it works as expected; when I enter example.com/?test=trigger I see rule 905999 in the log, but if I send example.com/?testarg=trigger I only see the exclusion rule 1030 as expected.
SecRule ARGS "#rx trigger" "id:905999,phase:2,log,msg:'test msg delete rule'"
SecRule ARGS_NAMES "#rx testarg" "phase:2,log,id:1030,ctl:ruleRemoveTargetById=905999;ARGS:testarg"
When I open the nextcloud settings tab the HTTP request contains
<?xml version="1.0"?><d:propfind xmlns:d="DAV:"><d:prop><d:resourcetype/></d:prop></d:propfind>
which Modsec interprets as the argument name <?xml version, containing the value "1.0"?><d:propfind xmlns:d="DAV:"><d:prop><d:resourcetype/></d:prop></d:propfind>
I would expect my rule 1030 to stop 905999 from firing, but i still see both rules in the log. I am assuming it is because of the space between "xml" and "version" but cant figure out how to exclude the target.
For some reason the regex in ctl:ruleRemoveTargetById=905999;ARGS:/<\?xml\sversion/ did not work. Using SecRuleUpdateTargetById 905999 "!ARGS:/<\?xml\sversion" and placing it into RESPONSE-999-EXCLUSION-RULES-AFTER-CRS.conf to update the rule targets after they are loaded works as desired.

MailKit - FolderCache and deleted folder

I am in a situation where I have two clients (ClientA and ClientB) connected to IMAP server. ClientA is running mailkit. When I delete or move a folder with ClientB, mailkit client is getting error on attempt to open or fetch messages from the deleted folder. Actually, I am getting disconnected from the server when i try to fetch message from a deleted folder(I guess that is the expected behavior from the server), because of that I am trying to detect if the folder I am about to execute command to, still exists.
I see mailkit uses FolderCache and when I use GetFolder method even after I reconnect the client, I am still getting IMailFolder reference for the deleted folder when I use GetFolder(string path) method. To avoid the FolderCache, I am creating a new instance of MailClient each time I am about to synchronize remote folders to avoid having not existing folders in the cache. I would like to know if that is recommended approach in that situation?
UPDATE:
So, I am now using GetSubfolders command and I can see a LIST command is sent to the server. However it seems there is an issue with that command in the following scenario:
ClientB is deleting a folder INBOX.spam.op, ClientA is trying to move folder with path INBOX.spam.op.folder1. What happens is - the server is creating a new folder INBOX.spam.op with Attributes NonExistent. That is the expected server behavior in order to create folder with path INBOX.spam.op.folder1
But see what happens with Mailkit when I used GetSubfolders on INBOX.spam - I am getting an instance of IMailFolder with Name = "op", Attributes = a mix of the new attributes NonExistent and the attributes of the old "op" folder (the folder in the FolderCache). UidValidity should be 0 for NonExistent but it is the same as the UIDValidity of "op" folder in the FolderCache even if the server response is this
C: A00000102 LIST "" "INBOX.spam.%" RETURN (SUBSCRIBED CHILDREN STATUS (UIDVALIDITY))
S: * LIST (\NonExistent \HasChildren) "." INBOX.spam.op
S: A00000102 OK List completed (0.001 + 0.000 secs).
I tried to inherit ImapClient and add my own method GetFolderNoCache(string path) but this doesn't work, because of the internal classes. Any other suggestions?
What you want to do is get the top-level folder from the namespace. Then, using that ImapFolder object, get the list of its children (and so on if you are trying to see if a deeply nested folder).
var toplevel = client.GetFolder (client.PersonalNamespaces[0]);
foreach (var folder in toplevel.GetSubfolders ()) {
// look for the folder you are interested in...
// if it's not here, then the folder has been deleted
}

OFBiz-11.04 deployment in JBoss-5.1.0

In order to get flexibility in load balancing and clustering, I thought of deploying my OFBiz application in JBoss-5.1.0 as per document (https://cwiki.apache.org/OFBTECH/deploying-ofbiz-904-on-jboss-510.html).
Build was successful and I could see all WAR files in server/default/deploy/OFBiz.ear and other JAR files in lib folder. But I am getting a few issues when starting JBoss server. Please take a look and help me if you have any clues.
Issues:
Could not find definition for entity name EntityKeyStore
......................
Could not find definition for entity name JobSandbox
...................
Entity definitions for EntityKeyStore is located in entitymodel.xml of framework/entity/entitydef folder. I could not find this xml anywhere inside OFBiz.ear. (Not only this XML, none of the entitymodel.xml s are found in ear. But no issue with other entities. I do not know why it is so). I checked my database (server/default/data/derby) and found out ENTITY_KEY_STORE there.
The same is the case for JobSandBox.
After this exception, the server seems to proceed with creating dispatcher for each component and it started with accounting. I am getting another issue here.
Could not get root location for component with name [common], error was: org.ofbiz.base.component.ComponentException: No component found named : common
.....
....
....
---- exception report ----------------------------------------------------------
Error processing include at [component://common/webcommon/WEB-INF/common-controller.xml]:java.net.MalformedURLException: Could not get root location for component with name [common], error was: org.ofbiz.base.component.ComponentException: No component found named : common
Exception: java.net.MalformedURLException
This exception is repeatedly printed in log file. And down the line, the same kind of exception is thrown for other components like commonext, accounting and ecommerce.
After these exceptions "Could not find definition for entity name Tenant" is also thrown.
My OFBiz application uses a multitenant environment. Any help is greatly appreciated.

FATAL org.apache.hadoop.conf.Configuration - error parsing conf file: org.xml.sax.SAXParseException

I'm trying to run pig locally, installed using homebrew, to test a script. However, I get the following error when I attempt to run a simple dump from the interactive prompt pig -x local:
2012-07-16 23:20:40,447 [Thread-7] INFO org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input paths (combined) to process : 1
[Fatal Error] :63:85: Character reference "&#2" is an invalid XML character.
2012-07-16 23:20:40,688 [Thread-7] FATAL org.apache.hadoop.conf.Configuration - error parsing conf file: org.xml.sax.SAXParseException: Character reference "&#2" is an invalid XML character.
The same load/dump works fine on Elastic MapReduce.
I can't find any XML config files, and I've tried with both version 0.9.2 and 0.10.0
What am I missing?
Edit: Just checked a direct download (vs. homebrew) and it doesn't seem to work either
You should check that your Hadoop configuration files have correct configuration data.
Have a look in your hadoop/conf directory.
Have a look inside:
hdfs-site.xml
mapred-site.xml
core-site.xml
Finally worked out what the problem was. I ended up having to use dtruss -p on the pig/java process. This revealed a temporary directory and dynamically generated xml files. Once the temporary directory was discovered, it all fell quickly into place.
It was picking up the proxy excludes from my network connections, which had, as far as I can tell, &#2 (http://www.fileformat.info/info/unicode/char/02/index.htm) embedded in it. How this invalid value came to be in my network preferences in the first place, I haven't the faintest clue.
The value was then being pulled into dynamically generated files, for example /tmp/hadoop-vertis/mapred/staging/vertis-1005847898/.staging/job_local_0001/job.xml.
The offending lines:
<property><name>ftp.nonProxyHosts</name><value>localhost|*.localhost|127.0.0.1|h|*.h</value></property>
<property><name>socksNonProxyHosts</name><value>localhost|*.localhost|127.0.0.1|h|*.h</value></property>
<property><name>http.nonProxyHosts</name><value>localhost|*.localhost|127.0.0.1|h|*.h</value></property>

Solr 3.1 Jboss server deployment failed

When I deploy Solr 3.1 to Jboss application server (version 6.0 final) I got the following exception message:
Failed to create Resource solr.war - cause: java.lang.Exception:Failed to start deployment [vfs:///D:/jboss-6.0.0.Final/server/default/deploy/solr.war] during deployment of 'solr.war' - cause: java.lang.RuntimeException:org.jboss.deployers.client.spi.IncompleteDeploymentException: Summary of incomplete deployments (SEE PREVIOUS ERRORS FOR DETAILS): * DEPLOYMENTS IN ERROR: Name -> Error vfs:///D:/jboss-6.0.0.Final/server/default/deploy/solr.war -> org.jboss.deployers.spi.DeploymentException: Error creating managed object for vfs:///D:/jboss-6.0.0.Final/server/default/deploy/solr.war DEPLOYMENTS IN ERROR: Deployment "vfs:///D:/jboss-6.0.0.Final/server/default/deploy/solr.war" is in error due to the following reason(s): org.xml.sax.SAXException: Element type "tlibversion" must be declared. # vfs:///D:/jboss-6.0.0.Final/server/default/deploy/solr.war/WEB-INF/lib/velocity-tools-2.0-beta3.jar/META-INF/velocity-view.tld[22,16] ->
I wonder why this error occurred.
I tried to deploy both Solr version 1.4 and 4.0 to the same server and no error was found.
(My deploy method: Use JBoss AS 6 Admin Console and Add "solr.war" as a new resource for standalone web application)
Thank you for attention and any help is regarded.
me again :) .... good news I fixed it I just edited this file: solr.war\WEB-INF\lib\velocity-tools-2.0-beta3.jar\META-INF\velocity-view.tld
to this (you copy and paste it as is):
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE taglib PUBLIC "-//Sun Microsystems, Inc.//DTD JSP Tag Library 1.2//EN" "http://java.sun.com/dtd/web-jsptaglibrary_1_2.dtd">
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
<taglib>
<tlib-version>1.0</tlib-version>
<jsp-version>1.2</jsp-version>
<short-name>velocity</short-name>
<uri>http://velocity.apache.org/velocity-view</uri>
<display-name>VelocityView Tag</display-name>
<description><![CDATA[Support for using Velocity and VelocityTools within JSP files and tags.
This makes it trivial to render VTL (Velocity Template Language)
or process a Velocity template from within JSP using the current
context. This also provides the typical VelocityView support
for accessing and configuring both custom and provided
VelocityTools.]]></description>
<tag>
<name>view</name>
<tag-class>org.apache.velocity.tools.view.jsp.VelocityViewTag</tag-class>
<body-content>tagdependent</body-content>
<attribute>
<name>id</name>
<required>false</required>
<rtexprvalue>true</rtexprvalue>
<description><![CDATA[A id unique to this usage of the VelocityViewTag. This id is used to uniquely identify this tag in log messages and hopefully at some point serve as a key under which any body for this tag may be cached as an already-parsed template for improved performance. If no id is specified, then a unique is automatically generated, though that will understandably be less useful in log messages.]]></description>
</attribute>
<attribute>
<name>var</name>
<required>false</required>
<rtexprvalue>true</rtexprvalue>
<description><![CDATA[A variable name whose value should be set to the rendered result of this tag.]]></description>
</attribute>
<attribute>
<name>scope</name>
<required>false</required>
<rtexprvalue>true</rtexprvalue>
<description><![CDATA[This property is meaningless unless a 'var' attribute is also set. When it is, this determines the scope into which the resulting variable is set.]]></description>
</attribute>
<attribute>
<name>template</name>
<required>false</required>
<rtexprvalue>true</rtexprvalue>
<description><![CDATA[The name of a template to be requested from the configured Velocity resource loaders and rendered into the page (or variable if the 'var' attribute is set) using the current context. If this tag also has body content, then the body will be rendered first and placed into the context used to render the template as '$bodyContent'; this approximates the "two-pass render" used by the VelocityLayoutServlet.]]></description>
</attribute>
<attribute>
<name>bodyContentKey</name>
<required>false</required>
<rtexprvalue>true</rtexprvalue>
<description><![CDATA[This property is meaningless unless a 'template' attribute is set and the tag has body content in it. When it is, this changes the key under which the rendered result of the body content is placed into the context for use by the specified template. The default value is "bodyContent" and should be sufficient for nearly all users.]]></description>
</attribute>
</tag>
</taglib>