I have a Mule flow to fetch data from a table in a PostgreSQL database and convert the data into XML format and write to a file:
<mule ...>
<spring:bean id="Postgres-jdbcDataSource"
class="org.enhydra.jdbc.standard.StandardDataSource" destroy-method="shutdown">
<spring:property name="driverName" value="org.postgresql.Driver" />
<spring:property name="url"
value="jdbc:postgresql://host:port/schema?user=username&password=password" />
</spring:bean>
<jdbc:connector name="Postgres-jdbcConnector"
dataSource-ref="Postgres-jdbcDataSource" pollingFrequency="60000"
transactionPerMessage="false">
<jdbc:query key="read" value="SELECT * FROM tablename" />
</jdbc:connector>
<file:connector name="file_connector" fileAge="500"
streaming="false" pollingFrequency="60000" />
<flow name="Postgres-flow">
<jdbc:inbound-endpoint queryKey="read"
connector-ref="Postgres-jdbcConnector">
<jdbc:transaction action="ALWAYS_BEGIN" />
<property key="receiveMessageInTransaction" value="true" />
</jdbc:inbound-endpoint>
<custom-transformer name="Postgres-transformer"
class="com.example.transformer.DbToXmlTransformer" ignoreBadInput="false"
encoding="UTF-8" />
<file:outbound-endpoint connector-ref="file_connector"
path="/home/path" outputPattern="file.xml" responseTimeout="10000"
encoding="UTF-8" />
</flow>
</mule>
When I run this flow, the flow does not fetch data from DB and write to file. It does not throw any errors or exceptions either. But when I run the same flow for MySQL or SQLServer database, changing driverName and url properties accordingly, the flow works fine.
Any idea why the Postgres database does not work? Probably it requires different DataSource class?
There is also a Postgre data source for mule and you can use it instead of spring beans :
<jdbc:postgresql-data-source name="PostgreSQL_Data_Source" user="your user name" password="your pwd" url="jdbc:postgresql://localhost:5432/TestDB" transactionIsolation="UNSPECIFIED" doc:name="PostgreSQL Data Source"/>
Anyways... , with your existing config you can just check by keeping the JDBC inbound endpoint in a poll component and place a logger before File outbound to check the payload value.. If it gets payload value in logger ..that means it is fetching the value ..Let me know if it works ... you can try the following :-
<mule ...>
<spring:bean id="Postgres-jdbcDataSource"
class="org.enhydra.jdbc.standard.StandardDataSource" destroy-method="shutdown">
<spring:property name="driverName" value="org.postgresql.Driver" />
<spring:property name="url"
value="jdbc:postgresql://host:port/schema?user=username&password=password" />
</spring:bean>
<jdbc:connector name="Postgres-jdbcConnector"
dataSource-ref="Postgres-jdbcDataSource" pollingFrequency="60000"
transactionPerMessage="false">
<jdbc:query key="read" value="SELECT * FROM tablename" />
</jdbc:connector>
<file:connector name="file_connector" fileAge="500"
streaming="false" pollingFrequency="60000" />
<flow name="Postgres-flow">
<poll frequency="1000" doc:name="Poll">
<jdbc:inbound-endpoint queryKey="read"
connector-ref="Postgres-jdbcConnector">
<jdbc:transaction action="ALWAYS_BEGIN" />
<property key="receiveMessageInTransaction" value="true" />
</jdbc:inbound-endpoint>
</poll>
<!-- You can also use object to xml transformer if you are not using any custom transformer -->
<!--<mulexml:object-to-xml-transformer doc:name="Object to XML"/> -->
<custom-transformer name="Postgres-transformer"
class="com.example.transformer.DbToXmlTransformer" ignoreBadInput="false"
encoding="UTF-8" />
<logger message="Payload :- #[message.payload]" level="INFO" doc:name="Logger"/>
<file:outbound-endpoint connector-ref="file_connector"
path="/home/path" outputPattern="file.xml" responseTimeout="10000"
encoding="UTF-8" />
</flow>
</mule>
For more find the reference here for Postgre Database with Mule :- http://www.dotnetfunda.com/articles/show/2068/using-mule-studio-to-read-data-from-postgresqlinbound-and-write-it-to
Related
I am trying to connect to a DB2 database on a Mainframe. I am using the db2jcc.jar driver. My config looks like this:
<spring:beans>
<spring:bean id="db2DataSource" name="db2DataSource" class="com.ibm.db2.jcc.DB2DataSource" destroy-method="finalize" scope="singleton">
<spring:property name="serverName" value="mycompany.com"/>
<spring:property name="portNumber" value="7803"/>
<spring:property name="databaseName" value="DBNAME"/>
<spring:property name="driverType" value="4"/>
<spring:property name="user" value="username"/>
<spring:property name="password" value="password"/>
</spring:bean>
</spring:beans>
<db:generic-config name="DB2_Database"
driverClassName="com.ibm.db2.jcc.DB2Driver"
doc:name="Generic Database Configuration" dataSource-ref="db2DataSource"/>
<flow name="databaseexampleFlow">
<http:listener config-ref="HTTP_Listener_Configuration" path="/test" doc:name="HTTP"/>
<db:select config-ref="DB2_Database" doc:name="Select from Table">
<db:dynamic-query><![CDATA[SELECT * FROM DB2.EA_SALEFRC_PRCSPOC;]]></db:dynamic-query>
</db:select>
<logger message="Selection: #[payload]" level="INFO" doc:name="Logger"/>
</flow>
I am getting an error complaining about the DB2DataSource class.
Caused by: java.lang.ClassNotFoundException: Cannot load class 'com.ibm.db2.jcc.DB2DataSource'
I can Test the connection and it works fine. Any ideas?
Please check when you build your project for deploying the jar's for DB2 driver are exported with your project.
you can go to
{mule home}/apps/{your application}/lib
and check if jar's are available or not.
I'm trying to use SmtpAppender of log4net in order to send log via Gmail. But it's not working - I did not received any email. Internal logging didn't showed any errors, so I'm even not sure is it failed or not.
Here is config:
<appSettings>
<add key="log4net.Internal.Debug" value="true" />
</appSettings>
<system.diagnostics>
<trace autoflush="true">
<listeners>
<add name="tracer"
type="System.Diagnostics.TextWriterTraceListener"
initializeData="D:\\Dev\\Camps\\log4net.log" />
</listeners>
</trace>
</system.diagnostics>
<log4net>
<root>
<level value="ALL" />
<appender-ref ref="SmtpAppender" />
</root>
<appender name="SmtpAppender" type="log4net.Appender.SmtpAppender">
<authentication value="Basic" />
<username value="...#gmail.com" />
<password value="..." />
<to value="...#gmail.com" />
<from value="...#gmail.com" />
<subject value="log4net message from Camps.DAL" />
<smtpHost value="smtp.gmail.com" />
<port value="587"/>
<bufferSize value="1" />
<EnableSsl value="true"/>
<lossy value="false" />
<evaluator type="log4net.Core.LevelEvaluator">
<threshold value="ALL"/>
</evaluator>
<layout type="log4net.Layout.PatternLayout">
<conversionPattern value="%newline%date [%thread] %-5level %logger [%property{NDC}] - %message%newline%newline%newline" />
</layout>
</appender>
</log4net>
It seems you have to use your username instead of your email address:
<username value="...#gmail.com" /> <<---- username, not email address
I am trying to use a logger that will log into MongoDB, but I can not get it to work. In the same configuration i have set up the loggers to log using an email and file and both work just fine.
Here is my NLog.config file
<?xml version="1.0" encoding="utf-8" ?>
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<extensions>
<add assembly="NLog.MongoDB"/>
</extensions>
<!--
See http://nlog-project.org/wiki/Configuration_file
for information on customizing logging rules and outputs.
-->
<targets>
<target xsi:type="File" name="file" fileName="${basedir}/logs/${shortdate}.log"
layout="${longdate} ${uppercase:${level}} ${message}" />
<target xsi:type="Mongo"
name="mongoDefault"
connectionString="mongodb://localhost/nlog"
collectionName="cdss"
cappedCollectionSize="26214400">
<property name="ThreadID" layout="${threadid}" bsonType="Int32" />
<property name="ThreadName" layout="${threadname}" />
<property name="ProcessID" layout="${processid}" bsonType="Int32" />
<property name="ProcessName" layout="${processname:fullName=true}" />
<property name="UserName" layout="${windows-identity}" />
</target>
<target name="TcpOutlet" xsi:type="Chainsaw" address="tcp4://localhost:4505" > </target>
<target name="Email" xsi:type="Mail"
smtpServer="localhost"
smtpPort="25"
smtpAuthentication="None"
enableSsl="false"
from="ssss#sdsdfs"
to="ssss#sdsdfs" html="true"
/>
</targets>
<rules>
<logger name="*" minlevel="Trace" writeTo="file,TcpOutlet,mongoDefault,Email" />
<logger name="*" minlevel="Error" writeTo="file,TcpOutlet,Email,mongoDefault" />
</rules>
</nlog>
I have installed the Nlog.Mongo nugget also. My database is called nlog. Whatever i do, the loggers do not write in the mongodb. I am using NLogger.
I am using JBoss AS 5.1.0 and Jboss ESB 4.10
I am trying to Invoke a Service which has a single action. I have Set MEP = oneWay for the Service.
When I Invoke the Service Using the Below Method I do not get a reply but an Exception.
new ServiceInvoker("Chapter3Sample", "Chapter3Service").deliverSync(esbMessage, 10000);
WHen I change mep=RequestResponse : I am able to get the Reply
As per my understanding ESB Message has a ReplyTo field (Since I am invkoing a Sync Request) the Message should be returned back by the last Action which is not happening in my case. Please find below the ESB XML:
<?xml version="1.0"?>
<jbossesb parameterReloadSecs="5"
xmlns="http://anonsvn.labs.jboss.com/labs/jbossesb/trunk/product/etc/schemas/xml/jbossesb-1.0.1.xsd"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://anonsvn.labs.jboss.com/labs/jbossesb/trunk/product/etc/schemas/xml/jbossesb-1.0.1.xsd http://anonsvn.jboss.org/repos/labs/labs/jbossesb/trunk/product/etc/schemas/xml/jbossesb-1.0.1.xsd">
<providers>
<jms-provider connection-factory="ConnectionFactory" name="JBossMQ">
<jms-bus busid="chapter3GwChannel">
<jms-message-filter dest-name="queue/chapter3_Request_gw" dest-type="QUEUE"/>
</jms-bus>
<jms-bus busid="chapter3EsbChannel">
<jms-message-filter dest-name="queue/chapter3_Request_esb" dest-type="QUEUE"/>
</jms-bus>
</jms-provider>
</providers>
<services>
<service category="Chapter3Sample"
description="A template for Chapter3" name="Chapter3Service">
<listeners>
<jms-listener busidref="chapter3GwChannel" is-gateway="true" name="Chapter3GwListener"/>
<jms-listener busidref="chapter3EsbChannel" name="Chapter3Listener"/>
</listeners>
<actions mep="OneWay">
<action class="org.jboss.soa.esb.samples.chapter3.MyAction"
name="BodyPrinter">
<property name="process" value="displayMessage"/>
<property name="symbol" value="*"/>
<property name="count" value="50"/>
<property name="propertyName">
<hierarchicalProperty attr="value">
<inner name="myName" random="randomValue"/>
</hierarchicalProperty>
</property>
<property name="exceptionMethod" value="processException"/>
<property name="okMethod" value="processSuccess"/>
</action>
</actions>
</service>
</services>
</jbossesb>
When your are invoking call as synchronus.
new ServiceInvoker("Chapter3Sample", "Chapter3Service").deliverSync(esbMessage, 10000).
set mep=RequestResponse.
when your are invoking call asynchronus.
new ServiceInvoker("Chapter3Sample", "Chapter3Service").deliverASync(esbMessage, 10000).
set mep=oneWay .
I am using mybatis 3.0.4 for a test against a mysql 5.5 database with mysql-connector JDBC driver, version 5.1.16.
The problem I am experiencing is that if I get a SqlSession via openSession() method and i retrieve data via a select from database, subsequent selects in same session are not aware of changes made (and committed) to database even if i call clearCache() on session. To concurrently modify database I am using Mysql command line client. Setting cacheEnabled as false in configuration file doesn't help too.
I enclose configuration file.
<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE configuration
PUBLIC "-//ibatis.apache.org//DTD Config 3.0//EN"
"http://ibatis.apache.org/dtd/ibatis-3-config.dtd">
<configuration>
<settings>
<setting name="lazyLoadingEnabled" value="false"/>
<setting name="cacheEnabled" value="false"/>
</settings>
<environments default="development">
<environment id="development">
<transactionManager type="JDBC" />
<dataSource type="POOLED">
<property name="poolMaximumIdleConnections" value="20"></property>
<property name="poolMaximumActiveConnections" value="80"></property>
<property name="poolMaximumCheckoutTime" value="600"></property>
<property name="poolTimeToWait" value="600"></property>
<property name="driver" value="com.mysql.jdbc.Driver"/>
<property name="url" value="jdbc:mysql://localhost:3306/testdb"/>
<property name="username" value="root"/>
<property name="password" value="password"/>
</dataSource>
</environment>
</environments>
<mappers>
<mapper resource="mappers/TestMapper.xml" />
</mappers>
</configuration>
Ok, solved by myself. It was not a mybatis issue, it depended on Transaction isolation on JDBC driver, which was by default TRANSACTION_REPEATABLE_READ, I needed TRANSACTION_READ_COMMITTED instead. Solved with:
getSessionFactory().openSession(TransactionIsolationLevel.READ_COMMITTED);