How is it supposed to build some tests with the repository approach in Spring Data MongoDB? I would like to set the test database for my tests since I don't want to use the production database for this purpose. It should be probably possible but I have no idea. This is my application context:
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:context="http://www.springframework.org/schema/context"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xmlns:neo4j="http://www.springframework.org/schema/data/neo4j"
xsi:schemaLocation=
"http://www.springframework.org/schema/context
http://www.springframework.org/schema/context/spring-context-3.0.xsd
http://www.springframework.org/schema/data/mongo
http://www.springframework.org/schema/data/mongo/spring-mongo-1.0.xsd
http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-3.0.xsd
http://www.springframework.org/schema/data/neo4j
http://www.springframework.org/schema/data/neo4j/spring-neo4j.xsd">
<!-- Default bean name is 'mongo' -->
<mongo:mongo host="${mongo.host}" port="${mongo.port}">
<mongo:options connections-per-host="8"
threads-allowed-to-block-for-connection-multiplier="4"
connect-timeout="${mongo.connect-timeout}"
max-wait-time="${mongo.max-wait-time}"
auto-connect-retry="true"
socket-keep-alive="true"
socket-timeout="${mongo.socket-timeout}"
slave-ok="true"
write-number="1"
write-timeout="0"
write-fsync="true"/>
</mongo:mongo>
<bean id="mongoTemplate" class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg ref="mongo" />
<constructor-arg name="databaseName" value="${mongo.db}" />
</bean>
<context:component-scan base-package="domain.company.group.project.data.repositories"/>
<!-- MongoDB repositories -->
<mongo:repositories base-package="domain.company.group.project.data.repositories.mongodb"/>
<!-- some other stuff -->
</beans>
And let's say I have a simple repository as follows:
public interface LocationRepository extends MongoRepository<Location, String>, LocationRepositoryCustom {
}
where LocationRepositoryImpl is the class implementing all my custom methods for a certain Location (domain object) class. My test class looks like:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(locations={"/test-context.xml"})
public class LocationRepositoryTest {
#Autowired
private LocationRepository locationRepository;
/* Some tests... */
}
I have tried to embed a MongoDB instance within my running tests (as explained here) but it does not work: the connection to the test database is established but the mongo template seems not able to be overwritten as all save methods keep inserting data to the "production" database.
I am using Spring 3.2.0 and Spring Data Mongo 1.1.0.RELEASE. I am using Junit for testing.
Any suggestions?
Thank you in advance.
Jaranda,
I faced the same problem last week and coincidentally I heard about Fongo, "an in-memory java implementation of mongo."
So I decide to use it to test my custom repositories and worked perfectly to me. Below is an example of how to configure Spring to use Fongo in JUnit tests. Note that I'm not using xml configuration.
Hope that will be useful!
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration
public class LocationRepositoryTest {
private static final String PLAYER_ID = ObjectId.get().toString();
#Autowired private LocationRepositoryCustom playerRepository;
#Autowired private MongoTemplate mongoTemplate;
/* Some tests... */
#Configuration
static class LocationRepositoryTestConfiguration {
#Bean
public Mongo mongo() {
// Configure a Fongo instance
return new Fongo("mongo-test").getMongo();
}
#Bean
public MongoTemplate mongoTemplate() {
return new MongoTemplate(mongo(), "collection-name");
}
#Bean
public LocationRepositoryCustom playerRepository() {
// This is necessary if MongoTemplate is an argument of custom implementation constructor
return new LocationRepositoryCustomImpl(mongoTemplate());
}
}
}
Related
#Bean
public LockProvider lockProvider(DataSource dataSource) {
return new JdbcTemplateLockProvider(dataSource);
}
#Bean
public ScheduledLockConfiguration taskScheduler(LockProvider lockProvider) {
return ScheduledLockConfigurationBuilder
.withLockProvider(lockProvider)
.withPoolSize(10)
.withDefaultLockAtMostFor(Duration.ofMinutes(10))
.build();
}
My requirement is to run only single scheduler at only one instance in clustered enviroment. For this i am using shedlock, but problem is that at server startup i am getting the below exception, "java.lang.ClassCastException: net.javacrumbs.shedlock.spring.SpringLockableTaskSchedulerFactoryBean cannot be cast to org.springframework.scheduling.concurrent.ThreadPoolTaskScheduler"
Help me on this.
You can easily do this with dlock. You simply do the following and add registrar to your xml config.
Java Code
#TryLock(name = "doSomeWork", owner = "serviceA", lockFor = ONE_MINUTE)
public void doSomeWork() {
//...
}
XML Config
<!-- A bean for the lock implementation. Note that there should be only one global implementation-->
<bean id="postgresLock" class="com.yusufaytas.dlock.jdbc.PostgresIntervalLock">
<constructor-arg type="javax.sql.DataSource" ref="lockDataSource"/>
</bean>
<!-- The lock gets auto-registered to the registrar -->
<bean id="lockRegistrar" class="com.yusufaytas.dlock.spring.IntervalLockRegistrar"/>
I have a problem with PDX serialization on a remote Geode instance when using SB to create a Geode client cache as ->
#Configuration
public class GeodeClientConfiguration {
#Bean
ClientCache cache() {
return new ClientCacheFactory()
.setPdxPersistent(true)
.setPdxDiskStore("foo")
.setPdxReadSerialized(true)
.setPdxSerializer(new ReflectionBasedAutoSerializer(false, "foo.EpgProgram"))
.create();
}
#Bean
Region<String, List<EpgProgram>> testRegion(final ClientCache cache) {
return cache.<String, List<EpgProgram>> getRegion("schedule");
}
The cache.xml looks like ->
<?xml version="1.0" encoding="UTF-8"?>
<client-cache
xmlns="http://geode.apache.org/schema/cache"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://geode.apache.org/schema/cache
http://geode.apache.org/schema/cache/cache-1.0.xsd"
version="1.0">
<pool name="serverPool">
<locator host="localhost" port="10334"/>
</pool>
<region name="schedule" refid="CACHING_PROXY">
<region-attributes pool-name="serverPool"
scope="global" />
</region>
In Gfsh I have created a region as
create region --name=/schedule --type=REPLICATE_PERSISTENT
When adding a List of EpgProgram to the region during testing in the method
public List<EpgProgram> getScheduleFromWhatson(String channel, LocalDate broadcastDate, Boolean expand) throws RestClientException, URISyntaxException {
List<EpgProgram> programs = transform(whatsOnServiceInternal.getScheduleFromWhatson(channel, broadcastDate), expand);
schedule.put(channel, programs);
return programs;
}
The pdx instance seems to get generated using reflection from what I can see from the info trace ->
[info 2016/12/09 11:32:33.361 CET <http-nio-8080-exec-1> tid=0xc8] Auto serializer generating type for class dk.dr.epg.core.EpgProgram for fields:
printable: private boolean dk.dr.epg.core.EpgProgram.printable
live: private boolean dk.dr.epg.core.EpgProgram.live
rerun: private boolean dk.dr.epg.core.EpgProgram.rerun
But just after that I get an exception ->
org.apache.geode.pdx.PdxInitializationException: The PDX metadata must be persistent in a member that has persistent data. See CacheFactory.setPdxPersistent.
Have missed any other place where I have to set Pdx persistense ??
Geode version: 1.0.0-incubating.
You need to configure the pdx registry to be persistent on the server. You can do that in gfsh like this:
gfsh> configure pdx --disk-store=DEFAULT
I think you need to restart the server after that for the changes to take effect.
I need to periodically check about 30 mailboxes and want to do this with annotations only. I know how to do it with XML files, it looks like this:
<mail:inbound-channel-adapter id="ImapAdapter"
store-uri="imaps://${login}:${pass}#${host}:993/inbox"
channel="testReceiveEmailChannel"
should-delete-messages="false"
should-mark-messages-as-read="true"
auto-startup="true"
java-mail-properties="javaMailProperties">
<int:poller fixed-delay="200"
time-unit="SECONDS"
task-executor="asyncTaskExecutor"/>
</mail:inbound-channel-adapter>
<int:channel id="testReceiveEmailChannel">
<int:interceptors>
<int:wire-tap channel="logger"/>
</int:interceptors>
</int:channel>
<int:service-activator input-channel="testReceiveEmailChannel"
ref="testMailReceiverService"
method="receive"/>
<bean id="testMailReceiverService" class="com.myproject.email.EmailReceiverService">
<property name="mailBox" value="${login}"/>
</bean>
<int:logging-channel-adapter id="logger" level="DEBUG"/>
I know that Spring 4+ have #InboundChannelAdapter but I dont know how to use it. Actually I am new in Spring, so any helps very appreciated!
You are looking into the correct way - #InboundChannelAdapter. If you take a look to the Documentation properly, you'll see something like this:
#Bean
#InboundChannelAdapter(value = "testReceiveEmailChannel", poller = #Poller(fixedDelay = "200000", taskExecutor = "asyncTaskExecutor"))
public MessageSource<javax.mail.Message> mailMessageSource(MailReceiver mailReceiver) {
MailReceivingMessageSource mailReceivingMessageSource = new MailReceivingMessageSource(mailReceiver);
// other setters here
return mailReceivingMessageSource;
}
Where MailReceiver is something like this:
#Bean
public MailReceiver imapMailReceiver(#Value("imaps://${login}:${pass}#${host}:993/inbox") storeUrl) {
ImapMailReceiver imapMailReceiver = new ImapMailReceiver(storeUrl);
// other setters here
return imapMailReceiver;
}
and so with other #Beans for MessageChannel and #ServiceActivator for your EmailReceiverService.
Consider as a tool for Java Configuration the Spring Integration Java DSL.
I want to bind variable dataSourceString(possible value:HR,FINANCE : i am getting dataSourceString value dynamically by jsp) to DataSource. When dataSourceString value is HR then connect to TESTDS and when dataSourceString value is FINANCE then connect to TESTDS1.Means based on dataSourceString value i want to connect to datasource.
Enviornment : EJB3,weblogic10.3.3,JPA
Note:One more thing i dont want to write if-else loop in sessionbean like when dataSourceString is HR then connect to this EnityManage else to different EntityManager.currently there are 10-15 possible value of dataSourceString. I want to write code like if in future if a new dataSourceString is added then only i have to change persistence.xml.
After research i came to the following code,But getting some error.
Error:-
No persistence unit named 'em' is available in scope test.jar. Available persistence units: [HR, FINANCE]
at weblogic.ejb.container.deployer.EJBModule.prepare(EJBModule.java:467)
at weblogic.application.internal.flow.ModuleListenerInvoker.prepare(ModuleListenerInvoker.java:199)
at weblogic.application.internal.flow.DeploymentCallbackFlow$1.next(DeploymentCallbackFlow.java:507)
at weblogic.application.utils.StateMachineDriver.nextState(StateMachineDriver.java:41)
at weblogic.application.internal.flow.DeploymentCallbackFlow.prepare(DeploymentCallbackFlow.java:149)
Truncated. see log file for complete stacktrace
Caused By: java.lang.IllegalArgumentException: No persistence unit named 'em' is available in scope test.jar. Available persistence units: [HR, FINANCE]
at weblogic.deployment.ModulePersistenceUnitRegistry.getPersistenceUnit(ModulePersistenceUnitRegistry.java:132)
at weblogic.deployment.BasePersistenceContextProxyImpl.<init>(BasePersistenceContextProxyImpl.java:38)
at weblogic.deployment.TransactionalEntityManagerProxyImpl.<init>(TransactionalEntityManagerProxyImpl.java:35)
at weblogic.deployment.BaseEnvironmentBuilder.createPersistenceContextProxy(BaseEnvironmentBuilder.java:974)
at weblogic.deployment.BaseEnvironmentBuilder.addPersistenceContextRefs(BaseEnvironmentBuilder.java:855)
Truncated. see log file for complete stacktrace
Error is obvious that there is no persistence unit for em is available in persistence.xml
But how can i achieve lookup of datasource using jpa dynamically.
Following is my code
Session Bean
package entity.library;
import java.util.Collection;
import javax.ejb.Stateless;
import javax.persistence.EntityManager;
import javax.persistence.EntityManagerFactory;
import javax.persistence.Persistence;
import javax.persistence.PersistenceContext;
import javax.persistence.PersistenceUnit;
import java.io.Serializable;
import javax.ejb.*;
#Remote(TestInterface.class)
#Stateless(mappedName="ejb3/TestBeans")
public class TestSessionBean implements Serializable, TestInterface
{
protected TestJPA test;
protected Collection <TestJPA> list;
#PersistenceContext
private EntityManager em;
#PersistenceUnit
private EntityManagerFactory emf;
public Collection <TestJPA> getAllList(String dataSourceString) {
emf = Persistence.createEntityManagerFactory(dataSourceString);
em = emf.createEntityManager();
list=em.createQuery("SELECT test FROM TestJPA test").getResultList();
return list;
}
}
persistence.xml
<persistence version="1.0" xmlns="http://java.sun.com/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_1_0.xsd">
<persistence-unit name="HR" transaction-type="JTA">
<provider>org.eclipse.persistence.jpa.PersistenceProvider</provider>
<jta-data-source>TESTDS</jta-data-source>
<non-jta-data-source>TESTDS</non-jta-data-source>
<properties>
<property name="eclipselink.target-server" value="WebLogic_10"/>
<property name="eclipselink.logging.level" value="FINEST"/>
</properties>
</persistence-unit>
<persistence-unit name="FINANCE" transaction-type="JTA">
<provider>org.eclipse.persistence.jpa.PersistenceProvider</provider>
<jta-data-source>TESTDS1</jta-data-source>
<non-jta-data-source>TESTDS1</non-jta-data-source>
<properties>
<property name="eclipselink.target-server" value="WebLogic_10"/>
<property name="eclipselink.logging.level" value="FINEST"/>
</properties>
</persistence-unit>
</persistence>
If there is a single persistence unit defined in persistence.xml, then it default unit for application & the same is is injected by annotation.
You can lookup manually the specific persistence context at runtime.
javax.persistence.EntityManager entityManager =
(javax.persistence.EntityManager)initCtx.lookup(
"java:comp/env/" + persistenceContext);
Even better...
Instead of:
#PersistenceContext(name="myPU")
private EntityManager em;
specify:
#PeristenceContext(unitName="myPU")
private EntityManager em;
I am looking up multiple datasource, depending an value of x In EJB 3.0.
To do this I have written following code.
Session Bean
package entity.library;
import java.util.Collection;
import javax.ejb.Stateless;
import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext;
import java.io.Serializable;
import javax.ejb.*;
#Remote(TestInterface.class)
#Stateless(mappedName="ejb3/TestBeans")
public class TestSessionBean implements Serializable, TestInterface {
/**
*
*/
private static final long serialVersionUID = 1L;
#PersistenceContext(unitName="EntityBeanDS1")
EntityManager emds1;
#PersistenceContext(unitName="EntityBeanDS2")
EntityManager emds2;
protected TestJPA test;
protected Collection <TestJPA> list;
public Collection <TestJPA> getAllList(int x) {
System.out.println("TestInterface.java:getAllPmns x "+x);
if(x==1)
{
System.out.println("going to lookup datasource1");
list=emds1.createQuery("SELECT test FROM TestJPA test").getResultList();
}
else if(x==2)
{
System.out.println("going to lookup datasource2");
list=emds2.createQuery("SELECT test FROM TestJPA test").getResultList();
}
return list;
}
}
Persitence.xml
<persistence version="1.0" xmlns="http://java.sun.com/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_1_0.xsd">
<persistence-unit name="EntityBeanDS1" transaction-type="JTA">
<provider>org.eclipse.persistence.jpa.PersistenceProvider</provider>
<jta-data-source>TESTDS</jta-data-source>
<non-jta-data-source>TESTDS</non-jta-data-source>
<properties>
<property name="eclipselink.target-server" value="WebLogic_10"/>
<property name="eclipselink.logging.level" value="FINEST"/>
</properties>
</persistence-unit>
<persistence-unit name="EntityBeanDS2" transaction-type="JTA">
<provider>org.eclipse.persistence.jpa.PersistenceProvider</provider>
<jta-data-source>TESTDS1</jta-data-source>
<non-jta-data-source>TESTDS1</non-jta-data-source>
<properties>
<property name="eclipselink.target-server" value="WebLogic_10"/>
<property name="eclipselink.logging.level" value="FINEST"/>
</properties>
</persistence-unit>
</persistence>
Above code is working successfully.but i think this is not a good tech for following reasons.
1. There are 10-15 multiple session beans, in each bean i have to write if-else for lookup of datasource.
2. In future if new data source is added or any newvalue of x is added, then i have to modify all 10-15 files.
Can one give me code , for connecting to multiple data source such that i change a single file for any change in value of x.
How that single file will look like so that i can retrieve 'EntityManager' object. or there is any other method (like modification in persistence.xml) to do this?
You could inject both entity managers into another session bean (QueryBean) and inject QueryBean into your session beans instead of the entity managers themselves. Then delegate the query creation to QueryBean. This bean then decides which entity manager to use depending on the value of x.
When adding another entity manager or a new value of x you just have to adjust QueryBean.