Uploading retrieved files in a FTP server to mongodb collection using spring boot - mongodb

I have implemented a springboot application to retrieve files from an FTP server and to download them into my local directory.
Following is the code which I used to do that.
#Configuration
public class FTPConfiguration {
#ServiceActivator(inputChannel = "ftpMGET")
#Bean
public FtpOutboundGateway getFiles() {
FtpOutboundGateway gateway = new FtpOutboundGateway(sf(), "mget", "payload");
gateway.setAutoCreateDirectory(true);
gateway.setLocalDirectory(new File("./downloads/"));
gateway.setFileExistsMode(FileExistsMode.REPLACE_IF_MODIFIED);
gateway.setFilter(new AcceptOnceFileListFilter<>());
gateway.setOutputChannelName("fileResults");
return gateway;
}
#Bean
public MessageChannel fileResults() {
DirectChannel channel = new DirectChannel();
channel.addInterceptor(tap());
return channel;
}
#Bean
public WireTap tap() {
return new WireTap("logging");
}
#ServiceActivator(inputChannel = "logging")
#Bean
public LoggingHandler logger() {
LoggingHandler logger = new LoggingHandler(LoggingHandler.Level.INFO);
logger.setLogExpressionString("'Files:' + payload");
return logger;
}
#Bean
public DefaultFtpSessionFactory sf() {
DefaultFtpSessionFactory sf = new DefaultFtpSessionFactory();
sf.setHost("localhost");
sf.setPort(2121);
sf.setUsername("anonymous");
sf.setPassword("");
return sf;
}
#MessagingGateway(defaultRequestChannel = "ftpMGET", defaultReplyChannel = "fileResults")
public interface GateFile {
List<File> mget(String directory);
}
}
Now I need to upload these files to my MongoDB automatically, when I run this program.
Can anyone please help me or guide me the steps I should follow?

Please, take a look into a MongoDB support in Spring Integration: https://docs.spring.io/spring-integration/docs/current/reference/html/mongodb.html#mongodb-outbound-channel-adapter
You probably need to think how to make some POJO from files of that MGET result and send it to that MongoDB channel adapter for storing as documents in some collection.

Related

Consuming SOAP Web Services with ActiveMQ in Spring Boot

I am new to Web Service development, currently building a SOAP Web Service with Spring Boot 2.7.0, Java 17.
As well as a client application that communicates with this soap service via JMS.
But I do not know the procedure of the process.
The way I see it -> The client application (Producer) sends a message to a queue that lives on the server side (Consumer), the queue pops the message when ready to consume and redirects it to the endpoint handler method and then sends a response in the response queue back to the client side.
However, I don't know how to redirect the JMS message to the endpoint. Nor do I know how to send it back. I have read all of the documentations related to "SOAP over JMS", CXF-SOAP-JMS", "ActiveMQ with Spring", etc... None of them helped me fix this problem.
Using SOAP with http is pretty easy by exploiting the "WebServiceTemplate", provided by Spring-WS API. But when I tried using it over JMS I encountered several problems, including the following:
What to do with the JMS Message once in the destination object?
How do I send it specifically to my endpoint handler method?
What and how do I send back to the response destination?
Sample code of what I've tried latest
CLIENT APP
Client Configuration
#Configuration
public class SoapClientConfiguration {
#Value("${spring.activemq.broker-url}")
private String activeMqUrl;
#Value("${spring.activemq.user}")
private String userName;
#Value("${spring.activemq.password}")
private String password;
#Bean
Jaxb2Marshaller jaxb2Marshaller() {
Jaxb2Marshaller marshaller = new Jaxb2Marshaller();
marshaller.setPackagesToScan("com.mile.soap.client.app.quiz");
return marshaller;
}
#Bean
WebServiceTemplate template() {
return new WebServiceTemplate(jaxb2Marshaller());
}
#Bean
public DefaultJmsListenerContainerFactory jmsListenerContainerFactory() {
DefaultJmsListenerContainerFactory factory = new DefaultJmsListenerContainerFactory();
factory.setConnectionFactory(mqConnectionFactory());
return factory;
}
#Bean
public SingleConnectionFactory mqConnectionFactory(){
SingleConnectionFactory factory = new SingleConnectionFactory();
ActiveMQConnectionFactory mqConnectionFactory = new ActiveMQConnectionFactory();
mqConnectionFactory.setBrokerURL(activeMqUrl);
mqConnectionFactory.setUserName(userName);
mqConnectionFactory.setPassword(password);
factory.setTargetConnectionFactory(mqConnectionFactory);
return factory;
}
#Bean
public JmsTemplate jmsTemplate(){
JmsTemplate template = new JmsTemplate();
template.setConnectionFactory(mqConnectionFactory());
return template;
}
Client Service
#Service
public class SoapClient extends WebServiceGatewaySupport{
#Autowired WebServiceTemplate template;
#Autowired JmsTemplate jmsTemplate;
public CategoriesResponse getCategories() {
CategoriesResponse response = new CategoriesResponse();
try {
SAAJResult soapRequest = new SAAJResult();
template.getMarshaller().marshal(new GetCategoriesRequest(), soapRequest);
Message m = jmsTemplate.sendAndReceive("example.queue", new MessageCreator() {
#Override
public Message createMessage(Session session) throws JMSException {
return session.createObjectMessage(soapRequest.toString());
}
});
response = m.getBody(CategoriesResponse.class);
}
catch (Exception e) {
e.printStackTrace();
}
return response;
}
SERVER SIDE APP
ActiveMQ Configuration
#Configuration #EnableJms
public class ActiveMqConfig {
#Value("${spring.activemq.broker-url}")
private String activeMqUrl;
#Value("${spring.activemq.user}")
private String userName;
#Value("${spring.activemq.password}")
private String password;
#Bean
public DefaultJmsListenerContainerFactory jmsListenerContainerFactory() {
DefaultJmsListenerContainerFactory factory = new DefaultJmsListenerContainerFactory();
factory.setConnectionFactory(mqConnectionFactory());
return factory;
}
#Bean
public SingleConnectionFactory mqConnectionFactory(){
SingleConnectionFactory factory = new SingleConnectionFactory();
ActiveMQConnectionFactory mqConnectionFactory = new ActiveMQConnectionFactory();
mqConnectionFactory.setBrokerURL(activeMqUrl);
mqConnectionFactory.setUserName(userName);
mqConnectionFactory.setPassword(password);
factory.setTargetConnectionFactory(mqConnectionFactory);
return factory;
}
}
Main Configuration (WSDL/SERVLET)
#Configuration
#EnableWs
public class SoapConfiguration extends WsConfigurerAdapter{
#Bean(name = Bus.DEFAULT_BUS_ID)
public SpringBus springBus(){
SpringBus bus = new SpringBus();
return bus;
}
#Bean
public ServletRegistrationBean<MessageDispatcherServlet> messageDispatcherServlet(
ApplicationContext applicationContext, SpringBus springBus){
MessageDispatcherServlet servlet = new MessageDispatcherServlet();
servlet.setApplicationContext(applicationContext);
servlet.setTransformWsdlLocations(true);
return new ServletRegistrationBean<>(servlet, "/*");
}
//wsdl
#Bean(name = "quiz") #SneakyThrows
public DefaultWsdl11Definition defaultWsdl11Definition(XsdSchema schema) {
DefaultWsdl11Definition defaultWsdl11Definition = new DefaultWsdl11Definition();
defaultWsdl11Definition.setPortTypeName("QuizMainEndPoint");
defaultWsdl11Definition.setLocationUri("/");
defaultWsdl11Definition.setTargetNamespace("http://www.mile.com/collection/management/soap/Quiz");
defaultWsdl11Definition.setTransportUri("http://www.openuri.org/2002/04/soap/jms/");
defaultWsdl11Definition.setSchema(schema);
return defaultWsdl11Definition;
}
#Override
public void addInterceptors(List<EndpointInterceptor> interceptors) {
EndpointInterceptor endpointInterceptor = new PayloadRootSmartSoapEndpointInterceptor(
new QuizMainEndpointInterceptor(), "http://www.mile.com/collection/management/soap/Quiz", "GetCategoriesRequest");
interceptors.add(endpointInterceptor);
}
#Bean
public XsdSchema schema() {
return new SimpleXsdSchema(new ClassPathResource("/schemas/QuizSchema/quiz.xsd"));
}
}
Listener
#Component
public class Listener {
#JmsListener(destination = "example.queue")
public void listenRequests(Message message) {
System.out.println(message.toString());
/*I RECEIVE THE MESSAGE BUT I HAVE NO IDEA WHAT TO DO WITH IT.
* HOW DO I CONSUME IT?
*/
}
}
Method in a class annotated with #Endpoint
#ResponsePayload
#PayloadRoot(namespace = NAMESPACE, localPart = "GetCategoriesRequest")
public CategoriesResponse getCategories( #RequestPayload GetCategoriesRequest request) {
CategoriesResponse response = new CategoriesResponse(service.getCategories());
/*
* How to CONVERT my JMS Message living in the Destination Object - "example.queue" To a SOAP Message
* and be RECOGNISED by this exact method??
Do i send a JMS response here or somewhere else?
Is it sent by default?
*/
return response;
}
Thank you for reading thoroughly. I'd appreciate any kind of help.

Spring Cloud Stream unit test Producer

I'm using Spring Cloud Stream 3.1.3.
I'm migrate from a pre 3.1 version, so I wrote my producer using a java.util.Function (I know I can use Supplier but this is what I need)
application.yaml file is configured with function definition, input and output bindings, and this is what I have:
#EnableAutoConfiguration
#Service
public class Producer {
public void produce(int messageId, Object data) {
Message<Object> message = MessageBuilder
.withPayload(data)
.setHeader(PARTITION_KEY, messageId)
.build();
streamBridge.send("produceMessage-in-0", message);
}
#Bean
public Function<Message<Object>, Message<Object>> produceMessage() {
return (input) -> {
int messageId = input.getHeaders().get(PARTITION_KEY, Integer.class);
Object message = input.getPayload();
return MessageBuilder
.withPayload(message)
.setHeader(MessageHeaders.CONTENT_TYPE, MimeTypeUtils.APPLICATION_JSON)
.setHeader(PARTITION_KEY, messageId)
.setHeader("type", "MyMessage")
.build();
};
}
}
Now, I would like to test this implmentation, so I wrote this test class
#SpringBootTest
class ProducerTest {
#Autowired
private Producer producer;
#Autowired
private ObjectMapper objectMapper;
#Test
void produceOk() {
try (ConfigurableApplicationContext context = new SpringApplicationBuilder(TestChannelBinderConfiguration.getCompleteConfiguration(Producer.class)).run()) {
producer.produce(1, new MyMessage(1, "Hello"));
OutputDestination output = context.getBean(OutputDestination.class);
Message<byte[]> received = output.receive();
Assertions.assertNotNull(received);
}
}
}
Test fails because output.receive() returns null.
Is this the right way to test my code?
Thanks
It is difficult to see what your issue may be since we don't see the entire setup of your project, but here are few pointers that may help. . .
Please look at any of the tests we use in the framework as well as checkout the Testing section of the reference manual.
There is also s dedicated StreamBridgeTests.java which is what I believe you are looking for.

Use multiple mongo DBs in same application for same model & same Repository

I need to implement Spring boot - MongoDb application where There are 2 mongo DBs which have exact same database name & collections. Based on User making a request, i need to choose whether to fetch data from DB1 or DB2 (only difference in mongo URI host - IP).
E.g. I need some way to create 2 mongoTemplates like mTempA & mTempB in my Repository & based on some condition, use either of the template to execute query as below:
#Repository
public class MyCustomRepository {
private Logger logger = LoggerFactory.getLogger(MyCustomRepository.class);
#Autowired
private MongoTemplateA mongoTemplateA;// Need to know if this is possible & how
#Autowired
private MongoTemplateB mongoTemplateB;// Need to know if this is possible & how
public List<MyModel> findByCriteria(MyRequest request) {
List<MyModel> result;
//Query query = <build query based on request>
if (request.getUserType().equals("A")) {
result = mongoTemplateA.find(query, MyModel.class);
} else {
result = mongoTemplateB.find(query, MyModel.class);
}
logger.debug("Result fetched with {} records", result.size());
return result;
}
}
I don't want to have 2 separate Repo (Class or Interfaces) or different models to be used. Just want to have 2 different mongoTemplates to be injected in single repo.
Is this possible? If yes, please give some example code.
I have followed below tutorial:
https://dzone.com/articles/multiple-mongodb-connectors-with-spring-boot
As rightly pointed out by #Lucia, below is how it can be done:
Have 2 different configuration placeholders
#Configuration
#EnableMongoRepositories(basePackages = "com.snk.repository", mongoTemplateRef = "mongoTemplateA")
public class MongoConfigA {
// Configuration class for DB 1 access
}
#Configuration
#EnableMongoRepositories(basePackages = "com.snk.repository", mongoTemplateRef = "mongoTemplateB")
public class MongoConfigB {
// Configuration class for DB 2 access
}
Get one class which will help in reading custom properties for mongo db properties in application.properties:
#ConfigurationProperties(prefix = "mongodb")
public class MultipleMongoProperties {
private MongoProperties adb = new MongoProperties();
private MongoProperties bdb = new MongoProperties();
public MongoProperties getAdb() {
return adb;
}
public MongoProperties getBdb() {
return bdb;
}
}
Add a configuration class to create mongoTemplates:
#Configuration
#EnableConfigurationProperties(MultipleMongoProperties.class)
public class MultipleMongoConfig {
#Autowired
private MultipleMongoProperties mongoProperties = new MultipleMongoProperties();
#Bean(name = "mongoTemplateA")
#Primary
public MongoTemplate mongoTemplateA() {
return new MongoTemplate(aDbFactory(this.mongoProperties.getAdb()));
}
#Bean(name = "mongoTemplateB")
public MongoTemplate mongoTemplateB() {
return new MongoTemplate(bDbFactory(this.mongoProperties.getBdb()));
}
#Bean
#Primary
public MongoDbFactory aDbFactory(final MongoProperties mongo) {
return new SimpleMongoDbFactory(new MongoClientURI(mongo.getUri()));
}
#Bean
public MongoDbFactory bDbFactory(final MongoProperties mongo) {
return new SimpleMongoDbFactory(new MongoClientURI(mongo.getUri()));
}
}
Add below decelerations to your service/repository:
#Autowired
#Qualifier("mongoTemplateA")
private MongoTemplate mongoTemplateA;
#Autowired
#Qualifier("MongoTemplateB")
private MongoTemplate MongoTemplateB;
Add below properties in your application.properties:
mongodb.adb.uri=mongodb://user:pass#myhost1:27017/adb
mongodb.bdb.uri=mongodb://user:pass#myhost2:27017/bdb
If you have mongo rplica set, URL can be set as:
mongodb.adb.uri=mongodb://user:pass#myhost1,myhost2,myhost13/adb?replicaSet=rsName
mongodb.bdb.uri=mongodb://user:pass#myhost1,myhost2,myhost13/bdb?replicaSet=rsName
Based on your logic, use either of the template.
Thought, there are few catches:
Notice the #Primary annotation, one bean needs to be marked as primary. I haven't find any solution if no template is marked primary.
If any of the mongo DB is down & application is started/restarted, application will not start/deploy. to avoid this, #Autowired needs to be changed to #Autowired(required = false).
If any of the mongo DB is down & application is already running, it automatically uses 2nd mongo BD (which is not down). So, even if you want to use A DB, if it's down, requests are processed with B DB & vice-versa.

Using Spring Batch to write to a Cassandra Database

As of now, I'm able to connect to Cassandra via the following code:
import com.datastax.driver.core.Cluster;
import com.datastax.driver.core.Session;
public static Session connection() {
Cluster cluster = Cluster.builder()
.addContactPoints("IP1", "IP2")
.withCredentials("user", "password")
.withSSL()
.build();
Session session = null;
try {
session = cluster.connect("database_name");
session.execute("CQL Statement");
} finally {
IOUtils.closeQuietly(session);
IOUtils.closeQuietly(cluster);
}
return session;
}
The problem is that I need to write to Cassandra in a Spring Batch project. Most of the starter kits seem to use a JdbcBatchItemWriter to write to a mySQL database from a chunk. Is this possible? It seems that a JdbcBatchItemWriter cannot connect to a Cassandra database.
The current itemwriter code is below:
#Bean
public JdbcBatchItemWriter<Person> writer() {
JdbcBatchItemWriter<Person> writer = new JdbcBatchItemWriter<Person>();
writer.setItemSqlParameterSourceProvider(new
BeanPropertyItemSqlParameterSourceProvider<Person>());
writer.setSql("INSERT INTO people (first_name, last_name) VALUES
(:firstName, :lastName)");
writer.setDataSource(dataSource);
return writer;
}
Spring Data Cassandra provides repository abstractions for Cassandra that you should be able to use in conjunction with the RepositoryItemWriter to write to Cassandra from Spring Batch.
It is possible to extend Spring Batch to support Cassandra by customising ItemReader and ItemWriter.
ItemWriter example:
public class CassandraBatchItemWriter<Company> implements ItemWriter<Company>, InitializingBean {
protected static final Log logger = LogFactory.getLog(CassandraBatchItemWriter.class);
private final Class<Company> aClass;
#Autowired
private CassandraTemplate cassandraTemplate;
#Override
public void afterPropertiesSet() throws Exception { }
public CassandraBatchItemWriter(final Class<Company> aClass) {
this.aClass = aClass;
}
#Override
public void write(final List<? extends Company> items) throws Exception {
logger.debug("Write operations is performing, the size is {}" + items.size());
if (!items.isEmpty()) {
logger.info("Deleting in a batch performing...");
cassandraTemplate.deleteAll(aClass);
logger.info("Inserting in a batch performing...");
cassandraTemplate.insert(items);
}
logger.debug("Items is null...");
}
}
Then you can inject it as a #Bean through #Configuration
#Bean
public ItemWriter<Company> writer(final DataSource dataSource) {
final CassandraBatchItemWriter<Company> writer = new CassandraBatchItemWriter<Company>(Company.class);
return writer;
}
Full source code can be found in Github repo: Spring-Batch-with-Cassandra

#EnableMongoAuditing for MongoDB on Cloud Foundry / mongolab

My setup works on my local but not when I deploy it to CloudFoundry/mongolab.
The config is very similar to the docs.
My local spring config
#Configuration
#Profile("dev")
#EnableMongoAuditing
#EnableMongoRepositories(basePackages = "com.foo.model")
public class SpringMongoConfiguration extends AbstractMongoConfiguration {
#Override
protected String getDatabaseName() {
return "myDb";
}
#Override
public Mongo mongo() throws Exception {
return new MongoClient("localhost");
}
#Bean
public AuditorAware<User> myAuditorProvider() {
return new SpringSecurityAuditorAware();
}
}
This is the cloud foundry setup
#Configuration
#Profile("cloud")
#EnableMongoAuditing
#EnableMongoRepositories(basePackages = "com.foo.model")
public class SpringCloudMongoDBConfiguration extends AbstractMongoConfiguration {
private Cloud getCloud() {
CloudFactory cloudFactory = new CloudFactory();
return cloudFactory.getCloud();
}
#Bean
public MongoDbFactory mongoDbFactory() {
Cloud cloud = getCloud();
MongoServiceInfo serviceInfo = (MongoServiceInfo) cloud.getServiceInfo(cloud.getCloudProperties().getProperty("cloud.services.mongo.id"));
String serviceID = serviceInfo.getId();
return cloud.getServiceConnector(serviceID, MongoDbFactory.class, null);
}
#Override
protected String getDatabaseName() {
Cloud cloud = getCloud();
return cloud.getCloudProperties().getProperty("cloud.services.mongo.id");
}
#Override
public Mongo mongo() throws Exception {
Cloud cloud = getCloud();
return new MongoClient(cloud.getCloudProperties().getProperty("cloud.services.mongo.connection.host"));
}
#Bean
public MongoTemplate mongoTemplate() {
return new MongoTemplate(mongoDbFactory());
}
#Bean
public AuditorAware<User> myAuditorProvider() {
return new SpringSecurityAuditorAware();
}
}
And the error I'm getting when I try to save a document in Cloud Foundry is:
OUT ERROR: org.springframework.data.support.IsNewStrategyFactorySupport - Unexpected error
OUT java.lang.IllegalArgumentException: Unsupported entity com.foo.model.project.Project! Could not determine IsNewStrategy.
OUT at org.springframework.data.mongodb.core.MongoTemplate.insert(MongoTemplate.java:739)
OUT at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:221)
OUT at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85)
Any ideas? Is it my config file etc..?
Thanks in advance
Niclas
This is usually caused if the Mongo mapping metadata obtained for entities does not scan entities at application startup. By default, AbstractMongoConfiguration uses the package of the actual configuration class to look for #Document annotated classes at startup.
The exception message makes me assume that SpringCloudMongoDBConfiguration is not located in any of the super packages of com.foo.model.project. There are two solutions to this:
Stick to the convenience of putting application configuration classes into the root package of your application. This will cause your application packages be scanned for domain classes, metadata obtained, and the is-new-detection work as expected.
Manually hand the package containing domain classes to the infrastructure by overriding MongoConfiguration.getMappingBasePackage().
The reason you might see the configuration working in the local environment is that the mapping metadata might be obtained through a non-persisting persistence operation (e.g. a query) and everything else proceeding from there.