Spring data MongoDb: MappingMongoConverter remove _class - mongodb

The default MappingMongoConverter adds a custom type key ("_class") to each object in the database. So, if I create a Person:
package my.dto;
public class Person {
String name;
public Person(String name) {
this.name = name;
}
}
and save it to the db:
MongoOperations ops = new MongoTemplate(new Mongo(), "users");
ops.insert(new Person("Joe"));
the resulting object in the mongo will be:
{ "_id" : ObjectId("4e2ca049744e664eba9d1e11"), "_class" : "my.dto.Person", "name" : "Joe" }
Questions:
What are the implications of moving the Person class into a different namespace?
Is it possible not to pollute the object with the "_class" key; without writing a unique converter just for the Person class?

So here's the story: we add the type by default as some kind of hint what class to instantiate actually. As you have to pipe in a type to read the document into via MongoTemplate anyway there are two possible options:
You hand in a type the actual stored type can be assigned to. In that case we consider the stored type, use that for object creation. Classical example here is doing polymorphic queries. Suppose you have an abstract class Contact and your Person. You could then query for Contacts and we essentially have to determine a type to instantiate.
If you - on the other hand - pass in a completely different type we'd simply marshal into that given type, not into the one stored in the document actually. That would cover your question what happens if you move the type.
You might be interested in watching this ticket which covers some kind of pluggable type mapping strategy to turn the type information into an actual type. This can serve simply space saving purposes as you might want to reduce a long qualified class name to a hash of a few letters. It would also allow more complex migration scenarios where you might find completely arbitrary type keys produced by another datastore client and bind those to Java types.

Here's my annotation, and it works.
#Configuration
public class AppMongoConfig {
public #Bean
MongoDbFactory mongoDbFactory() throws Exception {
return new SimpleMongoDbFactory(new Mongo(), "databasename");
}
public #Bean
MongoTemplate mongoTemplate() throws Exception {
//remove _class
MappingMongoConverter converter = new MappingMongoConverter(mongoDbFactory(), new MongoMappingContext());
converter.setTypeMapper(new DefaultMongoTypeMapper(null));
MongoTemplate mongoTemplate = new MongoTemplate(mongoDbFactory(), converter);
return mongoTemplate;
}
}

If you want to disable _class attribute by default, but preserve polymorfism for specified classes, you can explictly define the type of _class (optional) field by configuing:
#Bean
public MongoTemplate mongoTemplate() throws Exception {
Map<Class<?>, String> typeMapperMap = new HashMap<>();
typeMapperMap.put(com.acme.domain.SomeDocument.class, "role");
TypeInformationMapper typeMapper1 = new ConfigurableTypeInformationMapper(typeMapperMap);
MongoTypeMapper typeMapper = new DefaultMongoTypeMapper(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, Arrays.asList(typeMapper1));
MappingMongoConverter converter = new MappingMongoConverter(mongoDbFactory(), new MongoMappingContext());
converter.setTypeMapper(typeMapper);
MongoTemplate mongoTemplate = new MongoTemplate(mongoDbFactory(), converter);
return mongoTemplate;
}
This will preserve _class field (or whatever you want to name in construtor) for only specified entities.
You can also write own TypeInformationMapper for example based on annotations. If you annotate your document by #DocumentType("aliasName") you will keep polymorphism by keeping alias of class.
I have explained briefly it on my blog, but here is some piece of quick code:
https://gist.github.com/athlan/6497c74cc515131e1336

<mongo:mongo host="hostname" port="27017">
<mongo:options
...options...
</mongo:mongo>
<mongo:db-factory dbname="databasename" username="user" password="pass" mongo-ref="mongo"/>
<bean id="mongoTypeMapper" class="org.springframework.data.mongodb.core.convert.DefaultMongoTypeMapper">
<constructor-arg name="typeKey"><null/></constructor-arg>
</bean>
<bean id="mongoMappingContext" class="org.springframework.data.mongodb.core.mapping.MongoMappingContext" />
<bean id="mongoConverter" class="org.springframework.data.mongodb.core.convert.MappingMongoConverter">
<constructor-arg name="mongoDbFactory" ref="mongoDbFactory" />
<constructor-arg name="mappingContext" ref="mongoMappingContext" />
<property name="typeMapper" ref="mongoTypeMapper"></property>
</bean>
<bean id="mongoTemplate" class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg name="mongoDbFactory" ref="mongoDbFactory"/>
<constructor-arg name="mongoConverter" ref="mongoConverter" />
<property name="writeResultChecking" value="EXCEPTION" />
</bean>

While, Mkyong's answer still works, I would like to add my version of solution as few bits are deprecated and may be in the verge of cleanup.
For example : MappingMongoConverter(mongoDbFactory(), new MongoMappingContext()) is deprecated in favor of new MappingMongoConverter(dbRefResolver, new MongoMappingContext()); and SimpleMongoDbFactory(new Mongo(), "databasename"); in favor of new SimpleMongoDbFactory(new MongoClient(), database);.
So, my final working answer without deprecation warnings is :
#Configuration
public class SpringMongoConfig {
#Value("${spring.data.mongodb.database}")
private String database;
#Autowired
private MongoDbFactory mongoDbFactory;
public #Bean MongoDbFactory mongoDBFactory() throws Exception {
return new SimpleMongoDbFactory(new MongoClient(), database);
}
public #Bean MongoTemplate mongoTemplate() throws Exception {
DbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory);
// Remove _class
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, new MongoMappingContext());
converter.setTypeMapper(new DefaultMongoTypeMapper(null));
return new MongoTemplate(mongoDBFactory(), converter);
}
}
Hope this helps people who would like to have a clean class with no deprecation warnings.

For Spring Boot 2.3.0.RELEASE it's more easy, just override the method mongoTemplate, it's already has all things you need to set type mapper. See the following example:
#Configuration
#EnableMongoRepositories(
// your package ...
)
public class MongoConfig extends AbstractMongoClientConfiguration {
// .....
#Override
public MongoTemplate mongoTemplate(MongoDatabaseFactory databaseFactory, MappingMongoConverter converter) {
// remove __class field from mongo
converter.setTypeMapper(new DefaultMongoTypeMapper(null));
return super.mongoTemplate(databaseFactory, converter);
}
// .....
}

This is my one line solution:
#Bean
public MongoTemplate mongoTemplateFraud() throws UnknownHostException {
MongoTemplate mongoTemplate = new MongoTemplate(getMongoClient(), dbName);
((MappingMongoConverter)mongoTemplate.getConverter()).setTypeMapper(new DefaultMongoTypeMapper(null));//removes _class
return mongoTemplate;
}

I struggled a long time with this problem. I followed the approach from mkyong but when I introduced a LocalDate attribute (any JSR310 class from Java 8) I received the following exception:
org.springframework.core.convert.ConverterNotFoundException:
No converter found capable of converting from type [java.time.LocalDate] to type [java.util.Date]
The corresponding converter org.springframework.format.datetime.standard.DateTimeConverters is part of Spring 4.1 and is referenced in Spring Data MongoDB 1.7. Even if I used newer versions the converter didn't jump in.
The solution was to use the existing MappingMongoConverter and only provide a new DefaultMongoTypeMapper (the code from mkyong is under comment):
#Configuration
#EnableMongoRepositories
class BatchInfrastructureConfig extends AbstractMongoConfiguration
{
#Override
protected String getDatabaseName() {
return "yourdb"
}
#Override
Mongo mongo() throws Exception {
new Mongo()
}
#Bean MongoTemplate mongoTemplate()
{
// overwrite type mapper to get rid of the _class column
// get the converter from the base class instead of creating it
// def converter = new MappingMongoConverter(mongoDbFactory(), new MongoMappingContext())
def converter = mappingMongoConverter()
converter.typeMapper = new DefaultMongoTypeMapper(null)
// create & return template
new MongoTemplate(mongoDbFactory(), converter)
}
To summarize:
extend AbstractMongoConfiguration
annotate with EnableMongoRepositories
in mongoTemplate get converter from base class, this ensures that the type conversion classes are registered

#Configuration
public class MongoConfig {
#Value("${spring.data.mongodb.database}")
private String database;
#Value("${spring.data.mongodb.host}")
private String host;
public #Bean MongoDbFactory mongoDbFactory() throws Exception {
return new SimpleMongoDbFactory(new MongoClient(host), database);
}
public #Bean MongoTemplate mongoTemplate() throws Exception {
MappingMongoConverter converter = new MappingMongoConverter(new DefaultDbRefResolver(mongoDbFactory()),
new MongoMappingContext());
converter.setTypeMapper(new DefaultMongoTypeMapper(null));
MongoTemplate mongoTemplate = new MongoTemplate(mongoDbFactory(), converter);
return mongoTemplate;
}
}

The correct answer above seems to be using a number of deprecated dependencies. For example if you check the code, it mentions MongoDbFactory which is deprecated in the latest Spring release. If you happen to be using MongoDB with Spring-Data in 2020, this solution seems to be older. For instant results, check this snippet of code. Works 100%.
Just Create a new AppConfig.java file and paste this block of code. You'll see the "_class" property disappearing from the MongoDB document.
package "Your Package Name";
import org.apache.naming.factory.BeanFactory;
import org.springframework.beans.factory.NoSuchBeanDefinitionException;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.convert.CustomConversions;
import org.springframework.data.mongodb.MongoDatabaseFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultMongoTypeMapper;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
#Configuration
public class AppConfig {
#Autowired
MongoDatabaseFactory mongoDbFactory;
#Autowired
MongoMappingContext mongoMappingContext;
#Bean
public MappingMongoConverter mappingMongoConverter() {
DbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory);
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, mongoMappingContext);
converter.setTypeMapper(new DefaultMongoTypeMapper(null));
return converter;
}
}

I'm using:
package YOUR_PACKAGE;
import javax.annotation.PostConstruct;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.mongodb.core.convert.DefaultMongoTypeMapper;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
#Configuration
public class MongoConfiguration {
#Autowired
private MappingMongoConverter mongoConverter;
#PostConstruct
public void setUpMongoEscapeCharacterAndTypeMapperConversion() {
mongoConverter.setMapKeyDotReplacement("_");
// This will remove _class: key
mongoConverter.setTypeMapper(new DefaultMongoTypeMapper(null));
}
}
Btw: It is also replacing "." with "_"

you just need to add the #TypeAlias annotation to the class defintion over changing the type mapper

I've tried the solutions above, some of them don't work in combination with auditing, and none seems to set correctly the MongoCustomConversions
A solution that works for me is the following
#Configuration
public class MongoConfig {
#Bean
public MappingMongoConverter mappingMongoConverterWithCustomTypeMapper(
MongoDatabaseFactory factory,
MongoMappingContext context,
MongoCustomConversions conversions) {
DbRefResolver dbRefResolver = new DefaultDbRefResolver(factory);
MappingMongoConverter mappingConverter = new MappingMongoConverter(dbRefResolver, context);
mappingConverter.setCustomConversions(conversions);
/**
* replicate the way that Spring
* instantiates a {#link DefaultMongoTypeMapper}
* in {#link MappingMongoConverter#MappingMongoConverter(DbRefResolver, MappingContext)}
*/
CustomMongoTypeMapper customTypeMapper = new CustomMongoTypeMapper(
context,
mappingConverter::getWriteTarget);
mappingConverter.setTypeMapper(customTypeMapper);
return mappingConverter;
}
}
public class CustomMongoTypeMapper extends DefaultMongoTypeMapper {
public CustomMongoTypeMapper(
MappingContext<? extends PersistentEntity<?, ?>, ?> mappingContext,
UnaryOperator<Class<?>> writeTarget) {
super(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, mappingContext, writeTarget);
}
#Override
public TypeInformation<?> readType(Bson source) {
/**
* do your conversion here, and eventually return
*/
return super.readType(source);
}
}
As an alternative, you could use a BeanPostProcessor to detect the creation of a mappingMongoConverter, and add your converter there.
Something like
public class MappingMongoConverterHook implements BeanPostProcessor {
#Override
public Object postProcessAfterInitialization(Object bean, String beanName) throws BeansException {
if ("mappingMongoConverter" == beanName) {
((MappingMongoConverter) bean).setTypeMapper(new CustomMongoTypeMapper());
}
return bean;
}
}

Related

Spring Data MongoDB Converter not getting registered

I have a setup of multiple MongoDB configuration. Here is the configuration class
#Configuration
#RequiredArgsConstructor
#EnableConfigurationProperties(MongoConfigProperties.class)
public class MultipleMongoConfig {
private static final Logger logger = LoggerFactory.getLogger(MultipleMongoConfig.class);
private final MongoConfigProperties mongoProperties;
#Primary
#Bean(name = "sysdiagMongoTemplate")
public MongoOperations sysdiagMongoTemplate() {
MappingMongoConverter mappingMongoConverter = new MappingMongoConverter(new DefaultDbRefResolver(sysdiagFactory(mongoProperties.getSysdiag())),
new MongoMappingContext());
List<Converter<?, ?>> converters = new ArrayList<>();
converters.add(new AggregationResultReadConverter());
mappingMongoConverter.setCustomConversions(new CustomConversions(CustomConversions.StoreConversions.NONE, converters));
mappingMongoConverter.afterPropertiesSet();
boolean canConvert = mappingMongoConverter.getConversionService().canConvert(Document.class, AggregationResult.class);
mappingMongoConverter.afterPropertiesSet();
logger.info("canConvertFromDocumentToAggResult:: " + canConvert); //gives TRUE
return new MongoTemplate(sysdiagFactory(this.mongoProperties.getSysdiag()), mappingMongoConverter);
}
#Bean(name = "monitoringMongoTemplate")
public MongoOperations monitoringMongoTemplate() {
return new MongoTemplate(monitoringFactory(this.mongoProperties.getMonitoring()));
}
public MongoDbFactory sysdiagFactory(final MongoProperties mongo) {
return new SimpleMongoDbFactory(new MongoClient(mongo.getHost(), mongo.getPort()),
mongo.getDatabase());
}
public MongoDbFactory monitoringFactory(final MongoProperties mongo) {
return new SimpleMongoDbFactory(new MongoClient(mongo.getHost(), mongo.getPort()),
mongo.getDatabase());
}
}
Here is the read converter class (I only require reading from MongoDB). We have dynamic keys in the document due to which I need to convert them into a Map
public class AggregationResultReadConverter implements Converter<Document, AggregationResult> {
#Override
public AggregationResult convert(Document source) {
AggregationResult aggregationResult = new AggregationResult();
aggregationResult.setData(new HashMap());
for(Map.Entry<String,Object> entry : source.entrySet()){
if(entry.getKey().matches("[A-Z][A-Z][A-Z]")){
aggregationResult.getData().put(entry.getKey(), entry.getValue());
}
}
return aggregationResult;
}
}
Here is the mapping configuration for one of the MongoDB database
#Configuration
#EnableMongoRepositories(basePackages = {"com.hns.services.restapi.db.mongo.sysdiag.entity", "com.hns.services.restapi.db.mongo.sysdiag.repo"}, mongoTemplateRef = "sysdiagMongoTemplate")
public class SysdiagMongoConfig {
}
And here is the repository interface
#Repository
public interface AggregationResultRepository extends MongoRepository<AggregationResult, ObjectId> {
#Query("{ TIME: {$gte : ?0, $lt: ?1}}")
List<AggregationResult> findInTimeRange(Long startTime, Long endTime);
}
When I query using AggregationResultRepository, I expect the converter code to be executed so that I can convert the fields and put them in the Entity (Document) class object as per the logic. The query is going fine as I saw in the debug logs and I see an output but the converter is not getting called.
The converted is getting registered with the mongo template as the canConvertFromDocumentToAggResult logger gives TRUE. I tried changing the converted from Document -> AggregationResult to DBObject -> AggregationResult but no luck. Not sure what am I missing here.

MongoTemplate Custom Config using Spring boot

I was looking to change WriteResultChecking property of mongoTemplate whilst working on Spring boot app (2.0.5). I found out a way via extending AbstractMongoConfiguration as below.
I got the same working, however i found this approach a bit risky.
Saying this because this approach forced me to write a implementation for
public MongoClient mongoClient() {
return new MongoClient(host, port);
}
Now MongoClient is the central class to maintain connections with MongoDB and if i am forced to write implementation for the same, then i may be possibly missing out on optimizations that spring framework does.
Can someone please suggest any other optimal way of overriding some properties/behaviours without having to tinker too much ?
#Configuration
public class MyMongoConfigs extends AbstractMongoConfiguration {
#Value("${spring.data.mongodb.database}")
private String databaseName;
#Value("${spring.data.mongodb.host}")
private String host;
#Value("${spring.data.mongodb.port}")
private int port;
#Override
public MongoClient mongoClient() {
return new MongoClient(host, port);
}
#Override
protected String getDatabaseName() {
return databaseName;
}
#Bean
public MongoTemplate mongoTemplate() throws Exception {
MongoTemplate myTemp = new MongoTemplate(mongoDbFactory(), mappingMongoConverter());
**myTemp.setWriteResultChecking(WriteResultChecking.EXCEPTION);**
return myTemp;
}
You are in right direction. Using AbstractMongoConfiguration you override the configs that you need to customize and it's the right way to do it. AbstractMongoConfiguration is still Spring Provided class, so the you don't have to worry about optimization, unless you mess with your configuration.
This is my approach:
package app.config;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.WriteResultChecking;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import com.mongodb.WriteConcern;
#Configuration
class ApplicationConfig {
#Bean
MongoTemplate mongoTemplate(MongoDbFactory mongoDbFactory, MongoConverter converter) {
MongoTemplate mongoTemplate = new MongoTemplate(mongoDbFactory, converter);
mongoTemplate.setWriteConcern(WriteConcern.MAJORITY);
mongoTemplate.setWriteResultChecking(WriteResultChecking.EXCEPTION);
return mongoTemplate;
}
}
I have figured this out by inspecting the source code of org.springframework.boot.autoconfigure.data.mongo.MongoDataAutoConfiguration.
I wrote CustomMongoTemplate which override method find() with added criteria to check if document has 'deletedAt' field;
Custom MongoTemplate:
public class CustomMongoTemplate extends MongoTemplate {
public CustomMongoTemplate(MongoDatabaseFactory mongoDbFactory, MongoConverter mongoConverter) {
super(mongoDbFactory, mongoConverter);
}
#Override
public <T> List<T> find(Query query, Class<T> entityClass, String collectionName) {
query.addCriteria(Criteria.where("deletedAt").exists(Boolean.FALSE));
return super.find(query, entityClass, collectionName);
}
Then create Bean in configuration class:
#Configuration
public class MyConfiguration {
//...
#Bean(name = "mongoTemplate")
CustomMongoTemplate customMongoTemplate(MongoDatabaseFactory databaseFactory, MappingMongoConverter converter) {
return new CustomMongoTemplate(databaseFactory, converter);
}
//...
}
And last thing - allow Spring to override default MongoTemplate bean. Add next thing to your application.properties file:
spring.main.allow-bean-definition-overriding=true

spring-data-mongodb issue with java.util.currency

I'm facing the same issue as mentioned in the link:
Mongo spring-data issue with java.util.Currency
I tried the accepted answer. But its not working and i'm getting NullPointerException at line:
new CustomConversions(Arrays.asList(currencyToString, stringToCurrency));
So i defined converters in new classes, like this:
#Component
public class CurrencyToStringConverter implements Converter<Currency,String>{
/* (non-Javadoc)
* #see com.fasterxml.jackson.databind.util.Converter#convert(java.lang.Object)
*/
#Override
public String convert(Currency arg0) {
System.out.println("***INSIDE CurrencyToStringConverter***"+arg0);
return arg0.getCurrencyCode();
}
}
and
#Component
public class StringToCurrencyConverter implements Converter<String,Currency>{
/* (non-Javadoc)
* #see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
#Override
public Currency convert(String arg0) {
System.out.println("***INSIDE StringToCurrencyConverter***"+arg0);
return Currency.getInstance(arg0);
}
}
I used them in "customConversions()" as shown below:
#Configuration
public class CustomMongoConfiguration extends AbstractMongoConfiguration {
#Autowired
private Environment env;
#Bean
#Override
public CustomConversions customConversions() {
System.out.println("***CUSTOM MONGO CONVERSIONS***");
List<Converter<?,?>> converters=new ArrayList<>();
converters.add(new CurrencyToStringConverter());
converters.add(new StringToCurrencyConverter());
return new CustomConversions(converters);
}
/* (non-Javadoc)
* #see org.springframework.data.mongodb.config.AbstractMongoConfiguration#getDatabaseName()
*/
#Override
protected String getDatabaseName() {
// TODO Auto-generated method stub
String prop = env.getProperty("spring.data.mongodb.database");
System.out.println("database naem:: "+prop);
return prop;
}
/* (non-Javadoc)
* #see org.springframework.data.mongodb.config.AbstractMongoConfiguration#mongo()
*/
#Override
#Bean
public Mongo mongo() throws Exception {
String prop = env.getProperty("spring.data.mongodb.host");
System.out.println("host naem:: "+prop);
return new MongoClient(prop);
}
}
When spring-boot application starts up, i could see below statments printed:
System.out.println("***CUSTOM MONGO CONVERSIONS***");
System.out.println("database naem:: "+prop);
System.out.println("host naem:: "+prop);
But when i try to get my object from MongoDB, it never calls "convert" methods in Converter classs and moreover, i always get this exception:
org.springframework.data.mapping.model.MappingException: No property null found on entity class java.util.Currency to bind constructor parameter to!
at org.springframework.data.mapping.model.PersistentEntityParameterValueProvider.getParameterValue(PersistentEntityParameterValueProvider.java:74)
at org.springframework.data.mapping.model.SpELExpressionParameterValueProvider.getParameterValue(SpELExpressionParameterValueProvider.java:63)
at org.springframework.data.convert.ReflectionEntityInstantiator.createInstance(ReflectionEntityInstantiator.java:71)
at org.springframework.data.convert.ClassGeneratingEntityInstantiator.createInstance(ClassGeneratingEntityInstantiator.java:83)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:251)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:231)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.readValue(MappingMongoConverter.java:1186)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.access$200(MappingMongoConverter.java:78)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter$MongoDbPropertyValueProvider.getPropertyValue(MappingMongoConverter.java:1134)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.getValueInternal(MappingMongoConverter.java:870)
I followed this link too:
Spring not using mongo custom converters
I changed #Configuration annotation to #Component in the class,CustomMongoConfiguration,as suggested in the link. But Its also not working. If i use #Component annotation, below statements are getting printed multiple times. I dont know the reason.
System.out.println("***CUSTOM MONGO CONVERSIONS***");
System.out.println("database naem:: "+prop);
System.out.println("host naem:: "+prop);
I'm using spring-boot version 1.3.2.RELEASE.
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>1.3.2.RELEASE</version>
<relativePath /> <!-- lookup parent from repository -->
</parent>
By the way, i'm using application.properties to configure mongodb properties but not any java based configuration:
spring.data.mongodb.host=localhost
spring.data.mongodb.password=
spring.data.mongodb.port=27017
spring.data.mongodb.repositories.enabled=true
spring.data.mongodb.database=abcde
I'm struggling to solve this for the past three days. Can anyone please help me out.
Currency conversion has been added via b7131b and is available as of 1.9.0.M1.
Please add #ReadingConverter and #WritingConverter to your Converter implementations so that CustomConversions and not only the underlying ConversionService is aware of those.
My Bad. Problem is with the data which was inserted earlier in mongodb. Earlier inserted in mongodb with improper "currency" format ie., instead of just string value like USD, GBP it has below structure:
"currency" : {
"currencyCode" : "GBP",
"defaultFractionDigits" : 2,
"numericCode" : 826
}
instead of simple code like:
"currency" : "GBP"
After i wrote converters, mongo unable to parse this structure and throwing this exception:
org.springframework.data.mapping.model.MappingException: No property null found on entity class java.util.Currency to bind constructor parameter to!
at org.springframework.data.mapping.model.PersistentEntityParameterValueProvider.getParameterValue(PersistentEntityParameterValueProvider.java:74)
I think this exception is misleading. Anyways, i figured it out. Now both converters are being called.
Final Solution that worked for me:
#Configuration
public class CustomMongoConfiguration extends AbstractMongoConfiguration {
#Autowired
private MongoProperties mongoProps;
#Bean
#Override
public CustomConversions customConversions() {
System.out.println("***CUSTOM MONGO CONVERSIONS***");
List<Converter<?, ?>> converters = new ArrayList<>();
converters.add(new CurrencyToStringConverter());
converters.add(new StringToCurrencyConverter());
return new CustomConversions(converters);
}
#Override
protected String getDatabaseName() {
return mongoProps.getDatabase();
}
#Override
#Bean
public Mongo mongo() throws Exception {
return new MongoClient(mongoProps.getHost(), mongoProps.getPort());
}
}
As per spring docs, http://docs.spring.io/spring-data/mongodb/docs/current/reference/html/#mongo.converter-disambiguation, you need to put #ReadingConverter and #WritingConverter on converters only if both source & target are native types (for ex: String to Long or Long to String). Otherwise, you don't need to have them on converters. In my case those not required as Currency is not native mongo type.

Spring Batch process an encoded zipped file

I’m investigating the use of spring batch to process records from an encoded zipped file. The records are variable length with nested variable length data fields encoded within them.
I’m new to Spring and Spring Batch, this is how I plan to structure the batch configuration.
The ItemReader would need to read a single record from the zipped (*.gz) file input stream into a POJO (byte array), the length of this record would be contained in the first two bytes of the stream.
The ItemProcessor will decode the byte array and store info in relevant attributes in the POJO.
The ItemWriter would populate a database.
My initial problem is understanding how to set up the ItemReader, I’ve looked at some of the examples of using a FlatFileItemReader, but my difficulty is the expectation to have a Line Mapper. I don't see how I can do that in my case (no concept of a line in the file).
There are some articles indicating the use of a custom BufferedReaderFactory, but great to see a worked example of this.
Help would be appreciated.
if the gzipped file is a simple txt file, you only need a custum BufferedReaderFactory, the linemaper then gets the String of the current line
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.io.UnsupportedEncodingException;
import java.util.ArrayList;
import java.util.List;
import java.util.zip.GZIPInputStream;
import org.springframework.batch.item.file.BufferedReaderFactory;
import org.springframework.core.io.Resource;
public class GZipBufferedReaderFactory implements BufferedReaderFactory {
/** Default value for gzip suffixes. */
private List<String> gzipSuffixes = new ArrayList<String>() {
{
add(".gz");
add(".gzip");
}
};
/**
* Creates Bufferedreader for gzip Resource, handles normal resources
* too.
*
* #param resource
* #param encoding
* #return
* #throws UnsupportedEncodingException
* #throws IOException
*/
#Override
public BufferedReader create(Resource resource, String encoding)
throws UnsupportedEncodingException, IOException {
for (String suffix : gzipSuffixes) {
// test for filename and description, description is used when
// handling itemStreamResources
if (resource.getFilename().endsWith(suffix)
|| resource.getDescription().endsWith(suffix)) {
return new BufferedReader(new InputStreamReader(new GZIPInputStream(resource.getInputStream()), encoding));
}
}
return new BufferedReader(new InputStreamReader(resource.getInputStream(), encoding));
}
public List<String> getGzipSuffixes() {
return gzipSuffixes;
}
public void setGzipSuffixes(List<String> gzipSuffixes) {
this.gzipSuffixes = gzipSuffixes;
}
}
simple itemreader configuration:
<bean id="itemReader" class="org.springframework.batch.item.file.FlatFileItemReader" scope="step">
<property name="resource" value="#{jobParameters['input.file']}" />
<property name="lineMapper">
<bean class="org.springframework.batch.item.file.mapping.PassThroughLineMapper" />
</property>
<property name="strict" value="true" />
<property name="bufferedReaderFactory">
<bean class="your.custom.GZipBufferedReaderFactory" />
</property>
</bean>
Tested that this simple configuration of reading lines from a zipped & encoded file in S3 works.
Key points:
Implement a BufferedReaderFactory that uses Apache's GZIPInputStreamFactory, and set that as the bufferedReaderFactory on the FlatFileItemReader.
Configure a SimpleStorageResourceLoader from Spring Cloud with an AmazonS3Client, and use it to get the zipped flat file in S3. Set that as the resource on the FlatFileItemReader.
Note: reading into a string can be easily replaced by reading into a POJO.
GZIPBufferedReaderFactory.java
Using Apache's GZIPInputStreamFactory
public class GZIPBufferedReaderFactory implements BufferedReaderFactory {
private final GZIPInputStreamFactory gzipInputStreamFactory;
public GZIPBufferedReaderFactory(GZIPInputStreamFactory gzipInputStreamFactory) {
this.gzipInputStreamFactory = gzipInputStreamFactory;
}
#Override
public BufferedReader create(Resource resource, String encoding) throws IOException {
return new BufferedReader(new InputStreamReader(gzipInputStreamFactory.create(resource.getInputStream()), encoding));
}
}
AWSConfiguration.java
#Configuration
public class AWSConfiguration {
#Bean
public AmazonS3Client s3Client(AWSCredentialsProvider credentials, Region region) {
ClientConfiguration clientConfig = new ClientConfiguration();
AmazonS3Client client = new AmazonS3Client(credentials, clientConfig);
client.setRegion(region);
return client;
}
}
How you configure the AWSCredentialsProvider and Region beans can vary and I will not detail that here since there is documentation elsewhere.
BatchConfiguration.java
#Configuration
#EnableBatchProcessing
public class SignalsIndexBatchConfiguration {
#Autowired
public AmazonS3Client s3Client;
#Bean
public GZIPInputStreamFactory gzipInputStreamFactory() {
return new GZIPInputStreamFactory();
}
#Bean
public GZIPBufferedReaderFactory gzipBufferedReaderFactory(GZIPInputStreamFactory gzipInputStreamFactory) {
return new GZIPBufferedReaderFactory(gzipInputStreamFactory);
}
#Bean
public SimpleStorageResourceLoader simpleStorageResourceLoader() {
return new SimpleStorageResourceLoader(s3Client);
}
#Bean
#StepScope
protected FlatFileItemReader<String> itemReader(
SimpleStorageResourceLoader simpleStorageResourceLoader,
GZIPBufferedReaderFactory gzipBufferedReaderFactory) {
FlatFileItemReader<String> flatFileItemReader = new FlatFileItemReader<>();
flatFileItemReader.setBufferedReaderFactory(gzipBufferedReaderFactory);
flatFileItemReader.setResource(simpleStorageResourceLoader.getResource("s3://YOUR_FLAT_FILE.csv"));
flatFileItemReader.setLineMapper(new PassThroughLineMapper());
return flatFileItemReader;
}
#Bean
public Job job(Step step) {
return jobBuilderFactory.get("job").start(step).build();
}
#Bean
protected Step step(GZIPInputStreamFactory gzipInputStreamFactory) {
return stepBuilderFactory.get("step")
.<String, String> chunk(200)
.reader(itemReader(simpleStorageResourceLoader(), gzipBufferedReaderFactory(gzipInputStreamFactory)))
.processor(itemProcessor())
.faultTolerant()
.build();
}
/*
* These components are some of what we
* get for free with the #EnableBatchProcessing annotation
*/
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Autowired
public JobRepository jobRepository;
/*
* END Freebies
*/
#Bean
public JobLauncher jobLauncher() throws Exception {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(jobRepository);
jobLauncher.setTaskExecutor(new SimpleAsyncTaskExecutor());
jobLauncher.afterPropertiesSet();
return jobLauncher;
}
}
From the feature request ticket to spring batch (https://jira.spring.io/browse/BATCH-1750):
public class GZIPResource extends InputStreamResource implements Resource {
public GZIPResource(Resource delegate) throws IOException {
super(new GZIPInputStream(delegate.getInputStream()));
}
}
The custom GZipBufferedReaderFactory won't work with other than FlatFileItemReader.
Edit: lazy version. This doesn't try to open the file until getInputStream is called. This avoids exceptions due to that the file doesn't exist if you create the Resource at the program initialization (e.g. with autowiring).
public class GzipLazyResource extends FileSystemResource implements Resource {
public GzipLazyResource(File file) {
super(file);
}
public GzipLazyResource(String path) {
super(path);
}
#Override
public InputStream getInputStream() throws IOException {
return new GZIPInputStream(super.getInputStream());
}
}
Edit2: this only works for input Resources
Adding another similar method getOutputStream won't work because spring uses the FileSystemResource.getFile, not the FileSystemResource.getOutputStream.
My confusion was based around the file handling in the custom ItemReader, if I was to open and process the file in the read() method, I would have to keep track of where I was in the file etc. I managed to tackle this by creating a BufferedInputStream (BufferedInputStream(new GZIPInputStream(new FileInputStream(file)) in the constructor of the custom ItemReader, then process that stream in the read() method with each iteration of the step.

Using JdbcTemplate with Named parameters in spring batch

i'm trying to pass a parameter to my query in spring batch. I decided to create a tasklet and use JdbcTemplate as follows ...
public RepeatStatus execute(StepContribution stepContribution, ChunkContext chunkContext)
throws EpsilonBatchBusinessException {
LOGGER.debug("Enter execute.");
JdbcTemplate jdbcTemplate = new JdbcTemplate(dataSource);
jdbcTemplate.query(queryString,
new PreparedStatementSetter() {
public void setValues(PreparedStatement preparedStatement) throws SQLException {
preparedStatement.setInt(1, runNumber);
}
},
rowMapper);
LOGGER.debug("Exit execute.");
return RepeatStatus.FINISHED;
}
So am injecting to this bean a dataSource, queryString, rowMapper object, and the parameter (runNumber) .. This tasklet will be called within a step to create a list. I usually pass the row mapper to JdbcCursorItemReader spring bean and wouldn't write a tasklet, but my query string needs a parameter hence am writing this tasklet. Am just not sure if this tasklet will do the trick as with JdbcCursorItemReader? Your input wil be appreciated
A better option would be to use the JdbcCursorItemReader and write a custom PreparedStatementSetter.
The PreparedStatementSetter interface is very simple; pretty much all the code you'd need to write is below. Once the setter is written, all you need to do is configure it as a new bean with the runNumber value injected in the config, and then inject that bean into a JdbcCursorItemReader. This allows you to use all the usual ItemReaders and ItemWriters instead of having to implement everything by hand in a Tasklet.
package com.foo;
import java.sql.PreparedStatement;
import java.sql.SQLException;
import org.springframework.jdbc.core.PreparedStatementSetter;
public class YourParamSetter implements PreparedStatementSetter {
private int runNumber;
public void setValues(PreparedStatement ps) throws SQLException {
ps.setInt(1, runNumber);
}
public void setRunNumber(int runNumber) {
this.runNumber = runNumber;
}
public int getRunNumber() {
return runNumber;
}
}