spring-data-mongodb issue with java.util.currency - mongodb

I'm facing the same issue as mentioned in the link:
Mongo spring-data issue with java.util.Currency
I tried the accepted answer. But its not working and i'm getting NullPointerException at line:
new CustomConversions(Arrays.asList(currencyToString, stringToCurrency));
So i defined converters in new classes, like this:
#Component
public class CurrencyToStringConverter implements Converter<Currency,String>{
/* (non-Javadoc)
* #see com.fasterxml.jackson.databind.util.Converter#convert(java.lang.Object)
*/
#Override
public String convert(Currency arg0) {
System.out.println("***INSIDE CurrencyToStringConverter***"+arg0);
return arg0.getCurrencyCode();
}
}
and
#Component
public class StringToCurrencyConverter implements Converter<String,Currency>{
/* (non-Javadoc)
* #see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
#Override
public Currency convert(String arg0) {
System.out.println("***INSIDE StringToCurrencyConverter***"+arg0);
return Currency.getInstance(arg0);
}
}
I used them in "customConversions()" as shown below:
#Configuration
public class CustomMongoConfiguration extends AbstractMongoConfiguration {
#Autowired
private Environment env;
#Bean
#Override
public CustomConversions customConversions() {
System.out.println("***CUSTOM MONGO CONVERSIONS***");
List<Converter<?,?>> converters=new ArrayList<>();
converters.add(new CurrencyToStringConverter());
converters.add(new StringToCurrencyConverter());
return new CustomConversions(converters);
}
/* (non-Javadoc)
* #see org.springframework.data.mongodb.config.AbstractMongoConfiguration#getDatabaseName()
*/
#Override
protected String getDatabaseName() {
// TODO Auto-generated method stub
String prop = env.getProperty("spring.data.mongodb.database");
System.out.println("database naem:: "+prop);
return prop;
}
/* (non-Javadoc)
* #see org.springframework.data.mongodb.config.AbstractMongoConfiguration#mongo()
*/
#Override
#Bean
public Mongo mongo() throws Exception {
String prop = env.getProperty("spring.data.mongodb.host");
System.out.println("host naem:: "+prop);
return new MongoClient(prop);
}
}
When spring-boot application starts up, i could see below statments printed:
System.out.println("***CUSTOM MONGO CONVERSIONS***");
System.out.println("database naem:: "+prop);
System.out.println("host naem:: "+prop);
But when i try to get my object from MongoDB, it never calls "convert" methods in Converter classs and moreover, i always get this exception:
org.springframework.data.mapping.model.MappingException: No property null found on entity class java.util.Currency to bind constructor parameter to!
at org.springframework.data.mapping.model.PersistentEntityParameterValueProvider.getParameterValue(PersistentEntityParameterValueProvider.java:74)
at org.springframework.data.mapping.model.SpELExpressionParameterValueProvider.getParameterValue(SpELExpressionParameterValueProvider.java:63)
at org.springframework.data.convert.ReflectionEntityInstantiator.createInstance(ReflectionEntityInstantiator.java:71)
at org.springframework.data.convert.ClassGeneratingEntityInstantiator.createInstance(ClassGeneratingEntityInstantiator.java:83)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:251)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:231)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.readValue(MappingMongoConverter.java:1186)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.access$200(MappingMongoConverter.java:78)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter$MongoDbPropertyValueProvider.getPropertyValue(MappingMongoConverter.java:1134)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.getValueInternal(MappingMongoConverter.java:870)
I followed this link too:
Spring not using mongo custom converters
I changed #Configuration annotation to #Component in the class,CustomMongoConfiguration,as suggested in the link. But Its also not working. If i use #Component annotation, below statements are getting printed multiple times. I dont know the reason.
System.out.println("***CUSTOM MONGO CONVERSIONS***");
System.out.println("database naem:: "+prop);
System.out.println("host naem:: "+prop);
I'm using spring-boot version 1.3.2.RELEASE.
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>1.3.2.RELEASE</version>
<relativePath /> <!-- lookup parent from repository -->
</parent>
By the way, i'm using application.properties to configure mongodb properties but not any java based configuration:
spring.data.mongodb.host=localhost
spring.data.mongodb.password=
spring.data.mongodb.port=27017
spring.data.mongodb.repositories.enabled=true
spring.data.mongodb.database=abcde
I'm struggling to solve this for the past three days. Can anyone please help me out.

Currency conversion has been added via b7131b and is available as of 1.9.0.M1.
Please add #ReadingConverter and #WritingConverter to your Converter implementations so that CustomConversions and not only the underlying ConversionService is aware of those.

My Bad. Problem is with the data which was inserted earlier in mongodb. Earlier inserted in mongodb with improper "currency" format ie., instead of just string value like USD, GBP it has below structure:
"currency" : {
"currencyCode" : "GBP",
"defaultFractionDigits" : 2,
"numericCode" : 826
}
instead of simple code like:
"currency" : "GBP"
After i wrote converters, mongo unable to parse this structure and throwing this exception:
org.springframework.data.mapping.model.MappingException: No property null found on entity class java.util.Currency to bind constructor parameter to!
at org.springframework.data.mapping.model.PersistentEntityParameterValueProvider.getParameterValue(PersistentEntityParameterValueProvider.java:74)
I think this exception is misleading. Anyways, i figured it out. Now both converters are being called.
Final Solution that worked for me:
#Configuration
public class CustomMongoConfiguration extends AbstractMongoConfiguration {
#Autowired
private MongoProperties mongoProps;
#Bean
#Override
public CustomConversions customConversions() {
System.out.println("***CUSTOM MONGO CONVERSIONS***");
List<Converter<?, ?>> converters = new ArrayList<>();
converters.add(new CurrencyToStringConverter());
converters.add(new StringToCurrencyConverter());
return new CustomConversions(converters);
}
#Override
protected String getDatabaseName() {
return mongoProps.getDatabase();
}
#Override
#Bean
public Mongo mongo() throws Exception {
return new MongoClient(mongoProps.getHost(), mongoProps.getPort());
}
}
As per spring docs, http://docs.spring.io/spring-data/mongodb/docs/current/reference/html/#mongo.converter-disambiguation, you need to put #ReadingConverter and #WritingConverter on converters only if both source & target are native types (for ex: String to Long or Long to String). Otherwise, you don't need to have them on converters. In my case those not required as Currency is not native mongo type.

Related

overwrite findAll() method of QuerydslPredicateExecutor

My goal is to add a dynamic Predicate to the findAll method of QuerydslPredicateExecutor. This should be used to filter entities based on the organization of the currently active user.
I'm using Spring Data together with Spring Data REST to get the REST API out of the box, i.e. I have no dedicated REST service where I can intercept the incoming data and modify it.
By extending a SimpleJpaRepository and registering it with #EnableJpaRepositories it is possible to overwrite a method and change its default behavior. I wanted to do this, but my Repository interfaces are implementing QuerydslPredicateExecutor and this does not seem to work.
My failed approach started as:
public class CustomizedJpaRepositoryIml<T, ID extends Serializable> extends
SimpleJpaRepository<T, ID> {
private EntityManager entityManager;
#Autowired
public CustomizedJpaRepositoryIml(JpaEntityInformation<T, ?>
entityInformation,
EntityManager entityManager) {
super(entityInformation, entityManager);
this.entityManager = entityManager;
}
}
but obviously this extension does not provide the method to be overwritten. I debugged how the implementing QuerydslJpaPredicateExecutor is wired, but this is rather complex and I see no way of plugging in here something easily.
Another idea was to use a filter intercepting the URL call and adding parameters but this does not sound nice.
I could also override the controller path for the finder with a #BasePathAwareController, but this would mean to do this for all entities I have and not in a single place.
Any ideas to achieve my goal? maybe there are also completely different options possible to achieve my goal of add additional filtering to the Querydsl Predicate
I found a way in the meanwhile. It requires to provide an own implementation of QuerydslPredicateExecutor. But this is not made easy in Spring Data. The answer is motivated by https://stackoverflow.com/a/53960209/3351474, but in the meanwhile a constructor has changed in newer Spring Data, why this cannot be taken 1:1.
I use a different example as in my question, but with this solution you have complete freedom also to add and append any Predicate. As an example I take here a customized Querydsl implementation using always the creationDate of an entity as sort criteria if nothing is is passed. I assume in this example that this column exists in some #MappedSuperClass for all entities. Use generated static metadata in real life instead the hard coded string "creationDate".
As first the wrapped delegating all CustomQuerydslJpaRepositoryIml delegating all methods to the QuerydslJpaPredicateExecutor:
/**
* Customized Querydsl JPA repository to apply custom filtering and sorting logic.
*
*/
public class CustomQuerydslJpaRepositoryIml<T> implements QuerydslPredicateExecutor<T> {
private final QuerydslJpaPredicateExecutor querydslPredicateExecutor;
public CustomQuerydslJpaRepositoryIml(QuerydslJpaPredicateExecutor querydslPredicateExecutor) {
this.querydslPredicateExecutor = querydslPredicateExecutor;
}
private Sort applyDefaultOrder(Sort sort) {
if (sort.isUnsorted()) {
return Sort.by("creationDate").ascending();
}
return sort;
}
private Pageable applyDefaultOrder(Pageable pageable) {
if (pageable.getSort().isUnsorted()) {
Sort defaultSort = Sort.by(AuditableEntity_.CREATION_DATE).ascending();
pageable = PageRequest.of(pageable.getPageNumber(), pageable.getPageSize(), defaultSort);
}
return pageable;
}
#Override
public Optional<T> findOne(Predicate predicate) {
return querydslPredicateExecutor.findOne(predicate);
}
#Override
public List<T> findAll(Predicate predicate) {
return querydslPredicateExecutor.findAll(predicate);
}
#Override
public List<T> findAll(Predicate predicate, Sort sort) {
return querydslPredicateExecutor.findAll(predicate, applyDefaultOrder(sort));
}
#Override
public List<T> findAll(Predicate predicate, OrderSpecifier<?>... orders) {
return querydslPredicateExecutor.findAll(predicate, orders);
}
#Override
public List<T> findAll(OrderSpecifier<?>... orders) {
return querydslPredicateExecutor.findAll(orders);
}
#Override
public Page<T> findAll(Predicate predicate, Pageable pageable) {
return querydslPredicateExecutor.findAll(predicate, applyDefaultOrder(pageable));
}
#Override
public long count(Predicate predicate) {
return querydslPredicateExecutor.count(predicate);
}
#Override
public boolean exists(Predicate predicate) {
return querydslPredicateExecutor.exists(predicate);
}
}
Next the CustomJpaRepositoryFactory doing the magic and providing the Querydsl wrapper class instead of the default one. The default one is passed as parameter and wrapped.
/**
* Custom JpaRepositoryFactory allowing to support a custom QuerydslJpaRepository.
*
*/
public class CustomJpaRepositoryFactory extends JpaRepositoryFactory {
/**
* Creates a new {#link JpaRepositoryFactory}.
*
* #param entityManager must not be {#literal null}
*/
public CustomJpaRepositoryFactory(EntityManager entityManager) {
super(entityManager);
}
#Override
protected RepositoryComposition.RepositoryFragments getRepositoryFragments(RepositoryMetadata metadata) {
final RepositoryComposition.RepositoryFragments[] modifiedFragments = {RepositoryComposition.RepositoryFragments.empty()};
RepositoryComposition.RepositoryFragments fragments = super.getRepositoryFragments(metadata);
// because QuerydslJpaPredicateExecutor is using som internal classes only a wrapper can be used.
fragments.stream().forEach(
f -> {
if (f.getImplementation().isPresent() &&
QuerydslJpaPredicateExecutor.class.isAssignableFrom(f.getImplementation().get().getClass())) {
modifiedFragments[0] = modifiedFragments[0].append(RepositoryFragment.implemented(
new CustomQuerydslJpaRepositoryIml((QuerydslJpaPredicateExecutor) f.getImplementation().get())));
} else {
modifiedFragments[0].append(f);
}
}
);
return modifiedFragments[0];
}
}
Finally the CustomJpaRepositoryFactoryBean. This must be registered with the Spring Boot application, to make Spring aware where to get the repository implementations from, e.g. with:
#SpringBootApplication
#EnableJpaRepositories(basePackages = "your.package",
repositoryFactoryBeanClass = CustomJpaRepositoryFactoryBean.class)
...
Here now the class:
public class CustomJpaRepositoryFactoryBean<T extends Repository<S, I>, S, I> extends JpaRepositoryFactoryBean<T, S, I> {
/**
* Creates a new {#link JpaRepositoryFactoryBean} for the given repository interface.
*
* #param repositoryInterface must not be {#literal null}.
*/
public CustomJpaRepositoryFactoryBean(Class<? extends T> repositoryInterface) {
super(repositoryInterface);
}
protected RepositoryFactorySupport createRepositoryFactory(EntityManager entityManager) {
return new CustomJpaRepositoryFactory(entityManager);
}
}

Spring Data MongoDB No property get found for type at org.springframework.data.mapping.PropertyPath

I am using Spring Data MongodB 1.4.2.Release version. For Spring Data MongoDB, I have created the custom repository interface and implementation in one location and create custom query function getUsersName(Users users).
However I am still getting below exception:
Caused by: org.springframework.data.mapping.PropertyReferenceException:
No property get found for type Users! at org.springframework.data.mapping.PropertyPath. (PropertyPath.java:75) at
org.springframework.data.mapping.PropertyPath.create(PropertyPath.java:327) at
org.springframework.data.mapping.PropertyPath.create(PropertyPath.java:359) at
org.springframework.data.mapping.PropertyPath.create(PropertyPath.java:359) at
org.springframework.data.mapping.PropertyPath.create(PropertyPath.java:307) at
org.springframework.data.mapping.PropertyPath.from(PropertyPath.java:270) at
org.springframework.data.mapping.PropertyPath.from(PropertyPath.java:241) at
org.springframework.data.repository.query.parser.Part.(Part.java:76) at
org.springframework.data.repository.query.parser.PartTree$OrPart.(PartTree.java:201) at
org.springframework.data.repository.query.parser.PartTree$Predicate.buildTree(PartTree.java:291) at
org.springframework.data.repository.query.parser.PartTree$Predicate.(PartTree.java:271) at
org.springframework.data.repository.query.parser.PartTree.(PartTree.java:80) at
org.springframework.data.mongodb.repository.query.PartTreeMongoQuery.(PartTreeMongoQuery.java:47)
Below is my Spring Data MongoDB structure:
/* Users Domain Object */
#Document(collection = "users")
public class Users {
#Id
private ObjectId id;
#Field ("last_name")
private String last_name;
#Field ("first_name")
private String first_name;
public String getLast_name() {
return last_name;
}
public void setLast_name(String last_name) {
this.last_name = last_name;
}
public String getFirst_name() {
return first_name;
}
public void setFirst_name(String first_name) {
this.first_name = first_name;
}
}
/* UsersRepository.java main interface */
#Repository
public interface UsersRepository extends MongoRepository<Users,String>, UsersRepositoryCustom {
List findUsersById(String id);
}
/* UsersRepositoryCustom.java custom interface */
#Repository
public interface UsersRepositoryCustom {
List<Users> getUsersName(Users users);
}
/* UsersRepositoryImpl.java custom interface implementation */
#Component
public class UsersRepositoryImpl implements UsersRepositoryCustom {
#Autowired
MongoOperations mongoOperations;
#Override
public List<Users> getUsersName(Users users) {
return mongoOperations.find(
Query.query(Criteria.where("first_name").is(users.getFirst_name()).and("last_name").is(users.getLast_name())), Users.class);
}
/* Mongo Test function inside Spring JUnit Test class calling custom function with main UsersRepository interface */
#Autowired
private UsersRepository usersRepository;
#Test
public void getUsersName() {
Users users = new Users();
users.setFirst_name("James");`enter code here`
users.setLast_name("Oliver");
List<Users> usersDetails = usersRepository.getUsersName(users);
System.out.println("users List" + usersDetails.size());
Assert.assertTrue(usersDetails.size() > 0);
}
The query method declaration in your repository interface is invalid. As clearly stated in the reference documentation, query methods need to start with get…By, read_By, find…By or query…by.
With custom repositories, there shouldn't be a need for method naming conventions as Oliver stated. I have mine working with a method named updateMessageCount
Having said that, I can't see the problem with the code provided here.
I resolved this issue with the help of this post here, where I wasn't naming my Impl class correctly :
No property found for type error when try to create custom repository with Spring Data JPA

Morphia converter calling other converters

I want to convert Optional<BigDecimal> in morphia. I created BigDecimalConverter, and it works fine. Now I want to create OptionalConverter.
Optional can hold any object type. In my OptionalConverter.encode method I can extract underlying object, and I'd like to pass it to default mongo conversion. So that if there is string, I'll just get string, if there is one of my entities, I'll get encoded entity. How can I do it?
There are two questions:
1. How to call other converters?
2. How to create a converter for a generic class whose type parameters are not statically known?
The first one is possible by creating the MappingMongoConveter and the custom converter together:
#Configuration
public class CustomConfig extends AbstractMongoConfiguration {
#Override
protected String getDatabaseName() {
// ...
}
#Override
#Bean
public Mongo mongo() throws Exception {
// ...
}
#Override
#Bean
public MappingMongoConverter mappingMongoConverter() throws Exception {
MappingMongoConverter mmc = new MappingMongoConverter(
mongoDbFactory(), mongoMappingContext());
mmc.setCustomConversions(new CustomConversions(CustomConverters
.create(mmc)));
return mmc;
}
}
public class FooConverter implements Converter<Foo, DBObject> {
private MappingMongoConverter mmc;
public FooConverter(MappingMongoConverter mmc) {
this.mmc = mmc;
}
public DBObject convert(Foo foo) {
// ...
}
}
public class CustomConverters {
public static List<?> create(MappingMongoConverter mmc) {
List<?> list = new ArrayList<>();
list.add(new FooConverter(mmc));
return list;
}
}
The second one is much more difficult due to type erasure. I've tried to create a converter for Scala's Map but haven't found a way. Unable to get the exact type information for the source Map when writing, or for the target Map when reading.
For very simple cases, e.g. if you don't need to handle all possible parameter types, and there is no ambiguity while reading, it may be possible though.

Spring Data MongoTemplate not throwing DataAccessException

I am trying to learn MongoDB and in the same time write a simple REST application using Spring framework.
I have a simple model:
#Document
public class Permission extends documentBase{
#Indexed(unique = true)
private String name;
public Permission(String name) {
this.name = name;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
}
Then I have a simple DAO:
#Repository
#Transactional
#Profile({"production","repositoryTest","mongoIntegrationTest"})
public class DaoImpl implements DAO {
#Autowired
protected MongoTemplate mongoTemplate;
public <T> T addObject(T object) {
mongoTemplate.insert(object);
return object;
}
The I have my integration tests:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(locations = { "classpath:mvc-dispatcher-servlet.xml", classpath:IntegrationContext.xml"},loader = TestXmlContextLoader.class)
#ActiveProfiles("mongoIntegrationTest")
public class RepositoryIntegrationTest extends AccountTestBase{
#Autowired DAO repository;
#Autowired WebApplicationContext wac;
#Test
public void AddPermission() {
Permission permission_1 = new Permission("test");
Permission permission_2 = new Permission("test");
repository.addObject(permission_1);
repository.addObject(permission_2);
}
}
My configuration:
<!-- MongoDB host -->
<mongo:mongo host="${mongo.host.name}" port="${mongo.host.port}"/>
<!-- Template for performing MongoDB operations -->
<bean id="mongoTemplate" class="org.springframework.data.mongodb.core.MongoTemplate"
c:mongo-ref="mongo" c:databaseName="${mongo.db.name}"/>
I am expecting that, on adding "permission_2" their would be a exception thrown from MongoDB, which would be translated by Spring,, and catched as a DataAccessException in the DAO.
Looking at the log files from MongoDb I can see that a duplicated exception is thrown but it never reaches my DAO.
So,, I guess I am doing something wrong,,, but at the moment,, I am blind to my own misstakes.
//lg
Make sure you configure the WriteConcern of the MongoTemplate to something non-default (e.g. WriteConcern.SAFE). By default MongoDB is in fire-and-forget mode and does not throw exceptions on index violations or server errors in general.
Still struggling with this.
Finnally I succeded to get the exeption translation working. MongoDb throws a exception which is translated to Spring Data exception.
Now I am stuck with another problem.
My DAO shown above has also the following code:
#ExceptionHandler(DataAccessException.class)
public void handleDataAccessException(DataAccessException ex) {
// For debug only
DataAccessException test = ex;
test.printStackTrace();
}
I was expecting this code to catch the exception thrown,, but this is not the case.
Why not?
//lasse

Spring data MongoDb: MappingMongoConverter remove _class

The default MappingMongoConverter adds a custom type key ("_class") to each object in the database. So, if I create a Person:
package my.dto;
public class Person {
String name;
public Person(String name) {
this.name = name;
}
}
and save it to the db:
MongoOperations ops = new MongoTemplate(new Mongo(), "users");
ops.insert(new Person("Joe"));
the resulting object in the mongo will be:
{ "_id" : ObjectId("4e2ca049744e664eba9d1e11"), "_class" : "my.dto.Person", "name" : "Joe" }
Questions:
What are the implications of moving the Person class into a different namespace?
Is it possible not to pollute the object with the "_class" key; without writing a unique converter just for the Person class?
So here's the story: we add the type by default as some kind of hint what class to instantiate actually. As you have to pipe in a type to read the document into via MongoTemplate anyway there are two possible options:
You hand in a type the actual stored type can be assigned to. In that case we consider the stored type, use that for object creation. Classical example here is doing polymorphic queries. Suppose you have an abstract class Contact and your Person. You could then query for Contacts and we essentially have to determine a type to instantiate.
If you - on the other hand - pass in a completely different type we'd simply marshal into that given type, not into the one stored in the document actually. That would cover your question what happens if you move the type.
You might be interested in watching this ticket which covers some kind of pluggable type mapping strategy to turn the type information into an actual type. This can serve simply space saving purposes as you might want to reduce a long qualified class name to a hash of a few letters. It would also allow more complex migration scenarios where you might find completely arbitrary type keys produced by another datastore client and bind those to Java types.
Here's my annotation, and it works.
#Configuration
public class AppMongoConfig {
public #Bean
MongoDbFactory mongoDbFactory() throws Exception {
return new SimpleMongoDbFactory(new Mongo(), "databasename");
}
public #Bean
MongoTemplate mongoTemplate() throws Exception {
//remove _class
MappingMongoConverter converter = new MappingMongoConverter(mongoDbFactory(), new MongoMappingContext());
converter.setTypeMapper(new DefaultMongoTypeMapper(null));
MongoTemplate mongoTemplate = new MongoTemplate(mongoDbFactory(), converter);
return mongoTemplate;
}
}
If you want to disable _class attribute by default, but preserve polymorfism for specified classes, you can explictly define the type of _class (optional) field by configuing:
#Bean
public MongoTemplate mongoTemplate() throws Exception {
Map<Class<?>, String> typeMapperMap = new HashMap<>();
typeMapperMap.put(com.acme.domain.SomeDocument.class, "role");
TypeInformationMapper typeMapper1 = new ConfigurableTypeInformationMapper(typeMapperMap);
MongoTypeMapper typeMapper = new DefaultMongoTypeMapper(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, Arrays.asList(typeMapper1));
MappingMongoConverter converter = new MappingMongoConverter(mongoDbFactory(), new MongoMappingContext());
converter.setTypeMapper(typeMapper);
MongoTemplate mongoTemplate = new MongoTemplate(mongoDbFactory(), converter);
return mongoTemplate;
}
This will preserve _class field (or whatever you want to name in construtor) for only specified entities.
You can also write own TypeInformationMapper for example based on annotations. If you annotate your document by #DocumentType("aliasName") you will keep polymorphism by keeping alias of class.
I have explained briefly it on my blog, but here is some piece of quick code:
https://gist.github.com/athlan/6497c74cc515131e1336
<mongo:mongo host="hostname" port="27017">
<mongo:options
...options...
</mongo:mongo>
<mongo:db-factory dbname="databasename" username="user" password="pass" mongo-ref="mongo"/>
<bean id="mongoTypeMapper" class="org.springframework.data.mongodb.core.convert.DefaultMongoTypeMapper">
<constructor-arg name="typeKey"><null/></constructor-arg>
</bean>
<bean id="mongoMappingContext" class="org.springframework.data.mongodb.core.mapping.MongoMappingContext" />
<bean id="mongoConverter" class="org.springframework.data.mongodb.core.convert.MappingMongoConverter">
<constructor-arg name="mongoDbFactory" ref="mongoDbFactory" />
<constructor-arg name="mappingContext" ref="mongoMappingContext" />
<property name="typeMapper" ref="mongoTypeMapper"></property>
</bean>
<bean id="mongoTemplate" class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg name="mongoDbFactory" ref="mongoDbFactory"/>
<constructor-arg name="mongoConverter" ref="mongoConverter" />
<property name="writeResultChecking" value="EXCEPTION" />
</bean>
While, Mkyong's answer still works, I would like to add my version of solution as few bits are deprecated and may be in the verge of cleanup.
For example : MappingMongoConverter(mongoDbFactory(), new MongoMappingContext()) is deprecated in favor of new MappingMongoConverter(dbRefResolver, new MongoMappingContext()); and SimpleMongoDbFactory(new Mongo(), "databasename"); in favor of new SimpleMongoDbFactory(new MongoClient(), database);.
So, my final working answer without deprecation warnings is :
#Configuration
public class SpringMongoConfig {
#Value("${spring.data.mongodb.database}")
private String database;
#Autowired
private MongoDbFactory mongoDbFactory;
public #Bean MongoDbFactory mongoDBFactory() throws Exception {
return new SimpleMongoDbFactory(new MongoClient(), database);
}
public #Bean MongoTemplate mongoTemplate() throws Exception {
DbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory);
// Remove _class
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, new MongoMappingContext());
converter.setTypeMapper(new DefaultMongoTypeMapper(null));
return new MongoTemplate(mongoDBFactory(), converter);
}
}
Hope this helps people who would like to have a clean class with no deprecation warnings.
For Spring Boot 2.3.0.RELEASE it's more easy, just override the method mongoTemplate, it's already has all things you need to set type mapper. See the following example:
#Configuration
#EnableMongoRepositories(
// your package ...
)
public class MongoConfig extends AbstractMongoClientConfiguration {
// .....
#Override
public MongoTemplate mongoTemplate(MongoDatabaseFactory databaseFactory, MappingMongoConverter converter) {
// remove __class field from mongo
converter.setTypeMapper(new DefaultMongoTypeMapper(null));
return super.mongoTemplate(databaseFactory, converter);
}
// .....
}
This is my one line solution:
#Bean
public MongoTemplate mongoTemplateFraud() throws UnknownHostException {
MongoTemplate mongoTemplate = new MongoTemplate(getMongoClient(), dbName);
((MappingMongoConverter)mongoTemplate.getConverter()).setTypeMapper(new DefaultMongoTypeMapper(null));//removes _class
return mongoTemplate;
}
I struggled a long time with this problem. I followed the approach from mkyong but when I introduced a LocalDate attribute (any JSR310 class from Java 8) I received the following exception:
org.springframework.core.convert.ConverterNotFoundException:
No converter found capable of converting from type [java.time.LocalDate] to type [java.util.Date]
The corresponding converter org.springframework.format.datetime.standard.DateTimeConverters is part of Spring 4.1 and is referenced in Spring Data MongoDB 1.7. Even if I used newer versions the converter didn't jump in.
The solution was to use the existing MappingMongoConverter and only provide a new DefaultMongoTypeMapper (the code from mkyong is under comment):
#Configuration
#EnableMongoRepositories
class BatchInfrastructureConfig extends AbstractMongoConfiguration
{
#Override
protected String getDatabaseName() {
return "yourdb"
}
#Override
Mongo mongo() throws Exception {
new Mongo()
}
#Bean MongoTemplate mongoTemplate()
{
// overwrite type mapper to get rid of the _class column
// get the converter from the base class instead of creating it
// def converter = new MappingMongoConverter(mongoDbFactory(), new MongoMappingContext())
def converter = mappingMongoConverter()
converter.typeMapper = new DefaultMongoTypeMapper(null)
// create & return template
new MongoTemplate(mongoDbFactory(), converter)
}
To summarize:
extend AbstractMongoConfiguration
annotate with EnableMongoRepositories
in mongoTemplate get converter from base class, this ensures that the type conversion classes are registered
#Configuration
public class MongoConfig {
#Value("${spring.data.mongodb.database}")
private String database;
#Value("${spring.data.mongodb.host}")
private String host;
public #Bean MongoDbFactory mongoDbFactory() throws Exception {
return new SimpleMongoDbFactory(new MongoClient(host), database);
}
public #Bean MongoTemplate mongoTemplate() throws Exception {
MappingMongoConverter converter = new MappingMongoConverter(new DefaultDbRefResolver(mongoDbFactory()),
new MongoMappingContext());
converter.setTypeMapper(new DefaultMongoTypeMapper(null));
MongoTemplate mongoTemplate = new MongoTemplate(mongoDbFactory(), converter);
return mongoTemplate;
}
}
The correct answer above seems to be using a number of deprecated dependencies. For example if you check the code, it mentions MongoDbFactory which is deprecated in the latest Spring release. If you happen to be using MongoDB with Spring-Data in 2020, this solution seems to be older. For instant results, check this snippet of code. Works 100%.
Just Create a new AppConfig.java file and paste this block of code. You'll see the "_class" property disappearing from the MongoDB document.
package "Your Package Name";
import org.apache.naming.factory.BeanFactory;
import org.springframework.beans.factory.NoSuchBeanDefinitionException;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.convert.CustomConversions;
import org.springframework.data.mongodb.MongoDatabaseFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultMongoTypeMapper;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
#Configuration
public class AppConfig {
#Autowired
MongoDatabaseFactory mongoDbFactory;
#Autowired
MongoMappingContext mongoMappingContext;
#Bean
public MappingMongoConverter mappingMongoConverter() {
DbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory);
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, mongoMappingContext);
converter.setTypeMapper(new DefaultMongoTypeMapper(null));
return converter;
}
}
I'm using:
package YOUR_PACKAGE;
import javax.annotation.PostConstruct;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.mongodb.core.convert.DefaultMongoTypeMapper;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
#Configuration
public class MongoConfiguration {
#Autowired
private MappingMongoConverter mongoConverter;
#PostConstruct
public void setUpMongoEscapeCharacterAndTypeMapperConversion() {
mongoConverter.setMapKeyDotReplacement("_");
// This will remove _class: key
mongoConverter.setTypeMapper(new DefaultMongoTypeMapper(null));
}
}
Btw: It is also replacing "." with "_"
you just need to add the #TypeAlias annotation to the class defintion over changing the type mapper
I've tried the solutions above, some of them don't work in combination with auditing, and none seems to set correctly the MongoCustomConversions
A solution that works for me is the following
#Configuration
public class MongoConfig {
#Bean
public MappingMongoConverter mappingMongoConverterWithCustomTypeMapper(
MongoDatabaseFactory factory,
MongoMappingContext context,
MongoCustomConversions conversions) {
DbRefResolver dbRefResolver = new DefaultDbRefResolver(factory);
MappingMongoConverter mappingConverter = new MappingMongoConverter(dbRefResolver, context);
mappingConverter.setCustomConversions(conversions);
/**
* replicate the way that Spring
* instantiates a {#link DefaultMongoTypeMapper}
* in {#link MappingMongoConverter#MappingMongoConverter(DbRefResolver, MappingContext)}
*/
CustomMongoTypeMapper customTypeMapper = new CustomMongoTypeMapper(
context,
mappingConverter::getWriteTarget);
mappingConverter.setTypeMapper(customTypeMapper);
return mappingConverter;
}
}
public class CustomMongoTypeMapper extends DefaultMongoTypeMapper {
public CustomMongoTypeMapper(
MappingContext<? extends PersistentEntity<?, ?>, ?> mappingContext,
UnaryOperator<Class<?>> writeTarget) {
super(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, mappingContext, writeTarget);
}
#Override
public TypeInformation<?> readType(Bson source) {
/**
* do your conversion here, and eventually return
*/
return super.readType(source);
}
}
As an alternative, you could use a BeanPostProcessor to detect the creation of a mappingMongoConverter, and add your converter there.
Something like
public class MappingMongoConverterHook implements BeanPostProcessor {
#Override
public Object postProcessAfterInitialization(Object bean, String beanName) throws BeansException {
if ("mappingMongoConverter" == beanName) {
((MappingMongoConverter) bean).setTypeMapper(new CustomMongoTypeMapper());
}
return bean;
}
}