I am using elasticsearch for my application and this the domain entity
#Document(indexName = "bookstore", type = "book", refreshInterval = "-1")
public class Book {
#Id
#Column(name = "ID")
private Long id;
#Column(name = "NAME")
#Field(type=FieldType.String)
private String name;
#Column(name = "DESCRIPTION")
private String description;
#Column(name = "PRICE")
private Double price;
This is the config file
#Configuration
#EnableElasticsearchRepositories(basePackages =
"elasticsearch.repo")
public class BookSearchRepositoryTestConfig {
#Bean
public ElasticsearchOperations elasticsearchTemplate() throws IOException {
return new ElasticsearchTemplate(nodeBuilder()
.loadConfigSettings(false)
.local(true)
.settings(
ImmutableSettings.settingsBuilder()
.put("index.store.type", "memory")
.put("index.number_of_shards", 1)
.put("index.number_of_replicas", 0).build()
).node().client());
}
This settings doesnt work.It use defualt settings and create 5 shards.
I know this can done by using #Document
#Document(indexName = "bookstore", type = "book", shards = 1, replicas = 0, indexStoreType = "memory", refreshInterval = "-1")
or using #Setting
#Setting(settingPath = "/settings/elasticsearch-settings.json")
But I am tring to use the config file and set the properties.
Please guide me to solve this issue.
#Bean
public ElasticsearchOperations elasticsearchTemplate() throws IOException {
Settings settings = ImmutableSettings.settingsBuilder().loadFromClasspath("elasticsearch.yml").build();
return new ElasticsearchTemplate(nodeBuilder()
.loadConfigSettings(false)
.local(true)
.settings(settings).node().client());
}
This works.But I had to add #Setting(settingPath="/") to the domain entity.
Related
I have the below class as my document.
#Data
#Builder
#Document(collection = "test")
public class TestData {
#Id
private String id;
private String name;
#Indexed(unique = true)
private String hash;
}
Even if I'm using Indexed with unique enabled, I'm able to insert duplicate documents into collection.
But if I generate index in mongo shell then it is working.
Is there any way where I can specify unique Index through code only?
This is how the compound index is used in my code
#Getter
#Setter
#Document
#CompoundIndexes({
#CompoundIndex(name = "name_author_idx", def = "{'name' : 1, 'author' : 1}", unique = true, background = true)})
public class Book implements Transformer {
#Id
private String id;
#Field(name = "name")
private String name;
#Field(name = "author")
private String author;
#Field(name = "qty")
private Integer qty;
#Field(name = "price")
private Double price;
#Field(name = "created_time")
private LocalDateTime createdTime = LocalDateTime.now();
}
Please use following code on application.properties file on spring boot application it will work.
spring.data.mongodb.auto-index-creation: true
Thank you
If you're having a configuration component you should override the autoIndexCreation method like this:
#Configuration
public class MongoConfiguration extends AbstractMongoClientConfiguration
{
#Override
protected boolean autoIndexCreation() {
return true;
}}
I have to entities exposed by spring boot application powered by Spring data REST.
#Entity
#Table(name = "joke")
#Data
public class Joke {
#Id
#Column(name = "joke_id")
private Long id;
#Column(name = "content")
private String content;
#JsonProperty("category")
#JoinColumn(name = "category_fk")
#ManyToOne(fetch = FetchType.EAGER)
private Category category;
}
and category
#Entity
#Table(name = "category")
#Data
public class Category {
#Id
#Column(name = "category_id")
private int id;
#Column(name = "name")
private String name;
}
It is working fine and exposing the HAL+Json format. I'm using Traverson client which is working fine:
Traverson client = new Traverson(URI.create("http://localhost:8080/api/"),
MediaTypes.HAL_JSON);
HashMap<String, Object> parameters = Maps.newHashMap();
parameters.put("size", "2");
PagedModel<JokesDTO> jokes = client
.follow("jokes")
.withTemplateParameters(parameters)
.toObject(new PagedModelType<JokesDTO>() {
});
return jokes;
where JokesDTO is:
#Builder(toBuilder = true)
#Value
#JsonDeserialize(builder = JokesDTO.JokesDTOBuilder.class)
#JsonInclude(Include.NON_NULL)
public class JokesDTO {
private String content;
#JsonPOJOBuilder(withPrefix = "")
#JsonIgnoreProperties(ignoreUnknown = true)
public static class JokesDTOBuilder {
}
}
I'm new in HAL and HateOS and I would like to achieve 2 things (and question is - is it possible, and how):
Base on Traverson client call, how to retrieve category (or link to category) in one call? How to extend what I wrote. And I'm not talking about adding additional #JsonProperty annotation to my class definition.
Is it possible to expose the inner query from Spring data REST, so I would be able to get all data with one call, is it possible with #RepositoryRestResource?
I want to send #RequestBody as Post request to restful service.In #RequestBody I have id, title, aboutMe, and image file.I set produces = MediaType.MULTIPART_FORM_DATA_VALUEbut when i check in rest I get error that says
406 not acceptable .How can I solve it?
My controller:
#RequestMapping(value = "aboutMe", method = RequestMethod.POST, produces = MediaType.MULTIPART_FORM_DATA_VALUE)
public String saveAboutMe(#RequestBody Author author) {
authorService.saveAboutMe(author);
return "saved";
}
And entity
#Entity
#Table(name = "author")
public class Author {
#Id
#GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "HIBERNATE_SEQUENCE")
#SequenceGenerator(name = "HIBERNATE_SEQUENCE", sequenceName = "HIBERNATE_SEQUENCE", allocationSize = 1, initialValue = 1)
private int id;
#Column(name = "title")
private String title;
#JsonFormat(pattern = "yyyy-MM-dd HH:mm")
#Column(name = "dateOfAuthor")
#Temporal(TemporalType.TIMESTAMP)
private Date modifiedTime;
#Column(name = "aboutMe", length = 10000)
private String aboutMe;
#Column(name = "image")
#Lob
private Blob image;
}
Screenshot I get from rest error
You can not pass with #RequestBody and Blob object, image should be MultipartFile object. So for working code:
1) Inject SessionFactory into controller class or in any class where you want to convert Mulitpart to Blob object
#Autowired
private SessionFactory sessionFactory;
2) Changes for Controller class:
#RequestMapping(value = "aboutMe", method = RequestMethod.POST)
public String saveAboutMe(Author author,#RequestPart("image") MultipartFile imageFile) {
//Convert Multipart file into BLOB
try {
Blob image = Hibernate.getLobCreator(sessionFactory.getCurrentSession()).createBlob(authorImage.getInputStream(),authorImage.getSize());
author.setImage(image);
} catch (HibernateException | IOException exception) {
log.error(exception.getMessage(),exception);
}
authorService.saveAboutMe(author);
return "saved";
}
I want to persist an entity(MyEntity) with merge method. This entity have some beans validation.
public class MyEntity extends AbstractEntity {
#Basic(optional = false)
#Column(name = "city", length = 255, nullable = false)
#NotNull
#NotEmpty(message = "{myentity.validation.size.name}")
private String city;
private String number;
#Basic(optional = false)
#Column(name = "zipcode", length = 255, nullable = false)
#NotNull
private String zipcode;
private String phoneNumber;
#Email(message = "{myentity.validation.conform.email}")
#Size(min = 2, max = 100, message = "{myentity.validation.size.email}")
private String email;
private String website;
private String gpsLocation;
#ElementCollection()
#CollectionTable(name = "translation_poi", joinColumns = #JoinColumn(name = "point_id"))
#MapKeyJoinColumn(name = "locale")
#NotEmpty
private Map<Locale, MyEntityI18n> translations = new HashMap<>();
}
#Embeddable
public class MyEntityI18n implements java.io.Serializable {
#Basic(optional = false)
#Column(name = "name", length = 255, nullable = false)
#NotNull
#NotEmpty(message = "{myentity.validation.size.name}")
private String name;
#Column(name = "comment", length = 1200)
private String comment;
#Column(name = "short_description", length = 1200)
private String shortDescription;
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
}
The merge succeeded on an existing entity value but with a new entity the merge failed despite the fact that the following validation succeeded.
private boolean validate(MyEntity poi) {
boolean result = true;
Set<ConstraintViolation<MyEntity>> constraintViolations = validator.validate(poi);
if (constraintViolations.size() > 0) {
result = false;
for (ConstraintViolation<MyEntity> constraints : constraintViolations) {
FacesContext context = FacesContext.getCurrentInstance();
String message = constraints.getPropertyPath() + " " + constraints.getMessage();
context.addMessage(null, new FacesMessage(FacesMessage.SEVERITY_WARN, constraints.getMessage(), message));
}
}
return result;
}
Try to add a #Valid to MyEntity.translations property. I think that your validation method hasn't take account the MyEntityI18n.name validation.
About merge fails, Do you have a not-null DB constraint on the MyEntityI18n.name field?
Good luck!
I am trying to map a class with composite key in datanucleus. The primary key is composed of two foreign keys and I can't seem to be able to include these foreign classes in the fetchgroup:
Using annotations :
#PrimaryKey
#Column(name = idElementOne, allowsNull = "false")
private Long idElementOne;
#PrimaryKey
#Column(name = "idElementTwo", allowsNull = "false");
private Long idElementTwo;
works
#PrimaryKey
#Column(name = idElementOne, allowsNull = "false");
private ElementOne elementOne;
#Column(name = "idElementTwo", allowsNull = "false");
private Long idElementTwo;
works
but
#PrimaryKey
#Column(name = idElementOne, allowsNull = "false")
private ElementOne elementOne;
#PrimaryKey
#Column(name = "idElementTwo", allowsNull = "false");
private Long idElementTwo;
does not.
How am I meant to do ?
Thanks to comments from DataNucleus user and documentation from the official website here is what I was missing.
ElementOne needs a PrimaryKey class so that we can use a constructor accepting a string argument in the main class' PrimaryKey.
ElementOne PrimaryKey class:
public static class PK implements Serializable
{
public Long idElementOne;
public PK()
{
}
public PK(String s)
{
this.idElementOne = Long.valueOf(s);
}
public String toString()
{
return "" + idElementOne;
}
//...
}
Main class with its PrimaryKey class:
#PersistenceCapable(objectIdClass=PK.class)
public class MainClass{
#PrimaryKey
#Column(name = idElementOne, allowsNull = "false")
private ElementOne elementOne;
#PrimaryKey
#Column(name = "idElementTwo", allowsNull = "false");
private Long idElementTwo;
//...
public static class PK implements Serializable
{
public Long idElementTwo; // Same name as real field in the main class
public ElementOne.PK elementOne; // Same name as the real field in the main class
public PK()
{
}
public PK(String s)
{
String[] constructorParam = s.split("::");
this.idElementTwo= Long.parseLong(constructorParam[1]);
this.personne = new Personne.PK(constructorParam[2]);
}
public String toString()
{
return "" + idElementTwo+ "::" + this.personne.toString();
}
//...
}
}
PS: Examples from DataNucleus website use StringTokenizer which is not implemented in GWT, use String.split() instead. Moreover the java doc states that:
StringTokenizer is a legacy class that
is retained for compatibility reasons
although its use is discouraged in new
code. It is recommended that anyone
seeking this functionality use the
split method of String or the
java.util.regex package instead.