Whether Replication Possible With Spring Data Couchbase? - spring-data

HI i just want to know whether the XDCR replication is possible with spring data Couchbase. If possible how can i achieve that .please help .
My Code Sample
//configuration class
#Configuration
public class ApplicationConfig {
#Bean
public CouchbaseClient couchbaseClient() throws IOException {
return new CouchbaseClient(Arrays.asList(URI
.create("http://localhost:8091/pools")), "xxxw", "");
}
#Bean
public CouchbaseTemplate couchbaseTemplate() throws IOException {
return new CouchbaseTemplate(couchbaseClient());
}
}
#Document
public class Person {
#Field
String name;
#Id
String id;
public Person(final String personId, final String personname,
final int personIdAge, final JSONObject personData) {
this.id = personId;
this.name = personname;
this.age = personIdAge;
this.body = personData;
}
}
//main Test class
public class Test {
public static void main(String s[]) {
try {
ApplicationContext context = new AnnotationConfigApplicationContext(
ApplicationConfig.class);
CouchbaseTemplate template = context.getBean("couchbaseTemplate",
CouchbaseTemplate.class);
} catch (Exception e) {
e.printStackTrace();
}
}
How can i achieve repltcaion to elastic search index through spring data couchbase . with this sample classes ..??

Using elastic search in Couchbase is not dependent on what you client application looks like or whether or not it uses Spring. As long as you are storing data in Couchbase in JSON format things should work fine.
Setting up elastic search is more of an operations task than a development task. Take a look at the instructions at the link below and then run you application code as is. If you have configured everything properly then your data should end up in elastic search.
http://docs.couchbase.com/couchbase-elastic-search/

Related

Query for multiple values of the same property with queryDSL and Spring Data JPA

Is there a way to query for multiple values of the same property with Spring DataREST JPA and querydsl? I am not sure what the format of the query URL should be and if I need extra customization in my bindings. I couldn't find anything in documentation. If I have a "student" table in my database with a "major" column with corresponding Student entity I would assume that querying for all students which have "math" and "science" majors would look like http://localhost:8080/students?major=math&major=science. However in this query only the first part is being taken and major=science is ignored
Below example customizes Querydsl web support to perform collection in operation. URI /students?major=sword&major=magic searches for students with major in ["sword", "magic"].
Entity and repository
public class Student {
private Long id;
private String name;
private String major;
}
public interface StudentRepos extends PagingAndSortingRepository<Student, Long>,
QuerydslPredicateExecutor<Student>,
QuerydslBinderCustomizer<QStudent> {
#Override
default void customize(QuerydslBindings bindings, QStudent root) {
bindings.bind(root.major)
.all((path, value) -> Optional.of(path.in(value)));
}
}
Test data
new Student("Arthur", "sword");
new Student("Merlin", "magic");
new Student("Lancelot", "lance");
Controller
#RestController
#RequestMapping("/students")
#RequiredArgsConstructor
public class StudentController {
private final StudentRepos studentRepos;
#GetMapping
ResponseEntity<List<Student>> getAll(Predicate predicate) {
Iterable<Student> students = studentRepos.findAll(predicate);
return ResponseEntity.ok(StreamSupport.stream(students.spliterator(), false)
.collect(Collectors.toList()));
}
}
Test case
#Test
#SneakyThrows
public void queryAll() {
mockMvc.perform(get("/students"))
.andExpect(status().isOk())
.andExpect(jsonPath("$").isArray())
.andExpect(jsonPath("$", hasSize(3)))
.andDo(print());
}
#Test
#SneakyThrows
void querySingleValue() {
mockMvc.perform(get("/students?major=sword"))
.andExpect(status().isOk())
.andExpect(jsonPath("$").isArray())
.andExpect(jsonPath("$", hasSize(1)))
.andExpect(jsonPath("$[0].name").value("Arthur"))
.andExpect(jsonPath("$[0].major").value("sword"))
.andDo(print());
}
#Test
#SneakyThrows
void queryMultiValue() {
mockMvc.perform(get("/students?major=sword&major=magic"))
.andExpect(status().isOk())
.andExpect(jsonPath("$").isArray())
.andExpect(jsonPath("$", hasSize(2)))
.andExpect(jsonPath("$[0].name").value("Arthur"))
.andExpect(jsonPath("$[0].major").value("sword"))
.andExpect(jsonPath("$[1].name").value("Merlin"))
.andExpect(jsonPath("$[1].major").value("magic"))
.andDo(print());
}
The full Spring Boot application is in Github

Using Spring Batch to write to a Cassandra Database

As of now, I'm able to connect to Cassandra via the following code:
import com.datastax.driver.core.Cluster;
import com.datastax.driver.core.Session;
public static Session connection() {
Cluster cluster = Cluster.builder()
.addContactPoints("IP1", "IP2")
.withCredentials("user", "password")
.withSSL()
.build();
Session session = null;
try {
session = cluster.connect("database_name");
session.execute("CQL Statement");
} finally {
IOUtils.closeQuietly(session);
IOUtils.closeQuietly(cluster);
}
return session;
}
The problem is that I need to write to Cassandra in a Spring Batch project. Most of the starter kits seem to use a JdbcBatchItemWriter to write to a mySQL database from a chunk. Is this possible? It seems that a JdbcBatchItemWriter cannot connect to a Cassandra database.
The current itemwriter code is below:
#Bean
public JdbcBatchItemWriter<Person> writer() {
JdbcBatchItemWriter<Person> writer = new JdbcBatchItemWriter<Person>();
writer.setItemSqlParameterSourceProvider(new
BeanPropertyItemSqlParameterSourceProvider<Person>());
writer.setSql("INSERT INTO people (first_name, last_name) VALUES
(:firstName, :lastName)");
writer.setDataSource(dataSource);
return writer;
}
Spring Data Cassandra provides repository abstractions for Cassandra that you should be able to use in conjunction with the RepositoryItemWriter to write to Cassandra from Spring Batch.
It is possible to extend Spring Batch to support Cassandra by customising ItemReader and ItemWriter.
ItemWriter example:
public class CassandraBatchItemWriter<Company> implements ItemWriter<Company>, InitializingBean {
protected static final Log logger = LogFactory.getLog(CassandraBatchItemWriter.class);
private final Class<Company> aClass;
#Autowired
private CassandraTemplate cassandraTemplate;
#Override
public void afterPropertiesSet() throws Exception { }
public CassandraBatchItemWriter(final Class<Company> aClass) {
this.aClass = aClass;
}
#Override
public void write(final List<? extends Company> items) throws Exception {
logger.debug("Write operations is performing, the size is {}" + items.size());
if (!items.isEmpty()) {
logger.info("Deleting in a batch performing...");
cassandraTemplate.deleteAll(aClass);
logger.info("Inserting in a batch performing...");
cassandraTemplate.insert(items);
}
logger.debug("Items is null...");
}
}
Then you can inject it as a #Bean through #Configuration
#Bean
public ItemWriter<Company> writer(final DataSource dataSource) {
final CassandraBatchItemWriter<Company> writer = new CassandraBatchItemWriter<Company>(Company.class);
return writer;
}
Full source code can be found in Github repo: Spring-Batch-with-Cassandra

Rest API with Spring Data MongoDB - Repository method not working

I am reading and learning Spring Boot data with MongoDB. I have about 10 records in my database in the following format:
{
"_id" : ObjectId("5910c7fed6df5322243c36cd"),
name: "car"
}
When I open the url:
http://localhost:8090/items
I get an exhaustive list of all items. However, I want to use the methods of MongoRepository such as findById, count etc. When I use them as such:
http://localhost:8090/items/count
http://localhost:8090/items/findById/5910c7fed6df5322243c36cd
http://localhost:8090/items/findById?id=5910c7fed6df5322243c36cd
I get a 404.
My setup is as so:
#SpringBootApplication
public class Application {
public static void main(String[] args) throws IOException {
SpringApplication.run(Application.class, args);
}
}
#Document
public class Item implements Serializable {
private static final long serialVersionUID = -4343106526681673638L;
#Id
private String id;
private String name;
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
}
#RepositoryRestResource(collectionResourceRel = "item", path = "items")
public interface ItemRepository<T, ID extends Serializable> extends MongoRepository<Item, String>, ItemRepositoryCustom {
}
What am I doing wrong? Do I need to implement the methods as defined by MongoRepository or will they be automatically implemented? I am lost and have been trying to figure this out for so long. I do not have any methods in my controller, its empty.
You have to declare the findById method in order for it to be exposed.
Item findById(String id);
Item findByName(String name);
Note that you don't need to implement the methods. SpringBoot will analyse the method name and provide the proper implementation
I had same issue,
After removing #Configuration,#ComponentScan everything worked fine.

Spring Data MongoDB No property get found for type at org.springframework.data.mapping.PropertyPath

I am using Spring Data MongodB 1.4.2.Release version. For Spring Data MongoDB, I have created the custom repository interface and implementation in one location and create custom query function getUsersName(Users users).
However I am still getting below exception:
Caused by: org.springframework.data.mapping.PropertyReferenceException:
No property get found for type Users! at org.springframework.data.mapping.PropertyPath. (PropertyPath.java:75) at
org.springframework.data.mapping.PropertyPath.create(PropertyPath.java:327) at
org.springframework.data.mapping.PropertyPath.create(PropertyPath.java:359) at
org.springframework.data.mapping.PropertyPath.create(PropertyPath.java:359) at
org.springframework.data.mapping.PropertyPath.create(PropertyPath.java:307) at
org.springframework.data.mapping.PropertyPath.from(PropertyPath.java:270) at
org.springframework.data.mapping.PropertyPath.from(PropertyPath.java:241) at
org.springframework.data.repository.query.parser.Part.(Part.java:76) at
org.springframework.data.repository.query.parser.PartTree$OrPart.(PartTree.java:201) at
org.springframework.data.repository.query.parser.PartTree$Predicate.buildTree(PartTree.java:291) at
org.springframework.data.repository.query.parser.PartTree$Predicate.(PartTree.java:271) at
org.springframework.data.repository.query.parser.PartTree.(PartTree.java:80) at
org.springframework.data.mongodb.repository.query.PartTreeMongoQuery.(PartTreeMongoQuery.java:47)
Below is my Spring Data MongoDB structure:
/* Users Domain Object */
#Document(collection = "users")
public class Users {
#Id
private ObjectId id;
#Field ("last_name")
private String last_name;
#Field ("first_name")
private String first_name;
public String getLast_name() {
return last_name;
}
public void setLast_name(String last_name) {
this.last_name = last_name;
}
public String getFirst_name() {
return first_name;
}
public void setFirst_name(String first_name) {
this.first_name = first_name;
}
}
/* UsersRepository.java main interface */
#Repository
public interface UsersRepository extends MongoRepository<Users,String>, UsersRepositoryCustom {
List findUsersById(String id);
}
/* UsersRepositoryCustom.java custom interface */
#Repository
public interface UsersRepositoryCustom {
List<Users> getUsersName(Users users);
}
/* UsersRepositoryImpl.java custom interface implementation */
#Component
public class UsersRepositoryImpl implements UsersRepositoryCustom {
#Autowired
MongoOperations mongoOperations;
#Override
public List<Users> getUsersName(Users users) {
return mongoOperations.find(
Query.query(Criteria.where("first_name").is(users.getFirst_name()).and("last_name").is(users.getLast_name())), Users.class);
}
/* Mongo Test function inside Spring JUnit Test class calling custom function with main UsersRepository interface */
#Autowired
private UsersRepository usersRepository;
#Test
public void getUsersName() {
Users users = new Users();
users.setFirst_name("James");`enter code here`
users.setLast_name("Oliver");
List<Users> usersDetails = usersRepository.getUsersName(users);
System.out.println("users List" + usersDetails.size());
Assert.assertTrue(usersDetails.size() > 0);
}
The query method declaration in your repository interface is invalid. As clearly stated in the reference documentation, query methods need to start with get…By, read_By, find…By or query…by.
With custom repositories, there shouldn't be a need for method naming conventions as Oliver stated. I have mine working with a method named updateMessageCount
Having said that, I can't see the problem with the code provided here.
I resolved this issue with the help of this post here, where I wasn't naming my Impl class correctly :
No property found for type error when try to create custom repository with Spring Data JPA

Spring Data MongoTemplate not throwing DataAccessException

I am trying to learn MongoDB and in the same time write a simple REST application using Spring framework.
I have a simple model:
#Document
public class Permission extends documentBase{
#Indexed(unique = true)
private String name;
public Permission(String name) {
this.name = name;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
}
Then I have a simple DAO:
#Repository
#Transactional
#Profile({"production","repositoryTest","mongoIntegrationTest"})
public class DaoImpl implements DAO {
#Autowired
protected MongoTemplate mongoTemplate;
public <T> T addObject(T object) {
mongoTemplate.insert(object);
return object;
}
The I have my integration tests:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(locations = { "classpath:mvc-dispatcher-servlet.xml", classpath:IntegrationContext.xml"},loader = TestXmlContextLoader.class)
#ActiveProfiles("mongoIntegrationTest")
public class RepositoryIntegrationTest extends AccountTestBase{
#Autowired DAO repository;
#Autowired WebApplicationContext wac;
#Test
public void AddPermission() {
Permission permission_1 = new Permission("test");
Permission permission_2 = new Permission("test");
repository.addObject(permission_1);
repository.addObject(permission_2);
}
}
My configuration:
<!-- MongoDB host -->
<mongo:mongo host="${mongo.host.name}" port="${mongo.host.port}"/>
<!-- Template for performing MongoDB operations -->
<bean id="mongoTemplate" class="org.springframework.data.mongodb.core.MongoTemplate"
c:mongo-ref="mongo" c:databaseName="${mongo.db.name}"/>
I am expecting that, on adding "permission_2" their would be a exception thrown from MongoDB, which would be translated by Spring,, and catched as a DataAccessException in the DAO.
Looking at the log files from MongoDb I can see that a duplicated exception is thrown but it never reaches my DAO.
So,, I guess I am doing something wrong,,, but at the moment,, I am blind to my own misstakes.
//lg
Make sure you configure the WriteConcern of the MongoTemplate to something non-default (e.g. WriteConcern.SAFE). By default MongoDB is in fire-and-forget mode and does not throw exceptions on index violations or server errors in general.
Still struggling with this.
Finnally I succeded to get the exeption translation working. MongoDb throws a exception which is translated to Spring Data exception.
Now I am stuck with another problem.
My DAO shown above has also the following code:
#ExceptionHandler(DataAccessException.class)
public void handleDataAccessException(DataAccessException ex) {
// For debug only
DataAccessException test = ex;
test.printStackTrace();
}
I was expecting this code to catch the exception thrown,, but this is not the case.
Why not?
//lasse