Postgres similarity function with spring data - postgresql

I tried to implement a search query in my spring-boot service which utilizes the similarity(text, text) function of postgres.
I got the similarity working in the postgres console, and managed to get it over to my #Repository interface as native query.
It seems to construct the query correctly, but every time I try to execute the query I get
ERROR: function similarity(text, character varying) does not exist
When I try to create the extension again, I get an exception, that this extension is already installed.
What am I missing? Do I need some Spring/JPA magic Object to enable this?
Example entity:
import javax.persistence.Entity;
import javax.persistence.Id;
import javax.persistence.Table;
import lombok.Data;
#Entity
#Table(name = "example")
#Data
public class ExampleEntity {
#Id
private String id;
private String textField;
}
Example repository:
import java.util.Set;
import org.springframework.data.jpa.repository.Query;
import org.springframework.data.repository.CrudRepository;
import org.springframework.stereotype.Repository;
#Repository
public interface ExampleRepository extends CrudRepository<ExampleEntity, String> {
#Query(nativeQuery = true,
value = "SELECT * FROM example ORDER BY similarity(:searchString)")
List<ExampleEntity> findBySimilarity();
#Query(nativeQuery = true, value = "CREATE EXTENSION pg_trgm")
void createSimilarityExtension();
}
Test code (excluding setup, as it is rather complex):
public void test() {
ExampleEntity r1 = dbUtils.persistNewRandomEntity();
ExampleEntity r2 = dbUtils.persistNewRandomEntity();
ExampleEntity r3 = dbUtils.persistNewRandomEntity();
try {
exampleRepository.createSimilarityExtension();
} catch (InvalidDataAccessResourceUsageException e) {
// always says that the extension is already setup
}
List<ExampleEntity> bySimilarity = exampleRepository.findBySimilarity(r2.getTextField());
for (ExampleEntity entity : bySimilarity) {
System.out.println(entity);
}
}

Turns out I created the extension in the wrong schema while trying out if the extension would work at all.
I then added the extension to my DB-migration script, but would skip it if the extension existed. Therefore my extension was registered for the public schema and did not work in the actual schema my service is using.
So if you have the same problem I had, make sure your extension is created for the correct schema by using:
SET SCHEMA <your_schema>; CREATE EXTENSION pg_trgm;

Related

Add custom database session variable to Spring Boot JPA queries?

I am trying to set SET SESSION encrypt.key='some_key' to database queries or connection.
Thing is I have following column definition in my model class
#ColumnTransformer(forColumn = "first_name",
read = "pgp_sym_decrypt(first_name, current_setting('encrypt.key'))",
write = "pgp_sym_encrypt(?, current_setting('encrypt.key'))")
#Column(name = "first_name", columnDefinition = "bytea")
private String firstName;
Above works when we set encrypt.key in postgres.conf file directly but out requirement is to have encrypt.key configurable from our spring properties file.
Things I tried.
AttributeConverter annotation with custom Converter class which only works with JPA, and LIKE operations are not supported.
I tried ContextEventListener where I executed SET SESSION query at application startup but that only works for few requests
Next I tried CustomTransactionManager extends JpaTransactionManager where I was doing following
#Override
protected void prepareSynchronization(DefaultTransactionStatus status,TransactionDefinition definition) {
super.prepareSynchronization(status, definition);
if (status.isNewTransaction()) {
final String query = "SET encrypt.key='" + encryptKey + "'";
entityManager.createNativeQuery(query).executeUpdate();
}
log.info("Encrypt Key : {}", entityManager.createNativeQuery("SElECT current_setting('encrypt.key')").getSingleResult());
}
}
Above does not work when I call normal JPA Repository methods and encrypt.key is not set as the CustomTransactionManager class in not called.
Any guidance in right direction would help me a lot
Since I created CustomTransactionManager extends JpaTransactionManager
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Primary;
import org.springframework.orm.jpa.JpaTransactionManager;
import org.springframework.stereotype.Component;
import org.springframework.transaction.TransactionDefinition;
import org.springframework.transaction.support.DefaultTransactionStatus;
import javax.persistence.EntityManager;
#Component
#Slf4j
#Primary
#Qualifier(value = "transactionManager")
public class CustomTransactionManager extends JpaTransactionManager {
#Autowired
private EntityManager entityManager;
#Value("${database.encryption.key}")
private String encryptKey;
#Override
protected void prepareSynchronization(DefaultTransactionStatus status, TransactionDefinition definition) {
super.prepareSynchronization(status, definition);
if (status.isNewTransaction()) {
final String query = "SET SESSION encrypt.key='" + encryptKey + "'";
entityManager.createNativeQuery(query).executeUpdate();
}
}
}
Above was not getting called when I used normal JPA Repository methods.
For example,
public interface UserRepository extends JpaRepository<User, Long> {
Optional<User> findByFirstName(String firstName);
}
Adding #Transactional on Repository class did override framework logic where a shared transaction was getting created behind-the-scenes for all repository beans. This resulted in my CustomTransactionManager to be called even with repository methods.
I initially thought that adding Transactional annotation was overkill but found out that it gets created automatically at framework level as well so manually adding it had no additional footprint on its own but code/query you write inside CustomTransactionManager class will add required request footprint.
So I ended up adding #Transactional annotation on all repository classes whose domain(table) had encrypted columns.
For my use-case, this was the most flexible solution to have column level encryption on Azure postgres datbase service with Spring boot because we can not add custom environment variables there from Azure Portal, and directly adding to postgres.conf file also not possible due it being a SAAS service.

What are schemas for in Apache Beam?

I was reading the docs about SCHEMAS in Apache BEAM but i can not understand what its purpose is, how and why or in which cases should i need to use them. What is the difference between using schemas or using a class that extends the Serializable interface?
The docs has an example:
#DefaultSchema(JavaFieldSchema.class)
public class TransactionPojo {
public String bank;
public double purchaseAmount;
}
PCollection<TransactionPojos> transactionPojos = readTransactionsAsPojo();
But it doesn't explain how readTransactionsAsPojo function is built. I think there are a lot of missing explanation about this.
There are several reasons to use Beam Schema, some of them are below:
You won't need to specify a Coder for objects with schema;
If you have the objects with the same schema, but represented in a different way (like, JavaBean and Pojo in your example), then Beam Schema will allow to use the same Schema PTransforms for the PCollections of these objects;
With Schema-aware PCollections it's much easier to write joins since it will require much less code boilerplate;
To use BeamSQL over PCollection it will require you to have a Beam Schema. Like, you can read Avro files with a schema that will be automatically converted into Beam Schema and then you apply a Beam SQL transform over these Avro records.
Also, I'd recommend to watch these talk from Beam Summit 2019 about Schema-aware PCollections and Beam SQL.
Still there is NO answer as how readTransactionsAsPojo() has been implemented
PCollection<TransactionPojos> transactionPojos = readTransactionsAsPojo();
Keeping document abstract and not having complete code in repo, is hard to understand!!
A sample code which worked for me
package com.beam.test;
import com.beam.test.schema.Address;
import com.beam.test.schema.Purchase;
import org.apache.beam.sdk.Pipeline;
import org.apache.beam.sdk.io.TextIO;
import org.apache.beam.sdk.options.PipelineOptions;
import org.apache.beam.sdk.options.PipelineOptionsFactory;
import org.apache.beam.sdk.transforms.Create;
import org.apache.beam.sdk.transforms.DoFn;
import org.apache.beam.sdk.transforms.ParDo;
import org.apache.beam.sdk.values.PCollection;
import java.util.ArrayList;
import java.util.List;
public class SchemaExample {
public static void main(String[] args) {
PipelineOptions options= PipelineOptionsFactory.create();
Pipeline pipeline=Pipeline.create(options);
pipeline.apply("Create input:", TextIO.read().from("path\to\input\file.txt"))
.apply(ParDo.of(new ConvertToPurchase())).
apply(ParDo.of(new DoFn<Purchase, Void>() {
#ProcessElement
public void processElement(#Element Purchase purchase){
System.out.println(purchase.getUserId()+":"+purchase.getAddress().getHouseName());
}
}));
pipeline.run().waitUntilFinish();
}
static class ConvertToPurchase extends DoFn<String,Purchase>{
#ProcessElement
public void processElement(#Element String input,OutputReceiver<Purchase> outputReceiver){
String[] inputArr=input.split(",");
Purchase purchase=new Purchase(inputArr[0],new Address(inputArr[1],inputArr[2]));
outputReceiver.output(purchase);
}
}
}
package com.beam.test.schema;
import org.apache.beam.sdk.schemas.JavaBeanSchema;
import org.apache.beam.sdk.schemas.annotations.DefaultSchema;
import org.apache.beam.sdk.schemas.annotations.SchemaCreate;
#DefaultSchema(JavaBeanSchema.class)
public class Purchase {
private String userId;
private Address address;
public String getUserId(){
return userId;
}
public Address getAddress(){
return address;
}
#SchemaCreate
public Purchase(String userId, Address address){
this.userId=userId;
this.address=address;
}
}
package com.beam.test.schema;
import org.apache.beam.sdk.schemas.JavaBeanSchema;
import org.apache.beam.sdk.schemas.annotations.DefaultSchema;
import org.apache.beam.sdk.schemas.annotations.SchemaCreate;
#DefaultSchema(JavaBeanSchema.class)
public class Address {
private String houseName;
private String postalCode;
public String getHouseName(){
return houseName;
}
public String getPostalCode(){
return postalCode;
}
#SchemaCreate
public Address(String houseName,String postalCode){
this.houseName=houseName;
this.postalCode=postalCode;
}
}
My test file contains data in below format
user1,abc,1234
user2,def,3456

How would you test a service that uses a JpaRepository with Spock and in memory database?

I am using Spock to test a small rest application that uses a JpaRepository but I seem to be having trouble to use an in memory database in my test. I admit that I am mostly following exaples, I still don't really understand all that is going on so any help would be appreciated and any explanation as a bonus would be greatly welcomed.
This is what I have so far.
My controller:
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
import java.util.ArrayList;
import java.util.Collection;
import static org.springframework.web.bind.annotation.RequestMethod.GET;
import static org.springframework.web.bind.annotation.RequestMethod.POST;
#RestController
#RequestMapping("facilities")
public class FacilityController {
private IFacilityService facilityService;
#Autowired
public FacilityController(IFacilityService facilityService) {
this.facilityService = facilityService;
}
#RequestMapping(path = "list", method = GET)
public ResponseEntity<Collection<Facility>> list() {
Collection<Facility> facilities = new ArrayList<>();
facilityService.getFacilities().forEach(facilities::add);
return ResponseEntity.ok().body(facilities);
}
#RequestMapping(path = "add", method = POST)
public ResponseEntity<?> add(#RequestBody FacilityDto facilityDto) {
Facility facility = facilityService.add(facilityDto);
return ResponseEntity.ok().body(facility);
}
}
The service it is using
import com.futsaltime.database.repositories.IFacilityRepository;
import com.futsaltime.mappers.FacilityMapper;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
#Service
public class FacilityService implements IFacilityService {
private IFacilityRepository facilityRepository;
#Autowired
public FacilityService(IFacilityRepository facilityRepository) {
this.facilityRepository = facilityRepository;
}
#Override
public Iterable<Facility> getFacilities() {
return facilityRepository.findAll();
}
#Override
public Facility add(FacilityDto facilityDto) {
Facility facility = FacilityMapper.toModel(facilityDto);
return facilityRepository.save(facility);
}
}
The repository
#Transactional
public interface IFacilityRepository extends CrudRepository<Facility, Long> {
Collection<Facility> findAllByFacilityName(String facilityName);
}
The application.yml
server:
port: 8000
endpoints:
enabled: true
sensitive: false
spring:
datasource: # DataSource settings: set here your own configurations for the database
url: jdbc:mysql://localhost:3306/futsaltime
type: com.zaxxer.hikari.HikariDataSource
username: root
password: root
testWhileIdle: true # Keep the connection alive if idle for a long time (needed in production)
validationQuery: SELECT 1
jpa:
show-sql: true
hibernate:
ddl-auto: create
# Use spring.jpa.properties.* for Hibernate native properties (the prefix is stripped before adding them to the entity manager)
properties:
hibernate:
dialect: org.hibernate.dialect.MySQL57Dialect # To force the engine to InnoDB
format_sql: true
And the test in question
import org.springframework.beans.factory.annotation.Autowired
import org.springframework.boot.test.autoconfigure.orm.jpa.DataJpaTest
import org.springframework.boot.test.context.SpringBootTest
import org.springframework.test.context.ContextConfiguration
import spock.lang.Specification
#ContextConfiguration
#SpringBootTest
#DataJpaTest
class FacilityServiceTest extends Specification {
FacilityService facilityService
#Autowired
IFacilityRepository facilityRepository;
def setup() {
facilityService = new FacilityService(facilityRepository)
}
def "retrieves all the facilities saved in the database"() {
given:
when:
def result = facilityService.getFacilities()
then:
result.isEmpty()
}
}
I read that #ContextConfiguration is required to be able to scan the classes to use in Autowiring (is this true) and the SpringBootTest to pick the applications.yml in the test folder. I have one there that sets up an in memory H2 database but I am not sure if that is needed when I use #DataJpaTest.
If I run the test the repository is not autowired and the service is created with a null so I get a NPE.
Can someone help point out what I am missing or point me to a guide or even some coments about the annotations and if they are correctly used?

Returning JSON from RESTful Java server code?

I've inherited a web project that a contractor started. I and my coworkers are unfamiliar with the technology used, and have a number of questions. From what we can tell, this appears to be some sort of RESTful Java server code, but my understanding is there are lots of different types of Java RESTful services. Which one is this? Specific questions:
1) Where can we read more (particularly introductory information) about this specific service?
2) The code creates and returns a JSON through some kind of "magic"... I merely return a model class (code below) that has getter and setter methods for its fields, and it's automagically converted into a JSON. I'd like to learn more about how this is done automagically.
3) We already have some code that creates a JSON. We need to return this using this framework. If I already have a JSON, how do I return that? I tried something like this:
String testJSON = "{\"menu\": {\"id\": \"file\", \"value\": \"Hello there\"}}";
return testJSON;
instead of returning a model object with getters/setters, but this returns a literal text string, not a JSON. Is there a way to return an actual JSON that's already a JSON string, and have it be sent as a JSON?
You don't have to be able to answer all of the questions above. Any/all pointers in a helpful direction appreciated!
CODE
First, the view controller that returns the JSON:
package com.aimcloud.server;
import com.aimcloud.util.MySqlConnection;
import javax.ws.rs.GET;
import javax.ws.rs.PUT;
import javax.ws.rs.POST;
import javax.ws.rs.Path;
import javax.ws.rs.PathParam;
import javax.ws.rs.QueryParam;
import javax.ws.rs.FormParam;
import javax.ws.rs.HeaderParam;
import javax.ws.rs.Produces;
import javax.ws.rs.WebApplicationException;
import javax.ws.rs.core.MediaType;
import java.io.File;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import org.json.JSONArray;
import org.json.JSONException;
import org.json.JSONObject;
import com.aimcloud.models.SubscriptionTierModel;
#Path("subscription_tier")
public class SubscriptionTierController
{
// this method will return a list of subscription_tier table entries that are currently active
#GET
#Produces({ MediaType.APPLICATION_JSON })
public String/*ArrayList<SubscriptionTierModel>*/ getSubscriptionTiers(#QueryParam("includeActiveOnly") Boolean includeActiveOnly)
{
MySqlConnection mysql = MySqlConnection.getConnection();
ArrayList<SubscriptionTierModel> subscriptionTierArray = new ArrayList<SubscriptionTierModel>();
String queryString;
if (includeActiveOnly)
queryString = "SELECT * FROM subscription_tier WHERE active=1";
else
queryString = "SELECT * FROM subscription_tier";
List<Map<String, Object>> resultList = mysql.query(queryString, null);
for (Map<String, Object> subscriptionRow : resultList)
subscriptionTierArray.add( new SubscriptionTierModel(subscriptionRow) );
// String testJSON = "{\"menu\": {\"id\": \"file\", \"value\": \"Hello there\"}}";
// return testJSON;
return subscriptionTierArray;
}
}
Next, the model the code above returns:
package com.aimcloud.models;
// NOTE this does NOT import Globals
import java.sql.Types;
import java.util.Arrays;
import java.util.Calendar;
import java.util.Date;
import java.util.List;
import java.util.Map;
import org.json.JSONObject;
import com.aimcloud.util.LoggingUtils;
public class SubscriptionTierModel extends ModelPrototype
{
private String name;
private Integer num_studies;
private Integer cost_viewing;
private Integer cost_processing;
private Integer active;
protected void setupFields()
{
this.fields.add("name");
this.fields.add("num_studies");
this.fields.add("cost_viewing");
this.fields.add("cost_processing");
this.fields.add("active");
}
public SubscriptionTierModel()
{
super("subscription");
this.setupFields();
}
public SubscriptionTierModel(Map<String, Object> map)
{
super("subscription");
this.setupFields();
this.initFromMap(map);
}
public void setName(String name) {
this.name = name;
}
public String getName() {
return this.name;
}
public void setNum_Studies(Integer num_studies) {
this.num_studies = num_studies;
}
public Integer getNum_studies() {
return this.num_studies;
}
public void setCost_viewing(Integer cost_viewing) {
this.cost_viewing = cost_viewing;
}
public Integer getCost_viewing() {
return this.cost_viewing;
}
public void setCost_processing(Integer cost_processing) {
this.cost_processing = cost_processing;
}
public Integer getCost_processing() {
return this.cost_processing;
}
public void setActive(Integer active) {
this.active = active;
}
public Integer getActive() {
return this.active;
}
}
public abstract class ModelPrototype {
protected MySqlConnection mysql;
protected ArrayList<String> fields;
protected String table;
protected Integer id = null;
public Integer getId() {
return this.id;
}
public void setId(Integer id) {
this.id = id;
}
abstract protected void setupFields();
public ModelPrototype() {
mysql = MySqlConnection.getConnection();
this.fields = new ArrayList<String>();
this.fields.add("id");
}
public void initFromDbResult(List<Map<String, Object>> result) {
if (result.size() >= 1)
{
Map<String, Object> userRow = result.get(0);
this.initFromMap(userRow);
if (result.size() > 1)
{
Thread.dumpStack();
}
}
else
{
throw new WebApplicationException(ServerUtils.generateResponse(Response.Status.NOT_FOUND, "resource not found"));
}
}
protected void initFromMap(Map<String, Object> map) {
for (Map.Entry<String, Object> entry : map.entrySet()) {
Object value = entry.getValue();
// LoggingUtils.log(entry.getKey() + " " + entry.getValue().toString());
if (value != null && this.fields.contains(entry.getKey())) {
this.setField(entry.getKey(), value);
}
}
}
....
1) Where can we read more (particularly introductory information)
about this specific service?
This is a RESTful service that uses basic jax-rs annotations to build the service. I suggest looking at a tutorial like "REST using jersey" or "REST using CXF".
2) The code creates and returns a JSON through some kind of "magic"...
The restful framework used usually takes care of this. #Produces({ MediaType.APPLICATION_JSON }) annotation indicates the framework to do this conversion.This will be defined somewhere in the configuration. Check the spring config files if you are using spring to define the beans. Usually a mapper or a provider will be defined that converts the object to json.
3) We already have some code that creates a JSON. We need to return this using this framework. If I already have a JSON, how do I return that? I tried something like this:
If you already have a json just return that json from the method. Remember to still have the #Produces({ MediaType.APPLICATION_JSON }) annotation on the method.
but this returns a literal text string, not a JSON
A json is a string. That is what you will see in the response, unless you deserialize it back to an object.
I suggest you read up on JAX-RS, the Java specification for RESTful web services. All of the "javax.ws.rs.*" classes/annotations come from JAX-RS
As JAX-RS, is just a specification, there needs to be something that implements the spec. There is probably a third-party, JAX-RS component that is used to run this service. Jersey in one popular implementation. Apache CXF is another.
Now back to JAX-RS. When you read up on this, you will see that the annotations on your class determine the REST characteristics of your service. For example,
#Path("subscription_tier")
defines your class as the resource with URI BASE_PATH/subscription_tier, where BASE_PATH is propbably defined in a configuration file for your web service framework.
As for how the objects are "automagically" converted into a JSON response: that is the role of the web service framework as well. It probably uses some kind of standard object-to-JSON mapping to accomplish this. (I have worked with CXF and XML resources. In that case JAXB was the mapping mechanism). This is a good thing, as the web service developer does not have to worry about this mapping, and can focus on coding just the implementation of service itself.

JPA Warning: "No mapping is associated with the state field path 't.progress'"

I'm using JPA (EclipseLink 2.4.1) with a mapping-file containing named-queries. Eclipse shows me the following warning message in my mapping file:
No mapping is associated with the state field path 't.progress'.
The warning is of the type JPA Problem. The corresponding lines in my named-queries.xml-file look like this:
<named-query name="FinishedTasks">
<query><![CDATA[SELECT t FROM Task t WHERE t.progress = 100]]></query>
</named-query>
However, the query runs fine when executed, so no warning in run-time.
Here's what the file Task.java looks like (excerpt):
#Entity
public class Task extends Issue {
private Integer progress = 0;
public Integer getProgress() {
return progress;
}
public void setProgress(final Integer progress) {
this.progress = progress;
}
}
Issue.java looks like this (excerpt):
#Entity
#Inheritance(strategy = InheritanceType.JOINED)
public class Issue implements Serializable {
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
private long id;
public long getId() {
return id;
}
public void setId(final long id) {
this.id = id;
}
}
I have no warnings about queries using Issue.
So my question is, how do I get rid of the warning? And does the warning have some implication I'm not aware of (as said, the query runs fine).
No mapping is associated with the state field path 't.progress'
I believe this is totally due to the Eclipse JPA Details View (orm.xml editor) and has nothing to do with EclipseLink nor JPA in general. The warning is reminding you that the Named Query is using a JPA query path (t.progress) that is not mapped in the mapping file. The View / xml editor is not analysing the metadata of your java classes, so is not aware whether the it is mapped via JPA annotations.
i.e. the tool is doing the best job for you it possibly can give it's technology / scope limitations.
Solution:
understand what the message is saying, manually ensure that the warning is addressed via JPA annotations (OR if you really must, insert the approprate Entity Mappings into your entity mapping XML file), and move on...
:^)
This seems to be wrong.
<named-query name="FinishedTasks">
<query><![CDATA[SELECT t FROM Task t WHERE t.progress = 100]]></query>
</named-query>
I can't find anything like that with CDATA. See examples at http://wiki.eclipse.org/EclipseLink/Examples/JPA/QueryOptimization
Try this in your named-queries.xml. Or use #NamedQuery annotation like below.
<named-query name="FinishedTasks">
<query>SELECT t FROM Task t WHERE t.progress = 100</query>
</named-query>
I just build a test project and use this
package test;
import javax.persistence.Entity;
import javax.persistence.NamedQuery;
#Entity
#NamedQuery(name = "FinishedTasks",
query = "SELECT t FROM Task t WHERE t.progress = 100")
public class Task extends Issue {
private Integer progress = 0;
public Integer getProgress() {
return progress;
}
public void setProgress(final Integer progress) {
this.progress = progress;
}
}
Using JUnit didn't resolve to any warning.
package test;
import static org.junit.Assert.assertEquals;
import java.util.List;
import javax.persistence.EntityManager;
import javax.persistence.EntityManagerFactory;
import javax.persistence.Persistence;
import javax.persistence.Query;
import org.junit.After;
import org.junit.AfterClass;
import org.junit.Before;
import org.junit.BeforeClass;
import org.junit.Test;
public class TaskTest {
private static EntityManager em;
#BeforeClass
public static void setUpBeforeClass() throws Exception {
EntityManagerFactory factory = Persistence.createEntityManagerFactory("test");
em = factory.createEntityManager();
em.getTransaction().begin();
Task t = new Task();
t.setProgress(100);
em.persist(t);
em.getTransaction().commit();
}
#AfterClass
public static void tearDownAfterClass() throws Exception {
em.close();
}
#Test
public void test() {
Query q = em.createNamedQuery("FinishedTasks");
List<?> list = q.getResultList();
int expected = 1;
int actual = list.size();
assertEquals(actual, expected);
}
}
My log
[EL Info]: 2013-05-01
21:57:55.561--ServerSession(1763596)--EclipseLink, version: Eclipse
Persistence Services - 2.4.1.v20121003-ad44345 [EL Info]: connection: 2013-05-01