How to mock #PrePersist method? - jpa

How do I mock a #PrePersist method, e.g. preInit(), of an entity that I instantiate?
I'm using TestNG. EasyMock is prefered.
#Test(enabled = true)
public void testCreateOrder() {
// Instantiating the new mini order will automatically invoke the pre-persist method, which needs to be mocked/overwritten!
MiniOrder order = new MiniOrder();
order.setDate(new Date());
order.setCustomerId(32423423);
}
The MiniOrder.java is an entity that has a pre-persist method. Again, the one I like to mock/overwrite. E.g. this.id = 1; Alternatively one could also mock the IdGenerator.getNewId() method.
#PrePersist
protected void preInit(){
this.id = IdGenerator.getNewId();
}
I don't want the IdGenertor class to be called, because it attempts to grab a jndi resource. I just don't understand how to capture this pre-persist method in advance, so that it's not triggered ,respectively replaced by different code, before the object is fully instantiaded.

In this case, what you really want is to mock the IdGenerator dependency, which happens to be called from a #PrePersist method.
Using JMockit, the test can be written as follows:
#Test
public void createOrder()
{
new MockUp<IdGenerator>() {
// change as needed...
#Mock int getNewId() { return 123; }
};
MiniOrder order = new MiniOrder();
order.setDate(new Date());
order.setCustomerId(32423423);
}

Related

How to properly test kafkaTemplate.send() within a function in Junit5?

I'm learning how to write tests and especially tests that have a producer in it. I cannot post all the classes because it's HUGE (and not mine, I should just practice by changing the test to work with KafkaTemplate). I'm lost as to how a call like this should be tested.
I'm getting a NPE because of a producer.send("topic", JsonObject) that is in the function I'm testing. The functions is built like so:
#Autowired
private KafkaTemplate<String,EventDto> kafkaTemplate;
public EventDto sendEvent(Event event) {
EventDto eventToSend = this.dtoMapper.mapToDto(event, SomeEvent.class);
this.kafkaTemplate.send("topic",eventToSend);
return eventToSend;
}
in the unit test it's like this (irrelevant parts omitted):
#Test
void testSendEvent() {
//omitted lines regarding second assert that works
EventProducer producer = new EventProducer(something);
EventDto dto = producer.sendEvent(Event.newBuilder().build());
assertThat(dto).isNotNull();
//there is a second assert here that passes, nothing to do with kafka
}
We have Mockito and I assume I need to mock the KafkaTemplate somehow. But I'm not quite getting how I can "direct" the sendEvent to use the KafkaTemplate within the producer.sendEvent() call?
Solution edit: I changed the #Autowired to injecting it with the constructor instead. Works well! Here is the full class and method now
#Service
public class EventProducer implements EventProducerInterface {
private final DtoMapper dtoMapper;
private KafkaTemplate<String,EventDto> kafkaTemplate;
#Autowired
public EventProducer (KafkaTemplate<String,EventDto> kafkaTemplate, IDtoMapper dtoMapper) {
Assert.notNull(dtoMapper, "dtoMapper must not be null");
this.dtoMapper = dtoMapper;
this.kafkaTemplate=kafkaTemplate;
}
public EventDto sendEvent(Event event) {
EventDto eventToSend = this.dtoMapper.mapToDto(event, EventDto.class);
this.kafkaTemplate.send("output-topic",eventToSend);
return eventToSend;
}
}
You should use constructor injection instead of #Autowired:
private KafkaTemplate<String,EventDto> kafkaTemplate;
public EventProducer(KafkaTemplate<String,EventDto> kafkaTemplate, something) {
this.kafkaTemplate = kafkaTemplate;
}
public EventDto sendEvent(Event event) {
EventDto eventToSend = this.dtoMapper.mapToDto(event, SomeEvent.class);
this.kafkaTemplate.send("topic",eventToSend);
return eventToSend;
}
This way you can inject a mock in your tests:
#Test
void testSendEvent() {
//omitted lines regarding second assert that works
KafkaTemplate<<String,EventDto>> templateMock = mock(KafkaTemplate.class);
EventProducer producer = new EventProducer(templateMock, something);
EventDto dto = producer.sendEvent(Event.newBuilder().build());
assertThat(dto).isNotNull();
//there is a second assert here that passes, nothing to do with kafka
}
If you can't change the class' constructor, you can provide a mock using #MockBean:
#MockBean
KafkaTemplate<String,EventDto> kafkaTemplate;
#Test
void testSendEvent() {
//omitted lines regarding second assert that works
EventProducer producer = new EventProducer(something);
EventDto dto = producer.sendEvent(Event.newBuilder().build());
assertThat(dto).isNotNull();
//there is a second assert here that passes, nothing to do with kafka
}
But there's something odd with this design - does the EventProducer class have #Autowired and constructor arguments? Autowiring only works on beans, and usually either the class has a default constructor and #Autowired dependencies, or injects everything through the constructor.
If those options I present do not work for you, please add more details on the class' constructor and overall design.

PowerMock not stubbing the right method

I am facing an weird PowerMock issue. Let me explain more in details.
My code:
#Service
public class TestMe {
#Autowired
private ClassA a;
#Autowired
private ClassB b;
#Autowired
private ClassStatic staticClass;
public void init(){
List<String> nameList = returnNames(); // Line#1
// Work with names
List<String> placeList = returnPlaces(); // Line#2
// Work with places
}
public List<String> returnNames(){
// Code to return list of names
}
public List<String> returnPlaces(){
// Code to return list of places
}
}
My Test Class
#RunWith(PowerMockRunner.class)
#PrepareForTest({ClassStatic.class})
public class TestMeTest {
#Mock
private ClassA aMock;
#Mock
private ClassB bMock;
#InjectMocks
private TestMe testMeMock;
#Test
public void testInit(){
List<String> listNames = ... // some list of names
List<String> listPlaces = ... // some list of places
when(testMeMock.returnNames()).thenReturn(listNames);
// listPlaces gets returned in Line#1 shown in the main code.
when(testMeMock.returnPlaces()).thenReturn(listPlaces);
testMeMock.init();
}
}
So, as you see in line#1 I get listPlaces instead of listNames. If I rearrange the when calls then I get listNames instead of listPlaces at Line#2.
Why PowerMock confuses with the methods? Or there is something else I am missing while working with PowerMock.
I could solve the issue by using thenReturn twice like below
when(testMeMock.returnNames()).thenReturn(listNames).thenReturn(listPlaces);
// Removed the returnPlaces() call
// when(testMeMock.returnPlaces()).thenReturn(listPlaces);
testMeMock.init();
But why can't PowerMock distinguish between two different method calls returnNames() and returnPlaces()??
A different perspective. This here:
#InjectMocks
private TestMe testMeMock;
and that:
when(testMeMock.returnPlaces()).thenReturn(listPlaces);
simply doesn't make sense together.
The #InjectMocks annotation is meant to create an instance of your class under test, which gets "filled" with the other mocks you created via the #Mock annotation. But then you use that class under test instance like it were a mock (by going when(testMeMock.foo())).
You should start by stepping back and clarifying for yourself what exactly you intend to do. Probably partial mocking, because you want to test your init() method, but you also intend to control what other methods on your class under test do.
Finally, you might also want to step back and re-think your overall design. To have public methods returning lists, that are also used on a public method that does the init, that simply sounds wrong, too.

Why Lazy Collections do not work with JavaFX getters / setters?

I experienced poor performance when using em.find(entity, primaryKey).
The reason seems to be that em.find() will also load entity collections, that are annotated with FetchType.LAZY.
This small test case illustrates what I mean:
public class OriginEntityTest4 {
[..]
#Test
public void test() throws Exception {
final OriginEntity oe = new OriginEntity("o");
final ReferencePeakEntity rpe = new ReferencePeakEntity();
oe.getReferencePeaks().add(rpe);
DatabaseAccess.onEntityManager(em -> {
em.persist(oe);
em.persist(rpe);
});
System.out.println(rpe.getEntityId());
DatabaseAccess.onEntityManager(em -> {
em.find(OriginEntity.class, oe.getEntityId());
});
}
}
#Access(AccessType.PROPERTY)
#Entity(name = "Origin")
public class OriginEntity extends NamedEntity {
[..]
private final ListProperty<ReferencePeakEntity> referencePeaks =
referencePeaks =
new SimpleListProperty<>(FXCollections.observableArrayList(ReferencePeakEntity.extractor()));
#Override
#OneToMany(mappedBy = "origin", fetch = FetchType.LAZY)
public final List<ReferencePeakEntity> getReferencePeaks() {
return this.referencePeaksProperty().get();
}
public final void setReferencePeaks(final List<ReferencePeakEntity> referencePeaks) {
this.referencePeaksProperty().setAll(referencePeaks);
}
}
I cannot find any documentation on that, my question is basically how can I prevent the EntityManager from loading the lazy collection?
Why I need em.find()?
I use the following method to decide whether I need to persist a new entity or update an existing one.
public static void mergeOrPersistWithinTransaction(final EntityManager em, final XIdentEntity entity) {
final XIdentEntity dbEntity = em.find(XIdentEntity.class, entity.getEntityId());
if (dbEntity == null) {
em.persist(entity);
} else {
em.merge(entity);
}
}
Note that OriginEntity is a JavaFX bean, where getter and setter delegate to a ListProperty.
Because FetchType.LAZY is only a hint. Depending on the implementation and how you configured your entity it will be able to do it or not.
Not an answer to titles question but maybe to your problem.
You can use also em.getReference(entityClass, primaryKey) in this case. It should be more efficient in your case since it just gets a reference to possibly existing entity.
See When to use EntityManager.find() vs EntityManager.getReference()
On the other hand i think your check is perhaps not needed. You could just persist or merge without check?
See JPA EntityManager: Why use persist() over merge()?

how can we apply aspect oriented programming on a particular line of java class ?

How can we apply aop on the last line of main method ?
Below is a test class for call by value in java. I have been asked in one interview to apply Aspect oriented programming on last line of the class. Is it possible to apply AOP on a particular line of any java class, if yes then please give some example code.
public class TestCallByValue {
public static void main(String[] args) {
Student st = new Student("Sanjeev", 1);
changeName(st);
System.out.println(st.getName());//apply aop on this line to stop printing sysout
}
public static void changeName(Student st) {
st = new Student("Rajeev", 2);
st.setName("Amit");
}
}
class Student {
String name;
Integer id;
public Student(String name, Integer id) {
this.name = name;
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public Integer getId() {
return id;
}
public void setId(Integer id) {
this.id = id;
}
}
What can be applied on a particular line of java code is called a joinpoint
This link lists the possible joinpoints you can set in your code, with aspectj. As you can see, only constructor call, method call, field initialization, etc. can be defined as joinpoints
The only way is to apply a pointcut on System.out#println. You could as well encapsulate System.out.println(st.getName()); in a dedicated method
AspectJ doesn't operate on source code, it operates on the semantic structure of Java programs. As such, it doesn't have a concept of "lines". The interviewer meant that you should prevent a particular method call from happening, and told you where that method call is, in this particular case it's the last statement of the main method.
This statement is located in TestCallByValue.main() method and invokes println() on System.out, which is a PrintStream. While we cannot indicate to AspectJ that we want to prevent only the "last" statement from executing, we can narrow this down to
method calls to the println method of the PrintStream class, accepting a String and returning void, within the code contained in the TestCallByValue.main() method that accepts an array of Strings and returns void
To prevent the method call from happening, you will need an around advice which doesn't call proceed(). We can also check whether the target of the method call is actually System.out, so we prevent only System.out.println(String), not println(String) calls on other instances of PrintStream.
The above can be achieved with the following aspect:
aspect DummyAspect {
pointcut printlnStatementInMain(): withincode(void TestCallByValue.main(String[]))
&& call(void java.io.PrintStream.println(String));
void around(): printlnStatementInMain() {
if (thisJoinPoint.getTarget() != System.out) {
proceed();
}
}
}

Disposing of object context when implementing a repository (DAL) and Mocking DAL for TDD

I am running into an issue where I can't figure out how to properly dispose of my object context I am creating every time I instantiate a new object.
public class OrderBLL{
var _iOrderLineDal;
public OrderBLL(){
_iOderLineDal = new OrderLineDal(new entityOBject(dbconnectionstring);
}
public OrderBLL(iOrderLineDal mockOrderLineDal){
_iOrderLineDal = mockOrderLineDal;
}
}
So the problem is, that every 30 seconds my service creates a new instance of the OrderBLL and then runs a method to see if there are any new orders in the Data base.
So every 30 seconds I create a new entityObject that is not being disposed of. the old implementation of the code was written using the using statement.
public bool HasNewOrders(){
using(var entityObject = new entityObject(dbconnectionString)){
var newOrders = entityObject.GetNewOrders();
}
//some logic
}
The problem with using this using statement is I cannot mock out the entityObject and easily write unit tests on any methods inside this OrderBLL class.
I tried disposing of it with a dispose method inside the OrderLineDal and once i got the data called dispose. That worked well the first iteration but the following iterations, the next 30 seconds, it would say that the entityObject was disposed of and cannot be used. (doesn't make sense to me, since I am creating a new one every time?)
Is there a way I can implement this repository pattern and still dispose of all the new entityObjects so I can mock the DAL out for unit testing?
I am working with EF 4. and it was not set up Code First, so I do not have POCO.
Ideally you would want to create your context outside of your OrderBLL (search google for Repository pattern).
public class OrderRepository : IOrderRepository, IDisposable
{
private readonly IOrderDBContext _dbContext;
// poor mans dependency injection
public OrderRepository() : this(new OrderDbContext("YourConnectionString")
{}
public OrderRepository(IOrderDBContext dbContext)
{
if (dbContext == null) throw new ArgumentNullException("dbContext");
_dbContext = dbContext;
}
public bool GetNewOrders(){
return _dbContext.Orders.Where(o => o.IsNew==true);
}
public void Dispose()
{
if (_dbContext != null) _dbContext.dispose();
}
}
public class OrderBLL : IOrderBLL
{
private readonly IOrderRepository _repository;
public OrderRepository(IOrderRepository repository)
{
if (repository == null) throw new ArgumentNullException("dbContext");
_repository = repository;
}
public bool HasNewOrders(){
var newOrders = _repository.GetNewOrders();
if (newOrders==null) return false;
return newOrders.Count() > 0;
}
}
[Test]
public void HasNewOrders_GivenNoNNewOrdersRetunedFromRepo_ReturnsFalse()
{
// test using nunit and nsubstitute
// Arrange
var repository = Substitue.For<IOrderRepository>();
var emptyOrderList = new List<Order>();
repository.GetNewOrders().Returns();
var orderBLL = new OrderBLL(repository);
// Act
var result = orderBLL.HasNewOrders();
// Assert
Assert.Equals(false, result);
}
Now you can inject your context into this class and easily test your business logic. Eventually you will need to create your dbContext and should also always expose this. I would suggest having a look at a DI container like Castle Windsor to manage the life of your objects, although in a service you may just want to manually create and dispose your context as close to the code entry point as possible (e.g. in the main method)