We are using RestTemplate to consume external rest services. There lot of different kinds of services in our project and all of them are tested using different strategies like mocking rest template and mocking our communication object.
We have used below code in our test case to test one service using MockRestServiceServer :
RestTemplate restTemplate = new RestTemplate();
mockServer = MockRestServiceServer.createServer(restTemplate);
So our question is :
Is there a way to destroy this server as soon as this test case completes so this doesn't affect other test cases?
First and foremost, the MockRestServiceServer is not a real server -- for example, it is not listening on a TCP port. The only thing the MockRestServiceServer does is modify your RestTemplate (see details below).
So to answer your question: there is no server to destroy.
However... if your RestTemplate is created in your ApplicationContext and injected into multiple components (e.g., in your service layer), you may want to reset the initial state of the RestTemplate. If that's the case, read on...
There is currently no "official" way to reset the RestTemplate passed to MockRestServiceServer.createServer(), but that doesn't mean you can't implement such a feature on your own.
The key to understanding this is knowing that the MockRestServiceServer.createServer() method replaces the ClientHttpRequestFactory in the provided RestTemplate with a mocked version (i.e., the private, internal MockRestServiceServer.RequestMatcherClientHttpRequestFactory).
So you should be able to reset the original state of the RestTemplate by tracking the original request factory and setting it in the template after your test. Something like the following should work:
RestTemplate restTemplate = // likely injected into the test
ClientHttpRequestFactory originalRequestFactory = restTemplate.getRequestFactory();
MockRestServiceServer mockServer = MockRestServiceServer.createServer(restTemplate);
try {
// use mockServer as usual...
mockServer.verify();
} finally {
restTemplate.setRequestFactory(originalRequestFactory);
}
Let me know if that solves your problem!
Cheers,
Sam (author of the Spring TestContext Framework)
Related
I'm writing automated test using TestNG for the REST API of my application. The application has a RestController which contains an #Autowired service class. When the REST endpoint is called with a HTTP GET request, the service looks into a storage directory for XML files, transforms their contents into objects and stores them in a database. The important thing for my question is that the path to the storage directory is stored in /src/main/resources/application.yml (source.storage) and imported via a #Value annotation.
Now, I have the source.storage property also in src/test/resources/application.yml pointing to a different directory within src/test, where I store my testing XML files, and import them to my test class with a #Value annotation again. My test calls the REST endpoint with a HTTP GET. However, it seems that the service still draws the source.storage property the main application.yml, while I would like that value overriden by the one in test application.yml file. In other words, the service tries to import XML files from the application storage directory, rather than from my testing storage.
#ActiveProfiles and #TestPropertySource do not seem to work for me. Scanning the main application.yml for its storage property is not an option, as in the end the application.yml will be drawn from a Spring Cloud Config, and I would not know where the main application.yml would be located.
Is there a way with which I could make the #Autowired service draw the source.storage property from the test application.yml, rather from the main one?
Any advice would be appreciated.
Thanks, Petr
Well, it really depends on what you're trying to build, if it is some sort of unit test of the controller or more likely an integration test. Both approaches are explained in this tutorial.
If you're trying to write integration test, which seems a bit more likely from your question, then #ActiveProfiles or #TestPropertySource should work for you. I would suggest to use profiles, in growing application with a lot of properties it is a bit more convenient to just replace some of the properties for the testing. Below is setup which worked for me when writing integration tests for controller endpoints:
#RunWith(SpringRunner.class)
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT)
#ActiveProfiles("test")
#FixMethodOrder(MethodSorters.NAME_ASCENDING)
#DirtiesContext(classMode = DirtiesContext.ClassMode.AFTER_CLASS)
public class AreaControllerTest {
#Autowired
TestRestTemplate rest;
#MockBean
private JobExecutor jobExecutor;
#Test
public void test01_List() {
//
}
#Test
public void test02_Get() {
//
}
// ...
}
There are several important things.
The testing properties are in src/test/resources/application-test.properties and merges with the ones in application.properties as the #ActiveProfiles("test") annotation suggests.
Essential is also #RunWith(SpringRunner.class) which is JUnit specific, for TestNG alternative please refer to this SO question.
Finally the #SpringBootTest annotation will start the whole application context.
#FixMethodOrder and #DirtiesContext are further setup of the testing case and are not really necessary.
Notice also the #MockBean annotation, in this case we did not wanted to use real-life implementation of JobExecutor, so we replaced it with mock.
If you want to write unit test where you want to just check the logic of controller and service on their own, then you have to have two test classes, each testing respective classes. Testing service should be standard unit test, testing controller is a bit trickier and is probably more inclined to partial integration test. If this is your case I would recommend to use MockMvc approach explained in the above mentioned tutorial. Small snippet from there:
#RunWith(SpringRunner.class)
#WebMvcTest(GreetingController.class)
public class WebMockTest {
#Autowired
private MockMvc mockMvc;
#MockBean
private GreetingService service;
#Test
public void greetingShouldReturnMessageFromService() throws Exception {
when(service.greet()).thenReturn("Hello Mock");
this.mockMvc.perform(get("/greeting")).andDo(print()).andExpect(status().isOk())
.andExpect(content().string(containsString("Hello Mock")));
}
}
Notice the #MockBean annotation which mocks service where you can specify your own behaviour of mock. This point is critical, because this sort of test does not load whole application context, but only MVC context, so the services are not available. Again as in the integration test the #RunWith(SpringRunner.class) annotation is essential. Finally #WebMvcTest(GreetingController.class) starts only MVC context of the GreetingController class and not the whole application.
You can try supplying the property directly to the spring boot test.
#SpringBootTest(properties= {"source.storage=someValue"})
Regarding the application picking up the wrong property source, You should also check if your application is being built properly.
I'm trying to wire up Spring Data JPA objects manually so that I can generate DAO proxies (aka Repositories) - without using a Spring bean container.
Inevitably, I will be asked why I want to do this: it is because our project is already using Google Guice (and on the UI using Gin with GWT), and we don't want to maintain another IoC container configuration, or pull in all the resulting dependencies. I know we might be able to use Guice's SpringIntegration, but this would be a last resort.
It seems that everything is available to wire the objects up manually, but since it's not well documented, I'm having a difficult time.
According to the Spring Data user's guide, using repository factories standalone is possible. Unfortunately, the example shows RepositoryFactorySupport which is an abstract class. After some searching I managed to find JpaRepositoryFactory
JpaRepositoryFactory actually works fairly well, except it does not automatically create transactions. Transactions must be managed manually, or nothing will get persisted to the database:
entityManager.getTransaction().begin();
repositoryInstance.save(someJpaObject);
entityManager.getTransaction().commit();
The problem turned out to be that #Transactional annotations are not used automatically, and need the help of a TransactionInterceptor
Thankfully, the JpaRepositoryFactory can take a callback to add more AOP advice to the generated Repository proxy before returning:
final JpaTransactionManager xactManager = new JpaTransactionManager(emf);
final JpaRepositoryFactory factory = new JpaRepositoryFactory(emf.createEntityManager());
factory.addRepositoryProxyPostProcessor(new RepositoryProxyPostProcessor() {
#Override
public void postProcess(ProxyFactory factory) {
factory.addAdvice(new TransactionInterceptor(xactManager, new AnnotationTransactionAttributeSource()));
}
});
This is where things are not working out so well. Stepping through the debugger in the code, the TransactionInterceptor is indeed creating a transaction - but on the wrong EntityManager. Spring manages the active EntityManager by looking at the currently executing thread. The TransactionInterceptor does this and sees there is no active EntityManager bound to the thread, and decides to create a new one.
However, this new EntityManager is not the same instance that was created and passed into the JpaRepositoryFactory constructor, which requires an EntityManager. The question is, how do I make the TransactionInterceptor and the JpaRepositoryFactory use the same EntityManager?
Update:
While writing this up, I found out how to solve the problem but it still may not be the ideal solution. I will post this solution as a separate answer. I would be happy to hear any suggestions on a better way to use Spring Data JPA standalone than how I've solve it.
The general principle behind the design of JpaRepositoryFactory and the according Spring integration JpaRepositoryFactory bean is the following:
We're assuming you run your application inside a managed JPA runtime environment, not caring about which one.
That's the reason we rely on injected EntityManager rather than an EntityManagerFactory. By definition the EntityManager is not thread safe. So if dealt with an EntityManagerFactory directly we would have to rewrite all the resource managing code a managed runtime environment (just like Spring or EJB) would provide you.
To integrate with the Spring transaction management we use Spring's SharedEntityManagerCreator that actually does the transaction resource binding magic you've implemented manually. So you probably want to use that one to create EntityManager instances from your EntityManagerFactory. If you want to activate the transactionality at the repository beans directly (so that a call to e.g. repo.save(…) creates a transaction if none is already active) have a look at the TransactionalRepositoryProxyPostProcessor implementation in Spring Data Commons. It actually activates transactions when Spring Data repositories are used directly (e.g. for repo.save(…)) and slightly customizes the transaction configuration lookup to prefer interfaces over implementation classes to allow repository interfaces to override transaction configuration defined in SimpleJpaRepository.
I solved this by manually binding the EntityManager and EntityManagerFactory to the executing thread, before creating repositories with the JpaRepositoryFactory. This is accomplished using the TransactionSynchronizationManager.bindResource method:
emf = Persistence.createEntityManagerFactory("com.foo.model", properties);
em = emf.createEntityManager();
// Create your transaction manager and RespositoryFactory
final JpaTransactionManager xactManager = new JpaTransactionManager(emf);
final JpaRepositoryFactory factory = new JpaRepositoryFactory(em);
// Make sure calls to the repository instance are intercepted for annotated transactions
factory.addRepositoryProxyPostProcessor(new RepositoryProxyPostProcessor() {
#Override
public void postProcess(ProxyFactory factory) {
factory.addAdvice(new TransactionInterceptor(xactManager, new MatchAlwaysTransactionAttributeSource()));
}
});
// Create your repository proxy instance
FooRepository repository = factory.getRepository(FooRepository.class);
// Bind the same EntityManger used to create the Repository to the thread
TransactionSynchronizationManager.bindResource(emf, new EntityManagerHolder(em));
try{
repository.save(someInstance); // Done in a transaction using 1 EntityManger
} finally {
// Make sure to unbind when done with the repository instance
TransactionSynchronizationManager.unbindResource(getEntityManagerFactory());
}
There must be be a better way though. It seems strange that the RepositoryFactory was designed to use EnitiyManager instead of an EntityManagerFactory. I would expect, that it would first look to see if an EntityManger is bound to the thread and then either create a new one and bind it, or use an existing one.
Basically, I would want to inject the repository proxies, and expect on every call they internally create a new EntityManager, so that calls are thread safe.
I need to create a Rest endpoint dynamically in my Spring Boot application. Instead of statically creating the class with #RestController, is there a way to instantiate and activate a Rest service at runtime? It should be possible to specify the endpoint, input parameters etc at runtime.
Are there some Groovy options too?
Thanks,
Sandeep Joseph
I think the approach to take would be to create a custom MvcEndpoint that will handle all requests on a specific path, from there depending on your internal configuration you can process requests. It's basically just a Servlet (that's also an option). You're fully in control of the request.
public class MyEndpoint extends AbstractMvcEndpoint
// can optionally implements ApplicationContextAware, ServletContextAware
// to inject configuration, etc.
{
#RequestMapping("/dynamic-enpoints-prefix/**")
public ModelAndView handle(HttpServletRequest request, HttpServletResponse response)
throws Exception {
// here you have the request and response. Can do anything.
}
}
I'm trying to test my verticle but with mocked MongoDB (not to perform real DB actions during the process of unit testing), I've tried to mock my client, but looks like when I use vertx.deployVerticle() my mocks are not being taken into account.
Here's an example of my test setup:
#RunWith(VertxUnitRunner.class)
#PrepareForTest({ MongoClient.class })
public class VerticleTest {
#Rule
public PowerMockRule rule = new PowerMockRule();
private Vertx vertx;
private Integer port;
#Before
public void setUp(TestContext context) throws Exception {
vertx = Vertx.vertx();
mockStatic(MongoClient.class);
MongoClient mongo = Mockito.mock(MongoClientImpl.class);
when(MongoClient.createShared(any(), any())).thenReturn(mongo);
ServerSocket socket = new ServerSocket(0);
port = socket.getLocalPort();
socket.close();
DeploymentOptions options = new DeploymentOptions().setConfig(new JsonObject().put("http.port", port));
vertx.deployVerticle(TalWebVerticle.class.getName(), options, context.asyncAssertSuccess());
}
And what I actually see, that is that MongoClient.createShared is still being called, though I've mocked it.
What can I do in this case?
Edit 1.
Looks like the problem is that MongoClient is an interface and PowerMockito is not able to mock static methods in this case.
I'm still trying to find workaround for this case.
I didn't know that the MongoClient is an interface then I gave my first answer.
PowerMock doesn't supports mocking static calls interfaces (bug #510, Javaassist fixed exception, but mocking static methods still isn't supported). It will be called in next release.
I was focusing on issue in PowerMock, not why it's needed. I agree with answer which was provided in Mailing List.
You could work around it by creating a helper method in your own code
that returns MongoClient.createdShared(). Then in your test, mock that
helper to return your mocked MongoClientImp
But it will be not a work around, but right design solution. Mocking MongoClient is not a good approach, because you should not mock types you don't own.
So better way will be create a custom helper which will create MongoClientfor you and then mock this the helper in unit test. Also you will need integration tests for this helper which will call real MongoClient.createdShared().
If you don't have an opportunity to change code (or you don't want to change code without tests), then I've create an example with work around how PowerMock bug could be bypassed.
Main ideas:
create a custom MainMockTransformer. The transformer will transform interfaces classes to enable supporting mock static calls for interfaces
create a custom PowerMockRunner which will be used to add the custom MockTransformer to transformers chains.
Please, bring to notice on packages name where these new classes are located. It's important. If you want to move them into another packages then you will need to add these new packages to #PowerMockIgnore.
I am trying to create a method that can be exposed through an ADO.NET data service. No matter what I do, the client cannot see the method that I am exposing. I am out of ideas. Please help:
[WebGet]
public ObjectResult<Product> GetAllProducts()
{
ProductOrdersEntities entities = new ProductOrdersEntities();
return entities.GetAllProducts();
}
I have also kept access open to the methods:
public static void InitializeService(IDataServiceConfiguration config)
{
config.SetEntitySetAccessRule("*", EntitySetRights.All);
config.SetServiceOperationAccessRule("*", ServiceOperationRights.All);
}
Still, when I create a client proxy, it cannot see the method GetAllProducts().
I was told by a developer in the Astoria team that the current code generation tool does not support generating code for service operations. By that time I had already started using the .Execute method to make an explicit HTTP request to invoke the method, and this strategy works fine; just that it is not elegant or typesafe.