Griffon integration tests with jpa - jpa

I'm writing a griffon application with JavaFX and the JPA plugin. I have a service I'd like to test - this service makes use of the JPA plugin (withJpa {...}) and it's this database access that I want to test.
So, I want to write this test so it inserts some data, then check that the service produces the right answer thus verifying the sql query is correct.
I've written a simple test:
class ReportServiceTests extends GriffonUnitTestCase {
GriffonApplication app
public void testStats() {
println app.getServices()
println app.getControllers()
}
}
but I cannot get hold of a valid service - both the println statements above produce "[:]".
How do I get hold of the 'ReportService' instance and exercise it against the database? I don't want to mock the database interaction.
Thanks.

There's no need to mock the database. As explained in http://griffon.codehaus.org/guide/latest/guide/testing.html#integrationTesting applications reach the INITIALIZE phase during integration testing. Addons get initialized during this phase. Services on the other hand get initialized lazily as they are pulled in by MVC members when instantiated: they do not get instantiated out-of-the-box if you call app.getServices(). However you can instruct the application to eagerly instantiate all services, this will make your code work as expected; just add the following flag to Config.groovy
griffon.services.eager.instantiation = true
More info on services can be found at http://griffon.codehaus.org/guide/latest/guide/controllersAndServices.html#services

Related

How to create a JpaRepository but NOT using dependency injection

In my application I am currently creating a JpaRepository via the usual mechanism, i.e. by defining an interface like so:
public interface HistoryInfoRepository extends JpaRepository<HistoryInfo, Long>
This works all fine BUT if the DB is not available or not accessible when starting up the application then the creation of the repo bean fails, the injection into the respective service (which receives that repo as an autowired constructor argument) fails and as a consequence my applications main class:
SpringApplication.run(Application.class, args);
fails as well. I.e. the entire application just fails with a gigantic stacktrace.
I would want to make situation a bit more user friendly in that the application at least comes up so far that it is able to display some (hopefully) helpful error message with an explanation and some hints what should be checked for.
That would require to NOT autowire the JpaRepository but create it programmatically, e.g. in said service's constructor but wrapped within some try - catch construct.
I don't want to give up that convenience that the JpaRepository mechanism provides but how can one create such a JpaRepository programmatically when there is only an interface defined and everything else is done automagically by Spring?
Is there some API-call to create such a JpaRepository for a given interface?
I would like to see something like this:
public HistoryInfoService( /* HistoryInfoRepository historyRepo */) {
...
try {
historyRepo = createJpaRepositoryFromInterface(HistoryInfoRepository.class);
} catch (Exception ex) {
// inform user about missing or non-reachable DB server and suggest remedies...
}
...
}
Here I commented away the constructor argument which is normally provided via auto-wiring, but instead I want to create the JpaRepository using some method-call. That would allow me to properly react to the failed Repo-creation as indicated by the comment in the catch clause.
I searched for something like that but I must have been using the wrong search terms and found nothing so far. Any hints?

How to override main application.yml with testing application.yml when testing REST API in an Autowired service class?

I'm writing automated test using TestNG for the REST API of my application. The application has a RestController which contains an #Autowired service class. When the REST endpoint is called with a HTTP GET request, the service looks into a storage directory for XML files, transforms their contents into objects and stores them in a database. The important thing for my question is that the path to the storage directory is stored in /src/main/resources/application.yml (source.storage) and imported via a #Value annotation.
Now, I have the source.storage property also in src/test/resources/application.yml pointing to a different directory within src/test, where I store my testing XML files, and import them to my test class with a #Value annotation again. My test calls the REST endpoint with a HTTP GET. However, it seems that the service still draws the source.storage property the main application.yml, while I would like that value overriden by the one in test application.yml file. In other words, the service tries to import XML files from the application storage directory, rather than from my testing storage.
#ActiveProfiles and #TestPropertySource do not seem to work for me. Scanning the main application.yml for its storage property is not an option, as in the end the application.yml will be drawn from a Spring Cloud Config, and I would not know where the main application.yml would be located.
Is there a way with which I could make the #Autowired service draw the source.storage property from the test application.yml, rather from the main one?
Any advice would be appreciated.
Thanks, Petr
Well, it really depends on what you're trying to build, if it is some sort of unit test of the controller or more likely an integration test. Both approaches are explained in this tutorial.
If you're trying to write integration test, which seems a bit more likely from your question, then #ActiveProfiles or #TestPropertySource should work for you. I would suggest to use profiles, in growing application with a lot of properties it is a bit more convenient to just replace some of the properties for the testing. Below is setup which worked for me when writing integration tests for controller endpoints:
#RunWith(SpringRunner.class)
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT)
#ActiveProfiles("test")
#FixMethodOrder(MethodSorters.NAME_ASCENDING)
#DirtiesContext(classMode = DirtiesContext.ClassMode.AFTER_CLASS)
public class AreaControllerTest {
#Autowired
TestRestTemplate rest;
#MockBean
private JobExecutor jobExecutor;
#Test
public void test01_List() {
//
}
#Test
public void test02_Get() {
//
}
// ...
}
There are several important things.
The testing properties are in src/test/resources/application-test.properties and merges with the ones in application.properties as the #ActiveProfiles("test") annotation suggests.
Essential is also #RunWith(SpringRunner.class) which is JUnit specific, for TestNG alternative please refer to this SO question.
Finally the #SpringBootTest annotation will start the whole application context.
#FixMethodOrder and #DirtiesContext are further setup of the testing case and are not really necessary.
Notice also the #MockBean annotation, in this case we did not wanted to use real-life implementation of JobExecutor, so we replaced it with mock.
If you want to write unit test where you want to just check the logic of controller and service on their own, then you have to have two test classes, each testing respective classes. Testing service should be standard unit test, testing controller is a bit trickier and is probably more inclined to partial integration test. If this is your case I would recommend to use MockMvc approach explained in the above mentioned tutorial. Small snippet from there:
#RunWith(SpringRunner.class)
#WebMvcTest(GreetingController.class)
public class WebMockTest {
#Autowired
private MockMvc mockMvc;
#MockBean
private GreetingService service;
#Test
public void greetingShouldReturnMessageFromService() throws Exception {
when(service.greet()).thenReturn("Hello Mock");
this.mockMvc.perform(get("/greeting")).andDo(print()).andExpect(status().isOk())
.andExpect(content().string(containsString("Hello Mock")));
}
}
Notice the #MockBean annotation which mocks service where you can specify your own behaviour of mock. This point is critical, because this sort of test does not load whole application context, but only MVC context, so the services are not available. Again as in the integration test the #RunWith(SpringRunner.class) annotation is essential. Finally #WebMvcTest(GreetingController.class) starts only MVC context of the GreetingController class and not the whole application.
You can try supplying the property directly to the spring boot test.
#SpringBootTest(properties= {"source.storage=someValue"})
Regarding the application picking up the wrong property source, You should also check if your application is being built properly.

No event context active - RESTeasy, Seam

I'm trying to add a RESTful web service with RESTeasy to our application running on JBoss 7.x, using Seam2.
I wanted to use as little Seam as possible, but I need it for Dependancy Injection.
My REST endpoints are as follows:
#Name("myEndpoint")
#Stateless
#Path("/path")
#Produces(MediaType.APPLICATION_JSON+"; charset=UTF-8")
public class MyEndpoint {
#In private FooService fooService;
#GET
#Path("/foo/{bar}")
public Response foobar(#CookieParam("sessionId") String sessionId,
#PathParam("bar") String bar)
{ ... }
}
I'm using a class extending Application. There is no XML config.
I can use the web service methods and they work, but I always get an IllegalStateException:
Exception processing transaction Synchronization after completion: java.lang.IllegalStateException: No event context active
Complete StackTrace
I did try everything in the documentation, but I can't get it away. If I leave out the #Stateless annotation, I don't get any Injection done. Adding #Scope doesn't do jack. Accessing the service via seam/resource/ doesn't even work (even without the Application class with #ApplicationPath).
It goes away if I don't use Dep. Injection, but instead add to each and every method
fooService = Component.getInstance("fooService");
Lifecycle.beginCall();
...
Lifecycle.endCall();
which isn't really a good solution. Nah, doesn't work either...
I have resolved the issue. For some reason (still not sure why, maybe because I tried to use Annotations and code exclusivly and no XML config), my REST service was availiable under a "non-standard" URL.
Usually it'd be something like "/seam/resources/rest".
Anyway, if you have a "custom" path, Seam doesn't know it should inject a context. You need to add <web:context-filter url-pattern="something" /> to your component.xml.
Specifically we already had this tag, but with the attribute regex-url-pattern and I extended it to match the REST URL.

How do I inject Db into Service classes when unit testing ServiceStack.OrmLite with NUnit?

I'm interested in writing unit tests (using NUnit) for some service classes created using ServiceStack, using the "New API" (inheriting from ServiceStack.ServiceInterface.Service). The services' Db property is properly auto-wired when hosted in an ASP.NET application using AppHost, but I can't seem to figure out the proper technique when running outside of that environment. I see a variety of testing-related namespaces and classes in ServiceStack but can't find a clear example where a service's Db property is getting injected, as opposed to simply setting up a connection factory directly and then calling the various IDbConnection extension methods (Insert, Select, etc.).
I have tried having my test class inherit from ServiceStack.ServiceInterface.Testing.TestBase and overriding its Configure method to register an IDbConnectionFactory (using ":memory:"), as well as setting OrmLiteConfig.DialectProvider = SqliteDialect.Provider; in my TestFixtureSetUp, but I continue to get a NullReferenceException when calling my service methods (at at ServiceStack.ServiceInterface.Service.get_Db()). It seems that the Funq container is not auto-wiring anything.
SQLite itself is properly set up, which I'm able to confirm with simpler unit tests that bypass my service classes and simply do direct IDbConnection calls.
What am I missing?
Edit
It appears that unit-testing ServiceStack services requires the existence of a host and a client, although it looks like there are ways to set this up to avoid the serialization cost (using the DirectServiceClient, as shown here) -- though I haven't succeeded in getting that to work in my case. I managed to get this working using an AppHostHttpListenerBase approach (see here) although it's more of an integration test than a unit one (and is accordingly slower).
The docs on Testing shows a couple of different approaches for injecting dependencies.
If you look at the implementation for the base Service it just creates the Db from the IDbConnectionFactory:
private IDbConnection db;
public virtual IDbConnection Db
{
get { return db ?? (db = TryResolve<IDbConnectionFactory>().OpenDbConnection()); }
}
That it just resolves from the local or Global IResolver IOC container:
public static IResolver GlobalResolver { get; set; }
private IResolver resolver;
public virtual IResolver GetResolver()
{
return resolver ?? GlobalResolver;
}
public virtual T TryResolve<T>()
{
return this.GetResolver() == null
? default(T)
: this.GetResolver().TryResolve<T>();
}
So to inject your own dependencies (when using Service base class) you just need to configure a IAppHost with your dependencies your services need which you can do with:
using (var appHost = new BasicAppHost {
ConfigureContainer = c => {
c.Register<IDbConnectionFactory>(new ...);
}
}.Init())
{
//...
}
Which you can then set on your service along with any of your own dependencies your service needs, e.g:
var service = appHost.ResolveService<MyService>();
Which will autowire all dependencies configured in your AppHost, you can also add your own adhoc test-specific dependencies via normal property access, e.g:
var service.MyDependency = new Mock<IMyDependency>().Object;
From then on you can just call and test your C# class methods as per usual:
var response = service.Get(new RequestDto { ... });
Assert.That(response.Result, Is.Equal("Expected Result from DB"));

Dynamic configs with Structuremap

Here's what I am trying to accomplish with Structuremap.
On each we request, database connection strings and web service urls used in our clients will vary based on some business logic. Currently, our sql and web service client implementations receive the configs in their constructors.
I wanted to use profiles, only to discover that it is not possible to use them per request.
In our team, we're having a debate over two solutions:
1- Pass a config factory into the registry that can resolve which configurations to use
when the container needs to instantiate something.
Problems I see is that we might have to use HttpContext.Items, as most of the app objects are not instantiated in structuremap and it seems hard to get the current request context from within the factory.
2- Instantiate containers for every different configurations and decide which container to use depending on the business logic.
Problems I see is the load time, the memory consumption and maybe the lifecycles of objects. So, I don't seem to find any real problem here, it just feels wrong to me to have multiple containers.
1- Do you see other problems?
2- Any better idea?
3- Which one would you choose?
Thank you
EDIT
and it seems hard to get the current request context from within the factory.
I don't mean HttpContext, I mean the request data. For this app, it is a wcf request object.
it seems hard to get the current request context from within the factory.
Not sure why it seems that way. Wouldnt the following do the trick?
ObjectFactory.Configure(config => {
config.For<HttpContextBase>()
.Use(() => { return new HttpContextWrapper(HttpContext.Current); });
config.For<Service>().Use<Service>();
});
var service = ObjectFactory.GetInstance<Service>();
public class ConfigurationFactory
{
public ConfigurationFactory(System.Web.HttpContextBase context)
{
}
}
public class Service
{
public Service(ConfigurationFactory Configuration)
{
}
}