How to generate both spring restdocs and Pact.io pact from junit test? - junit4

Using spring boot and mockmvc , I have test class with following #beforeEach method:
#BeforeEach
void setUp(WebApplicationContext context,
RestDocumentationContextProvider restDocumentation) {
MockMvcRestDocumentationConfigurer configurer = documentationConfiguration(restDocumentation);
configurer.operationPreprocessors()
.withRequestDefaults(prettyPrint())
.withResponseDefaults(prettyPrint());
configurer.snippets()
.withDefaults(
httpRequest(),
httpResponse()
);
this.mockMvc = MockMvcBuilders.webAppContextSetup(context)
.apply(configurer)
.build();
and the following test method:
#Test
void createAdminHttpRequest() throws Exception {
var adminDTO = HandlerTestObjectGenerator.createFixedAdminDTO();
mockMvc.perform(
RestDocumentationRequestBuilders
.post("/api/admins")
.content(objectMapper.writeValueAsString(adminDTO))
.contentType(MediaType.APPLICATION_JSON_UTF8)
).andExpect(status().isCreated())
.andDo(document("create-admin",
preprocessRequest(),
preprocessResponse(),
requestFields(
usernameFieldDescriptor,
passwordFieldDescriptor,
rolesFieldDescriptor
),
responseFields(
admin_adminIdFieldDescriptor,
admin_usernameFieldDescriptor,
admin_rolesFieldDescriptor
),
SpringCloudContractRestDocs.dslContract(),
));
}
This test works well and generates both spring rest docs documentation and groovy contract.
But for front-end (react) testing, I need to generate Pact.io contract, which is framework independent.
Question
So, my question is if it is possible to generate both spring rest docs and pact.io pact using single #Test method?
My research
What I have found so far is that pacts are generated from #Pact annotated methods using its own rest builder.
Additonally, I have found this conversation:
https://gitter.im/spring-cloud/spring-cloud-contract/archives/2018/08/06 and I am trying to implement own maven plugin to convert groovy contracts to pacts, but there seems to be error in the BodyConverter class and I am getting the following exception:
java.lang.UnsupportedOperationException: use the array(String name) form
at au.com.dius.pact.consumer.dsl.PactDslJsonBody.array(PactDslJsonBody.java:673)
My maven plugin code samples:
Inicialization:
private PactContractConverter pactContractConverter = new PactContractConverter();
private ContractVerifierDslConverter contractDslConverter = new ContractVerifierDslConverter();
Conversion:
private void processFiles(List<File> contractFiles) throws Exception {
for(File file : contractFiles) {
logger.info("Processing " + file.getAbsolutePath());
Collection<Contract> contracts = contractDslConverter.convertFrom(file);
Collection<Pact> pacts = pactContractConverter.convertTo(contracts);
String jsonPacts = mapper.writeValueAsString(pactContractConverter.store(pacts));
File pactsFile = new File(outputDir, file.getName() + "_pact.json");
FileUtils.writeByteArrayToFile(pactsFile, jsonPacts.getBytes());
logger.info("Generated pact file: " + pactsFile.getAbsolutePath());
}
}
But I am getting the exception mentioned above. There is a direct call to the method, which throws UnsupportedOperationException. I found other method array(String name), but that seems not to be called from the converter code.

Let's begin with this statement:
But for front-end (react) testing, I need to generate Pact.io contract, which is framework independent.
You can use Spring Cloud Contract in the polyglot world. Just use Docker (https://spring.io/blog/2018/02/13/spring-cloud-contract-in-a-polyglot-world) and https://cloud.spring.io/spring-cloud-static/spring-cloud-contract/2.2.0.RELEASE/reference/html/docker-project.html
Coming back to your question
So, my question is if it is possible to generate both spring rest docs and pact.io pact using single #Test method?
Let's do it in a different way... Since you already have the DSLs, I guess you would like to also get the Pact files. If you check the documentation under this section (https://cloud.spring.io/spring-cloud-static/spring-cloud-contract/2.2.0.RELEASE/reference/html/howto.html#how-to-generate-pact-from-scc) you'll see exactly the answer to your question. It's enough to add a plugin that after your tests have generated the DSLs will convert those DSLs to something else, e.g. Pact files.
Example of using Maven plugin
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.6.0</version>
<executions>
<execution>
<id>convert-dsl-to-pact</id>
<phase>process-test-classes</phase>
<configuration>
<classpathScope>test</classpathScope>
<mainClass>
org.springframework.cloud.contract.verifier.util.ToFileContractsTransformer
</mainClass>
<arguments>
<argument>
org.springframework.cloud.contract.verifier.spec.pact.PactContractConverter
</argument>
<argument>${project.basedir}/target/pacts</argument>
<argument>
${project.basedir}/src/test/resources/contracts
</argument>
</arguments>
</configuration>
<goals>
<goal>java</goal>
</goals>
</execution>
</executions>
</plugin>
If you modify the ${project.basedir}/src/test/resources/contracts to point to the location where the DSLs got dumped from your REST Docs tests, you'll get the PACT files dumped to ${project.basedir}/target/pacts. Below you have a similar example for Gradle
task convertContracts(type: JavaExec) {
main = "org.springframework.cloud.contract.verifier.util.ToFileContractsTransformer"
classpath = sourceSets.test.compileClasspath
args("org.springframework.cloud.contract.verifier.spec.pact.PactContractConverter",
"${project.rootDir}/build/pacts", "${project.rootDir}/src/test/resources/contracts")
}

Related

Integrating Spring-Shell with the MongoDb driver

Is it me, or are the MongoDb drivers and Spring-Shell deeply incompatible? To start, I'm not talking about the Spring-Data-Mongo stuff, I'm talking about the actual java client that the MongoDb folks put out.
My Pom is as follows:
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.1.7.RELEASE</version>
</parent>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.shell</groupId>
<artifactId>spring-shell-starter</artifactId>
<version>2.0.1.RELEASE</version>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver-sync</artifactId>
<version>3.11.0</version>
</dependency>
</dependencies>
If I try to use the MongoDb client from the Spring shell, I consistenty get noclassdeffound errors all over the place. A simplified bare bones shell method is as follows:
import com.mongodb.MongoClientSettings;
import com.mongodb.client.MongoClient;
import com.mongodb.client.MongoClients;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.MongoDatabase;
import org.bson.codecs.configuration.CodecRegistry;
import org.bson.codecs.pojo.PojoCodecProvider;
import org.springframework.shell.standard.ShellComponent;
import org.springframework.shell.standard.ShellMethod;
import java.util.Date;
import static org.bson.codecs.configuration.CodecRegistries.fromProviders;
import static org.bson.codecs.configuration.CodecRegistries.fromRegistries;
#ShellComponent
public class AuditCommands {
#ShellMethod("Just testing here")
public int cube(int number)
{
return number*number*number;
}
#ShellMethod("Sends a test document to mongo")
public void mgo()
{
System.out.println("Hello there. Doing some mongo stuff");
//MongoClient mongoClient = MongoClients.create();
MongoClient mongoClient = MongoClients.create("mongodb://whateversite:12345");
// New up a registry to automatically handle pojos
CodecRegistry pojoCodecRegistry = fromRegistries(MongoClientSettings.getDefaultCodecRegistry(),
fromProviders(PojoCodecProvider.builder().automatic(true).build()));
// Grep database instance
MongoDatabase database = mongoClient.getDatabase("MyDb");
database = database.withCodecRegistry(pojoCodecRegistry);
MongoCollection<Audit> collection = database.getCollection("MyCollection", Audit.class);
Audit audit = new Audit();
audit.setAuditId(1);
audit.setAuditTypeId(5);
audit.setCreatedOn(new Date());
audit.setMessage("Making mongo great again..");
collection.insertOne(audit);
System.out.println("Done..!!..");
}
}
I receive the following error if I try to execute the "mgo" ShellMethod in my example I get the following error.
Error starting ApplicationContext. To display the conditions report re-run your application with 'debug' enabled.
2019-09-02 18:10:54.634 ERROR 18848 --- [ main] o.s.b.d.LoggingFailureAnalysisReporter :
***************************
APPLICATION FAILED TO START
***************************
Description:
An attempt was made to call a method that does not exist. The attempt was made from the following location:
com.mongodb.client.internal.MongoClientImpl.<init>(MongoClientImpl.java:67)
The following method did not exist:
com.mongodb.MongoClientSettings.getAutoEncryptionSettings()Lcom/mongodb/AutoEncryptionSettings;
The method's class, com.mongodb.MongoClientSettings, is available from the following locations:
jar:file:/C:/Users/xxxxx/.m2/repository/org/mongodb/mongodb-driver-core/3.8.2/mongodb-driver-core-3.8.2.jar!/com/mongodb/MongoClientSettings.class
It was loaded from the following location:
file:/C:/Users/xxxxx/.m2/repository/org/mongodb/mongodb-driver-core/3.8.2/mongodb-driver-core-3.8.2.jar
Action:
Correct the classpath of your application so that it contains a single, compatible version of com.mongodb.MongoClientSettings
Process finished with exit code 1
If I remove Spring-Shell and Spring-Boot, that MongoDb code works fine.
So what gives here? Am I missing some essential point here or is this stuff essentially broken? I'm not a Java/Spring native, so I'm sure it won't come as a surprise when I say that connecting to Mongo and throwing a couple of documents around comes off muuuuuuch cleaner in C#, Python, and node. (And yes I know I can use spring-data-mongo, but that just seems like a really opinionated API for someone coming from a different language background)
Okay, I'm going to answer my own question here because I've learned a little more about this.
I wound up giving up on trying to make spring boot exclude all the mongodb dependencies that I didn't want and didn't ask for. So this wasn't really a spring-shell issue. I wound up just matching the version of the driver that spring boot is using in my own pom.
As in..
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>3.8.2</version>
</dependency>
Once you do that, you get a bunch "connecting to localhost:27017...." issues. You can fix this by excluding the ludicrous MongoAutoConfigure that spring defaults to. More specifically, you have to go with a parameterized SpringBootApplication annotation like this:
#SpringBootApplication(exclude = MongoAutoConfiguration.class)
public class Main {
public static void main(String[] args)
{
SpringApplication.run(Main.class);
}
}
As I mentioned earlier, I'm not a Java native, so my opinions are heavily flavored by the frameworks I grew up on. But.. The idea that spring just automatically tries to connect to a potentially networked resource is completely asinine to me. It's one thing if I'm actually trying to use spring-mongodb and it's super opinionated pattern, but I'm not in my case. The equivalent would be if I pulled the Dapper assemblies from NuGet and they tried to log into the nearest local instance of sql server just for the heck of it. Very sketchy value proposition at best, and surface area for some sort of creative exploit at worst. I just don't see what "I" get out of this behavior.

Spring Cloud Stream unable to detect message router

I'm trying to set up a simple cloud stream Sink but keep running into the following errors.
I've tried several binders and they all keep giving the same error.
"SEVERE","logNameSource":"org.springframework.boot.diagnostics.LoggingFailureAnalysisReporter","message":"
***************************
APPLICATION FAILED TO START
***************************
Description:
Parameter 0 of method binderAwareRouterBeanPostProcessor in org.springframework.cloud.stream.config.BindingServiceConfiguration required a bean of type '[ Lorg.springframework.integration.router.AbstractMappingMessageRouter;' that could not be found.
Action:
Consider defining a bean of type '[ Lorg.springframework.integration.router.AbstractMappingMessageRouter;' in your configuration.
I'm trying to use a simple Sink to log an incoming message from a kafka topic
#EnableBinding(Sink.class)
public class ReadEMPMesage {
private static Logger logger =
LoggerFactory.getLogger(ReadEMPMesage.class);
public ReadEMPMesage() {
System.out.println("In constructor");
}
#StreamListener(Sink.INPUT)
public void loggerSink(String ccpEvent) {
logger.info("Received" + ccpEvent);
}
}
and my configuration is as follows
# Test consumer properties
spring.kafka.consumer.auto-offset-reset=earliest
spring.kafka.consumer.group-id=testEmbeddedKafkaApplication
spring.kafka.consumer.key-deserializer=org.apache.kafka.common.serialization.ByteArrayDeserializer
spring.kafka.consumer.value-deserializer=org.apache.kafka.common.serialization.ByteArrayDeserializer
# Binding properties
spring.cloud.stream.bindings.output.destination=testEmbeddedOut
spring.cloud.stream.bindings.input.destination=testEmbeddedIn
spring.cloud.stream.bindings.output.producer.headerMode=raw
spring.cloud.stream.bindings.input.consumer.headerMode=raw
spring.cloud.stream.bindings.input.group=embeddedKafkaApplication
and my pom
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-stream-kafka</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-stream</artifactId>
</dependency>
TL;DR - check your version of Spring Boot and try upgrading it a few minor revs.
I ran into this problem on a project after upgrading from Spring Cloud DALSTON.RELEASE to Spring Cloud Edgware.SR4 -- it was strange because other projects worked fine but there was a single one that didn't.
After further investigation I realized that the troublemaker project was using Spring Boot 1.5.3.RELEASE and others were using 1.5.9.RELEASE
After upgrading Spring Boot to 1.5.9.RELEASE things seemed to start working

Is it possible to write JUnit tests that are agnostic to your JAX-RS implementation?

I wrote a REST web service using JAX-RS that knows nothing about the specific JAX-RS implementation I chose. I happen to be using TomEE which means my JAX-RS implementation is ApacheCXF.
I'd like to write unit tests for the web service that also know nothing about the JAX-RS implementation. Is this possible? So far every example I've found involves using classes from a specific JAX-RS implementation (JAXRSClientFactory for ApacheCXF, Jersey Test Framework, etc).
I've started experimenting with tomee-embedded and am able to test my EJB's but it doesn't seem to startup the REST services.
My solution was to use Arquillian paired with an Embedded TomEE. Arquillian provides a ton of functionality but I'm only using it to start/stop the Embedded TomEE. Therefore, all I needed to do was add this to my pom.xml:
<dependency>
<groupId>org.apache.openejb</groupId>
<artifactId>arquillian-tomee-embedded</artifactId>
<version>${tomee.version}</version>
<scope>test</scope>
</dependency>
Then I could write a JUnit test with a little extra Arquillian stuff and plain JAX-RS:
#RunWith(Arquillian.class)
public class MyServiceIT {
#ArquillianResource
private URL webappUrl;
#Deployment()
public static WebArchive createDeployment() {
return ShrinkWrap.create(WebArchive.class)
.addClasses(MyService.class)
.addAsWebInfResource("META-INF/persistence.xml") //Refers to src/main/resources/META-INF/persistence.xml
.addAsWebInfResource("test-resources.xml", "resources.xml") //Refers to src/test/resources/test-resources.xml
.addAsWebInfResource(EmptyAsset.INSTANCE, "beans.xml");
}
#Test
public void randomTest() throws URISyntaxException {
//Get data from the web service.
Client client = ClientBuilder.newClient();
WebTarget webTarget = client.target(webappUrl.toURI().resolve("myentity"));
Response response = webTarget.request(MediaType.APPLICATION_JSON).get();
int status = response.getStatus();
List<MyEntity> myEntities = response.readEntity(new GenericType<List<MyEntity>>() {});
//Perform some tests on the data
}
}

Struggling with Bean Validation within a JAX RS running in a Glassfish Container

I'm working on a simple Java EE Application, using Glassfish.
Everything runs fine, my Entity and Session Beans are working.
I also created some JAX RS Resources to invoke the Session Beans, which also works fine.
Now I'm struggling with Bean Validation.
Let's have a look at a little snippet:
#GET
#Path( "{portaluser}" )
#NotNull
public PortaluserResponse load( #PathParam( "portaluser" ) #NotBlank #Email final String strEmail )
{ ... some implementation ... }
My Jersey Application, which of course extends ResourceConfig looks like this:
public JerseyApplication()
{
packages( PortaluserService.class.getPackage().getName() );
register( JacksonFeature.class );
register( ValidationConfig.class);
property( ServerProperties.BV_SEND_ERROR_IN_RESPONSE, true );
}
In my pom.xml I included following dependency:
<dependency>
<groupId>org.glassfish.jersey.ext</groupId>
<artifactId>jersey-bean-validation</artifactId>
<version>2.9</version>
</dependency>
If I invoke the REST Service with nonsense data, the validation doesn't kick in.
Why is that? I expect to get a validation error.
I found a jersey-Sample which covers the bean validation stuff. My REST-Resource works within that project.
The only difference is, that jersey-Sample doesn't run in Glassfish, but in a Jetty.
Can it be that jersey bean validation doesn't work when running in a Java EE container?
Would appreciate some hints.

arquillian persistence extension doesn't work

I'm trying to get my webservice tested. This webservice uses ejb with jpa to retrieve its data. So i want to use the arquillian extension to get this done.
This is my arquillian test class:
#RunWith(Arquillian.class)
public class PersonWebServiceIT {
private PersonWebService service;
#Deployment(testable = false)
public static Archive<?> createDeployment() {
return ShrinkWrap
.create(ZipImporter.class, "test.ear")
.importFrom(new File("simple-webservice-ear-1.0.0-SNAPSHOT.ear"))
.as(EnterpriseArchive.class);
}
#Test
#UsingDataSet("dataset.yml")
#SneakyThrows
public void testFindPersons(#ArquillianResource final URL deploymentUrl) {
loadService(deploymentUrl);
Assert.assertEquals(2, service.findPersons().size());
}
private void loadService(final URL deploymentUrl)
//load webservice
}
}
This is my datasets/dataset.yml file:
person:
- id: 1
firstName: "stijn"
- id: 2
firstName: "cremers"
my arquillian.xml:
<?xml version="1.0" encoding="UTF-8"?>
<arquillian xmlns="http://jboss.com/arquillian" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="
http://jboss.org/schema/arquillian
http://jboss.org/schema/arquillian/arquillian-1.0.xsd">
<extension qualifier="persistence">
<property name="defaultDataSource">java:/DefaultDS</property>
</extension>
</arquillian>
My test data never gets loaded. I even tried with a wrongly formatted yml file, but even then i get no error.
The problem is with your test run mode. When you define your #Deployment with the attribute testable=false, all tests are run in the client mode, i.e. they're not run in-container.
The Arquillian Persistence Extension (as of 1.0.0.Alpha5) does not support running tests in client mode; only in-container tests are supported for now. Support for client mode tests in APE may come in a future release.
<property name="defaultDataSource">java:/DefaultDS</property>
U're specifying the Datasource which is defined in the server.
In client mode, test cases are run outside the Container(ie. Other JVM)
So that only persistence extension can not make use of data source and hence you can not use arquillian persistence extension client mode.
If there is anyway to specify jdbc url instead of datasource name in arquillian.xml file.Then u may use persistence extension