Is it possible to publish Spring Cloud Contract Producer verification to a Pact broker?
You would have to convert the DSL to Pact files and then push those. So technically that is possible.
Update: We describe how to do this in the documentation - https://cloud.spring.io/spring-cloud-contract/reference/html/howto.html#how-to-generate-pact-from-scc
Since in SO it seems that an answer "Check the docs" is not an accepted one, let me just copy paste the documentation
How Can I Generate Pact, YAML, or X files from Spring Cloud Contract Contracts?
Spring Cloud Contract comes with a ToFileContractsTransformer class that lets you dump contracts as files for the given ContractConverter. It contains a static void main method that lets you execute the transformer as an executable. It takes the following arguments:
argument 1 : FQN: Fully qualified name of the ContractConverter (for example, PactContractConverter). REQUIRED.
argument 2 : path: Path where the dumped files should be stored. OPTIONAL — defaults to target/converted-contracts.
argument 3 : path: Path were the contracts should be searched for. OPTIONAL — defaults to src/test/resources/contracts.
After executing the transformer, the Spring Cloud Contract files are processed and, depending on the provided FQN of the ContractTransformer, the contracts are transformed to the required format and dumped to the provided folder.
The following example shows how to configure Pact integration for both Maven and Gradle:
maven
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.6.0</version>
<executions>
<execution>
<id>convert-dsl-to-pact</id>
<phase>process-test-classes</phase>
<configuration>
<classpathScope>test</classpathScope>
<mainClass>
org.springframework.cloud.contract.verifier.util.ToFileContractsTransformer
</mainClass>
<arguments>
<argument>
org.springframework.cloud.contract.verifier.spec.pact.PactContractConverter
</argument>
<argument>${project.basedir}/target/pacts</argument>
<argument>
${project.basedir}/src/test/resources/contracts
</argument>
</arguments>
</configuration>
<goals>
<goal>java</goal>
</goals>
</execution>
</executions>
</plugin>
gradle
task convertContracts(type: JavaExec) {
main = "org.springframework.cloud.contract.verifier.util.ToFileContractsTransformer"
classpath = sourceSets.test.compileClasspath
args("org.springframework.cloud.contract.verifier.spec.pact.PactContractConverter",
"${project.rootDir}/build/pacts", "${project.rootDir}/src/test/resources/contracts")
}
test.dependsOn("convertContracts")
After having the files generated at build/pacts or target/pacts you can use the Pact Gradle / Maven plugin to upload those files to the broker.
Related
When trying to build our project from within Eclipse I keep getting the following error:
Execution generate-sources of goal
org.apache.cxf:cxf-codegen-plugin:3.2.0:wsdl2java failed: A required
class was missing while executing
org.apache.cxf:cxf-codegen-plugin:3.2.0:wsdl2java:
javax/xml/bind/annotation/adapters/HexBinaryAdapter
The reason for that is that - while we still compile for a Java-8 target environment - the tool chain (i.e. Eclipse, M2E (Eclipe's Maven-plugin), Maven, and CXF) is executed using Java-11.
In Java 9+ javax/xml/bind is not part of the rt.jar anymore, hence the class is missing when the plugin tries to start up. Elsewhere I found that one can enable it by specifying an "--add-modules java.xml.bind" JVM option.
I tried adding that option to the MAVEN_OPTS environment variable but that is apparently ignored when M2E starts up Maven (and with it the CXF plugin) in a separate VM.
Next I tried to specify that option in the plugin's configuration in the pom.xml like so:
<build>
<plugins>
<plugin>
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-codegen-plugin</artifactId>
<version>${cxf.version}</version>
<executions>
<execution>
<id>generate-sources</id>
<phase>generate-sources</phase>
<configuration>
<fork>true</fork>
<additionalJvmArgs>--add-modules java.xml.bind</additionalJvmArgs>
...
</configuration>
<goals>
<goal>wsdl2java</goal>
</goals>
</execution>
</executions>
...
... but that also didn't fly. :-(
Any idea anyone, how and where one can specify that option or how I can make the former standard javax-classes available to a Maven-plugin running under Java 9+ (when executed from Eclipse M2E) ?
Just in case: this is NOT an Eclipse or M2E issues! Even when I start Maven on the command line using Java 9+ I get:
...
[ERROR] Failed to execute goal org.apache.cxf:cxf-codegen-plugin:3.2.0:wsdl2java (generate-sources) on project my_project: Execution generate-sources of goal org.apache.cxf:cxf-codegen-plugin:3.2.0:wsdl2java failed: A required class was missing while executing org.apache.cxf:cxf-codegen-plugin:3.2.0:wsdl2java: javax/xml/bind/annotation/adapters/HexBinaryAdapter
I have tests for two consumers and a producer working fine offline but the consumer tests fail when I change them for retrieving the stubs from Artifactory.
This is the code for working offline:
#RunWith(SpringRunner.class)
#SpringBootTest(classes = ContractTestConfiguration.class, webEnvironment = SpringBootTest.WebEnvironment.NONE)
#AutoConfigureStubRunner(ids = {"com.mycompany:service-name:+:stubs"}, workOffline = true)
#ImportAutoConfiguration(org.springframework.cloud.stream.test.binder.TestSupportBinderAutoConfiguration.class)
#DirtiesContext
public class MyContractTest
And this is for online:
#RunWith(SpringRunner.class)
#SpringBootTest(classes = ContractTestConfiguration.class, webEnvironment = SpringBootTest.WebEnvironment.NONE)
#AutoConfigureStubRunner(ids = {"com.mycompany:service-name:+:stubs"}, repositoryRoot = "https://artifactory.companyname.com/artifactory/artifacts-snapshot-local")
#ImportAutoConfiguration(org.springframework.cloud.stream.test.binder.TestSupportBinderAutoConfiguration.class)
#DirtiesContext
public class MyContractTest {
I get this error:
Exception occurred while trying to download a stub for group [com.mycompany] module [service-name] and classifier [stubs] in [remote0 (https://artifactory.mycompany.com/artifactory/artifacts-snapshot-local, default, releases+snapshots)]
org.eclipse.aether.resolution.ArtifactResolutionException: Could not find artifact com.mycompany.domain:service-name:jar:stubs:1.6.0-SNAPSHOT
I have looked in Artifactory and in https://artifactory.mycompany.com/artifactory/artifacts-snapshot-local and the stubs jar appears there. I have done a mvn install of the producer and when I run the tests again I get this error "The artifact was found in the local repository but you have explicitly stated that it should be downloaded from a remote one".
I have also tried adding to the consumers a dependency on the stubs of the producer but I get similar errors. And I would prefer to avoid it because it would add a dependency with the specific version of the producer:
<dependency>
<groupId>com.companyname</groupId>
<artifactId>service-name</artifactId>
<classifier>stubs</classifier>
<version>1.6.0-SNAPSHOT</version>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>*</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency>
I have added this to the POM file of the producer:
<spring.cloud.contract.verifier.skip>true</spring.cloud.contract.verifier.skip>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<executions>
<execution>
<id>stub</id>
<goals>
<goal>single</goal>
</goals>
<phase>prepare-package</phase>
<inherited>false</inherited>
<configuration>
<attach>true</attach>
<descriptor>${basedir}/src/assembly/stub.xml</descriptor>
</configuration>
</execution>
</executions>
</plugin>
And this is the content of the file stub.xml that is under src/assembly:
<assembly
xmlns="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.3"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.3 http://maven.apache.org/xsd/assembly-1.1.3.xsd">
<id>stubs</id>
<formats>
<format>jar</format>
</formats>
<includeBaseDirectory>false</includeBaseDirectory>
<fileSets>
<fileSet>
<directory>src/main/java</directory>
<outputDirectory>/</outputDirectory>
<includes>
<include>**com/companyname/projectname/*.*</include>
</includes>
</fileSet>
<fileSet>
<directory>${project.build.directory}/classes</directory>
<outputDirectory>/</outputDirectory>
<includes>
<include>**com/companyname/projectname/*.*</include>
</includes>
</fileSet>
<fileSet>
<directory>${project.build.directory}/snippets/stubs</directory>
<outputDirectory>META-INF/${project.groupId}/${project.artifactId}/${project.version}/mappings</outputDirectory>
<includes>
<include>**/*</include>
</includes>
</fileSet>
<fileSet>
<directory>${basedir}/src/test/resources/contracts</directory>
<outputDirectory>META-INF/${project.groupId}/${project.artifactId}/${project.version}/contracts</outputDirectory>
<includes>
<include>**/*.groovy</include>
</includes>
</fileSet>
</fileSets>
</assembly>
Any idea of what I am missing? Thanks in advance
I have looked in Artifactory and in https://artifactory.mycompany.com/artifactory/artifacts-snapshot-local and the stubs jar appears there. I have done a mvn install of the producer and when I run the tests again I get this error "The artifact was found in the local repository but you have explicitly stated that it should be downloaded from a remote one".
This happens when you install a stub locally, then try to download it from artifactory but SHAs are different, so Aether (engine that downloads stubs) picks the local one. In this case we throw an exception cause you wanted to download the stub from a remote location and not take it from the local one.
Exception occurred while trying to download a stub for group [com.mycompany] module [service-name] and classifier [stubs] in [remote0 (https://artifactory.mycompany.com/artifactory/artifacts-snapshot-local, default, releases+snapshots)]
org.eclipse.aether.resolution.ArtifactResolutionException: Could not find artifact com.mycompany.domain:service-name:jar:stubs:1.6.0-SNAPSHOT
This looks like in Artifactory you had the entry in some Maven metadata that the latest jar is 1.6.0-SNAPSHOT but the JAR is no longer there. Can you double check that it's actually there?
I have also tried adding to the consumers a dependency on the stubs of the producer but I get similar errors. And I would prefer to avoid it because it would add a dependency with the specific version of the producer:
That only proves that you have something messed up with your artifactory / project settings. Do things still don't work if you hardcode versions?
UPDATE:
If your artifactory instance requires credentials or is behind a proxy you can use these values:
https://github.com/spring-cloud/spring-cloud-contract/blob/v1.1.4.RELEASE/spring-cloud-contract-stub-runner/src/main/java/org/springframework/cloud/contract/stubrunner/spring/StubRunnerProperties.java#L72-L87
You can provide the stubrunner.username, stubrunner.password, stubrunner.proxyHost and stubrunner.proxyPort
Can stubrunner.username and stubrunner.password be provided in the #AutoConfigureStubRunner annotation? I have tried the following:
#RunWith(SpringRunner.class)
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.NONE)
#AutoConfigureStubRunner(ids = "com.mycompany.myproj:myservice:+:stubs:8100",
repositoryRoot = "http://artifactory.mycompany.com/artifactory/libs-snapshot-local",
properties = {"stubrunner.username=myusername", "stubrunner.password=mypassword"},
stubsMode = StubRunnerProperties.StubsMode.REMOTE)
#DirtiesContext
class MyApiContractsVerificationTest {
...
}
The credentials are correct, and the stubs are correctly generated and deployed into the remote Artifactory repository. The tests run fine if I configure them to look in the local .m2/repository (removing the "repositoryRoot and "properties" annotation args), but with the above configuration I get the following error:
Could not find metadata com.mycommany.myproject:myservice/maven-metadata.xml in local (/Users/myname/.m2/repository),
org.eclipse.aether.transfer.MetadataTransferException: Could not transfer metadata com.mycommany.myproject:myservice/maven-metadata.xml from/to remote0 (http://artifactory.mycompany.com/artifactory/libs-snapshot-local): **Unauthorized (401)**]
...
I do clean the local .m2/repository of the stubs before I run the tests with the remote mode enabled, so there is no conflict.
Am I incorrectly providing the username and password? Is something incorrect or missing in the configuration?
I am using servicemix(v4.5.3) and want to deploy my application(depends upon hundreds of third party library) as bundle via maven-bundle-plugin.
Below is my pom.xml
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<version>2.3.7</version>
<extensions>true</extensions>
<executions>
<execution>
<id>wrap-my-dependency</id>
<goals>
<goal>wrap</goal>
</goals>
<configuration>
<wrapImportPackage></wrapImportPackage>
<instructions>
<Include-Resource>{maven-resources}</Include-Resource>
<Bundle-ClassPath>.</Bundle-ClassPath>
<Embed-Dependency>*;scope=compile|runtime</Embed-Dependency>
<Embed-Transitive>true</Embed-Transitive>
<Import-Package>*</Import-Package>
<Bundle-SymbolicName>${project.groupId}.${project.artifactId}</Bundle-SymbolicName>
<Bundle-Name>${project.artifactId}</Bundle-Name>
<Bundle-Version>1.0.0</Bundle-Version>
<Bundle-Activator>com.bundle.example.Main</Bundle-Activator>
</instructions>
</configuration>
</execution>
</executions>
</plugin>
I've followed this for creating bundle, but when I execute mvn bundle:wrap than it convert the external jars into bundle and placed into target/classes folder of my project.
Now, my query is should I have to copy all bundle and placed into deploy folder of servicemix installation directory to run my application. I've followed this approach, but still I am getting some error in while my application starts.
Manifest file :
Imported Packages
com.dhtmlx.connector from dhtmlxgridConnector (476)
com.google.gson,version=[1.7,2) -- Cannot be resolved
com.googlecode.ehcache.annotations,version=[1.1,2) -- Cannot be resolved
com.hazelcast.core,version=[2.6,3) from com.hazelcast (437)
com.tinkerpop.blueprints -- Cannot be resolved
com.tinkerpop.blueprints.impls.orient -- Cannot be resolved
com.tinkerpop.frames -- Cannot be resolved
This is just a little part of my Manifest file of bundle. Here some bundle are still unresolved that I think the problem for starting my bundle.
And 2nd query: is there any better approach to handle all 3rd parties libraries while using maven-bundle-plugin.
waiting for some valuable suggestion.
When I need to convert a JAR to a bundle in Servicemix and import I use:
./bin/servicemix
osgi:install -s wrap:file:////"jar_location Ex: /lib/ojdbc6-13.jar"
Execute shutdown command, choose yes option.
Now your JAR will be available as a bundle in ServiceMix.
I am trying to use aspectj maven plugin in our project that has multiple modules. Following the instructions given in this link http://mojo.codehaus.org/aspectj-maven-plugin/weaveJars.html
I am using #Aspectj annotation. My aspect is in a separate maven module called
artifactId - consumer
And the class whose method i want to intercept or advice is in
artifactId - producer
I have added the following configuration in the pom file of the consumer module:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.4</version>
<configuration>
<source>1.6</source>
<target>1.6</target>
<showWeaveInfo>true</showWeaveInfo>
<weaveDependencies>
<weaveDependency>
<groupId>com.home.demo</groupId>
<artifactId>producer</artifactId>
</weaveDependency>
</weaveDependencies>
</configuration>
<executions>
<execution>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
Also added "producer" as a dependency in the same pom file.
When i am doing mvn clean install for the consumer module the following information comes in the console.
[INFO] [aspectj:compile {execution: default}]
[INFO] Join point 'method-execution(void com.home.demo.producer.messaging.MomServiceEndpointListener.handle(com.home.messaging.service.MessageContext, com.home.messaging.service.MessageContext))' in
Type 'com.home.demo.producer.messaging.MomServiceEndpointListener' (MomServiceEndpointListener.java:21) advised by before advice from 'com.home.demo.ods.app.OdsConsumer' (OdsConsumer.java:38)
But while executing the application, it's not working. The aspect is not getting invoked.
I am not able to understand whether i am missing something.
Also i am having confusion whether the plugin configuration shown above should be in which module consumer(where my aspects are) or producer.
The problem is that weaveDependencies act like sources only.
Your consumer module takes original "sources" from weaveDependencies (producer), weaves them with aspects and put weaved classes into consumer(!!!) target/classes.
Therefore, producer artifact never knows about aspects and you use it unchanged.
You have to re-build a producer jar using classes from consumer/target/classes.
I don't think it's convenient, so i left my attempts to use this plugin in this way.
Also, several weaveDependencies will be merged into one scrap-heap of classes.
You better try Aspects from your external jar dependency and plugin config that is built into producer.
I tried changing to the release version of gwt2.4 and run into a problem. I use multiple projects in my setup. I have a project with serverside code, one project with shared code, that can be used in different gwt projects and a gwt project binding everything together. I build everything with maven. i followed the instructions for annotationprocessing found here:
http://code.google.com/p/google-web-toolkit/wiki/RequestFactoryInterfaceValidation
when I compile my shared project, where the proxies and services are, the folder "generated-sources\apt\" with the DeobfuscatorBuilder.java is created. I have the sources of this project as dependency of my mainproject and try to run the validator as well, but the DeobfuscatorBuilder.java is not created here. Everything compiles but when I invoke a call to the requestfactory I get the error:
com.google.web.bindery.requestfactory.server.UnexpectedException: No RequestContext for operation ZwI9iqZS626uTt_TFwRtUwPYSOE=
I guess there is an mistake in my setup, but I could't find where ..
Does anybody know how to solve this problem?
Regards
arne
UPDATE:
I added this to my pom:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<id>unpack</id>
<phase>initialize</phase>
<goals>
<goal>unpack</goal>
<!-- <goal>build-classpath</goal> -->
</goals>
<configuration>
<artifactItems>
<artifactItem>
<groupId>com.myproject.core</groupId>
<artifactId>shared</artifactId>
<version>${shared.version}</version>
<classifier>sources</classifier>
<overWrite>true</overWrite>
<outputDirectory>${project.build.directory}/com.myproject.shared</outputDirectory>
</artifactItem>
</artifactItems>
</configuration>
</execution>
</executions>
</plugin>
This unpacks the sources of my dependencies and puts them into my target folder.
Then I added:
<configuration>
<sourceDirectory>target/com.fileee.shared</sourceDirectory>
</configuration>
to my processor-plugin.
This way it is not necessary to have all the projects in the workspace and it should work with a continous integration system. Wouldn't have figured that out without Andys reply though :)
I had the same issue and spent hours scouring the web for an answer without any luck. If I add the processor plugin to the shared project, it generates the DeobfuscatorBuilder class, but I get the same No RequestContext exception as you. If I just have the processsor plugin on the GWT war project, the builder isn't generated at all.
With a fair amount of trial and error I found adding the source directory from the shared project into the processor plugin configuration on the war project worked...
http://code.google.com/p/android-shuffle/source/browse/shuffle-app-engine/pom.xml#269
It's a bit dirty, but it does the trick. If there's an official method that doesn't require cross project hackery I'd be more than welcome to switch, but I haven't seen anything suggested yet.
Cheers
Andy