I have a project in Eclipse with Maven, Cucumber, SoapUI and JUnit.I have been able to successfully build it without errors (Yay! I am new to Maven, SoapUI and Cucumber).
This project has a Cucumber feature file with two scenarios. I have the following configuration for SoapUI in pom.xml file
.
.
<pluginRepositories>
<pluginRepository>
<id>smartbear-sweden-plugin-repository</id>
<url>http://www.soapui.org/repository/maven2/</url>
</pluginRepository>
</pluginRepositories>
<build>
<plugins>
<plugin>
<groupId>com.smartbear.soapui</groupId>
<artifactId>soapui-pro-maven-plugin</artifactId>
<version>4.6.1</version>
<executions>
<execution>
<phase>test</phase>
<goals>
<goal>test</goal>
</goals>
<configuration>
<projectFile>soapui-project.xml</projectFile>
<outputFolder>soapuiOut</outputFolder>
<junitReport>true</junitReport>
<exportwAl>true</exportwAl>
<printReport>false</printReport>
</configuration>
</execution>
</executions>
</plugin>
.
.
Currently when I build it with maven, it runs the whole SoapUI project with all test cases in it. I want to link the two test cases in SoapUI to the two scenarios in the feature file. Is it possible to run a single testcase from SoapUI test suite in test step definition for a scenario? The scenario should pass only if the SoapUI testcase related to it passes.
Q: Is it possible to run a single testcase from SoapUI test suite in test step definition for a scenario?
A: Sure, use testCase in your configuration, as per documentation here: http://www.soapui.org/Test-Automation/maven-2x.html#5-1-test-settings
One small comment: You mentioned you are new to Maven. If you wish to make you project much more Maven-ized, place your soapui-project.xml in src/test/soapui, and adjust your pom.xml accordingly.
Related
I have a Scala app (v2.13) created using Maven v3. My resources path is:
src -> main -> resources -> application.conf and aplication.prod.conf
When I generate the JAR file for production, I want to take configuration resources from application.conf, but being overwriten by application.prod.conf.
I can not found a solution for that, all founded examples are for Play framework or previous maven versions.
The JAR file is generated using maven package cmd.
application.prod.conf file
include "application.conf"
# override default (DEV) settings
http {
host = "99.999.999.9"
port = 1111
}
The following example doesn't works for me, because from target path I get only the JAR file to move it on production:
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<version>3.0.0</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<target>
<delete file="${project.build.outputDirectory}/application.conf"/>
<copy file="src/main/resources/application.prod.conf"
tofile="${project.build.outputDirectory}/application.conf"/>
</target>
</configuration>
</execution>
</executions>
</plugin>
Few options here:
If your application.prod.cont is static and gets shipped with jar, why cant you have a logic in the code which loads appropriate app conf based on the environment app is getting executed
Is it a typesafe config, if so, while running in prod you can pass -Dconfig.resource=/application.prod.conf java command line argument
or application.prod.conf is not shipped with jar then you can pass -Dconfig.file=/path/to/application.prod.conf
Maven has a concept of phases (we're talking about the phase package here to be precise) which are logical places in the life cycle where the plugins can be invoked. Some plugins, like the one that creates the jar, for example, are associated to phases automatically (out-of-the-box), others you define explicitly and associate with the phase (like maven-antrun-plugin which is executed during the phase package as you've showed in the code snippet).
With that in mind, Is it possible that the file is attempted to get copied after the jar was packaged, so that the antrun plugin is invoked after the artifacts were packaged into the jar?
If so, the easiest solution will be moving it to one phase before, for example prepare-package:
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<version>3.0.0</version>
<executions>
<execution>
<phase>prepare-package</phase> <!-- Note the change here -->
<goals>
<goal>run</goal>
</goals>
<configuration>
<target>
<delete file="${project.build.outputDirectory}/application.conf"/>
<copy file="src/main/resources/application.prod.conf"
tofile="${project.build.outputDirectory}/application.conf"/>
</target>
</configuration>
</execution>
</executions>
</plugin>
Assuming you have maven 3 (there is no maven version 4 yet so it might be a typo), the information about which phases are available in maven here
Having said that, probably its not a good idea to "bake" the configuration file of production into the artifact, two issues here:
Your source code contains the information about production - hosts, ports, maybe even sensitive information like passwords or keys - this shouldn't really happen
From the point of view of build, your artifact is coupled to concrete environment, which is also considered a bad practice basically.
The techniques to resolve this are beyond the scope of the question but at least you've been warned :)
I wanted to know if with Spring Boot and Eclipse it is possible to configure something so that in the% of the project coverage it does not take into account the coverage of certain classes.
you are using jacoco plugin u can use like below
<plugin>
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<version>${jacoco.mavenplugin.version}</version>
<configuration>
<excludes>
<exclude>some/package/**/*</exclude>
</excludes>
</configuration>
<executions>
...
</executions>
</plugin>
please refer below
Maven Jacoco Configuration - Exclude classes/packages from report not working
I was going through this tutorial https://www.youtube.com/watch?v=k5ncCJBarRI&t=1443s
Around 1:07:30 the author mentioned about "Gradle has continuous build" later on was able to detect changes in the test and automatically regenerate asciidoc. I was wondering if anyone knows how to set this up in maven?
I have looked through docs in spring and asciidoctor plugin, but was not able to find anything related to this.
I was able to get maven to re-render html when ever there is a change in index.adoc by changing <goal> from process-asciidoc to auto-refresh. However, this does not watch the change in the Test.
Question
Is there a way to tell Maven to watch our test files and re-compile when changes are made?
POM.XML
<plugin>
<groupId>org.asciidoctor</groupId>
<artifactId>asciidoctor-maven-plugin</artifactId>
<version>1.5.7.1</version>
<executions>
<execution>
<id>generate-docs</id>
<phase>prepare-package</phase>
<goals>
<goal>auto-refresh</goal>
</goals>
<configuration>
<sourceDocumentName>index.adoc</sourceDocumentName>
<backend>html</backend>
<attributes>
<snippets>${project.build.directory}/generated-snippets</snippets>
</attributes>
</configuration>
</execution>
</executions>
</plugin>
Thank you.
This is not a continuous build solution but it works similarly. However, the process does take some time because it essentially re-packages the project everytime there is a change... May not be ideal for some use cases...
I found a plugin that watches files. https://github.com/fizzed/maven-plugins
Change the watch directory to where your test files. Changed the goal from compile to package.
Watcher will execute mvnw: package when a change is detected. Then the asciidoctor maven plugin will re-package the project.
Add this to your plugin
<plugin>
<groupId>com.fizzed</groupId>
<artifactId>fizzed-watcher-maven-plugin</artifactId>
<version>1.0.6</version>
<configuration>
<touchFile>target/classes/watcher.txt</touchFile>
<watches>
<watch>
<directory><directory>src/test/[your test package]</directory></directory>
</watch>
</watches>
<goals>
<goal>package</goal>
<!-- <goal>compile</goal> -->
</goals>
<profiles>
<profile>optional-profile-to-activate</profile>
</profiles>
</configuration>
</plugin>
Maven does not have an equivalent of Gradle's continuous build. If you want changes in the tests to be detected and to trigger recompilation of the tests and execution of all of the tasks that depend (directly or indirectly) on the compiled test classes, you'll have to use Gradle.
I'm just starting with Maven 3 for an Scala project in IntelliJ.
I have generated a JAR file following this guide.
I moved archetype.jar to a directory in where I want to create a new project. But my questions are:
Is this file stand-alone? Is it enough? It does not work with the command "mvn archetype:generate"
Is it possible to use the jar file without the intervention of any repository? So I can share it with collegues.
What's the best method for this, I've been reseaching and all the guides are based on repositories only and not in working local. Even the local repositories only consists in xmls files with the id but not the contents.
This is a sort of complex question you are asking...I am going to try summarise what I know and let's see if it helps.
Answers to your questions:
1) If you have just the basic Maven Project structure after you have generated the archtype and so on, if you run maven clean install and the project produces a jar, this in theory should be immediately executable from the command line and standalone.
However, as you add dependencies to your small projects, not all dependencies are automatically built into the standalone jar, sometimes you have to tell maven to bundle them into it.
Maven Shade Plugin - adds all the needed dependencies into your jar
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
<configuration>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>MainApp</mainClass>
</transformer>
<transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
</transformers>
<shadedArtifactAttached>true</shadedArtifactAttached>
<shadedClassifierName>launcher</shadedClassifierName>
</configuration>
</plugin>
</plugins>
</build>
2) You do not need to integrate to any repository, its recommended for wider projects so that you can publish artifacts to your firms Repo for eas of use between developers
3) The easiest method is to create a new project in Intellij itself specifying Maven as the project type and that will give you the default project structure. In the pom file you then specify the build block I pasted above and you are essentially good to go...If you need Dependencies also t run your own code you will need to add them in a Dependencies block.
So, I finally used the following command before generating:
mvn install:install-file -Dfile=<path-to-jar-archetype-file> -DgroupId=<groupId> -DartifactId=<artifactId> -Dversion=<version> -Dpackaging=jar
After executing the command, the jar is installed in your local repo. You can generate the project with for instance IntelliJ or go to the path where you want to generate the project and...
mvn archetype:generate
I have an application which I want to deploy in karaf. I have created a feature file and I am able to add features through this file using karaf console. What I want to achieve now is that create this feature file through maven commands instead of creating it manually and then create a custom karaf distribution using this feature file. How can I achieve it ?
My approach so far is to create a maven module for generating feature file using karaf-maven-plugin and then create another module to generate karaf custom distribution so that we dont need to access maven in production environment.
Is this approach correct ? Do I really need to make two different modules for achieving it. How can I get access to feature file from second module.
These are my poms -
all dependecies
<build>
<finalName>${project.artifactId}-${project.version}</finalName>
<plugins>
<plugin>
<groupId>org.apache.karaf.tooling</groupId>
<artifactId>karaf-maven-plugin</artifactId>
<version>4.0.5</version>
<extensions>true</extensions>
<executions>
<execution>
<id>generate</id>
<phase>generate-resources</phase>
<goals>
<goal>features-generate-descriptor</goal>
</goals>
<configuration>
<startLevel>80</startLevel>
<aggregateFeatures>true</aggregateFeatures>
<includeTransitiveDependency>true</includeTransitiveDependency>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
I am not able to figure out the second part yet. Any help with that is really appreciated.
To generate a custom Karaf you just need to use the karaf-maven-plugin.
For example the following will generate a fully working custom Karaf:
<plugin>
<groupId>org.apache.karaf.tooling</groupId>
<artifactId>karaf-maven-plugin</artifactId>
<version>4.0.0</version>
<extensions>true</extensions>
<configuration>
<!-- no startupFeatures -->
<bootFeatures>
<feature>minimal</feature>
</bootFeatures>
<installedFeatures>
<feature>wrapper</feature>
<feature>spring/4.0.7.RELEASE_1</feature>
</installedFeatures>
</configuration>
</plugin>
This will generate a custom karaf based on the minimal sets of features which are needed to create the minimal distro. If you want to depend on the standard distro just exchange that with standard.
Btw. all this is also documented in the Karaf documentation