skip mvn exec:exec for modules in multimodule maven project - exec-maven-plugin

I have a multimodule project, for which I wanted to do a verification on pre-requisites by running a shell script using exec-maven-plugin, when I run the below command
mvn exec:exec -Dexec.executable=/apps/rm-scripts/verify.sh
It goes through all the modules and execute the script, my requirement is, it should not look in to the modules, instead it should run only on the parent project
Can anyone plesae help?....thanks for your time

You can run maven with the --non-recursive option so the goal will only be run on the parent project:
mvn --non-recursive exec:exec -Dexec.executable=/apps/rm-scripts/verify.sh
If you need to run the script only on some projects, then use the --projects option. The given argument must be a list of path separated with commas.
With this project hierarchy:
project
+-- module_A
+-- module_B
| +-- module_B1
| +-- module_B2
+-- module_C
+-- module_C1
To run a command only for module_A, module_B1 and module_C2 projects:
mvn --projects module_A,module_B/module_B1,module_C/module_C2 exec:exec -Dexec.executable=/apps/rm-scripts/verify.sh

Related

create project jar in scala

I have a self-contained application in SBT. My data is stored on HDFS (the hadoop file system).How can I get a jar file to run my work on another machine.
The directory of my project is the following:
/MyProject
/target
/scala-2.11
/MyApp_2.11-1.0.jar
/src
/main
/scala
If you don't have any dependencies then running sbt package will create a jar will all your code.
You can then run your Spark app as:
$SPARK_HOME/bin/spark-submit --name "an-app" my-app.jar
If your project has external dependencies (other than spark itself; if it's just Spark or any of it's dependencies, then the above approach still works), then you have two options:
1) Use the sbt assembly plugin to create an uper jar with your entire class-path. Running sbt assembly will create another jar which you can use in the same way as before.
2) If you only have very few simple dependecies (say just joda-time), then you can simply include them into your spark-submit script.
$SPARK_HOME/bin/spark-submit --name "an-app" --packages "joda-time:joda-time:2.9.6" my-app.jar
Unlike Java, in Scala, the file’s package name doesn’t have to match the directory name. In fact, for simple tests like this,
you can place this file in the root directory of your SBT project, if you prefer.
From the root directory of the project, you can compile the project:
$ sbt compile
Run the project:
$ sbt run
Package the project:
$ sbt package
Here is link to understand:
http://alvinalexander.com/scala/sbt-how-to-compile-run-package-scala-project

Eclipse maven project package disorder

Maven project is usually organized as src/main/java/somepackage, but from time to time my eclipse would recognize src/main/java as part of package too. How to fix this?
In the following picture, common and helloworld are both maven projects, where common is organized correctly but helloworld is not.
The src directory is the default directory where Eclipse stores the source code, while in Maven is src/main/java. Better check if the .classpath file has not been corrupted or externally modified. It should contain something like this:
<classpathentry kind="src" output="target/classes" path="src/main/java">
The code line
package main.java.helloworld;
in the helloworld project results in this package hierarchy:
main
`-+ java
`-+ helloworld
Is that really what you mean?
In a maven project, you have this directory structure:
src
`-+ main
| `-+ java
| | `-+ <here your packages and classes>
| '-+ resources
|
'-+ test
| `-+ java
| | `-+ <here your test packages and classes>
| '-+ resources
src/main/java (production code) and src/test/java (tests only) are the root of your package hierarchy. If you create the nested packages com.enterprise within test/java and create a class Test within enterprise, then you have to insert this package declaration in Test.java:
package com.enterprise;
Take a look at your directory and package structure in the helloworld project.

How I can list all sbt dependencies?

I need to list all sbt dependencies in order to check if already exists a debian package (I also notice that there is a DEB package but it seems that external dependencies are not packaged).
At the moment I did a list of sbt dependencies with the following steps:
Install sbt manually
I created a simple script that extract all jar files in ~/.ivi2 directory (excluding sbt jar). Here the result of the execution:
Group;Artifact;Artifact+Version
org.scala-lang;jline;jline-2.10.5
org.scala-lang;scala-compiler;scala-compiler-2.10.5
org.scala-lang;scala-library;scala-library-2.10.5
org.scala-lang;scala-reflect;scala-reflect-2.10.5
com.jcraft;jsch;jsch-0.1.46
org.scalamacros;quasiquotes_2.10;quasiquotes_2.10-2.0.1
jline;jline;jline-2.11
com.thoughtworks.paranamer;paranamer;paranamer-2.6
org.json4s;json4s-ast_2.10;json4s-ast_2.10-3.2.10
org.json4s;json4s-core_2.10;json4s-core_2.10-3.2.10
org.scala-lang.modules;scala-pickling_2.10;scala-pickling_2.10-0.10.0
org.scala-tools.sbinary;sbinary_2.10;sbinary_2.10-0.4.2
org.fusesource.jansi;jansi;jansi-1.4
org.spire-math;json4s-support_2.10;json4s-support_2.10-0.6.0
org.spire-math;jawn-parser_2.10;jawn-parser_2.10-0.6.0
Do you think is the right way to list all sbt dependencies?
There is a nice sbt plugin for that:
https://github.com/jrudolph/sbt-dependency-graph
Simply adding to ~/.sbt/0.13/plugins/plugins.sbt:
addSbtPlugin("net.virtual-void" % "sbt-dependency-graph" % "0.8.0")
Calling sbt dependencyTree you can get an "ascii graph" like:
...
[info] | +-org.apache.lucene:lucene-spatial:4.10.2
[info] | | +-com.spatial4j:spatial4j:0.4.1
[info] | | +-org.apache.lucene:lucene-core:4.10.2
[info] | | +-org.apache.lucene:lucene-queries:4.10.2
[info] | | +-org.apache.lucene:lucene-core:4.10.2
[info] | |
[info] | +-org.apache.lucene:lucene-suggest:4.10.2
[info] | +-org.apache.lucene:lucene-analyzers-common:4.10.2
[info] | | +-org.apache.lucene:lucene-core:4.10.2
[info] | |
[info] | +-org.apache.lucene:lucene-core:4.10.2
[info] | +-org.apache.lucene:lucene-misc:4.10.2
[info] | | +-org.apache.lucene:lucene-core:4.10.2
[info] | |
[info] | +-org.apache.lucene:lucene-queries:4.10.2
[info] | +-org.apache.lucene:lucene-core:4.10.2
...
In case the dependency hierarchy provided by sbt-dependency-graph is not needed, the following might be useful:
sbt 'show dependencyClasspathFiles'
Just adding here how to install sbt-dependency-graph, I think that is relevant for the question.
IMPORTANT:
The answer is just the part related to sbt-dependency-graph. The complete answer (sbt+scala+homebrew+plugin) you may find here
In order to use the Snyk CLI to test Scala projects, you will need to install the Sbt dependency graph plugin.
Installing the Sbt dependency graph plugin for sbt 0.13
Prerequisites
Ensure you have installed Scala.
Ensure you have installed Sbt and ran sbt.
NOTE: The steps below will install the Sbt dependency plugin as a global plugin.
First navigate to the correct directory by typing the following
command: cd ~/.sbt
This will take you to the Sbt directory. From there you will need to
navigate to the 0.13 directory. Typing the ls command will show if
0.13 and/or 1.0 exists in the directory
Navigate to 0.13 by typing: cd 0.13 and then make a directory called
plugins by typing: mkdir plugins
Navigate to the new directory by typing: cd plugins and then proceed
to create a file called “plugins.sbt” by typing: touch plugins.sbt
Edit the plugins.sbt file via your preferred editor
Add the following line to the file:
addSbtPlugin("net.virtual-void" % "sbt-dependency-graph" % "0.10.0-RC1")
save the changes
Take the following steps for the 1.0 directory. Check if 1.0 exists
by typing ls in the sbt directory:
If the 1.0 does NOT exist in the sbt directory, type mkdir 1.0 in the sbt directory
If 1.0 exists in the directory, run the following command: cd ~/.sbt/1.0
Make a directory called “plugins” in that folder by typing: mkdir plugins
Copy the existing “plugins.sbt” file from the 0.13 directory to the current 1.0 directory by typing the following: cp ../0.13/plugins/plugins.sbt ./plugins
Validate that the plugin has been installed correctly by running the following command: sbt "-Dsbt.log.noformat=true" dependencyTree important This should be tested in the directory of the project and by running the command will generate the dependency graph. You can also run it each time you want to generate the dependency graph)

Maven works but eclipse error

I created a maven project from the command-line interface. Configured the dependencies with my other maven projects. Now when I run my project, it works without any problem using the following maven command:
$ mvn clean
$ mvn install
$ mvn exec:java
Hello world!
Then, I use the following maven command to have my eclipse project files.
$ mvn eclipse:eclipse
I go and open my project using Eclipse Kepler, however as you can see there is a exclamation point on my project and it does not run from eclipse.
The problem is I do not know how to find the error on eclipse..

Maven2 + Eclipse 3.5 web Project Errors

I'm using Maven 2.2 to build Simple Web Project and Integrate it to Eclipse:
I'm doing it the following way:
1) Going to my workspace directory using command line:
2) Create Project using the following command:
mvn archetype:generate -DgroupId=com.vanilla.test -DartifactId=myTest -DarchetypeArtifactId=maven-archetype-webapp -DinteractiveMode=false
3) Convert this project to Eclipse project:
cd myTest
mvn eclipse:eclipse -Dwtpversion=R7
When I import then project to Eclispe I have red 'x' on the project name. Although, I tried to Clean project, to refresh it or run index.jsp, I can't fix it.
No any other problems with the project.
Why does it happen?
Perhaps you should configure the workspace before converting to Eclipse project.
mvn eclipse:configure-workspace -Declipse.workspace=<path-of-your-workspace>