Due to some quirks in some dependencies, I'm having trouble with sbt-assembly, and have been told that people working with Java and have had good results with Maven's shade plugin.
How can I use Maven's shade plugin for Scala / sbt?
You can add the following to your POM
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>1.6</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>com.group.id.Launcher1</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
Related
I am able to build a jar for spark applications and java applications, but the same is not working for the snowpark applications. I would like to know, How to build executable jar from snowpark scala application and run from command line. I m able to build jar but not able to execute form command line
Below is my error
Exception in thread "main" net.snowflake.client.jdbc.SnowflakeSQLException: User Error Report:
Java Stack Trace:
java.lang.RuntimeException: java.lang.ClassNotFoundException: us.company.snowpark.etl.HashProcessor
at function_handler_0//com.snowflake.snowpark.internal.JavaUtils$.doDeserializeAndCloseInputStream(JavaUtils.scala:351)
at function_handler_0//com.snowflake.snowpark.internal.JavaUtils$.deserialize(JavaUtils.scala:335)
at function_handler_0//com.snowflake.snowpark.internal.JavaUtils.deserialize(JavaUtils.scala)
at function_handler_0//SnowUDF.<init>(InlineCode.java:12)
Caused by: java.lang.ClassNotFoundException: us.company.snowpark.etl.HashProcessor
at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:471)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:589)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)
at java.base/java.lang.Class.forName0(Native Method)
at java.base/java.lang.Class.forName(Class.java:398)
at java.base/java.io.ObjectInputStream.resolveClass(ObjectInputStream.java:745)
at java.base/java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1965)
at java.base/java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1851)
at java.base/java.io.ObjectInputStream.readClass(ObjectInputStream.java:1814)
at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1639)
at java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2434)
at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2328)
at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2166)
at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1668)
at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:482)
at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:440)
at function_handler_0//com.snowflake.snowpark.internal.JavaUtils$.doDeserializeAndCloseInputStream(JavaUtils.scala:348)
... 3 more
in function SNOWPARK_TEMP_FUNCTION_KTVPWBIRM0FSHTU with handler SnowUDF.compute
at net.snowflake.client.jdbc.SnowflakeUtil.checkErrorAndThrowExceptionSub(SnowflakeUtil.java:127)
at net.snowflake.client.jdbc.SnowflakeUtil.checkErrorAndThrowException(SnowflakeUtil.java:67)
at net.snowflake.client.core.StmtUtil.pollForOutput(StmtUtil.java:442)
at net.snowflake.client.core.StmtUtil.execute(StmtUtil.java:345)
at net.snowflake.client.core.SFStatement.executeHelper(SFStatement.java:487)
at net.snowflake.client.core.SFStatement.executeQueryInternal(SFStatement.java:198)
at net.snowflake.client.core.SFStatement.executeQuery(SFStatement.java:135)
at net.snowflake.client.core.SFStatement.execute(SFStatement.java:781)
at net.snowflake.client.core.SFStatement.execute(SFStatement.java:677)
at net.snowflake.client.jdbc.SnowflakeStatementV1.executeQueryInternal(SnowflakeStatementV1.java:238)
at net.snowflake.client.jdbc.SnowflakePreparedStatementV1.executeQuery(SnowflakePreparedStatementV1.java:117)
at com.snowflake.snowpark.internal.ServerConnection.$anonfun$runQueryGetResult$1(ServerConnection.scala:358)
at com.snowflake.snowpark.internal.ServerConnection.withValidConnection(ServerConnection.scala:810)
at com.snowflake.snowpark.internal.ServerConnection.runQueryGetResult(ServerConnection.scala:353)
at com.snowflake.snowpark.internal.ServerConnection.runQuery(ServerConnection.scala:336)
at com.snowflake.snowpark.Session.runQuery(Session.scala:781)
at com.snowflake.snowpark.internal.UDXRegistrationHandler.createJavaUDF(UDXRegistrationHandler.scala:735)
at com.snowflake.snowpark.internal.UDXRegistrationHandler.$anonfun$registerUDF$5(UDXRegistrationHandler.scala:117)
at com.snowflake.snowpark.internal.UDXRegistrationHandler.retryAfterFixingClassPath(UDXRegistrationHandler.scala:54)
at com.snowflake.snowpark.internal.UDXRegistrationHandler.$anonfun$registerUDF$4(UDXRegistrationHandler.scala:99)
at com.snowflake.snowpark.internal.UDXRegistrationHandler.withUploadFailureCleanup(UDXRegistrationHandler.scala:169)
at com.snowflake.snowpark.internal.UDXRegistrationHandler.registerUDF(UDXRegistrationHandler.scala:99)
at com.snowflake.snowpark.UDFRegistration.register(UDFRegistration.scala:2368)
at com.snowflake.snowpark.functions$.registerUdf(functions.scala:2998)
at com.snowflake.snowpark.functions$.udf(functions.scala:3110)
at us.company.snowpark.etl.HashProcessor.<init>(HashProcessor.scala:228)
at us.company.snowpark.app.SnowparkAppDriver$.main(SnowparkAppDriver.scala:24)
at us.company.snowpark.app.SnowparkAppDriver.main(SnowparkAppDriver.scala)
I have added below mentioned build in my pom and ran "mvn clean install". The executable fat jar is generated in the target folder in the project. More information about the maven jar plugin can be found at https://maven.apache.org/plugins/maven-jar-plugin/
<build>
<sourceDirectory>src/main/scala</sourceDirectory>
<resources>
<resource>
<directory>src/main/resources</directory>
</resource>
</resources>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.2</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
<configuration>
<args>
<arg>-encoding</arg>
<arg>${project.build.sourceEncoding}</arg>
</args>
<checkMultipleScalaVersions>false</checkMultipleScalaVersions>
</configuration>
</plugin>
<plugin>
<artifactId>maven-shade-plugin</artifactId>
<version>3.2.1</version>
<executions>
<execution>
<id>jar-with-dependencies</id>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>us.company.mainClass</mainClass>
</transformer>
</transformers>
<shadedArtifactAttached>true</shadedArtifactAttached>
<shadedClassifierName>FAT</shadedClassifierName>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.7</version>
<configuration>
<skipTests>true</skipTests>
</configuration>
</plugin>
</plugins>
</build>
Just follow the guide in the Snowpark Documentation:
https://docs.snowflake.com/en/sql-reference/stored-procedures-scala.html#using-sbt-to-build-a-jar-file-with-dependencies
I have a maven project with both Java and Scala components, but when I use maven-shade-plugin, it relocates package names for both Java and Scala files, but ONLY renames packages inside Java files, Scala files still contain the older package names, what am I missing?
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.2.1</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<!--<minimizeJar>true</minimizeJar>-->
<artifactSet>
<includes>
<include>ml.dmlc:xgboost4j-spark</include>
<include>ml.dmlc:xgboost4j</include>
</includes>
</artifactSet>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<relocations>
<relocation>
<pattern>ml.dmlc.xgboost4j</pattern>
<shadedPattern>ml.dmlc.xgboost4j.shaded</shadedPattern>
</relocation>
</relocations>
<transformers>
</transformers>
</configuration>
</execution>
</executions>
</plugin>```
Sadly, I believe that Maven is intended to have this functionality but, currently (Dec 2020), it does not.
This can be seen with this bug ticket:
https://issues.apache.org/jira/browse/MSHADE-345
workaround
I have personally done a silly workaround for this. I make a new empty mvn project that has the dependency:
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>28.0-jre</version>
</dependency>
And the plugin:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.1.1</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<relocations>
<relocation>
<pattern>com.google.</pattern>
<shadedPattern>shader.com.google.</shadedPattern>
</relocation>
</relocations>
</configuration>
</execution>
</executions>
</plugin>
Then in the project with code that requires a low version of guava and a new version of guava, include the empty project as a dependency.
<dependency>
<groupId>com.yoursite.yourwork</groupId>
<artifactId>shader</artifactId>
<version>0.0.1</version>
</dependency>
Then in a .scala file you import the new (version 28) guava classes like this:
import shader.com.google.common.graph.Network
Why this works
Since the error only occurs in scala projects where you refer to your own class that uses the dependency, or as said in the question "Scala files still contain the older package names", shading a project that does not refer to its own dependencies bypasses the bug.
Yes, it does. Choose any build version you want and import the library into your Scala project.
I am getting above exception in my web application running in Tomcat when packaged all my dependencies including spring-data-jpa.jar in a single jar using maven-shaded-plugin and put under WEB-INF/lib directory.
Problem dis-appears if I package the spring-data-jpa.jar directly into WEB-INF/lib along with my shaded jar?
NOTE: I will be running the same package as AWS Lambda hence I need to create a shaded jar.
To help others, the problem was that multiple spring-*.jar files META-INF/spring.handlers files which overwrites each other while running the maven-shade-plugin.
To resolve use <transformers> in the plugin configuration. My final plugin configuration looks like as follows;
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<configuration>
<createDependencyReducedPom>false</createDependencyReducedPom>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<!-- Remove signatures from transitive dependencies and append spring handlers and schemas -->
<configuration>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>META-INF/spring.handlers</resource>
</transformer>
<transformer
implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>META-INF/spring.schemas</resource>
</transformer>
</transformers>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
</configuration>
</execution>
</executions>
</plugin>
Above will merge all handlers in one single file in final jar. Enjoy :-)
I have downloaded Gatling Maven Example and trying add mvn shade plugin in it as below, the jar get created but it doesn't contains any classes, so it fails during execution
E:\projects\gatling-maven>java -jar target\gatling-maven-plugin-demo-2.2.3.jar
Error: Could not find or load main class Engine
Here is pom.xml I have added
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>io.gatling</groupId>
<artifactId>gatling-maven-plugin-demo</artifactId>
<version>2.2.3</version>
<properties>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<gatling.version>${project.version}</gatling.version>
<gatling-plugin.version>2.2.1</gatling-plugin.version>
<scala-maven-plugin.version>3.2.2</scala-maven-plugin.version>
</properties>
<dependencies>
<dependency>
<groupId>io.gatling.highcharts</groupId>
<artifactId>gatling-charts-highcharts</artifactId>
<version>${gatling.version}</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>${scala-maven-plugin.version}</version>
</plugin>
<plugin>
<groupId>io.gatling</groupId>
<artifactId>gatling-maven-plugin</artifactId>
<version>${gatling-plugin.version}</version>
<executions>
<execution>
<goals>
<goal>execute</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>Engine</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
and package structure is as
You have to move all your resources and scala files to src\main\resources and src\main\scala. Shade plugin will not include your test resources and scala files. I have also tried shadedTestjar and it also does not work. The other option could be you use either
Maven Dependency plugin and move all dependency manually - Error prone and ugly
Use Assembly plugin - Not suitable
I have tried moving your resources and scala files in src/main and it has worked. Following is working pom content,
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>io.gatling</groupId>
<artifactId>gatling-maven-plugin-demo</artifactId>
<version>2.2.3</version>
<properties>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<gatling.version>${project.version}</gatling.version>
<gatling-plugin.version>2.2.1</gatling-plugin.version>
<scala-maven-plugin.version>3.2.2</scala-maven-plugin.version>
</properties>
<dependencies>
<dependency>
<groupId>io.gatling.highcharts</groupId>
<artifactId>gatling-charts-highcharts</artifactId>
<version>${gatling.version}</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>${scala-maven-plugin.version}</version>
</plugin>
<plugin>
<groupId>io.gatling</groupId>
<artifactId>gatling-maven-plugin</artifactId>
<version>${gatling-plugin.version}</version>
<executions>
<execution>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<configFolder>src/main/resources</configFolder>
<dataFolder>src/main/resources/data</dataFolder>
<resultsFolder>target/gatling/results</resultsFolder>
<requestBodiesFolder>src/main/resources/request-bodies</requestBodiesFolder>
<simulationsFolder>src/main/scala</simulationsFolder>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>Engine</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
Hope it solves your problem.
Try after changing your following plugin configuration,
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>${scala-maven-plugin.version}</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>
and
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>Engine</mainClass>
</transformer>
</transformers>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
</configuration>
</execution>
</executions>
</plugin>
There are still file not found exception but I believe they are trivial to solve.
First learn about the Maven standard directory structure convention. Your main project source is now put under src/test/ instead of the correct src/main/, so they are treated as to be used in unit test. Therefore it will not be packaged.
I'm trying to use the Maven shade plugin (according to the tutorial here) to create a "fat jar" from my project.
I'm working on my project in eclipse and when I look at the (huge) resulting fat jar, I see that it contains a lot (possibly all) of classes from the Eclipse IDE code itself.
Why is it doing that and how to prevent it from doing it?
I've tried just listing a bunch of directories in the <exclude> <filters>, but the Eclipse JDT jars also have some files in jars "root" folders and its not easy to list all of these as well.
The shade <plugin> part of the pom.xml file currently looks like this:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>${maven.shade.version}</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<createDependencyReducedPom>true</createDependencyReducedPom>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>vertx-unit*/**</exclude>
<exclude>org/apache/derby/**</exclude>
<exclude>javax/annotation/**</exclude>
<exclude>com/google/googlejavaformat/**</exclude>
<exclude>org/eclipse/**</exclude>
<exclude>jdtCompilerAdapter.jar</exclude>
<exclude>ant_tasks/**</exclude>
<exclude>META-INF/services/org.osgi.framework.launch.FrameworkFactory</exclude>
<exclude>about_files/**</exclude>
<exclude>org/osgi/**</exclude>
</excludes>
</filter>
</filters>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<manifestEntries>
<Main-Class>io.vertx.core.Starter</Main-Class>
<Main-Verticle>io.thesphere.service.App</Main-Verticle>
</manifestEntries>
</transformer>
</transformers>
<artifactSet />
<outputFile>${project.build.directory}/${project.artifactId}-${project.version}-fat.jar</outputFile>
</configuration>
</execution>
</executions>
</plugin>