iText and org.bouncycastle.asn1.ASN1Primitive not found - itext

I'm a newbie on iText. This is my first project using this library.
I'm building a PDF with essentially a big table on it, and while compiling, i'm getting this Class Not Found error: class file for org.bouncycastle.asn1.ASN1Primitive not found
I'm confused, since i'm only using the basic functionalities, and didn't even touch the PDF Signing features. What should i do to fix the error?
I'm using:
JDK 1.7
iText 5.3.5
extrajars 2.2 (which provides bcmail-jdk15-146.jar, bcprov-jdk15-146.jar and bctsp-jdk15-146.jar)
I only using iText inside one class, with these imports:
import com.itextpdf.text.BadElementException;
import com.itextpdf.text.BaseColor;
import com.itextpdf.text.Chunk;
import com.itextpdf.text.Document;
import com.itextpdf.text.DocumentException;
import com.itextpdf.text.Font;
import com.itextpdf.text.Image;
import com.itextpdf.text.Paragraph;
import com.itextpdf.text.Phrase;
import com.itextpdf.text.Rectangle;
import com.itextpdf.text.pdf.PdfPCell;
import com.itextpdf.text.pdf.PdfPTable;
import com.itextpdf.text.pdf.PdfWriter;
If it helps, i would like to clarify that when i run the project inside NetBeans, it compiles and runs just fine. The error appears when i try to compile it to a single executable jar file (which includes the dist/lib)
This is build.xml target where the error appears:
<target name="single_jar" depends="jar">
<property name="store.jar.name" value="Final"/>
<property name="store.dir" value="store"/>
<property name="store.jar" value="${store.dir}/${store.jar.name}.jar"/>
<echo message="Packaging ${application.title} into a single JAR at ${store.jar}"/>
<delete dir="${store.dir}"/>
<mkdir dir="${store.dir}"/>
<jar destfile="${store.dir}/temp_final.jar" filesetmanifest="skip">
<zipgroupfileset dir="dist" includes="*.jar"/>
<zipgroupfileset dir="dist/lib" includes="*.jar"/>
<manifest>
<attribute name="Main-Class" value="${main.class}"/>
</manifest>
</jar>
<zip destfile="${store.jar}">
<zipfileset src="${store.dir}/temp_final.jar"
excludes="META-INF/*.SF, META-INF/*.DSA, META-INF/*.RSA"/>
</zip>
<delete file="${store.dir}/temp_final.jar"/>
</target>

Current iText versions (since 5.3.0) use BouncyCastle 1.47 but you provide 1.46; even though that looks like a small step, there are substantial changes between those BC versions; any sensible version management would have called it 2.0.
Please update dependencies.

I was getting java.lang.NoClassDefFoundError: org/bouncycastle/asn1/ASN1Primitive when depending on:
<dependency>
<groupId>com.itextpdf.tool</groupId>
<artifactId>xmlworker</artifactId>
<version>5.5.0</version>
</dependency>
<dependency>
<groupId>com.itextpdf</groupId>
<artifactId>itextpdf</artifactId>
<version>5.5.0</version>
</dependency>
I needed to explicitly include newer bouncycastle artifacts:
<dependency>
<groupId>org.bouncycastle</groupId>
<artifactId>bcprov-jdk15on</artifactId>
<version>1.50</version>
</dependency>
<dependency>
<groupId>org.bouncycastle</groupId>
<artifactId>bcpkix-jdk15on</artifactId>
<version>1.50</version>
</dependency>
<dependency>
<groupId>com.itextpdf.tool</groupId>
<artifactId>xmlworker</artifactId>
<version>5.5.0</version>
</dependency>
<dependency>
<groupId>com.itextpdf</groupId>
<artifactId>itextpdf</artifactId>
<version>5.5.0</version>
</dependency>

Related

Quartz 1.8.5 + OpenLiberty 18.0.0.4/ Websphere Liberty 17.0.0.4 java.lang.NoClassDefFoundError: oracle/sql/BLOB

we get the following exception when using quartz 1.8.5 and the liberty server. While using a tomcat-server (7.0.81) the exception doesn't occur.
java.lang.NoClassDefFoundError: oracle/sql/BLOB
at org.quartz.impl.jdbcjobstore.oracle.OracleDelegate.writeDataToBlob(OracleDelegate.java:642)
at org.quartz.impl.jdbcjobstore.oracle.OracleDelegate.insertJobDetail(OracleDelegate.java:207)
pom.xml
<dependency>
<groupId>org.quartz-scheduler</groupId>
<artifactId>quartz</artifactId>
<version>1.8.5</version>
</dependency>
<dependency>
<!-- ASYNC-METHOD-INVOCATION -->
<groupId>org.quartz-scheduler</groupId>
<artifactId>quartz-oracle</artifactId>
<version>1.8.5</version>
</dependency>
<dependency>
<groupId>com.oracle</groupId>
<artifactId>ojdbc6</artifactId>
<version>12.1.0.2.0</version>
<scope>provided</scope>
</dependency>
dataSource
<server>
<library id="oracleDriver">
<fileset dir="..\sw\oracle" includes="*.jar" scanInterval="120s" />
</library>
<!-- xa datasource -->
<dataSource id="ORACLE_DS_XA" jndiName="jdbc/xxx/xxx" pool-name="xxx">
<jdbcDriver libraryRef="oracleDriver" javax.sql.ConnectionPoolDataSource="oracle.jdbc.xa.client.OracleXADataSource" />
<properties.oracle URL="jdbc:oracle:thin:#localhost:1521:sid" password="user" user="password" />
<connectionManager minPoolSize="1" maxPoolSize="10" />
</dataSource>
<keyStore id="defaultKeyStore" password="password" />
</server>
What could be the reason for that exception? And how can we solve the problem?
Tell me if i should provide more information about our configuration.
TIA
I'm not really familiar with either, but if quartz is bundled in your application and needs access to the classes from your Oracle driver, then you need to expose the shared library to your application.
You'd do this by adding a <classloader> section to your <application> or <webapplication> block in your server.xml
E.g.
<application ...rest of your app configuration...>
<classloader commonLibraryRef="oracleDriver"/>
</application>
If you're currently deploying your app by putting it in the dropins directory, you'll have to change that to deploy your application to the apps directory instead and create an <application> or <webApplication> block in your server.xml.
Documentation links:
Deploying an app and adding the server.xml configuration
Reference for <application> element (includes classloader as a sub-element)
Reference for <webApplication> element (includes classloader as a sub-element)

Eclipse Maven dependency jar grayed out, can't import classes from it

I'm helping a friend configure a maven project with m2eclipse for the first time. We're both pretty unfamiliar with it and are encountering an issue where even though a dependency jar is showing up with packages in it under "maven dependencies" in the Project directory, if we try to import anything from any of that jar's packages, it can't find the class.
I noticed that the jars that are having issues are gray and not as opaque as the rest of the jars that are working.
What's strange is if you hover of the class name in the import, it shows a brief description of the class (from the documentation in the jar!) but it won't let me import it. All the other maven dependencies can be imported fine. Any ideas? We can't seem to even find what the darker icon means.
Also, the pom.xml is dead simple:
http://maven.apache.org/xsd/maven-4.0.0.xsd">
4.0.0
<groupId>com.something.portal.test</groupId>
<artifactId>PortalFrontEndTests</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>PortalFrontEndTests</name>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<!-- Selenium -->
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-java</artifactId>
<version>2.53.1</version>
</dependency>
<!-- TestNG -->
<dependency>
<groupId>org.testng</groupId>
<artifactId>testng</artifactId>
<version>6.11</version>
<scope>test</scope>
</dependency>
</dependencies>
</project>
I'm not sure what I'm missing here
open your pom.xml file
check for the name of the grayed out jar file
change
<scope>test</scope>
to
<scope>compile</scope>
I found the issue. It was because I had the class in the source directory instead of the test directory and both of the maven dependencies had been marked as "Visible only to test"
I had the same problem when i used the <scope>test</scope> in the maven pom.
It seems as if the newer Eclipse/Java versions do have a new Attribute :
<classpathentry kind="src" output="target/test-classes" path="src/test/java/...">
<attributes>
<attribute name="test" value="true"/>
</attributes>
</classpathentry>
This should be enabled in the Java Build Path Settings:
Image showing "Containts test sources" option from build path menu
After enabling this i got rid of all the compiler errors.
check for your dependency scope in POM file
compile, provided, system and test these were the available test
test -> compile would change your dependencies from grey to white.
If your dependency is for test scope then that dependency is not available for normal use in application whereas compile scope sets that dependency in class path of your project.
I am not sure on the grayed out part. If this is the feature because it suggest that Testing class should be under /test rather /src.
However, solution to your problem is scope of plugin, change it to compile and you will be good to go.
i.e. replace test with compile:
<scope>test</scope>
<scope>compile</scope>
That's it. you will not get any error for import testing packages.
Just removing the Scope will work. I tried as the following:
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-api</artifactId>
<version>5.8.0-M1</version>
</dependency>
I am also faced the same problem
set scope to compile or remove scope
open maven dependencies
right click on dependency and click download resources

Dependency issue with spark-sql-kafka with spark-submit

I have written a simple driver class in scala that uses spark-sql-kafka for structured streaming. I have used eclipse+maven to package it into a jar. Relevant part of pom.xml file is as follows:
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.8</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.1.1</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>2.1.1</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.databricks</groupId>
<artifactId>spark-csv_2.11</artifactId>
<version>1.5.0</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql-kafka-0-10_2.11</artifactId>
<version>2.0.2</version>
<scope>provided</scope>
</dependency>
</dependencies>
The, I submit the resulting jar file to spark-submit using following command:
spark-submit --properties-file {path}/kafka-streaming-conf --packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.0.2 --class TestStreamDriver --master yarn {path}/StructuredStreaming-1.0-SNAPSHOT.jar
kafka-streaming-conf is as follows:
spark.executor.extraJavaOptions -Dhttp.proxyHost=proxyName -Dhttp.proxyPort=8080 -Dhttps.proxyHost=proxyName -Dhttps.proxyPort=8080
spark.jars.ivySettings {path}/ivysettings_proxy.xml
ivysettings_proxy.xml file is as follows:
<ivysettings>
<settings defaultResolver="default" />
<credentials host = "proxyName:8080" username = "" passwd = ""/>
<include url="${ivy.default.settings.dir}/ivysettings-public.xml" />
<include url="${ivy.default.settings.dir}/ivysettings-shared.xml" />
<include url="${ivy.default.settings.dir}/ivysettings-local.xml" />
<include url="${ivy.default.settings.dir}/ivysettings-main-chain.xml" />
<include url="${ivy.default.settings.dir}/ivysettings-default-chain.xml"/>
</ivysettings>
I also changed JAVA_OPTS variable by:
export JAVA_OPTS="$JAVA_OPTS -Dhttp.proxyHost=proxyName -Dhttp.proxyPort=8080 -Dhttps.proxyHost=proxyName -Dhttps.proxyPort=8080"
When I run spark-submit with above command, it tries to download from maven repository and other urls and then exists with Connection timed out error.
How can I make spark-submit download dependencies through a proxy?
Thanks.
What worked for me is:
I changed spark-submit properties file to:
spark.driver.extraJavaOptions -Dhttp.proxyHost=proxyName -Dhttp.proxyPort=8080 -Dhttps.proxyHost=proxyName -Dhttps.proxyPort=8080
spark.executor.extraJavaOptions -Dhttp.proxyHost=proxyName -Dhttp.proxyPort=8080 -Dhttps.proxyHost=proxyName -Dhttps.proxyPort=8080
which led to a certificate error.
Then I added certificate for https://repo.maven.apache.org/maven2/
to {path}/jdk1.8.0_144\jre\lib\security\cacerts file. (I used a free program called portecle to add certificates to cacerts file. )
Since I run spark-submit in yarn mode, I had to copy new cacerts file to all nodes with:
pscp.pssh -h cluster-hosts ./cacerts {path}/jdk1.8.0_40/jre/lib/security/

The import com.google.api.services.storage cannot be resolved

import com.google.api.services.storage.Storage;
import com.google.api.services.storage.StorageScopes;
What are the JARs to be added and where can I find them?
That looks like you're trying to use the GCS component of the Google API Client Library for Java. If you're using Maven, it's:
<project>
<dependencies>
<dependency>
<groupId>com.google.apis</groupId>
<artifactId>google-api-services-storage</artifactId>
<version>v1-rev82-1.22.0</version>
</dependency>
</dependencies>
</project>
The JAR itself is gonna be at https://developers.google.com/resources/api-libraries/download/storage/v1/java
These links, as well as Gradle snippets and documentation, are at: https://developers.google.com/api-client-library/java/apis/storage/v1

Confluence Macro Development in SDK6

I am kind of frustrated. I am trying to develop a simple "hello world" macro for confluence. But all the tutorials are not really working anymore for the actual SDK6.
I tried this tutorial:
https://developer.atlassian.com/confdev/tutorials/macro-tutorials-for-confluence/creating-a-new-confluence-macro#CreatingaNewConfluenceMacro-Step1.Createthepluginprojectandtrimtheskeleton
But as you can see the article discussing, it is not working correctly anymore. I think some elements have be modified with SDK6 and the tutorials are not up to date anymore.
I ask at the confluence-forum for help but without any luck. There are several post around this issue without any solution.
The problem is, that the addon / plugin is visible in the system administration panel but I can not use the macro on a page and I can not see the macro in the macro browser.
Now it works - Update
This is what I did:
1) Download SDK
I downloaded sdk-installer-6.2.4.exe and installed it
2) Creating new plugin
I created a new plugin for confluence by typing in
atlas-create-confluence-plugin
with these following group- and artifact ids
groupid : com.example.plugins.tutorial.confluence
artifactid : tutorial-confluence-macro-demo
version : 1.0-SNAPSHOT
package : package com.example.plugins.tutorial.confluence
3) Creating eclipse project
Then I created the eclipse project by typing in
atlas-mvn eclipse:eclipse
4) Modify pom.xml
I modified the pom.xml just like ppasler explained in his answer. I also modified the companyname and the version in order to check in confluence, if the modification will have an effect. The pom looks like this:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.example.plugins.tutorial.confluence</groupId>
<artifactId>tutorial-confluence-macro-demo</artifactId>
<version>4.4-SNAPSHOT</version>
<organization>
<name>Hauke Company</name>
<url>http://www.example.com/</url>
</organization>
<name>tutorial-confluence-macro-demo</name>
<description>This is the com.example.plugins.tutorial.confluence:tutorial-confluence-macro-demo plugin for Atlassian Confluence.</description>
<packaging>atlassian-plugin</packaging>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.10</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.atlassian.confluence</groupId>
<artifactId>confluence</artifactId>
<version>${confluence.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.atlassian.plugin</groupId>
<artifactId>atlassian-spring-scanner-annotation</artifactId>
<version>${atlassian.spring.scanner.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>com.atlassian.plugin</groupId>
<artifactId>atlassian-spring-scanner-runtime</artifactId>
<version>${atlassian.spring.scanner.version}</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>javax.inject</groupId>
<artifactId>javax.inject</artifactId>
<version>1</version>
<scope>provided</scope>
</dependency>
<!-- WIRED TEST RUNNER DEPENDENCIES -->
<dependency>
<groupId>com.atlassian.plugins</groupId>
<artifactId>atlassian-plugins-osgi-testrunner</artifactId>
<version>${plugin.testrunner.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax.ws.rs</groupId>
<artifactId>jsr311-api</artifactId>
<version>1.1.1</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.2.2-atlassian-1</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>com.atlassian.maven.plugins</groupId>
<artifactId>maven-confluence-plugin</artifactId>
<version>${amps.version}</version>
<extensions>true</extensions>
<configuration>
<productVersion>${confluence.version}</productVersion>
<productDataVersion>${confluence.data.version}</productDataVersion>
<enableQuickReload>true</enableQuickReload>
<enableFastdev>false</enableFastdev>
<!-- See here for an explanation of default instructions: -->
<!-- https://developer.atlassian.com/docs/advanced-topics/configuration-of-instructions-in-atlassian-plugins -->
<instructions>
<Atlassian-Plugin-Key>${atlassian.plugin.key}</Atlassian-Plugin-Key>
<!-- Add package to export here -->
<Export-Package>
com.example.plugins.tutorial.confluence.api,
</Export-Package>
<!-- Add package import here -->
<Import-Package>
org.springframework.osgi.*;resolution:="optional",
org.eclipse.gemini.blueprint.*;resolution:="optional",
*
</Import-Package>
<!-- Ensure plugin is spring powered -->
<Spring-Context>*</Spring-Context>
</instructions>
</configuration>
</plugin>
<plugin>
<groupId>com.atlassian.plugin</groupId>
<artifactId>atlassian-spring-scanner-maven-plugin</artifactId>
<version>1.2.6</version>
<executions>
<execution>
<goals>
<goal>atlassian-spring-scanner</goal>
</goals>
<phase>process-classes</phase>
</execution>
</executions>
<configuration>
<scannedDependencies>
<dependency>
<groupId>com.atlassian.plugin</groupId>
<artifactId>atlassian-spring-scanner-external-jar</artifactId>
</dependency>
</scannedDependencies>
<verbose>false</verbose>
</configuration>
</plugin>
</plugins>
</build>
<properties>
<confluence.version>5.9.7</confluence.version>
<confluence.data.version>5.9.7</confluence.data.version>
<amps.version>6.2.4</amps.version>
<plugin.testrunner.version>1.1.1</plugin.testrunner.version>
<atlassian.spring.scanner.version>1.2.6</atlassian.spring.scanner.version>
</properties>
<!--
<properties>
<confluence.version>5.9.7</confluence.version>
<confluence.data.version>5.9.7</confluence.data.version>
<amps.version>6.2.3</amps.version>
<plugin.testrunner.version>1.2.3</plugin.testrunner.version>
<atlassian.spring.scanner.version>1.2.6</atlassian.spring.scanner.version>
<atlassian.plugin.key>${project.groupId}.${project.artifactId}</atlassian.plugin.key>
</properties>
-->
</project>
5) Starting eclipse
I imported the project to Eclilpse
Version: Mars.1 Release (4.5.1)
Build id: 20150924-1200
Java JDK 1.8.0_60
6) ExampleMacro class creating
I created the class "ExampleMacro"
package com.example.plugins.tutorial.confluence;
import com.atlassian.confluence.content.render.xhtml.ConversionContext;
import com.atlassian.confluence.content.render.xhtml.XhtmlException;
import com.atlassian.confluence.macro.Macro;
import com.atlassian.confluence.macro.MacroExecutionException;
import com.atlassian.confluence.xhtml.api.MacroDefinition;
import com.atlassian.confluence.xhtml.api.MacroDefinitionHandler;
import com.atlassian.confluence.xhtml.api.XhtmlContent;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
public class ExampleMacro implements Macro
{
private final XhtmlContent xhtmlUtils;
public ExampleMacro(XhtmlContent xhtmlUtils)
{
this.xhtmlUtils = xhtmlUtils;
}
#Override
public String execute(Map<String, String> parameters, String bodyContent, ConversionContext conversionContext) throws MacroExecutionException
{
String body = conversionContext.getEntity().getBodyAsString();
final List<MacroDefinition> macros = new ArrayList<MacroDefinition>();
try
{
xhtmlUtils.handleMacroDefinitions(body, conversionContext, new MacroDefinitionHandler()
{
#Override
public void handle(MacroDefinition macroDefinition)
{
macros.add(macroDefinition);
}
});
}
catch (XhtmlException e)
{
throw new MacroExecutionException(e);
}
StringBuilder builder = new StringBuilder();
builder.append("<p>");
if (!macros.isEmpty())
{
builder.append("<table width=\"50%\">");
builder.append("<tr><th>Macro Name</th><th>Has Body?</th></tr>");
for (MacroDefinition defn : macros)
{
builder.append("<tr>");
builder.append("<td>").append(defn.getName()).append("</td><td>").append(defn.hasBody()).append("</td>");
builder.append("</tr>");
}
builder.append("</table>");
}
else
{
builder.append("You've done built yourself a macro! Nice work.");
}
builder.append("</p>");
return builder.toString();
}
#Override
public BodyType getBodyType()
{
return BodyType.NONE;
}
#Override
public OutputType getOutputType()
{
return OutputType.BLOCK;
}
}
7) Modified the atlassian-plugin.xml file
<atlassian-plugin key="${atlassian.plugin.key}" name="${project.name}" plugins-version="2">
<plugin-info>
<description>${project.description}</description>
<version>${project.version}</version>
<vendor name="${project.organization.name}" url="${project.organization.url}" />
<param name="plugin-icon">images/pluginIcon.png</param>
<param name="plugin-logo">images/pluginLogo.png</param>
</plugin-info>
<!-- add our i18n resource -->
<resource type="i18n" name="i18n" location="tutorial-confluence-macro-demo"/>
<xhtml-macro name="tutorial-confluence-macro-demo" class="com.example.plugins.tutorial.confluence.ExampleMacro" key="my-macro">
<parameters/>
</xhtml-macro>
<!-- add our web resources -->
<web-resource key="tutorial-confluence-macro-demo-resources" name="tutorial-confluence-macro-demo Web Resources">
<dependency>com.atlassian.auiplugin:ajs</dependency>
<resource type="download" name="tutorial-confluence-macro-demo.css" location="/css/tutorial-confluence-macro-demo.css"/>
<resource type="download" name="tutorial-confluence-macro-demo.js" location="/js/tutorial-confluence-macro-demo.js"/>
<resource type="download" name="images/" location="/images"/>
<context>tutorial-confluence-macro-demo</context>
</web-resource>
</atlassian-plugin>
8) Starting confluence
atlas-clean
atlas-package
atlas-debug
9) Logged into confluence
Here the result of the confluence administration page
And now I can find it also in the macro browser and it works
Thanks
Hauke
working with atlassian plugins can be really frustrating :)
I checked out the macro source code from bitbucket and made the following changes in the pom
<properties>
<confluence.version>5.9.7</confluence.version>
<confluence.data.version>5.9.7</confluence.data.version>
<amps.version>6.2.4</amps.version>
<plugin.testrunner.version>1.1.1</plugin.testrunner.version>
</properties>
Then run
atlas-clean
atlas-package
atlas-debug
After that I was able to add the macro with the macro browser (with a confluence 5.8.6 instance).
Unfortunatly I had no time to check the differences between the source code and the tutorial, but my solution will give you a working state to try new stuff.
Your image is displaying ${atlassian.plugin.key}. Is your Macro add-on working properly. It is displayed in the macro browser but can you use it on the page? I also noticed you commented out atlassian.plugin.key in your pom.xml.
The use of <Atlassian-Plugin-Key> here tells the plugin system that you are a transformerless plugin and that it should skip the slow transformation step. This is VERY IMPORTANT. Without this entry in your Manifest, the plugin system will try to transform your plugin, and you will lose the load time speed benefits. You are also likely to see Spring-related errors. Do not forget to specify this entry.
See: Atlassian Spring Scanner
The new way of importing components is to use Atlassian Spring Scanner. It looks like your mixing the old and new way of importing components by commenting out atlassian.plugin.key.
Check out: Build a Macro Add-on
Confluence examples: Confluence Add-on Development examples