I have library jar file in IBM COS which I want to add to a scala notebook in watson stdio. I want to use %addJar but not sure what should be the URL to access the object.
When I right click the object I get URL "cos://" which addJars does not recognize. Thanks
I’m assuming you are using a python notebook using the Apache Spark service and the jar file is in your project’s cos bucket (please update your question if these assumptions are incorrect).
One option is to use project-lib to download the jar and write it to the libs folder on the spark service:
from project_lib import Project
project = Project(sc,"<ProjectId>", "<ProjectToken>")
# Get the file
mem_jar_file = project.get_file("your.jar")
mem_jar_file.seek(0)
with open('~/data/libs/your.jar', 'wb') as f:
f.write(mem_jar_file.read())
Update
If you are using Scala, one option is to:
Use the IBM Cloud Object Storage java library to retrieve the file
Write the jar file to appropriate folder
Related
I've built a jar that I can use from pyspark by adding it to ${SPARK_HOME}/jars and calling it using
spark._sc._jvm.com.mypackage.myclass.mymethod()
however what I'd like to do is bundle that jar into a python wheel so someone can pip install a the jar into their running pyspark/jupyter session. I'm not very familiar with python packaging, is it possible to distribute jars inside a wheel and have that jar be automatically available to pyspark?
I want to put a jar inside a wheel or egg (not even sure if I can do that???) and upon installation of said wheel/egg, out that jar in a place where it will be available to the jvm.
I guess what I'm really asking is, how do I make it easy for someone to install a 3rd party jar and use it from pyspark?
As you have mentioned above, and hope you have already used the --jars option and able to use function in pyspark. As understood your requirement correctly you want to add this jar in install package so that jar library will be available on each node of cluster.
There is one source found on databricks which talks about adding third party jar files pyspark python wheel install. See if that is only information you are looking at.
https://docs.databricks.com/libraries.html#upload-a-jar-python-egg-or-python-wheel
I am looking for a tool that reads the java file with swagger annotation and generates that API JSON spec file? Please do let me know if there is already one.
You can use swaggerdocgen Maven plugin to generate Swagger documents in the both JSON and YAML formats. This is assuming that you are able to use Maven for your application.
You can find more information about swaggerdocgen plugin here: https://github.com/WASdev/tool.swagger.docgen
Basically, you need to create a WAR Maven project and add your application files there. Then as part of running mvn package, the Swagger document will be generated in the 'Target' directory of your project.
I want to make a custom buildpack on bluemix, as part of it I am trying to add my own jar file as a javaagent. I used to work with Tomcat where I just added the extra agent to the catalina.sh script.
On bluemix those are the steps I took:
I create new project and uploaded my code.
I cloned the default java buildpack to my own git repository.
On the repository I added the .jar file on /lib/java_buildpack folder.
Now is the step I have trouble with, I located the:
java_opts.add_javaagent(#droplet.sandbox + 'javaagent.jar')
function call which according to the comments should so exactly what I am looking for.
the issue is that when I check the function I see that it calls the following function:
qualify_path(path, root = #droplet_root)
"$PWD/#{path.relative_path_from(root)}"
I cant figure out where is this #droplet_root position is, if I could find it I could upload my jar file there.
I tried adding the relative position like this:
java_opts << "java_buildpack/myAgent.jar"
But it didnt work.
Any suggestions on how it might be achieved? where should I place the file or is there any other way?
Forking the buildpack is one way to achieve this. You can implement this as a "framework" in the Java buildpack. Here are a few samples you can refer to which also adds an agent jar:
https://github.com/cloudfoundry/java-buildpack/blob/master/lib/java_buildpack/framework/new_relic_agent.rb
https://github.com/cloudfoundry/java-buildpack/blob/master/lib/java_buildpack/framework/jrebel_agent.rb
Another little hacky way is to simply add the agent jar to your application package, and then add a Java option to enable the agent, using the JAVA_OPTS environment variable. That requires you to find out the path where the agent jar ends up in the running application container. You can browse to it by using "cf files". This will have a dependency on the internal structure of the droplet so it may get broken if the buildpack changes the droplet structure.
I am currently learning hadoop 2.5.
In order to modify some part of hdfs , I check out the HDFS project from Hdfs resposity , but after importing to eclipse, the complier cannot find the package "org.apache.hadoop.hdfs.protocol.proto". This package is also empty in the SVN.
Any solutions?
Please follow the build process described in the BUILDING.txt. The folder that you're missing are the protobuf files that are generated during the usual maven build.
I have a java based appengine endpoint project in eclipse.
When I generate client library using command line tool.
https://developers.google.com/appengine/docs/java/endpoints/endpoints_tool
I'm getting only source based jar file ('project_name_version'.java-1.18.0-rc-sources.jar). It does not work fine in Android Studio when I add as a Library.
How can I get class based jar client library (google-api-services-'project_name_verison'-1.18.0-rc.jar)?
I tried searching online but no luck yet.
You could always zip up the sources file and use them in Android Studio. However , note that in the build.gradle file, you will have to reference the other dependent JAR files + versions that will be needed by the sources that you have generated in Eclipse via the Generate Cloud Endpoint Library option.
Build your app engine back end with JRE 7. You can change this from windows->preferences->java->installed JREs. You'll find an Add button at the right side of the pane. For more detail refer this Tutorial
This will solve most of your problems.