I have run into a compilation error:
[warn] Class com.google.api.client.auth.oauth2.Credential not found - continuing with a stub.
[error] error while loading GoogleService, class file '....../gdata-core-1.0.jar(com/google/gdata/client/GoogleService.class)' is broken
I found this similar question but couldn't successfully use its solution for my case. How can I trace what's actually broken in this jar as per the Scala compiler (i.e. get the details of what's actually broken), to make sure what the solution aught to be?
The source where I think this Google jar builds from, is here.
Note: unlike the other questions, in my case, the google jars are included unmanaged in my project, in the lib directory, as per these google setup instructions.
As described in your linked answer, this error happens when a class contains some annotations which are no longer present on the classpath. Java considers this acceptable, but Scala does not. You'll generally only run into this problem with heavily optimized java libraries where they've deliberately excluded the "unnecessary" annotations. Google does this for a lot of their code; to be perfectly honest I don't think I've ever seen the problem in any non-google libraries.
The pragmatic solution is to use advanced search on maven central to find the jar that contains the missing class. If gdata-core had been built with maven (as most serious java libraries are these days), it would be easy to see from the <dependencies> section of pom.xml which dependencies had been declared optional, and therefore figure out where any classpath problems of this kind were likely to be. Unfortunately this particular library is still built with ant, so it's hard to determine the build classpath without reading the whole build.xml and figuring it out "by hand".
Had to find a google supplied jar file where the class mentioned in the warning (class Credential) is contained, and stick it in my lib directory used by sbt.
With really a lot of heuristic, it turned out to be the jar file google-oauth-client-1.18.0-rc.jar which I obtained here and equally exists here, after figuring that the source file that the error is for, simply does not define that class itself but rather imports it from a different package com.google.api.client.auth.oauth2. I guess the latter package must have been present at compile time, and its compiled class is necessary for Scala being able to use the classes contained in the former jar, at least when Scala is involved.
Not exactly sure how the build system at google produced all of this, and how to pin down the annotations that made the additional jar required for Scala, but the missing class is no longer missing for Scala.
Hope someone would chime in to provide a deeper answer, as to how to pin down the details of a class being broken for scala, and how to trace back where to obtain it sans my heuristic search.
Related
The Jetbrains Kotlin compiler in Eclipse outputs to a hidden folder inside the Eclipse compiler plugin. This hidden folder is then made available through the Eclipse Kotlin classpath container.
In bndtools we need a normal file system folder since bnd can run both from the file system as well as in Eclipse. Since the folder is a linked resource there is no known way to translate it outside Eclipse.
Anybody knows how to tell the Kotlin compiler to just output it in the bin folder?
Currently, this is not possible in the Kotlin Eclipse plugin.
To make it possible that Kotlin code can be used from Java, Kotlin plugin produce so-called lightweight class files to this folder. These class files do not contain bodies for methods and they are stored in memory.
Actual class files, that are used to run an application, are being built only before launch and they are produced to the default output folder. For now, we cannot produce class files on each save reasonably fast as there is no incremental compilation in the plugin yet:
Feel free to upvote for this issue.
From the short analysis of the code of Kotlin plugin, it looks like the proper method is KotlinCompiler.compileKotlinFiles. It is being called in two contexts:
KotlinBuilder.build — this is the one called on the project build; it makes a call stack trick (or rather a hack...) to check if being called from the LaunchConfigurationDelegate, and depending on the results, either compiles whole project (via its own private fun compileKotlinFiles), or just makes stubs in memory.
KotlinCompilerUtils.compileWholeProject — this is in fact being called from 1.; nice static method, perfect for abuse until the problem is correctly solved in the plugin. :)
So, I'd use the method from 2. wrapped in a similar way as compileKotlinFiles from file in 1.
I'd like to write a plugin that makes its code available to projects that uses the plugin.
The plugin would be defined as follows:
package mypackage
object MyPlugin extends sbt.Plugin {
...
}
trait MyInterface {
...
}
A client code should be able to export and instantiate mypackage.MyInterface to make possible for plugin distinguish MyInterface instances during parsing Analysis API info.
I should add that I would like to create separate config for doing some code testing (existing test are not suited for me) and plugin would be exported only to this config's classpath.
If someone would like to ask if this approach is legitimate I answer that sbt itself uses this method for working with plugins. I've found almost no documentations for writing sbt plugins and was forced to peek inside the sbt code. There I found similar cases and some hints. But the code is too complicated full of macroses and DSL with lack of documentation strings, so I grasp only part of it.
My rather limited knowledge about sbt lets me argue about the merits of your question and what you're going to achieve.
Since a plugin is a part of the build and should (merely) help the project's artifact(s) see the light of production release, it should in no way be the artifact's dependency as you'd have to release the dependency (that's a sbt plugin in its current form) so others would be able to download it, too.
What you'd rather do is to have a plugin that wraps a dependency and when included in a build makes the dependency a dependency of your project. That's acceptable in my opinion.
How much do I differ from your use case? I'd happy to refine the answer after having heard some additional information.
this is my first post here. I'm currently working on a simple http audio servlet in Scala on Apache-Karaf 3.0.0. I'm deploying it as a feature from inside some bundles, which I've built using a maven project. I'm using the 'javax.sound.sampled' library to get the audio, and I'm loading the file from the AudioSystem with 'java.io.File'.
val file = new File("audioFile.wav")
val audioStream = AudioSystem.getAudioInputStream(file)
This is obviously not the actual code, as I've stripped out all of the trivial bits. But this is where it fails, on the 'getAudioInputStream' call.
When I deploy this code to Karaf it fails with 'UnsupportedAudioFileException'. The file does exist, and is readable, I've already validated this. Also, I've made sure that this code can be run under the following.
- Scala 2.10.2, 2.10.3
- Java 1.7.0_45 ( This is the same JRE as my Karaf program is using )
- SBT 0.12.4 ( With the different Scala versions )
The only place this fails is when I deploy it to Karaf. I don't know if Karaf has cut out some random audio support, or what is going on, because this otherwise works when deployed through SBT or using the Scala command line. I've also looked into alternative libraries, but to no avail. Most other solutions seem to be based around actually playing the audio through a sound driver, which is useless to me. I need the actual byte data.
Also, keep in mind that just sending the file over is also useless me. Another requirement is that I need to be able to be able to merge multiple audio files in to one seamless audio stream. I already have this done, I just need to port it to OSGi, and for some reason I am now getting this error. I don't know if Karaf has something to do with it, or if my building it through a Maven project has broken something. I've looked around, and have found very little hint as to where the problem might be.
The audio files I'm using are of Waveform audio. 8,000 sampling rate, 16 bits per sample. I don't think this would actually make a difference, but I'm no expert on audio formats.
My pom.xml dependencies are as follows. The only plugin I'm using is the Scala compiler, and of course my root pom.xml is using the org.apache.felix maven-bundle-plugin. There's really not much magic going on here, yet the mystery remains.
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.10.3</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
<version>2.5</version>
</dependency>
Any clues would be greatly appreciated, thank you.
I think AudioSystem is not fully OSGi ready. This is what I found in the Aries Spy Fly dcoumentation.
Not sure what exactly you have to do to make it work but this might help.
https://aries.apache.org/modules/spi-fly.html
Special Cases
SPI Fly can be used for most SPI provider/lookup systems that use the TCCL pattern to obtain implementations. However in some cases, some special treatment is needed. This special treatment is often needed when the API itself does not match the name of the resources in META-INF/services, java.util.ServiceLoader is such a case, however SPI-Fly has built-in knowledge of ServiceLoader. Known APIs that require special treatment are listed below:
javax.sound.sampled.AudioSystem: This class uses sun.misc.Service under the covers (via com.sun.media.sound.JDK13Services) which is a predecessor to java.util.ServiceLoader. There is no special treatment for sun.misc.Service in SPI Fly (yet), but the AudioSystem.getAudioInputStream() API can be made to work by explicitly listing it in the provider bundle (the one that contains the relevant META-INF/services resources): SPI-Provider: javax.sound.sampled.AudioSystem on the consumer side you can use SPI-Consumer: javax.sound.sampled.AudioSystem#getAudioInputStream
Christian's answer is correct but I wanted to provide an updated link to the spifly documentation page. Specifically:
Java's java.util.ServiceLoader.load(), other similar methods such as
sun.misc.Service.providers() and also other static finder methods such
as the FactoryFinder.find() methods try to locate 'service'
implementations by looking for resources in the META-INF/services
directory of all the jars visible to the Thread Context ClassLoader
(TCCL).
There are a number of issues with the above mechanisms when used in
OSGi:
The Thread Context ClassLoader is not defined in general in an OSGi context. It can and has to be set by the caller and OSGi cannot
generally enforce that.
A bundle can't Import-Package META-INF/services as potentially many bundles will contain this pseudo-package and the OSGi framework
will only bind a single exporter to an importer for a given package.
Instantiating an SPI provider generally requires access to internal implementation classes, by exporting these classes an
implementing bundle would break its encapsulation.
Even if an implementation class was exported, importing this class in a consumer bundle would bind it to the specific implementation
package provided, which violates the principle of loose coupling.
Bundles have a dynamic life-cycle which means that provided services could disappear when a bundle is updated or uninstalled. The
java.util.ServiceLoader API does not provide a mechanism to inform
service consumers of such an event.
The SPI Fly project makes it possible to use existing code that uses
ServiceLoader.load() and similar mechanisms under OSGi.
Please note that as of 2016-05-20 new versions of the com.googlecode.soundlibs artifacts were uploaded to the maven central repository. These new versions of the artifacts are proper OSGi bundles. This will help everyone who needs to use the Java Sound API within an OSGi container
I created a simple example project on github that demonstrates how to play an MP3 file inside an OSGi container using the Java Sound API. Checkout the branch static-weaving and dynamic-weaving to see the respective solutions.
I got a little problem. I want to use hibernate in an eclipse rcp. (i'm new to osgi and eclipse rcp). So I added the jar into the plugin-project folder and the build path and the bundle build path, but when I try to use hibernate from my bundle, it crashes with a ClassNotFoundException.
What is the proper way to do this?
Pls look at eclipse buddy policy. This might help you if you are facing class not getting loaded because of osgi classloading.
Hibernate, and many other classic Java programs (ab)use dynamic class loading to to connect the different parts. They classes they use are read from a file and then loaded with Class.forName. This is fundamentally not-modular since these classes are by definition implementation classes, which should be hidden.
Since OSGi is a modularity framework it puts fences around a module (a bundle) and refuses to load anything that is not properly exported and imported. So if Hibernate does its Class.forName it will run right into this fence, as it should be to get the advantages of modularity.
Eclipse Buddy policy is like a huge hole in this fence, moving things back tot he bad old classpath: linear search. With a buddy policy, Eclipse will just start searching if there is a class somewhere that has that name. Since this ignores versions, you can no longer rely on proper version handling. The good news is that it works most of the time. The bad news is that you loose privacy and when it does not work you get weird errors.
With Hibernate, only solution is to not use the text file setup but use the API and give Hibernate the actual classes. In those cases Hibernate will use the class loader of those classes and that works. In OSGi, as long as you follow the Java language rules there are no problems.
To handle the kind of problems that class loading hacks address OSGi uses services.
I'm sure I'm just being obtuse but I've bought the OA book, a couple of others and I'm still just as dense as before....I'm trying to build a scala library with maven and the scala plkugin, but I think this applies to Java as well.
It has no main code module, it's just a library. If I have a class such as com.busygeeks.binklebots and source files under it, I created
src
scala
com
busygeeks
binklebots
sourcefies.....
When I try a maven:compile, it completes successfully, but doesn't actually build anything.
I know it's very basic -- but I'm missing it. How can I say "Take everything under src/scala and build and jar it"
It looks like you might just need a main directory in there between src and scala.
If you really wanted to, you could specify a custom layout with the java and scala directories immediately under src. But you almost certainly don't want to, for the reasons given in the Maven documentation linked above.