I have a Gradle project that declares a test-only dependency on an XML data file, and then loads the file from the classpath. When I run the tests directly in Gradle from the command line, everything works fine, but when I run "gradlew eclipse", refresh the project in Eclipse, and then try running the test from Eclipse (Debug As -> JUnit Test), the test fails because it's unable to find the XML file and the classpath (as accessed from the Properties context menu item on the process in the Debug view) shows no indication of the XML file being included on the classpath.
The behavior I'm seeing has some commonality with http://gradle.1045684.n5.nabble.com/gradle-junit-tests-resources-and-classpath-td4418753.html#a4420758, but Sean's problem was the reverse: his tests ran properly under Ant (but he never mentioned trying to run directly from the Eclipse JUnit plugin), but not under Gradle.
Here's the relevant part of build.gradle:
dependencies {
testCompile group: 'com.mycompany', name: 'MyConfigFile', version: '0.0.0+dirty', ext: 'xml' }
Because the only resources that URLClasspathLoader can load directly from the file system are JARs, I'm using the following static method to search the classpath for files that match the filename I need to load:
public static String getFullPathForResourceDirectlyOnClasspath(String nameFragment) {
ClassLoader cl = ClassLoader.getSystemClassLoader();
for (URL url: ((URLClassLoader)cl).getURLs()){
String fullPath = url.getFile();
if (fullPath.contains(nameFragment)) {
return fullPath;
}
}
return null;
}
I call that method as follows:
getFullPathForResourceDirectlyOnClasspath("/MyConfigFile-");
When I run that code from Gradle ("gradlew build"), it finds the file and my test succeeds. When I run it from Eclipse (Debug As -> JUnit Test), it fails to find that file on the classpath (because the Eclipse JUnit plugin doesn't put it there) and that call returns null.
I've tried changing the configuation from testCompile to compile to see if that made a difference, but it doesn't change anything (and perhaps tellingly, my .classpath doesn't have any entry for the XML file even when the compile configuration is selected).
Does anyone know of a way to make this work? Am I just missing something that should be obvious?
It seems that you are abusing the class path to pass a single argument (the absolute file path of the XML file) to a test. Instead, you should either put the XML file on the (test) class path in the correct way (it needs to go into a directory or Jar file that's listed on the class path) and load it correctly (e.g. with getClass().getClassLoader().getResource("some/resource.xml")), or pass the file path to the test as a system property. Naturally, the latter will be harder to make work for different environments (say Gradle build and IDEs).
Related
I'm trying to run a test that uses a resource in sbt. The test depends on a util that loads a resource via ClassLoader.getSystemResourceAsStream(...), which results unexpectedly in null. If I run the same test in Intellij or via bazel, the test succeeds. I additionally performed a test by creating a main class to list all resources according to the ResourceList example given in an answer to this question, and this confirmed that that file and many others were inaccessible at runtime.
These resources are not contained in the resources directories that sbt usually uses; they are contained in a jar file that is included in the lib directory. The sources in my project built by sbt depend heavily on this jar, and compilation is successful, so it appears the issue may be specific to resources. One thing I noticed is that the resource can be loaded if I use a ClassLoader object and call getResourceAsStream instead of using the static method ClassLoader.getSystemResourceAsStream.
Does anyone know how to resolve this issue (short of copying out all resources from the jar file in lib)?
Try running tests in forked JVM:
Test / fork := true
Unmanaged dependencies in lib/ should end up by default on all the classpaths, including test, without having to do anything special:
Dependencies in lib go on all the classpaths (for compile, test, run,
and console).
Note, when running tests in IntelliJ you might be using IntelliJ's internal build system as opposed to SBT shell, which could be the reason why it worked in IntelliJ. To run tests in IntelliJ via sbt shell, select Use sbt checkbox in Edit Configurations...
I would suggest using getClass.getResourceAsStream, as this anyways uses system ClassLoader as fallback:
public InputStream getResourceAsStream(String name) {
name = resolveName(name);
ClassLoader cl = getClassLoader0();
if (cl==null) {
// A system class.
return ClassLoader.getSystemResourceAsStream(name);
}
return cl.getResourceAsStream(name);
}
I have a simple example
public class FileSystemReadFile {
public static void main(String[] args) throws IOException {
System.out.println("Reading the file" + args[0]);
}
}
which is created in IntelliJ where I want to build JAR file; So what I did:
Added Artifact with dependencies (presumably I have some);
Ensure that MANIFEST.MF is located in src\main\resources\META-INF\ as it is already mentioned somewhere here on the site.
Run Artifact build which gave me JAR file in out folder and I run that jar file that said me "Could not find or load main class" java <name>.jar
You may see that main class is added into MANIFEST and location of manifest is also fine.
When I open that created JAR file, I see the same MANIFEST content, I see lots of dependency modules, but I don't see my class!
I suspect that is a cause. Any ideas?
If you include any signed JARs in your app and then use IntelliJ to build artifacts, it will extract the JARS and bundle them with your compiled output.
This then causes a JAVA security exception. I've seen this with Eclipse Paho and Bouncy Castle which are signed.
You can check if any of the library JARs you are using are signed using the jarsigner tool.
jarsigner -verify -verbose <path to library JAR>
Change your IntelliJ artifact setup so that these get bundled as libraries instead of being extracted. Extraction invalidates the certificate as you'd expect.
Try creating a dummy project with just Main. Add 1 library JAR (that you are trying to build with) at a time. Build an output JAR each time until Main breaks. That's how I found this.
IntelliJ should warn you.....
Not sure what was with IntellJ, but I rebuilded artifacts again and it was ok.
hadoop jar <Jar-name>
java -jar <Jar-name>
Everything is working fine.
I have imported a project (I am very new at this) and I get the following errors:
Project cannot be built until build path errors are resolved
Project FST is missing required library: 'C:program Files/Apache Group/Tomcat 4.1/common/lib/servlet.jar'
Project FST is missing required library: 'C:program Files/Apache Group/Tomcat 4.1/common/lib/struts.jar'
The project cannot be built until build path errors are resolved
Unbound classpath variable: 'TOMCAT_HOME/common/lib/jasper-runtime.jar' in project
Unbound classpath variable: 'TOMCAT_HOME' in project FST
I create a variable called TOMCAT_HOME and give it the proper directory ,but Also, we should change the project classpath to use TOMCAT_HOME rather than the absolute path.
i dont know how to do it (change the project classpath and the absolute path)
thanks !
To answer somewhat indirectly, if you configure your project build with something like Maven or Gradle, so that you can successfully build the project using the corresponding command-line tool, then it should be quite straightforward to import the project into eclipse using the Maven or Gradle eclipse plugin. I think doing so will be worth whatever trouble it causes you in the short run--just take care to make your project structure conform to the usual project structure that Maven expects or you'll be asking for trouble (it should be no problem to do so for greenfield work).
I want to generate an eclipse project from a pom file that will ignore resources (code and libraries) that are not required for the compilation (like tests resources...)
Oh, and I can't modify the original pom file.
For now now, I use:
mvn org.apache.maven.plugins:maven-eclipse-plugin:eclipse
And I get a project with the test sources and dependencies included.
Is there some parameter or config file that I can pass to that command to exclude the unwanted stuff?
Update I don't want to have to use m2eor change my eclipse configuration.
I'm altering a hadoop map - reduce job that currently compiles and runs fine without my changes.
As part of the job, I will now be connecting to S3 to deliver a file.
I developed a (very simple) s3Connector class, tested and ran it in eclipse,
then went to hook it into my reduce job. In order to run the job in hadoop, I have to export the project as a jar file, then call it from hadoop. The jar file seems to compile and export without problem from eclipse, but when I run it in hadoop, I get a java.lang.VerifyError exception.
java.lang.VerifyError: (class: com/extrabux/services/S3Connector, method:
connectToS3 signature: ()V) Incompatible argument to function
Several other posts mention that there may be jar version dependencies that are conflicting, but in my eclipse build path, I added all the latest jar files for the specified libs, and pushed them to the top of the build path order.
This is about as simple as I can isolate it down to:
import org.jets3t.service.impl.rest.httpclient.RestS3Service;
import org.jets3t.service.security.AWSCredentials;
public class S3Connector {
protected RestS3Service s3Service;
protected AWSCredentials awsCredentials;
public S3Connector()
{
this.awsCredentials= new AWSCredentials("my secret 1", "my secret 2");
}
public void connectToS3() throws Exception
{
this.s3Service = new RestS3Service(this.awsCredentials);
}
}
Even that simple class will die.. Same message. As soon as I comment out the AWS credentials in the constructor and RestS3Service, the issue disappears. Basically, I think it's some kind of library export problem out of eclipse, but not sure how to find it.
Figured this out. There was an old version of the jets3t jar that was in the hadoop lib dir
the hadoop command line script loops over all jars in the lib dir and physically adds them to the classpath on the final exec'ed command line command that it builds. This command line classpath of the 0.6.0 jar was overriding the good 0.8.0 jar that I was exporting in my jar file. Since the 0.6.0 version did not have the specified constructor for RestS3Service , the java.lang.VerifyError was getting thrown. By removing the 0.6.0 lib from hadoop, all was well.