I have started with data nucleus a couple of days ago.
I have downloaded the jdo tutorial and am trying to run it.
I have collected all the jar files related to the tutorial here .
I am using ant for building. "compile" and "enhance" tasks work fine. The "createschema" task is spitting the following error out!
C:\datanucleus\datanucleus-samples-jdo-tutorial-3.0\build.xml:123:
taskdef clas
org.datanucleus.store.rdbms.SchemaToolTask
cannot be found
I have checked datanucleus-rdbms.jar for the SchemaToolTask and I didn't find that class in the jar file.
I downloaded it from here .
Why is that class not there?? Am I using the wrong jar file?
Why would you download 3.0 and use it with a tutorial for v 1.0?
In version 3.x the location of the SchemaToolTask changed from:
org.datanucleus.store.rdbms
to:
org.datanucleus.store.schema
reflecting the fact that the Schema tool is datastore agnostic.
Found that I was using source jar files instead of the binaries.
[Closing the question by marking this as the answer]
Related
I am attempting to work through the N+1 TipCalc tutorial & get the following error when trying to compile the core project:
Error 1 Metadata file '1\packages\MvvmCross.HotTuna.CrossCore.3.0.4\lib\portable-win+net45+MonoAndroid16+MonoTouch40+sl40+wp71\Cirrious.CrossCore.dll' could not be found C:\Source.CS\trunk\Learning\Mono\TipCalc1\TipCalc.N=1\TipCalc.Core\CSC TipCalc.Core
Error 2 Metadata file '1\packages\MvvmCross.HotTuna.StarterPack.3.0.4\lib\portable-win+net45+MonoAndroid16+MonoTouch40+sl40+wp71\Cirrious.MvvmCross.dll' could not be found C:\Source.CS\trunk\Learning\Mono\TipCalc1\TipCalc.N=1\TipCalc.Core\CSC TipCalc.Core
Error 3 Metadata file '1\packages\MvvmCross.HotTuna.CrossCore.3.0.4\lib\portable-win+net45+MonoAndroid16+MonoTouch40+sl40+wp71\Cirrious.MvvmCross.Localization.dll' could not be found C:\Source.CS\trunk\Learning\Mono\TipCalc1\TipCalc.N=1\TipCalc.Core\CSC TipCalc.Core
NuGet 2.5 RC was installed from here https://nuget.codeplex.com/
Profile104 xml files created as per instructions & have been used for a few weeks with V3 prior to attempting to use NuGet for the MvvmCross components.
Pre NuGet TipCalc tutorial was accomplished without problems.
Downloading the complete project from GitHub also compiles without issues.
I haven't been able to see any difference between the packages.config file in the downloaded project and the project being created from scratch.
The profile of the core project is also the same in each case.
The dll's are at the location indicated.
There were no issues with using NuGet to download the package, only when attempting to compile the project.
Any suggestions as to what I have missed?
TIA
My guess is that the problem is in your path
The path seems to be C:\Source.CS\trunk\Learning\Mono\TipCalc1\TipCalc.N=1\TipCalc.Core\CSC
And the message is that '1\packages\MvvmCross.HotTuna.CrossCore.3.0.4\lib\portable-win+net45+MonoAndroid16+MonoTouch40+sl40+wp71\Cirrious.CrossCore.dll' could not be found
So maybe try using a path without an =?
does anybody know of a good tutorial for getting started with NQUnit.NUnit.
I've installed it to my test project via nuget and am unsure what the blank.js and async.js files are all about, should I rename these to match my files under test or do I just add my asyncronous and syncronous tests to the respective files.
ta!
Find the answer in the following link:
NQUnit: JavaScript testing within .NET / CI
I have some scala code I've written using IntelliJ with the SBT-Plugin and want to provide me code as an DLL for C++.
I already tried to use 'ikvmc': I packed all my classes via 'package' in one jar. Afterwards I manually set up one jar which contains all the dependencies I use (scala-library,scama,jamtio,jama). Unfortunately i obtain a lot of warnings:'IKVMC0119', "Emitted java.lang.VerificationError' and 'IKVMC0104' (analogously to the example below)!
Then i tried to convert a simple scala-class (no dependencies) using the method described above : package with sbt, add the scala-library.jar and try to convert it via ikvmc -target:library simpleClass.jar . I obtain the same warnings/errors as you see below...
I would be very happy if someone could give me a step-by-step explanation how to provide my Scala-code as an DLL.
Thanks a lot in advance!
Which IKVM version do you use?
If you already use 7.1 then it sounds like a bug in IKVM. Contact the mailing list or the bug list with a sample for reproduction.
If you use an older version then you should update.
After converting the hello.jar with the previous version of IKVM ('7.0.4335.0') i could use the dll in c# (even though i obtained warning from ikvmc). It also worked for my Scala code: converting the sbt-packaged jar with its dependencies delivered a dll. Afterwards i could use the classes in c#!
thanks in advance for attention.
It's the first time i am writing on this site (quite newbie :) )
I previously read question of a user asking for my same problem. Although i read a lot i could not find a solution.
Problem:
I am trying to use MatlabControl jar (http://code.google.com/p/matlabcontrol/) to "call" matlab within my java code.
When i try this api within a normal java application to try it (including matlabcontrol.jar in the buildpath) everything works perfectly.
My issue is to make it work on an ejb module with jboss 5 AS:
i can deploy the ejb module and i can see classes of the matlabcontrol.jar (which i put in server/default/lib folder), but it is not working and is returning me the following exception:
Caused by: java.lang.NullPointerException
at java.io.File.<init>(File.java:251)
at matlabcontrol.Configuration.getSupportCodeLocation(Configuration.java:227)
at matlabcontrol.RemoteMatlabProxyFactory.createProcess(RemoteMatlabProxyFactory.java:278)
at matlabcontrol.RemoteMatlabProxyFactory.requestProxy(RemoteMatlabProxyFactory.java:116)
at matlabcontrol.RemoteMatlabProxyFactory.getProxy(RemoteMatlabProxyFactory.java:134)
at matlabcontrol.MatlabProxyFactory.getProxy(MatlabProxyFactory.java:81)
that lead me to the following lines:
URL url = Configuration.class.getProtectionDomain().getCodeSource().getLocation();
File file = new File(url.toURI().getPath()).getCanonicalFile();
The very strange thing is that very very rarely, after restarting jboss and re-deploying the ejb module, the system works!.
I really don't know if i have to modify the source code of these last 2 lines (as if it is a problem of not properly getting the location of the jar code) or to set some configuration files of jBoss to set the classpath differently.
Thanks again in advance.
Any help would be very appreciate.
The mistake was in the code to find the location of the jar at runtime.
I printed the path, which ended with "!" and double slash, so I removed these chars and I was finally able to make this API work.
I am trying to run MapReduce jobs using hadoop-eclipse plugin with Eclipse Indigo, but I am getting the following error:
Error: failure to login
While looking for some help, I found there is a problem with Hadoop-0.20.203.0, so I tried Hadoop-0.20.205.0 as the issues are fixed in this version.
I am still facing the same problem. Am I missing something or making a mistake?
Sorry for my poor English, as your question has no more detail, I guess that you meet the same problme as me, if so, the following link resolved my problem, pls. pay attention to step "4".
http://hi.baidu.com/wangyucao1989/blog/item/279cef87c4b37c34c75cc315.html
Sorry for that is a page in Chinese. It said the problem is because the file hadoop-eclipse-plugin-0.20.203.0.jar lost 5 files "commons-configuration-1.6.jar , commons-httpclient-3.0.1.jar , commons-lang-2.4.jar , jackson-core-asl-1.0.1.jar 和 jackson-mapper-asl-1.0.1.jar ". You should:
Extract the "hadoop-eclipse-plugin-0.20.203.0.jar",
Add the 5 files into "hadoop-eclipse-plugin-0.20.203.0\lib" ,
Modify "hadoop-eclipse-plugin-0.20.203.0\META-INF\MANIFEST.MF" (modify the Bundle-ClassPath).
Re 'jar' the package and replace the old "hadoop-eclipse-plugin-0.20.203.0.jar".
The os the page referred is linux, my os is Win7.
good luck!
Instead of going for adding plugin u can just add the required libraries in eclipse and do your programming.
here is the list of library u will need. These files exists with the Apache hadoop distribution in lib folder.
hadoop-core-1.1.2.jar
log4j-1.2.15.jar
jackson-mapper-asl-1.8.8.jar
jackson-core-asl-1.8.8.jar
commons-logging-api-1.0.4.jar
commons-logging-1.1.1.jar
commons-lang-2.4.jar
commons-httpclient-3.0.1.jar
commons-configuration-1.6.jar