I am beginning work on development/maintenance of a J2ME MIDlet application that uses the Nokia N97 SDK. As a first step (I have never developed J2ME/Java applications before), I completed the following steps on Eclipse 3.6.0 Helios:
Imported the project using the following step:
File -> Import -> Existing Projects into Workspace
Select root directory as TeleDB1 (the name of the directory that contains the files).
When I click on Finish after these steps, Eclipse automatically builds my new workspace. This step is unsuccessful. I get the following errors:
Errors running builder 'Preverification' on project 'TeleDB1'.
org.objectweb.asm.ClassReader.accept(Lorg/object/web/asm/ClassVisitor;Z)V
This is what I have tried:
1. Right-click on the project, Properties -> Java Build Path shows the following libraries:
a. org-netbeans-modules-mobility-antext.jar - missing
b. J2ME Library (failed to get library information).
These are my questions:
1. How can I get this project to compile?
2. How do I resolve the errors in the Libraries/Build Path?
I think I have set up my environment for running J2ME applications correctly.
I have situation as yours, and I find the way to solve it.
Right-click on the project---->properties---->j2me---->device---->manage device,choose a device you like.
Related
I'm working on a JavaEE project, I already have the skeleton of an old project. All i have done is rename the project, packages and change the name in the project file (.project). But now when i'm trying to run, it returns error in the browser "The requested resource is not available". I've checked the web.xml and added a new jsp in vain it doesn't work. In the browser's address, it gives me the old name of the project(http://localhost:8061/smsgate/). I've tried a lot to fix the problem. what can i do? Is there any other file that i have to update and write in the new name? Please give me your ideas (knowing that i have to use that old project)
Do Project -> Clean in your workspace by selecting your current project. Also if using maven then right click your project and do maven -> clean.
Manually go to your target directory of your ear module and shift-delete everything inside your target and then rebuild your project to form new ear.
if you have configured the web.xml properly then strike your app url. Also ctrl+shift+delete your browser history and then check.
Why not create a new project using a maven archetype, just type in the console:
mvn archetype:generate
It will show you a list of available archetypes from the web, just choose one of them, for JEE6/JEE7 you could choose among:
1019: remote -> org.codehaus.mojo.archetypes:webapp-javaee6 (Archetype for a web application using Java EE 6.)
1020: remote -> org.codehaus.mojo.archetypes:webapp-javaee7 (Archetype for a web application using Java EE 7.)
If you agree with me, just type:
1019 or 1020 according to your needs and answer the questions like:
projectId:your-project-name
groupdId:com.yourdomain.reverse
version:1.0
I have installed spark-hadoop env in my Red Hat 64. And I also want to read and write code in spark source code project in intelliJ idea. I have downloaded spark source code and make everything ready. But I had some errors when compiling spark project in IntelliJ idea.
Here are errors:
/home/xuch/IdeaProjects/spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/CatalystQI.scala
Error:(809, 34) not found: value SparkSqlParser
case ast if ast.tokenType == SparlSqlParser.TinyintLiteral =>
Error:(812, 34) not found: value SparkSqlParser
case ast if ast.tokenType == SparlSqlParser.SmallintLiteral =>
... ...
But actually I did not find a file named SparkSqlParser.scala in the whole project neither a scala class named SparkSqlParser.
However, I had searched the web for some files named SparkSqlParser.scala, but they don't have attribute like "TinyintLiteral", "SmallintLiteral", etc.
Here are the files link:
https://github.com/yjshen/zzzzobspk/blob/master/sql/core/src/main/scala/org/apache/spark/sql/SparkSQLParser.scala
https://apache.googlesource.com/spark/+/c152dde78f73d5ce3a483fd60a47e7de1f1916da/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/SparkSQLParser.scala
I meet the same problem. Here is my solution:
Download the antlr4 (i.e. antlr v4) plugin of IntelliJ. Then you can see the file "spark-2.0.1\sql\catalyst\src\main\antlr4\org\apache\spark\sql\catalyst\parser\SqlBase.g4" can be recognized by IntelliJ IDEA.
Navigate to View->Tool Windows->Maven Projects tab. select the project "Spark Project Catalyst". Right click on it. Then select "Generate sources and update folders"
After that you can see some files added into the "spark-2.0.1\sql\catalyst\target\generated-sources\antlr4"
Then you can build success of the project.
Hope it can help you.
None of the advice here worked for me. I noticed, however, that the generated code depends on Antlr 3.x while Antlr 4.x is what is in the dependencies (mvn dependecy:tree). I don't know why this was the case. Maybe because I had earlier built it from the command line (?).
Anyway, try cleaning your Catalyst sub-project then rebuild the autogenerated sources. To do this in IntelliJ, go to View -> Tools Window -> Maven Projects.
Then navigate to the "Spark Project Catalyst" in the "Maven Project" tab.
Navigate to clean -> clean:clean and double click it. Navigate to Plugins -> antlr4 -> antlr4:antlr4 and double click it.
Now, you'll see the autogenerated sources of the Antlr classes are different and they should compile. YMMV.
1) First build your Spark from command line using build instructions given in http://spark.apache.org/docs/latest/building-spark.html#building-with-buildmvn
2) Then check
$SPARK_HOME/sql/catalyst/target/generated-sources/antlr3/org/apache/spark/sql/catalyst/parser folder.
Some of the generated classes like SparkSqlLexer.java is there.
List of classes it generates are
SparkSqlLexer.java[enter link description here][1]
SparkSqlParser.java
SparkSqlParser_ExpressionParser.java
SparkSqlParser_FromClauseParser.java
SparkSqlParser_IdentifiersParser.java
SparkSqlParser_KeywordParser.java
SparkSqlParser_SelectClauseParser.java
3) Open Module Settings. Click on spark-catalyst module. Go to Source tab in the right. Make target/generated-source as a source folder.
I also faced similar problem when I updated my fork to latest master. Unfortunately, could not find a way to make it work from IDEA. What I did is compiled the project from command line. It generated the antlr classes which is required. I then added the generated-source target/generated-source/antlr as source directory. After that I could run tests from Idea. Ideally Idea generate source should have generated the code. Need to check more why it did not. May be because I have maven3.3.3 configured.
I have did as the intruction from Rishitesh Mishra and get stuck in the first step. I have always errors when executing "build/mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests clean package".
I have both tried on source code from https://spark.apache.org and fork on github.
I have attached the log screenshot in a new answer as following.
error log image
I am using libgdx to make a ios game with RoboVM. I am using the latest versions of LibGDX, RoboVM and my Eclipse is up to date.
Recently I have been trying to add Google Analytics thanks to the Robovm bindings.
I have manually imported the analytics project in Eclipse:
with File -> Import -> Gradle project
Everyting works fine, I can import and use the classes in my ios project.
But then if I right click my ios project -> Gradle -> refresh all, the build is succesful but it removes the analytics project from the java build path. As a result, when I try to compile my ios project from a terminal with a command line, it doesn't compile since it doesn't fine the analytics classes. I am using "./gradlew -Probovm.device.name=myiPhone launchIOSDevice --stacktrace"
I guess there is a setting or property in Gradle or Robovm that I should change, anybody has an idea?
Answering to myself, it might help others... I ended up creating my own jar from the analytics project, and added it as a library in a separate folder. As described here: https://github.com/BlueRiverInteractive/robovm-ios-bindings/blob/master/README.md
This way every new Gradle build correctly links it to my ios project.
If I create a test class I can't run it from eclipse until I have run it via maven on the command line first. My project build path output folder is pointing to project/target/classes. And build automatically is checked in the eclipse.
Anyone know why eclipse doesn't create the classes automatically?
Here's how the layout and build path of your project should look like.
Layout
Build path
In addition to what Marcel said, someone who runs into the same problem should look for dependencies between projects within Eclipse.
I was working with project A, which depended on project B. Project A was built and run normally, except for the problem mentioned in this post (run a JUnit class test). After digging for a while, I noticed that Project B had a build error (a source folder went missing, who knows why) and Eclipse was not building the project B, which caused an error on project A.
As soon as I fixed the project B build error, the problem with JUnit class was solved.
I am currently trying to get my headless pde-build working but I am stuck on a point where I do not know how to continue.
The problem is how to define the related target platform to compile the plugins against.
I have a build.bat with the following call (all in one line!):
java -jar D:\target\eclipse\plugins\org.eclipse.equinox.launcher_1.0.201.R35x_v20090715.jar
-application org.eclipse.ant.core.antRunner
-f D:\target\eclipse\plugins\org.eclipse.pde.build_3.5.2.R35x_20100114\scripts\productBuild\productBuild.xml
-Dbuilder=c:\pde-build\scripts %*
I tried to create the target eclipse platform from different parts: The eclipse SDK, RCP SDK, Delta Pack, PDE-SDK in all combinations but none of them worked well.
I got the following error:
BUILD FAILED
D:\target\eclipse\plugins\org.eclipse.pde.build_3.5.2.R35x_20100114\scripts\productBuild\productBuild.xml:18: Cannot fin
d ${eclipse.pdebuild.scripts}/build.xml imported from D:\target\eclipse\plugins\org.eclipse.pde.build_3.5.2.R35x_2010011
4\scripts\productBuild\productBuild.xml
where the variable ${eclipse.pdebuild.scripts} does not got resolved. I also tried to give this parameter via the command line but then I got another error regarding missing svn task which is absolutely confusing as this is working with my local eclipse installation referenced.
When I replace the path from d:/target/eclipse to my local eclipse installation the pde build works as expected!
This leads my to the point that the configuration of the target eclipse is not correct but in the moment I have no idea how to configure this!
My goal is the automate the pde build first on my local site without referencing my local eclipse and later on integrate this building process into our running cruisecontrol instance.
As I saw already another question about defining the target eclipse I would be happy if anyone can contribute hints or facts regarding the problem.
Regards,
Andreas
When performing a headless build, the target can be separate from the eclipse that is actually running the build itself. The problem you had here is that the eclipse that you were using to run the build did not have PDE/Build properly installed.
This is why the ${eclipse.pdebuild.scripts} was not set, because PDE/Build was not installed into that eclipse instance, the org.eclipse.pde.build bundle was not resolved and the code that sets this property never got called. Similarly, the necessary ant classpath entries for PDE/Build tasks would not have been set up properly either.
You need the Eclipse with PDE installed inside to run the build, but the target for the build can be separate from this.
In the build.properties file found under -Dbuilder=c:\pde-build\scripts you can set several properties:
baseLocation This is a path to an eclipse that is your target.
buildDirectory This is where the build will actually take place, source is fetched to plugins/ and features/ subfolders, but if there are already binary plugins located here then those become part of the target as well.
pluginPath This is a list of paths (separated with ';' on windows or ':' on linux) containing other locations that should be considered as part of your target. These locations can be several things:
The root of an eclipse-like install with plugins/ and features/ subfolders. This is a good way to provide the delta-pack instead of just unzipping it on top of an eclipse install.
The root of a workspace-like folder, where all subfolders are treated as plugins or features depending on the presence of a manifest or feature.xml.
The root of a bundle or feature, or the jar for a bundle.
If you are doing a p2 build (p2.gathering = true) you can also provide p2 repositories under a ${repoBaseLocation} which will be transformed and placed under ${transformedRepoLocation} and will become part of your target, and the p2 metadata there will get reused during the build.
after some more time of investigation I found out, what I did wrong so far. As I mentioned above defining the target platform is not that easy as copying the SDK and plugins in into one location (as it was in early times of eclipse dev).
The working solution by now is the following: Copying the eclipse SDK into the target location and run this version. Install inside this the neccessary PDE-Tools to enable plugin development. After that, close the IDE and copy the delta pack + the respective svn plugin (I used org.eclipse.pde.build.svn-1.0.1RC2 from sourceforge) into the target platform and you're done.
Now my automated PDE build is running as expected.
Only minor issue now is the following: The result product contains eclipse-specific menu entries which are not there when I ran this from inside my dev-eclipse.
Any hints on that?
I just posted an answer to my question on this kind of topics, may be this can help you:
Plugin product VS Feature product