I am using JDT to get ASTs and type resolvers for Java sources in Eclipse, is there a way to achieve the same for Scala sources in Scala projects?
No, it is not possible to parse Scala with Java Development Tools. Scala and Java are two completely different languages. They have different syntax, different semantics, different type systems.
Related
Does sbt (Scala build tool) have the concept of archetypes, similar to the way that you can define an archetype in a maven project? Basically a blueprint for a new project that might have specific dependencies already defined for you, or perhaps some prebuilt classes and traits?
Yeah, in sbt, we can use other plugins to achiecve this. sbt-native-packager can be used. As quoted in its documentation:
SBT native packager lets you build application packages in native
formats and offers different archetypes for common configurations,
such as simple Java apps or server applications
Let me know if it helps!!
.
I understand that cross-building across different Scala versions is easy with SBT. You just put your files which fail to compile in scala-2.10 and scala-2.11 directories instead of scala. However, If I want to cross build for different versions of Scala and for different versions of a dependency (say, Spark 1.6 and 2.1) then how can that be done?
I downloaded confluent-2.0.0-2.10.5.tar.gz, because I want to have scala 2.10 package
but still the kafka jar in /share/java/schema-registry is still kafka_2.11-0.9.0.0-cp1.jar
Is there anyway I can get a clean 2.10 scala confluent package
The 2.10 refers to the version of the Kafka subpackage, but a different version may be used by other subpackages.
The tar.gz packages use the 2.11 versions where a different subpackage requires access to the core Kafka jar that has a Scala dependency. (Actually, the version they depend on is really whichever Scala version is supported by Kafka and considered most stable and well supported upstream). This is necessary because Scala libraries aren't necessarily binary compatible between different Scala versions, which would mean that not doing this would require multiple versions of all the services that use the Kafka libraries, especially on platforms like Debian and RPM-based distros, i.e. we'd need a schema-registry-2.10 and schema-registry-2.11. Instead, we sort of vendorize the entire Kafka library for services that depend on it.
Note that the files under /share/java/kafka only use Scala 2.10 and if you need to pull in the clients, you can safely add that to your classpath. The use of 2.10 or 2.11 for any of the other services shouldn't matter as they are simply that: services that you execute. Any libraries that you might need to put on your classpath (e.g. serializers) only depend on the pure Java libraries in Kafka and are therefore safe to use with Kafka libraries compiled with any Scala version.
I'm new to Scala but I do know JAVA. Can Scala projects be compiled into web apps and deployed to Webservers like JAVA?
Thanks!
Scala code is converted into byte code after compilation, hence yes it can be done.
here this should help for starting
I am new to web development, but quite familiar with both Java and Python. In my beginning experiments with web development using Java, I discovered Apache Wicket; around the same time I also discovered Jython (v 2.5). I am wondering if there's a way to integrate Wicket and Jython so I can write Jython scripts instead of Java classes to use Wicket. So far I haven't been able to do that.
Ideas?
I don't see why not.
There are many people using Wicket successfully with Scala language. It should be the same with Jython, JRuby, Closure, ...
See https://github.com/wicketstuff/core/tree/master/jdk-1.5-parent/scala-extensions-parent for example with Scala.