Scalac doesn't find dependent classes - scala

I'm trying to compile the program with 2 simplest classes:
class BaseClass
placed in BaseClass.scala and
class Test extends BaseClass
placed in Test.scala. Issuing command scalac Test.scala fails, cause BaseClass is not found.
I don't want to compile classes one by one or using scalac *.scala.
The same operation in java works: javac Test.java. Where am I wrong?

Let's see first what Java does:
dcs#dcs-132-CK-NF79:~/tmp$ ls *.java
BaseClass.java Test.java
dcs#dcs-132-CK-NF79:~/tmp$ ls *.class
ls: cannot access *.class: No such file or directory
dcs#dcs-132-CK-NF79:~/tmp$ javac -cp . Test.java
dcs#dcs-132-CK-NF79:~/tmp$ ls *.class
BaseClass.class Test.class
So, as you can see, Java actually compiles BaseClass automatically when you do that. Which begs the question: how can it do that? Can does it know what file to compile?
Well, when you write extends BaseClass in Java, you actually know a few things. You know the directory where these files are found, from the package name. It also knows BaseClass is either in the current file, or in a file called BaseClass.java. If you doubt either of these facts, try moving the file from directory or renaming it, and see if Java can compile it.
So, why can't Scala do the same? Because it assumes neither thing! Scala's files can be in any directory, irrespective of the package they declare. In fact, a single Scala file can even declare more than one package, which would make the directory rule impossible. Also, a Scala class can be in any file whatsoever, irrespective of its name.
So, while Java dictates to you what directory the file should be in and what the file is called, and then reaps the benefit by letting you omit filenames from the command line of javac, Scala let you organize your code in whatever way seems best to you, but requires you to tell it where that code is.
Take your pick.

You need to compile BaseClass.scala first:
$ scalac Test.scala
Test.scala:1: error: not found: type BaseClass
class Test extends BaseClass
^
one error found
$ scalac BaseClass.scala
$ scalac Test.scala
$
EDIT So, the question is now why you have to compile the files one by one? Well, because the Scala compiler just doesn't do this kind of dependency handling. Its authors probably expect you to use a build tool like sbt or Maven so that they don't have to bother.

Related

How to import in F#

I have a file called Parser.fs with module Parser at the top of the file. It compiles. I have another module in the same directory, Main, that looks like this:
module Main
open Parser
let _ = //do stuff
I tried to compile Main.fs with $ fsharpc Main.fs (idk if there's another way to compile). The first error is module or namespace 'Parser' is not defined, all other errors are because of the fact that the functions in Parser are not in scope.
I don't know if it matters, but I did try compiling Main after Parser, and it still didn't work. What am I doing wrong?
F#, unlike Haskell, does not have separate compilation. Well, it does at the assembly level, but not at the module level. If you want both modules to be in the same assembly, you need to compile them together:
fsharpc Parser.fs Main.fs
Another difference from Haskell: order of compilation matters. If you reverse the files, it won't compile.
Alternatively, you could compile Parser into its own assembly:
fsharpc Parser.fs -o:Parser.dll
And then reference that assembly when compiling Main:
fsharpc Main.fs -r:Parser.dll
That said, I would recommend using an fsproj project file (analog of cabal file). Less headache, more control.

how does scala interpreter execute source file directly without creating class file

I have this scala code
object S extends App{
println("This is trait program")
}
When I execute scala S.scala it executes fine.
Now I want to know how can it execute code without compile and creating of class file.
Scala is a compiled language, and it needs to compile the code and the .class file is needed for execution.
Maybe you are thinking in using the REPL, where you can interactively code: https://www.scala-lang.org/documentation/getting-started.html#run-it-interactively
But, under the hood, the REPL is compiling your code, and executing the compiled .class
The command scala that you are launching is used to launch Scala REPL, and if you provide a file as an argument, it'll execute it will execute the content of the files as if it was bulk pasted in a REPL.
It's true that Scala is a compiled language, but it does not mean that a .class file is necessary. All that the Scala compiler needs to do is generate relevant JVM byte code and call JVM with that byte code. This does not mean that it explicitly has to create a .class file in directory from where you called it. It can do it all using memory and temporary storage and just call JVM with generated byte code.
If you are looking to explicitly generate class files with Scala that you can later execute by calling java manually, you should use Scala compiler CLI (command: scalac).
Please note that Scala compiler has interfaces to check and potentially compile Scala code on the fly, which is very useful for IDEs (checkout IntelliJ and Ensime).
Just call main() on the object (which inherits this method from App):
S.main(Array())
main() expects an Array[String], so you can just provide an empty array.
Scala is a compiled language in terms of source code to java bytecode transition, however some tricks may be taken to make it resemble an interpreted language. A naive implementation is that when run scala myscript.scala, it follows these steps:
scalac Myscript.scala. It generates S.class (which is the entry class that contains main method) and potentially other class files.
scala -cp . S. This runs/interprets from the main entry of the class
file. -cp . specifies the classpath; and S is the entry class without file extension .class. Note that run/interprets means interpreting (java) bytecode (rather than Scala/Java source code), which is done by JVM runtime.
Remove all the temporarily generated class files. This procedure is optional as long as the users are not aware of the temporary files (i.e., transparent to users).
That is to say, scala acts as a driver that may handle 0) initialization 1) compilation(scalac) 2) execute/run (scala) 3) cleanup.
The actual procedures may be different (e.g., due to performance concerns some files are only in memory/cached, or not generated, or not deleted, by using lower-level APIs of scala driver, etc.) but the general idea should be similar.
On Linux machines, you might find some evidences in /tmp folder. For me,
$ tree /tmp
/tmp
├── hsperfdata_hongxu
│   └── 64143
└── scala-develhongxu
├── output-redirects
│   ├── scala-compile-server-err.log
│   └── scala-compile-server-out.log
└── scalac-compile-server-port
└── 34751
4 directories, 4 files
It is also noteworthy that this way of running Scstepsala is not full-fledged. For example, package declarations are not supported.
# MyScript.scala
package hw
object S extends App {
println("Hello, world!")
}
And it emit error:
$ scala Myscript.scala
.../Myscript.scala:1: error: illegal start of definition
package hw
^
one error found
Others have also mentioned the REPL (read–eval–print loop), which is quite similar. Essentially, almost every language can have an (interactive) interpreter. Here is a text from wikipedia:
REPLs can be created to support any language. REPL support for compiled languages is usually achieved by implementing an interpreter on top of a virtual machine which provides an interface to the compiler. Examples of REPLs for compiled languages include CINT (and its successor Cling), Ch, and BeanShell
However interpreted languages (Python, Ruby, etc.) are typically superior due to their dynamic nature and runtime VMs/interpreters.
Additionally, the gap between compiled and interpreted is not that big. And you can see Scala actually has some interpreted features (at least it appears) since it makes you feel that you can execute like a script language.

Why does the scala-ide not allow multiple package definitions at the top of a file?

In scala it is common practice to stack package statements to allow shorter imports, but when I load a file using stacked packages into the scala ide and I attempt to use an import starting with the same organization I get a compiler error from what appears to be the presentation compiler. The code compiles fine in sbt outside of the IDE.
An example code snippet is as follows:
package com.coltfred
package util
package time
import com.github.nscala_time.time.Imports._
On the import I get the error object github is not a member of package com.coltfred.util.com.
If I move the import to a single line the error will go away, but we've used this practice frequently in our code base so changing them all to be single line package statements would be a pain.
Why is this happening and is there anything I can do to fix it?
Edit:
I used the eclipse-sbt plugin to generate the eclipse project file for this. The directory structure is what it should be and all of the dependencies are in the classpath.
Edit 2:
It turns out there was a file in the test tree of the util package (which should have been in the same package), but had a duplicate package statement at the top. I didn't check the test tree because it shouldn't affect the compilation of the main tree, but apparently I was wrong.
Not sure why the Scala IDE is not liking this, but you can force the import to start at the top level using _root_:
import _root_.com.github.nscala_time.time.Imports._
See if that avoids irritating the IDE.
This is a common annoyance that annoyed paulp into an attempt to fix it. His idea was that a dir that doesn't contribute class files shouldn't be taken as a package. If you can take util as scala.util, you should do so in preference to foo.util where that util is empty.
The util dir is the usual suspect, because who doesn't have a util dir lying around, and in particular, ./util?
apm#mara:~/tmp/coltfred$ mkdir -p com/coltfred/util/time
apm#mara:~/tmp/coltfred$ mkdir -p com/coltfred/util/com
apm#mara:~/tmp/coltfred$ vi com/coltfred/util/time/test.scala
apm#mara:~/tmp/coltfred$ scalac com/coltfred/util/time/test.scala
./com/coltfred/util/time/test.scala:5: error: object github is not a member of package com.coltfred.util.com
import com.github.nscala_time.time._
^
one error found
apm#mara:~/tmp/coltfred$ cat com/coltfred/util/time/test.scala
package com.coltfred
package util
package time
import com.github.nscala_time.time._
class Test
apm#mara:~/tmp/coltfred$
To debug, find out where the offending package is getting loaded from.

Buildr: How do I define a project which depends on another project in the same Buildfile?

This seems like a simple task in Buildr, so I must be missing something obvious because I can't make it work. Suppose I have a directory with two files, like so:
test/lib/src/main/scala/MyLib.scala
test/app/src/main/scala/MyApp.scala
MyLib.scala is:
class MyLib {
def hello() { println("Hello!") }
}
And MyApp.scala is:
object MyApp extends App {
val ml = new MyLib
ml.hello()
}
Building these with scalac is straightforward:
$ cd test
$ scalac lib/src/main/scala/MyLib.scala -d target/main/classes
$ scalac app/src/main/scala/MyApp.scala -cp target/main/classes -d target/main/classes
$ cd target/main/classes/
$ scala MyApp
Hello!
However, my naïve attempt to turn this into a Buildfile (in the test folder):
require 'buildr/scala'
lib_layout = Layout.new
lib_layout[:source, :main, :scala] = 'lib/src/main/scala'
app_layout = Layout.new
app_layout[:source, :main, :scala] = 'app/src/main/scala'
define 'mylib', :layout => lib_layout do
end
define 'myapp', :layout => app_layout do
compile.with project('mylib')
end
fails with:
(in /test, development)
Building mylib
Compiling myapp into /test/target/main/classes
/test/app/src/main/scala/MyApp.scala:2: error: not found: type MyLib
val ml = new MyLib
^
one error found
Buildr aborted!
RuntimeError : Failed to compile, see errors above
and if I run buildr --trace it's pretty clear that the reason scalac is failing is because the classpath does not include target/main/classes.
How do I make this happen? I know that separating the two projects may seem contrived, but I have something much more sophisticated in mind, and this example boiled the problem down to its essential components.
The idiomatic way to describe your project with Buildr would be to use sub-projects,
Note: buildfile below goes into the test/ directory.
require 'buildr/scala'
define "my-project" do
define "lib" do
end
define "app" do
compile.with project("lib").compile.target
end
end
The two sub-projects lib and app are automatically mapped to the lib/ and app/ sub-directories and Buildr will automatically look for sources under src/main/scala for each.
Buildr common convention for project layout looks for compiled classes in target/classes and target/test/classes. I see your build is placing classes into target/main/classes:
So you'll want to change your scala params to the expected location, or change the layout to include:
lib_layout[:target, :main, :scala] = 'target/main/classes'
If you also need to change the layout for test classes (I'm guessing you don't), use:
lib_layout[:target, :test, :scala] = ...

FSC recompiles every time

FSC recompiles my .scala files every time even there is no need - I can compile it twice without editing anything between attempts and it recompiles them!
For example, I have 2 files
Hello.scala
class Hello{
print("hello")
}
And Tokens.scala:
abstract class Token(val str: String, val start: Int, val end: Int)
{override def toString = getClass.getSimpleName + "(" + "[" + start + "-" + end + "]" + str + ")"}
class InputToken(str: String, start: Int, end: Int)
extends Token(str, start, end)
class ParsedToken(str: String, start: Int, end: Int, val invisible: Boolean)
extends Token(str, start, end)
When I ask ant to compile project from scratch I see following output:
ant compile
init:
[mkdir] Created dir: D:\projects\Test\build\classes
[mkdir] Created dir: D:\projects\Test\build\test\classes
compile:
[fsc] Base directory is `D:\projects\Test`
[fsc] Compiling source files: somepackage\Hello.scala, somepackage\Tokens.scala to D:\projects\Test\build\classes
BUILD SUCCESSFUL
Than I don't edit anything and ask ant compile again:
ant compile
init:
[mkdir] Created dir: D:\projects\Test\build\classes
[mkdir] Created dir: D:\projects\Test\build\test\classes
compile:
[fsc] Base directory is `D:\projects\Test`
[fsc] Compiling source files: somepackage\Tokens.scala to D:\projects\Test\build\classes
BUILD SUCCESSFUL
As you can see, fsc acts smart in case of Hello.scala (no recompilation) and acts dumb in case of Tokens.scala. I suggest that the problem is somehow related with inheritance but that is all.
So what is wrong?
Tokens.scala is recompiled because there isn't a class file matching its basename. That is, it doesn't produce a Tokens.class file. When deciding if a source file should be compiled, fsc looks for a classfile with the same basename and if the class file does not exist or the modification time on the source file is later than that of the class file, the source file will be rebuilt. If you can, I suggest that you look into Simple Build Tool, its continuous compile mode accurately tracks source->classfile mapping and won't recompile Tokens.scala
For extra laughs, think about what the compiler might do if you have a different source file that has class Tokens in it.
Although scala allows arbitrary public classes/objects in any source file, there's still quite a bit of tooling that assumes you will somewhat follow the java convention and at least have one class/object in the file with the same name as the source file basename.
I don't like much posting stuff written by others, but I think this question merits a more complete answer that what was strictly asked.
So, first of all, fsc recompiles everything by default, period. It is ant, not fsc, which is leaving Hello.scala out, because the file name matches the class name. It is not leaving Tokens.scala out because there is no class called Tokens compiled -- so, in the absence of a Tokens.class, it recompiled Tokens.scala.
That is the wrong thing to do with Scala. Scala differs in one fundamental aspect from Java in that, because of technical limitations on JVM, a change in a trait requires recompilation of every class, object or instantiation that uses it.
Now, one can fix the ant task to do a smarter thing starting with Scala 2.8. I'm taking this information from blogtrader.net by Caoyuan, of Scala plugin for Netbeans fame. You define the Scala task on the build target like below:
<scalac srcdir="${src.dir}"
destdir="${build.classes.dir}"
classpathref="build.classpath"
force="yes"
addparams="-make:transitive -dependencyfile ${build.dir}/.scala_dependencies"
>
<src path="${basedir}/src1"/>
<!--include name="compile/**/*.scala"/-->
<!--exclude name="forget/**/*.scala"/-->
</scalac>
It tells ant to recompile everything, as ant simply isn't smart enough to figure out what needs to be recompiled or not. It also tells Scala to build a file containing compilation dependencies, and use a transitive dependency algorithm to figure out what needs to be recompiled or not.
You also need to change the init target to include the build directory in the build classpath, as Scala will need that to recompile other classes. It should look like this:
<path id="build.classpath">
<pathelement location="${scala-library.jar}"/>
<pathelement location="${scala-compiler.jar}"/>
<pathelement location="${build.classes.dir}"/>
</path>
For more details, please refer to Caoyuan's blog.