How to import an user defined package into scala REPL automatically when scala REPL is started? - scala

When scala REPL starts some default packages like
scala.lang._ ,
scala.Predef are automatically available.Suppose I have my own package like com.raghhuraamm.rUtils._
How to import this package automatically when REPL starts? Is there a
way or I just have to type "import com.raghhuraamm.rUtils._ " every
time I start scala REPL?

If you could use sbt console to launch the REPL, you can create a build.sbt containing this line:
initialCommands in Compile in console += "import com.raghhuraamm.rUtils._"
Source: https://www.scala-sbt.org/1.x/docs/Inspecting-Settings.html

Create a script (myPreload.scala, say) that contains all the imports that you want:
// in myPreload.scala
import com.raghhuraamm.rUtils._
Assuming that the classes are packaged in my.jar, start the scala repl as follows:
scala -cp path/to/my.jar -i some/other/path/to/myPreload.scala

Related

How do I make packages available to the Scala REPL?

I'm trying to get familiar with Scala. I am using macOS.
I've installed scala using brew install scala which is hassle-free and once complete I can launch the scala REPL simply by issuing scala and I'm at the scala> prompt.
I now want to import some packages, so I try:
import org.apache.spark.sql.Column
and unsurprisingly it fails with
error: object apache is not a member of package org
This makes sense, how would it know where to get that package from? Thing is, I don't know what I need to do to make that package available. Is there anything I can do from the command-line that would allow me to import org.apache.spark.sql.Column?
I have googled around a little but not found anything that explains in a jargon-free way. Complete Scala noob here so jargon-free responses would be appreciated.
Here are two ways to start a REPL with dependencies that I'm aware of:
Use SBT to manage dependencies, use console to start a REPL with those dependencies
Use Ammonite REPL
You could create a separate directory with a build.sbt where you set
scalaVersion := "2.11.12"
and then copy the
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.3.0"
snippets from MavenCentral. Then you can run the REPL with sbt console. Note that this will create a project and target subdirectories, so it "leaves traces", you can't use it like the standalone scala-repl. You could also omit the build.sbt, and add the library-dependencies by typing them into the SBT-shell itself.
Alternatively you can just use Ammonite REPL that has been created exactly for that purpose.
You can use classpath to make the lib available i.e. download the jar locally and use the command as follows (here I'm using Apache IO lib to move files from scala prompt )
C0:Desktop pvangala$ scala -cp /Users/pvangala/Downloads/commons-io-2.6/commons-io-2.6.jar
Welcome to Scala 2.12.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_161).
Type in expressions for evaluation. Or try :help.
scala> import java.io.File
import java.io.File
scala> val src = new File("/Users/pvangala/Downloads/commons-io-2.6-bin.tar")
src: java.io.File = /Users/pvangala/Downloads/commons-io-2.6-bin.tar
scala> val dst = new File("/Users/pvangala/Desktop")
dst: java.io.File = /Users/pvangala/Desktop
scala> org.apache.commons.io.FileUtils.moveFileToDirectory(src, dst, true)
If you want to use spark stuff I'd recommend you use the spark-shell that comes with the spark-installation. I don't know macOS so I can't help you much with the install of Spark there.
For normal Scala I recommend ammonite http://ammonite.io/#Ammonite-REPL that has included syntax to allow to pull packages/dependencies.
If you want to use spark, you should use the spark-shell instead the scala REPL. It has almost the same behaviour but includes all the spark dependencies by default.
You should download spark binaries from here
Then if you are using Linux, you should create the variable SPARK_HOME pointing to the downloaded folder and include its bin folder in PATH.
then you can start it in any console with the command spark-shell
In Windows i'm not sure, but i think that you should have a spark-shell.cmd file or something similar which you can use to start the spark-shell,
I did the following in Windows:
for /f "tokens=*" %%a in ('java -jar coursier fetch -p "com.lihaoyi::requests:0.2.0" "com.lihaoyi::upickle:0.7.5"') do set SCP=%%a
scala -nc -classpath %SCP% %1 %2 %3
Instead of the two libraries listed here you can use an unlimited number of other libraries you need. They must be available in maven central, though. The %1 could be a scala script (".sc" extension). But you could leave it empty and the REPL will start with the libraries on the classpath.

Why can't scala find what sbt can?

With sbt everything is fine:
» sbt console
[info] Loading project definition from /repos/myrepo/project
[info] Set current project to bpavscan (in build file:/repos/myrepo/)
[info] Starting scala interpreter...
[info]
Welcome to Scala 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_131).
Type in expressions for evaluation. Or try :help.
scala> import play.api.libs.json._
import play.api.libs.json._
scala>
But if I do it with the scala tool:
» scala
Welcome to Scala version 2.11.6 (OpenJDK 64-Bit Server VM, Java 1.8.0_131).
Type in expressions to have them evaluated.
Type :help for more information.
scala> import play.api.libs.json._
<console>:7: error: not found: value play
import play.api.libs.json._
^
scala>
I need to run a simple script, which I usually do with:
scala myscript.scala
But since now my script has a play dependency, I can not run it with scala anymore, since scala does not find play.
I need to either:
Be able to load the play framework with the simple scala tool
Be able to run a simple script with sbt: sbt run runs my project, which I do not want. I want to run a simple script (to try out some simple things)
sbt console with load the same console/REPL as of scala but with additional loaded dependencies defined in build.sbt. So before loading the console, all the dependent libraries are added. And this is reason you could import play libraries while using sbt console.
On the other hand scala starts the console with the libraries inside scala-package of the system. Thus needs additional jars being included inside the package for importing. For the above case if play library jar was included in scala directory then import play.api.libs.json._ should have worked for scala console too.

Scala can not load file in the interpreter

I am doing the Scala course from Coursera; currently, I am at the week 2 exercises. I want to load the code into the interpreter so I can check the methods I implemented like this:
:load FunSets.scala
However, I get the following error:
<console>:10: error: not found: value common
import common._
This appears because the source file imports another package like this:
package funsets
import common._
How can I make the interpreter see the other package as well?
Is there a way of importing the entire project?
Assuming your project uses sbt, you should be able to do the following.
From the root of your project, type sbt and press enter. Your project will be loaded in sbt.
Use the console task to load the REPL with all compiled classes and libraries. Use the consoleProject task to load the REPL with access to the project definition and sbt.
The sbt documentation has more details.

How to fire up Scala interpreter with ScalaCheck in the classpath in Ubuntu 11.10?

Scala is installed and working fine.
scalacheck.jar is placed in the /bin .
I used the following command
$ scala -cp scalacheck.jar
After that, when i tried the below command,
scala> import org.scalacheck.Prop.forAll
I got the following error.
<console>:7: error: object scalacheck is not a member of package org
import org.scalacheck.Properties
^
I might have done some mistake in using scalacheck, please correct me and give the proper commands so that I can able to work with scalacheck in Ubuntu in interpreter mode.
Putting executable on the path isn't the same as jar being on the classpath, so your jar being in /bin didn't change anything.
Just use:
scala -cp path_to_your.jar
and you should be fine.
If for example, your scalachek.jar is in /bin then use:
scala -cp /bin/scalacheck.jar
edit:
Putting jars in /bin probably isn't the best idea.
You can use it like this:
kjozsa#walrus:~$ scala -version
Scala code runner version 2.9.2 -- Copyright 2002-2011, LAMP/EPFL
kjozsa#walrus:~$ locate scalacheck.jar
/usr/share/scala/lib/scalacheck.jar
kjozsa#walrus:~$ scala -cp /usr/share/scala/lib/scalacheck.jar
Welcome to Scala version 2.9.2 (OpenJDK 64-Bit Server VM, Java 1.7.0_03-icedtea).
Type in expressions to have them evaluated.
Type :help for more information.
scala> import org.scalacheck.Prop.forAll
import org.scalacheck.Prop.forAll
scala>

"is not a member of package" error when importing package in Scala with SBT

(Relative beginner here, please be gentle...)
I've got a Scala program that I can build with sbt. I can (from within sbt) run compile and test-compile with no errors. I've defined a package by putting package com.mycompany.mypackagename at the top of several .scala files. When I do console to get a Scala REPL, this happens:
scala> import com.mycompany.mypackagename._
<console>:5: error: value mypackagename is not a member of package com.mycompany
import com.mycompany.mypackagename._
Any variation of this also fails. When I just do import com.mycompany I get no problems.
I thought that running the Scala console from within sbt would properly set the classpath based on the current projects? What (completely obvious) thing am I missing?
I ran into this same problem, and then I realized I was running scala 2.10.0 on commandline, and IDEA was using Scala 2.9.2. So the fix was to change both to use the same version, and:
sbt clean
What will happen if you import actual class name instead of wildcard.
import com.mycompany.mypackagename.ActualClassName