I have a set of Scala projects and for all of those projects, I would like to introduce some scala source code formatting for which purpose, I'm using the scamafmt sbt pliugin. I have compiled the config file and this config file is in a separate project repo. I would now like to reuse this in all of the other Scala projects. I see two possibilities:
Use the repo where the conf file is located as a git submodule in all the other 10 projects where I want to run the scala formatter
Do not do anything, just add a README documentation that every user who is working on the codebase should download the scalafmt conf file to the project (I will pre add a .gitignore to all projects to ignore the local conf file)
Is there any other approach? I definitely do not want the conf file to diverge if I leave it as is in all the projects.
As per the documentation, one option is to build (and publish in your org) a SBT plugin with your configuration:
https://scalameta.org/scalafmt/docs/installation.html#share-configuration-between-builds
To share configuration across different sbt builds, create a custom sbt plugin that generates .scalafmt-common.conf on build reload, then include the generated file from .scalafmt.conf
// project/MyScalafmtPlugin.scala
import sbt._
object MyScalafmtPlugin extends AutoPlugin {
override def trigger = allRequirements
override def requires = plugins.JvmPlugin
override def buildSettings: Seq[Def.Setting[_]] = {
SettingKey[Unit]("scalafmtGenerateConfig") :=
IO.write(
// writes to file once when build is loaded
file(".scalafmt-common.conf"),
"maxColumn = 100".stripMargin.getBytes("UTF-8")
)
}
}
// .scalafmt.conf
include ".scalafmt-common.conf"
Related
I'm trying to run certain tasks and startup servers after running sbt. I want to be able to run commands in terminal to do this. How can I define them? Are plugins the right way to do this:
I see some code like this:
object DoThing extends AutoPlugin {
object autoImport {
val vpnCheck = taskKey[Boolean]("Check for a VPN connection.")
}
import autoImport._
override lazy val projectSettings = Seq(
vpnCheck := {
doVpnCheck()
}
)
What is the projectSettings method doing? Are plugins the way?
From the plugins page:
A plugin is a way to use external code in a build definition. A plugin can be a library used to implement a task (you might use Knockoff to write a markdown processing task). A plugin can define a sequence of sbt settings that are automatically added to all projects or that are explicitly declared for selected projects. For example, a plugin might add a proguard task and associated (overridable) settings. Finally, a plugin can define new commands (via the commands setting).
But I can't seem to figure this out.
For your scenario, maybe you can just a create Task in sbt file to do this, like:
val hello = taskKey[Unit]("hello world")
hello := {
println("hello")
}
and if you run it automatically in startup time, you can create .sbtrc file in project directory, and it like:
alias boot = ;reload ;hello ;iflast shell
(Long question ahead. Simplified tl;dr at the bottom).
I have two ScalaJS projects built with SBT - "myapp" and "mylib", in the following directory structure
root/build.sbt
root/myapp/build.sbt
root/myapp/jvm/
root/myapp/js/
root/myapp/shared/
root/mylib/build.sbt
root/mylib/jvm
root/mylib/js
root/mylib/shared
lib exports an artifact named "com.example:mylib:0.1", which as used as a libraryDependency for myapp.
myapp and mylib are in separate repositories, contain their own build files, and should be able to be build completely separately (i.e. they must contain their own individual build config).
In production, they will be built separately with mylib being first published as a maven artifact before building myapp separately.
In development however, I want to be able to merge these into a parent SBT project so that both can be developed in parallel without needing to use publishLocal after each change.
In a traditional (not scalajs) project this would be quite easy
$ROOT/build.sbt:
lazy val mylib = project
lazy val myapp = project.dependsOn(mylib)
However in ScalaJS, we actually have two projects inside each module - appJVM, appJS, libJVM and libJS. As such, the above configuration only finds the aggregate root project and does not correctly apply the dependsOn configuration to the actual JVM and JS projects.
(i.e. myapp and mylib build.sbt each contains two projects, and an aggregate root project)
Ideally I'd like to be able to do something like the following
lazy val mylibJVM = project
lazy val myappJVM = project.dependsOn(mylibJVM)
lazy val mylibJS = project
lazy val myappJS = project.dependsOn(myappJS)
Unfortunately this just creates new projects within the root instead of importing the subprojects themselves.
I've also tried various combinations of paths (such as)
lazy val mylibJVM = project.in(file("mylib/jvm"))
But this doesn't see configuration in build.sbt file in mylib
Ultimately I keep running up against the same problem - when importing an existing multi-project SBT project into a parent sbt file, it imports the root project, but does not seem to provide a way to import a subproject from an existing multimodule SBT file in a way that lets me add dependsOn configuration to it.
tl;dr
If I have
root/mylib/build.sbt with multiple projects defined and
root/myapp/build.sbt with multiple projects defined
Is it possible to import individual subprojects into root/build.sbt instead of the root project from the submodule?
i.e. Can I have two layers of multiproject builds.
After spending a lot of time digging through SBT source code, I managed to figure out a solution. This isn't clean, but it works. (For bonus points, it imports correctly into IntelliJ).
// Add this function to your root build.sbt file.
// It can be used to define a dependency between any
// `ProjectRef` without needing a full project definition.
def addDep(from:String, to:String) = {
buildDependencies in Global <<= (
buildDependencies in Global,
thisProjectRef in from,
thisProjectRef in to) {
(deps, fromref, toref) =>
deps.addClasspath(fromref, ResolvedClasspathDependency(toref, None))
}
}
// `project` will import the `build.sbt` file
// in the subdirectory of the same name as the `lazy val`
// (performed by an SBT macro). i.e. `./mylib/build.sbt`
//
// This won't reference the actual subprojects directly,
// will but import them into the namespace such that they
// can be referenced as "ProjectRefs", which are implicitly
// converted to from strings.
//
// We then aggregate the JVM and JS ScalaJS projects
// into the new root project we've defined. (Which unfortunately
// won't inherit anything from the child build.sbt)
lazy val mylib = project.aggregate("mylibJVM","mylibJS")
lazy val myapp = project.aggregate("myappJVM","myappJS")
// Define a root project to aggregate everything
lazy val root = project.in(file(".")).aggregate(mylib,myapp)
// We now call our custom function to define a ClassPath dependency
// between `myapp` -> `mylib` for both JVM and JS subprojects.
// In particular, this will correctly find exported artifacts
// so that `myapp` can refer to `mylib` in libraryDependencies
// without needing to use `publishLocal`.
addDep("myappJVM", "mylibJVM")
addDep("myappJS","mylibJS")
Is there an equivalent to Leiningen's "checkouts" feature in sbt?
Here is what I want to accomplish:
I have two projects, an application Foo and library "Bar". I want to publish each of these projects independently. Foo depends on Bar, and the sbt project will direct sbt to download the jar for "Bar" from a repository whenever a third-party tries to build "Foo" (which is typical behavior).
Now, say I want to hack on both Foo and Bar at the same time. For example, while working on Foo, I want to directly edit and debug some of the source for Bar so the edits affect Foo (and then later rebuild Bar when it is convenient).
How can I instruct sbt to satisfy its dependency on Bar from its source code on my machine (rather than my local repository) during this hack session?
(P.S. I asked a similar question for Clojure/Leiningen. Leiningen has the "checkouts" feature which accomplishes this. I am wondering if there is something similar in sbt...)
You can declare a source dependency from Foo to Bar via a project reference:
import sbt._
object FooBuild extends Build {
lazy val root = Project(
id = "foo",
base = file(".")
) dependsOn(theBarBuild)
lazy val theBarBuild = ProjectRef(
base = file("/path/to/bar"),
id = "bar")
}
This should also recompile Bar (if it has changed) whenever you compile Foo. Please note that the id of the project reference must match the actual id of the Bar project, which might be something like e.g. default-edd2f8 if you use a simple build definition (.sbt files only).
This technique is especially useful for plug-ins (see my blog post about this topic).
Edit:
You can kind of re-code the checkout behaviour like this:
import sbt._
object FooBuild extends Build {
lazy val root = addCheckouts(Project(id = "foo", base = file(".")))
def addCheckouts(proj: Project): Project = {
val checkouts = proj.base.getCanonicalFile / "checkouts"
if (! checkouts.exists) proj
else proj.dependsOn(IO.listFiles(DirectoryFilter)(checkouts).map { dir =>
ProjectRef(base = dir, id = dir.name): ClasspathDep[ProjectReference]
}:_*)
}
}
This checks your project directory for a checkouts directory, and if it exists, adds the directories therein (which should be symlinks to other projects) as project references to the project. It expects the symlink to be named like the actual ID of the linked project (e.g. default-edd2f8 or bar). If the directory doesn't exist, the build just works as before.
When you add or remove a symlink in the checkouts directory (or the directory itself), you must reload the Foo project to pick up the changes.
Hope this helps.
I'm sure this is simple, but I haven't figured it out yet...
I've installed sbt-launch.jar and a shell script to execute it (named sbt).
How do I put multiple projects in the same directory?
When I run sbt the directories project and target get created and populated, and the current project is default-XXXXX. The compile command picks up source files in the top-level directory and jar files in the top-level 'lib' directory.
How do I add another project under the same directory? Every time I run sbt in an empty directory it creates a 20+ MB project directory.
Note 1: when I run sbt I an not getting asked "Create new project?" or any other questions.
Note 2: I am using sbt-launch.jar from this url: http://typesafe.artifactoryonline.com/typesafe/ivy-releases/org.scala-tools.sbt/sbt-launch/0.10.1/sbt-launch.jar
and I'm following the instructions at: http://code.google.com/p/simple-build-tool/wiki/Setup
Found the answer (for sbt 0.10.1):
Create the file project/Build.scala that looks like this:
import sbt._
object MyBuild extends Build
{
lazy val root = Project("root", file("."))
lazy val sub1: Project = Project("proj1", file("dir1"));
lazy val sub2 = Project("proj2", file("dir2"))
}
This creates three projects 'root' (in the top-level directory), 'proj1' (in the sub-directory 'dir1') and 'proj2' (in the sub-directory 'dir2')
For more info, see https://github.com/harrah/xsbt/wiki/Full-Configuration
According to sbt tutorial on changing paths I'm trying to change "target" output directory to "someother"
override def outputDirectoryName = "someother"
Everything goes fine except one: sbt automatically creates target directory with ".history" file inside. Why sbt does this when it supposed do create only "someother" dir ? I tryied to override all methods that are inherited from BasicProjectPaths (I use sbt.DefaultProject as superclass of my project descriptor)
override def mainCompilePath = ...
override def testCompilePath = ...
...
But sbt creates "target" folder in spite of paths overriding.
It certainly seems that it should use the overridden outputDirectoryName in trunk...
/** The path to the file that provides persistence for history. */
def historyPath: Option[Path] = Some(outputRootPath / ".history")
def outputPath = crossPath(outputRootPath)
def outputRootPath: Path = outputDirectoryName
def outputDirectoryName = DefaultOutputDirectoryName
(from SBT's current trunk).
It may have been different in a previous version. Have you considered raising a new bug?
In sbt 0.13.5, I found a way to change the target folder by just re-assigning target in the build.sbt file:
target := file("someotherParent") / "someotherSubdir"
This only modifies the directory for the built classes and artifacts, however, the .history file is always in the project root directory.
Unfortunately, some other plugins (xsbt-web-plugin) seem to have problems with that - running the webapp via SBT console produced weird errors, when I switched back to the standard directory layout, these problems disappeared.
A better way to achieve my goals (of all JARS in one directory, whose names contains the JAVA-VM-version) seems to be to specify an appropriate target for publishing - there are less restrictions on "sbt publish", and other plugins are not disturbed by a different directory layout.