Simplest way to generate Verilog code from Chisel code - scala

What is the simplest way to generate Verilog code from existing Chisel code?
Would i have to create my own build file?
For example from a standalone scala file (AND.scala) like the following one..
import Chisel._
class AND extends Module {
val io = IO(new Bundle {
val a = Bool(INPUT)
val b = Bool(INPUT)
val out = Bool(OUTPUT)
})
io.out := io.a & io.b
}
I have the complete Chisel3 Toolchain installed under ubuntu 16.4.

See answer here: Is there a simple example of how to generate verilog from Chisel3 module?
In short, create a build.sbt file at the root of your project with the following in it:
scalaVersion := "2.12.13"
resolvers ++= Seq(
Resolver.sonatypeRepo("snapshots"),
Resolver.sonatypeRepo("releases")
)
libraryDependencies += "edu.berkeley.cs" %% "chisel3" % "3.4.4"
Add this code to AND.scala
object ANDDriver extends App {
(new chisel3.stage.ChiselStage).emitVerilog(new AND, args)
}
Type sbt run on the command line at the root of your project.

Related

Need to provide a SettingKey from a plugin I use in my sbt plugin

I am using the s3 resolver plugin and would like to override it in my AutoPlugin.
I have tried added the value to projectSettings and globalSettings.
Error
not found: value s3CredentialsProvider
[error] s3CredentialsProvider := s3CredentialsProviderChain
Code
lazy val s3CredentialsProviderChain = {bucket: String =>
new AWSCredentialsProviderChain(
new EnvironmentVariableCredentialsProvider(),
CustomProvider.create(bucket)
)
}
override lazy val projectSettings = Seq(
publishTo := {
if (Keys.isSnapshot.value) {
Some("my-snapshots" at "s3://rest-of-stuff")
} else {
Some("my-releases" at "s3://rest-of-stuff")
}
},
s3CredentialsProvider := s3CredentialsProviderChain
)
The plugin code I'm working on does not define any custom settings of it's own thus has no autoImport of it's own.
Update
I have been unable to resolve the fm.sbt.S3ResolverPlugin in MyPlugin and the code won't compile.
I have tried adding it to enablePlugins on MyPlugin's build.sbt as well as adding it to the dependencies like this:
libraryDependencies ++= Seq(
"com.amazonaws" % "aws-java-sdk-sts" % amazonSDKVersion,
"com.frugalmechanic" % "fm-sbt-s3-resolver" % "0.17.0"
)
I get an error from sbt which I've asked below:
sbt fails to resolve a plugin as dependency
If you create an AutoPlugin in project directory. You need to add this to plugins.sbt.
addSbtPlugin("com.frugalmechanic" % "fm-sbt-s3-resolver" % "0.16.0")
If you create an independent plugin, add this to build.sbt of the plugin
sbtPlugin := true
addSbtPlugin("com.frugalmechanic" % "fm-sbt-s3-resolver" % "0.16.0")
autoImport does not work in scala files that are compiled for sbt, ie plugins for example. You have specify imports statements as in simple scala program. Something like this
import fm.sbt.S3ResolverPlugin
import sbt._
object TestPlugin extends AutoPlugin {
override def requires = S3ResolverPlugin
override def trigger = allRequirements
override def projectSettings: Seq[Def.Setting[_]] = Seq(
S3ResolverPlugin.autoImport.s3CredentialsProvider := ???
)
}
Note that to enable TestPlugin, you have to call enablePlugins(S3ResolverPlugin) in build.sbt

I can not import filters in playframework 2.3.0

I use playframework 2.3.0, recently I want to add the CSRFFilter
when I import csrf in global.scala:
import play.filters.csrf._
I get an error for this:
[error] G:\testprojects\app\Global.scala:7: object filters is not a member
of package play
[error] import play.filters.csrf._
My plugin.sbt is
...
// The Play plugin
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.3.0")
...
I use Build.scala instead of build.sbt
lazy val root = Project("root", base = file(".")).enablePlugins(PlayScala)
.settings(baseSettings: _*)
.settings(libraryDependencies++=appDependencies)
.settings(
scalaVersion := "2.11.1",
version := "1.0"
)
According to the documentation you have to add the filters dependency to your project:
libraryDependencies += filters
The documentation is for build.sbt but I guess it should work with Build.scala too.
Play Framework GzipFilter is working for me,
my build.sbt file
name := "GZIP"
version := "1.0-SNAPSHOT"
libraryDependencies ++= Seq(
javaJdbc,
javaEbean,
cache,
filters
)
play.Project.playJavaSettings
steps to get play.filters package
1. play
2. update //important
3. clean
4. eclipse
5. compile
6. run
finally it will work.... (update command is important)
if IDE not detecting play.filters
do the above steps one more time
finally copy paste below code
import play.GlobalSettings;
import play.api.mvc.EssentialFilter;
import play.filters.gzip.GzipFilter;
public class Global extends GlobalSettings {
public <T extends EssentialFilter> Class<T>[] filters() {
return new Class[]{GzipFilter.class};
}
}
In Play 2.4.3, the import is:
import play.filters.cors.CORSActionBuilder
It's no longer called csrf, but cors.

Compile with different settings in different commands

I have a project defined as follows:
lazy val tests = Project(
id = "tests",
base = file("tests")
) settings(
commands += testScalalib
) settings (
sharedSettings ++ useShowRawPluginSettings ++ usePluginSettings: _*
) settings (
libraryDependencies <+= (scalaVersion)("org.scala-lang" % "scala-reflect" % _),
libraryDependencies <+= (scalaVersion)("org.scala-lang" % "scala-compiler" % _),
libraryDependencies += "org.tukaani" % "xz" % "1.5",
scalacOptions ++= Seq()
)
I would like to have three different commands which will compile only some files inside this project. The testScalalib command added above for instance is supposed to compile only some specific files.
My best attempt so far is:
lazy val testScalalib: Command = Command.command("testScalalib") { state =>
val extracted = Project extract state
import extracted._
val newState = append(Seq(
(sources in Compile) <<= (sources in Compile).map(_ filter(f => !f.getAbsolutePath.contains("scalalibrary/") && f.name != "Typers.scala"))),
state)
runTask(compile in Compile, newState)
state
}
Unfortunately when I use the command, it still compiles the whole project, not just the specified files...
Do you have any idea how I should do that?
I think your best bet would be to create different configurations like compile and test, and have the appropriate settings values that would suit your needs. Read Scopes in the official sbt documentation and/or How to define another compilation scope in SBT?
I would not create additional commands, I would create an extra configuration, as #JacekLaskowski suggested, and based on the answer he had cited.
This is how you can do it (using Sbt 0.13.2) and Build.scala (you could of course do the same in build.sbt, and older Sbt version with different syntax)
import sbt._
import Keys._
object MyBuild extends Build {
lazy val Api = config("api")
val root = Project(id="root", base = file(".")).configs(Api).settings(custom: _*)
lazy val custom: Seq[Setting[_]] = inConfig(Api)(Defaults.configSettings ++ Seq(
unmanagedSourceDirectories := (unmanagedSourceDirectories in Compile).value,
classDirectory := (classDirectory in Compile).value,
dependencyClasspath := (dependencyClasspath in Compile).value,
unmanagedSources := {
unmanagedSources.value.filter(f => !f.getAbsolutePath.contains("scalalibrary/") && f.name != "Typers.scala")
}
))
}
now when you call compile everything will get compiled, but when you call api:compile only the classes matching the filter predicate.
Btw. You may want to also look into the possibility of defining different unmanagedSourceDirectories and/or defining includeFilter.

Can I access my Scala app's name and version (as set in SBT) from code?

I am building an app with SBT (0.11.0) using a Scala build definition like so:
object MyAppBuild extends Build {
import Dependencies._
lazy val basicSettings = Seq[Setting[_]](
organization := "com.my",
version := "0.1",
description := "Blah",
scalaVersion := "2.9.1",
scalacOptions := Seq("-deprecation", "-encoding", "utf8"),
resolvers ++= Dependencies.resolutionRepos
)
lazy val myAppProject = Project("my-app-name", file("."))
.settings(basicSettings: _*)
[...]
I'm packaging a .jar at the end of the process.
My question is a simple one: is there a way of accessing the application's name ("my-app-name") and version ("0.1") programmatically from my Scala code? I don't want to repeat them in two places if I can help it.
Any guidance greatly appreciated!
sbt-buildinfo
I just wrote sbt-buildinfo.
After installing the plugin:
lazy val root = (project in file(".")).
enablePlugins(BuildInfoPlugin).
settings(
buildInfoKeys := Seq[BuildInfoKey](name, version, scalaVersion, sbtVersion),
buildInfoPackage := "foo"
)
Edit: The above snippet has been updated to reflect more recent version of sbt-buildinfo.
It generates foo.BuildInfo object with any setting you want by customizing buildInfoKeys.
Ad-hoc approach
I've been meaning to make a plugin for this, (I wrote it) but here's a quick script to generate a file:
sourceGenerators in Compile <+= (sourceManaged in Compile, version, name) map { (d, v, n) =>
val file = d / "info.scala"
IO.write(file, """package foo
|object Info {
| val version = "%s"
| val name = "%s"
|}
|""".stripMargin.format(v, n))
Seq(file)
}
You can get your version as foo.Info.version.
Name and version are inserted into manifest. You can access them using java reflection from Package class.
val p = getClass.getPackage
val name = p.getImplementationTitle
val version = p.getImplementationVersion
You can also generate a dynamic config file, and read it from scala.
// generate config (to pass values from build.sbt to scala)
Compile / resourceGenerators += Def.task {
val file = baseDirectory.value / "conf" / "generated.conf"
val contents = "app.version=%s".format(version.value)
IO.write(file, contents)
Seq(file)
}.taskValue
When you run sbt universal:packageBin the file will be there.

How to get list of dependency jars from an sbt 0.10.0 project

I have a sbt 0.10.0 project that declares a few dependencies somewhat like:
object MyBuild extends Build {
val commonDeps = Seq("commons-httpclient" % "commons-httpclient" % "3.1",
"commons-lang" % "commons-lang" % "2.6")
val buildSettings = Defaults.defaultSettings ++ Seq ( organization := "org" )
lazy val proj = Project("proj", file("src"),
settings = buildSettings ++ Seq(
name := "projname",
libraryDependencies := commonDeps, ...)
...
}
I wish to creat a build rule to gather all the jar dependencies of "proj", so that I can symlink them to a single directory.
Thanks.
Example SBT task to print full runtime classpath
Below is roughly what I'm using. The "get-jars" task is executable from the SBT prompt.
import sbt._
import Keys._
object MyBuild extends Build {
// ...
val getJars = TaskKey[Unit]("get-jars")
val getJarsTask = getJars <<= (target, fullClasspath in Runtime) map { (target, cp) =>
println("Target path is: "+target)
println("Full classpath is: "+cp.map(_.data).mkString(":"))
}
lazy val project = Project (
"project",
file ("."),
settings = Defaults.defaultSettings ++ Seq(getJarsTask)
)
}
Other resources
Unofficial guide to sbt 0.10.
Keys.scala defines predefined keys. For example, you might want to replace fullClasspath with managedClasspath.
This plugin defines a simple command to generate an .ensime file, and may be a useful reference.