How to create ROM with VecInit(Array()) in Chisel? - scala

I'm trying to declare a «rom» with VecInit() like it :
val GbColors = VecInit(Array(GB_GREEN0, GB_GREEN1, GB_GREEN2, GB_GREEN3))
With GB_GREENx declared like it :
class VgaColors extends Bundle {
val red = UInt(6.W)
val green = UInt(6.W)
val blue = UInt(6.W)
}
//...
object GbConst {
//...
/* "#9BBC0F"*/
val GB_GREEN0 = (new VgaColors()).Lit(_.red -> "h26".U(6.W),
_.green -> "h2F".U(6.W),
_.blue -> "h03".U(6.W))
/* "#8BAC0F"*/
val GB_GREEN1 = (new VgaColors()).Lit(_.red -> "h1E".U(6.W),
_.green -> "h27".U(6.W),
_.blue -> "h03".U(6.W))
/* "#306230"*/
val GB_GREEN2 = (new VgaColors()).Lit(_.red -> "h0C".U(6.W),
_.green -> "h18".U(6.W),
_.blue -> "h0C".U(6.W))
/*"#0F380F"*/
val GB_GREEN3 = (new VgaColors()).Lit(_.red -> "h03".U(6.W),
_.green -> "h0E".U(6.W),
_.blue -> "h03".U(6.W))
I can't manage to use GbColors as indexable Vec :
io.vga_color := GbColors(io.mem_data)
It generate a java stack error :
[info] [0.004] Elaborating design...
[error] chisel3.internal.ChiselException: Connection between sink (VgaColors(IO in unelaborated MemVga)) and source (VgaColors(Wire in GbWrite)) failed #.blue: Sink or source unavailable to current module.
[error] ...
[error] at gbvga.MemVga.$anonfun$new$42(memvga.scala:87)
[error] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
[error] at chisel3.WhenContext.<init>(When.scala:80)
[error] at chisel3.when$.apply(When.scala:32)
[error] at gbvga.MemVga.<init>(memvga.scala:86)
[error] at gbvga.GbVga.$anonfun$memvga$1(gbvga.scala:24)
[error] at chisel3.Module$.do_apply(Module.scala:54)
[error] at gbvga.GbVga.<init>(gbvga.scala:24)
[error] at gbvga.GbVgaDriver$.$anonfun$new$9(gbvga.scala:53)
[error] ... (Stack trace trimmed to user code only, rerun with --full-stacktrace if you wish to see the full stack trace)
...
To manage it, I have to use switch(){is()} format :
switch(io.mem_data) {
is("b00".U) {
io.vga_color := GB_GREEN0
}
is("b01".U) {
io.vga_color := GB_GREEN1
}
is("b10".U) {
io.vga_color := GB_GREEN2
}
is("b11".U) {
io.vga_color := GB_GREEN3
}
}
But it's too verbose I think.
What is wrong with my VecInit() «rom» ?
[edit]
My versions are :
$ java -version
java version "1.8.0_151"
Java(TM) SE Runtime Environment (build 1.8.0_151-b12)
Java HotSpot(TM) 64-Bit Server VM (build 25.151-b12, mixed mode)
$ scala -version
Scala code runner version 2.11.7-20150420-135909-555f8f09c9 -- Copyright 2002-2013, LAMP/EPFL
In build.sbt :
val defaultVersions = Map(
"chisel3" -> "3.4.0-RC1",
"chisel-iotesters" -> "1.5.0-RC1",
"chisel-formal" -> "0.1-SNAPSHOT",
)

I think the problem here is because the Bundles in GbConst are created outside of a Module. One potential fix would be to make GbConst into a trait and add it to Modules who need access to those values. (I have created a PR that seems to show this approach works, though it's probably creating a lot of copies of the Bundles). Another approach (that I have not tried) would be to create a Module that serves up all the Bundles as outputs (which should make less copies).
My PR also changed the chisel3 and chisel-testers dependencies to be SNAPSHOTS.

Related

SBT : automatic generation of allOutputFiles

SBT version 1.3.13 - According to SBT documentation:
Like allInputFiles, there is an allOutputFiles task of return type Seq[Path] that is automatically generated for a task, foo, if the return type of foo is one of Seq[Path], Path, Seq[File] or File.
This seems to work with Seq[Path] and Path as expected:
val outputTask = Def.taskKey[Seq[java.nio.file.Path]]("")
outputTask := Seq[java.nio.file.Path]()
val printOutputs = Def.taskKey[Unit]("")
printOutputs := println((outputTask / allOutputFiles).value) // result: Vector()
However, if I change java.nio.file.Paths to java.io.Files it fails on loading:
[error] Reference to undefined setting:
[error]
[error] outputTask / allOutputFiles from printOutputs
I'm looking into the SBT source code but I haven't got a clue yet. Any insight on why it works with Path Seq[Path] but not with File Seq[File]?

How to fix "Could not find proxy for val base" error in sbt 1.3.0 loading project

I upgraded to sbt 1.3.0 and related plugins.sbt. When I try to start sbt for my project it fails to initialize with the error
java.lang.IllegalArgumentException: Could not find proxy for val base: sbt.SettingKey in List(value base, method sbtdef$1, method $sbtdef, object $bd1712fb73ddc970045f, package <empty>, package <root>) (currentOwner= method $sbtdef )
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.searchIn$1(LambdaLift.scala:316)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.$anonfun$proxy$4(LambdaLift.scala:321)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.searchIn$1(LambdaLift.scala:321)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.$anonfun$proxy$4(LambdaLift.scala:321)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.searchIn$1(LambdaLift.scala:321)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.$anonfun$proxy$4(LambdaLift.scala:321)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.searchIn$1(LambdaLift.scala:321)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.$anonfun$proxy$4(LambdaLift.scala:321)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.searchIn$1(LambdaLift.scala:321)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.proxy(LambdaLift.scala:330)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.proxyRef(LambdaLift.scala:370)
I did find this stackoverflow question Could not find proxy for ... in Macro , but I don't think it helps my error.
i think the code perpetrator is
//Ensure that version.sbt is included with each package.
mappings in Universal ++= {
val h=(packageBin in Compile, baseDirectory)
val base=h._2
val versionFile = (base.value / "version.sbt")
versionFile.get.map(file => file -> file.name)
}
and for some reason base is not storing (packageBin in Compile, baseDirectory) properly?
Edit:
I not a 100% but I think I fixed it by removing the intermediate variables and one lining it. So something like this:
mappings in Universal ++= {
((packageBin in Compile, baseDirectory)._2.value / "version.sbt").get.map(file => file -> file.name)
}
I don't know why it fixed it though...
I think the OP has confused the example with the ineffectual tuple use; perhaps there is some misunderstanding with some SBT API/DSL usage, that is, packageBin in Compile is never used or resolved (for it's side-effect).
I believe the error, however, is more to do with expressing the mappings in Universal task value in way the macro cannot process - it gets confused - for instance, expecting the macro/compiler to find the taskKey in a variable base, which is the _2 in a Tuple2.
The example could be rewritten as
mappings in Universal ++= {
(baseDirectory.value / "version.sbt").get.map(file => file -> file.name)
}
or
mappings in Universal ++= {
(baseDirectory in(Compile, packageBin)).value / "version.sbt").get.map(file => file -> file.name)
}
Depending on what the intention was (probably the latter).
Of course the new syntax would be
mappings in Universal ++= {
((Compile / packageBin / baseDirectory).value / "version.sbt").get.map(file => file -> file.name)
}

How to run Scala test in Scala native application?

I have hello world scala native app and wanted to run small scala test to this app I use the usual test command but it's throw an exception :
NativeMain.scala
object NativeMain {
val p = new Person("xxxx")
def main(args: Array[String]): Unit = {
println("Hello world")
}
}
class Person(var name: String)
}
NativeTest.scala
import org.scalatest.{FlatSpec, Matchers}
class NativeTest extends FlatSpec with Matchers {
"name" should "the name is set correctly in constructor" in {
assert(NativeMain.p.name == "xxxx")
}
}
I run test command in the sbt shell and got this error
[IJ]> test
[info] Compiling 1 Scala source to /home/****/Documents/ScalaNativeFresh/target/scala-2.11/classes...
[info] Compiling 1 Scala source to /home/****/Documents/ScalaNativeFresh/target/scala-2.11/test-classes...
[info] Compiling 1 Scala source to /home/****/Documents/ScalaNativeFresh/target/scala-2.11/test-classes...
[info] Linking (28516 ms)
[error] cannot link: #java.lang.Thread::getStackTrace_scala.scalanative.runtime.ObjectArray
[error] unable to link
[error] (nativetest:nativeLink) unable to link
[error] Total time: 117 s, completed Apr 2, 2019 3:04:24 PM
Any help or suggestions thank you :) ?
There is an open issue to add Add support for Scala Native #1112 and according to cheeseng:
3.1.0-SNAP6 and 3.2.0-SNAP10 are the only 2 versions (as of the time of writing) that supports scala-native
Try importing scalatest_native0.3_2.11 like so
libraryDependencies += "org.scalatest" % "scalatest_native0.3_2.11" % "3.2.0-SNAP10"
scalatest-native-example is a working example showing how to use scalatest with scala-native.

Not found : Type Build (Unresolved 'Build')

I'm getting the following set of errors, which I belive is caused by the sbt-assembly plugin that I is used.
In fact the object declaration of ;
object Build extends **Build** { (here Build is unresolved).
The error is as follows,
Error:Error while importing SBT project:<br/><pre>
[info] Loading settings from assembly.sbt,plugins.sbt ...
[info] Loading project definition from C:\Users\ShakthiW\IdeaProjects\TestProject\project
[error] <path>\Build.scala:4:22: not found: type Build
[error] object Build extends Build{
[error] ^
[error] <path>\Build.scala:8:80: value map is not a member of (sbt.TaskKey[sbt.librarymanagement.UpdateReport], sbt.SettingKey[java.io.File], sbt.SettingKey[String])
[error] def copyDepTask = copyDependencies <<= (update, crossTarget, scalaVersion) map {
[error] ^
[error] <path>\Build.scala:19:16: too many arguments (3) for method apply: (id: String, base: java.io.File)sbt.Project in object Project
[error] Note that 'settings' is not a parameter name of the invoked method.
[error] settings = Defaults.defaultSettings ++ Seq(
[error] ^
[error] three errors found
[error] (compile:compileIncremental) Compilation failed
A quick resolve is highly appreciated.
My Build.scala looks like this.
import sbt.Keys._
import sbt._
object MyBuild extends Build {
lazy val copyDependencies = TaskKey[Unit]("copy-dependencies")
def copyDepTask = copyDependencies <<= (update, crossTarget, scalaVersion) map {
(updateReport, out, scalaVer) =>
updateReport.allFiles foreach { srcPath =>
val destPath = out / "lib" / srcPath.getName
IO.copyFile(srcPath, destPath, preserveLastModified=true)
}
}
lazy val root = Project(
"root",
file("."),
settings = Defaults.defaultSettings ++ Seq(
copyDepTask
)
)
}
Also, I do rekon there is a issue with sbt-assembly upgrades as well which I'm not entirely aware of.
In sbt version 1.0.x, some key dependencies operators were removed. See the migration docs: https://www.scala-sbt.org/0.13/docs/Migrating-from-sbt-012x.html
Here is an short tutorial for writing Build.scala for sbt version 1.0.x: https://alvinalexander.com/scala/sbt-how-to-use-build.scala-instead-of-build.sbt.
You can also refer to build.scala of an existing project for more ref, eg. scalaz.

Why does custom scaladoc task throw MissingRequirementError: object scala.annotation.Annotation in compiler mirror not found?

I hit a MissingRequirementError when I try to invoke scaladoc from within an sbt task.
Using any version of sbt 0.13.x, start with this build.sbt:
val scaladoc = taskKey[Unit]("run scaladoc")
scaladoc := {
import scala.tools.nsc._
val settings = new doc.Settings(error => print(error))
settings.usejavacp.value = true
val docFactory = new doc.DocFactory(new reporters.ConsoleReporter(settings), settings)
val universe = docFactory.makeUniverse(Left((sources in Compile).value.map(_.absolutePath).toList))
}
Then run sbt scaladoc, and behold (during makeUniverse):
[info] Set current project to test (in build file:...)
scala.reflect.internal.MissingRequirementError: object scala.annotation.Annotation in compiler mirror not found.
at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
What is wrong here? I've already tried fork := true and different combinations of sbt/scala versions to no avail.
It seems you need to provide scala-library (and indeed, any other dependencies) directly to the DocFactory.
scaladoc := {
import scala.tools.nsc._
val settings = new doc.Settings(error => print(error))
val dependencyPaths = (update in Compile).value
.select().map(_.absolutePath).mkString(java.io.File.pathSeparator)
settings.classpath.append(dependencyPaths)
settings.bootclasspath.append(dependencyPaths)
val docFactory = new doc.DocFactory(new reporters.ConsoleReporter(settings), settings)
val universe = docFactory.makeUniverse(Left((sources in Compile).value.map(_.absolutePath).toList))
}