Are Vec can be reduced in Chisel? - scala

I'm trying to reduce a Vec of Bundle in an UInt, but got an error:
class DiffPair extends Bundle {
val p = Bool()
val n = Bool()
}
class TMDS extends Bundle {
val clk = new DiffPair()
val data = Vec(3, new DiffPair())
}
val io = IO(new Bundle {
//...
val tmds = Output(new TMDS())
})
val O_tmds_data_p = IO(Output(UInt(3.W)))
If I use this reduce code :
O_tmds_data_p := gbHdmi.io.tmds.data.reduce((a,b) => Cat(a.p, b.p))
I've got this error :
[info] compiling 1 Scala source to /home/user/GbHdmi/target/scala-2.12/classes ...
[error] /home/user/GbHdmi/src/main/scala/topgbhdmi.scala:72:67: type mismatch;
[error] found : chisel3.UInt
[error] required: gbhdmi.DiffPair
[error] O_tmds_data_p := gbHdmi.io.tmds.data.reduce((a,b) => Cat(a.p, b.p))
[error] ^
[error] one error found
[error] (Compile / compileIncremental) Compilation failed
[error] Total time: 0 s, completed 31 août 2021, 15:04:08
I know that a and b are DiffPair, but a.p and b.p are Bool no ?

The solution I found is to take 'p' value with map before reduce :
O_tmds_data_p := gbHdmi.io.tmds.data.map(_.p.asUInt).reduce((a,b) => Cat(a,b))
Or, in simplified way :
O_tmds_data_p := gbHdmi.io.tmds.data.map(_.p.asUInt).reduce(_ ## _)
And, that way didn't work, for an obscur reason to me :
O_tmds_data_p := gbHdmi.io.tmds.data.reduce(_.p.asUInt ## _.p.asUInt)
Error :
sbt:GbHdmi> runMain gbhdmi.TopGbHdmiDriver
[info] compiling 1 Scala source to /home/user/GbHdmi/target/scala-2.12/classes ...
[error] /home/user/GbHdmi/src/main/scala/topgbhdmi.scala:71:63: type mismatch;
[error] found : chisel3.UInt
[error] required: gbhdmi.DiffPair
[error] O_tmds_data_p := gbHdmi.io.tmds.data.reduce(_.p.asUInt ## _.p.asUInt)
[error] ^
[error] one error found
[error] (Compile / compileIncremental) Compilation failed
[error] Total time: 0 s, completed Sep 1, 2021, 9:17:15 AM

Have you tried
O_tmds_data_p := gbHdmi.io.tmds.data.asUInt
I think that should work for you.

Related

Scala internal compiler error (possibly?)

Code runs fine in scala repl, will not compile as part of a scala source code base. Seems like the compiler is throwing some kind of an internal error.
I have a List[(String,(Int,Int))] I want to convert to Map[String,List[(Int,Int)] so that I can search it faster. It is basically a set of buckets indexed by the String component of the map that I can then search over and over using maybe binary search, once I have zeroed in on a bucket. I am trying to build the Map structure from the List structure, using a foldLeft operation.
val tpls:List[(String,(Int,Int))] = sm.compressedSegmentList.
map(_.split("|")).
map(s => (s(1), s(2).toInt, s(3).toInt)).
sortBy(_._1).
map(s => (s._1 -> (s._2,s._3)))
val chrBuckets:Map[String,List[(Int,Int)]] =
tpls.
foldLeft(Map().empty.asInstanceOf[String,List[(Int,Int)]])(
(y,x) => {
if (!y.contains(x._1)) y ++ Map(x._1->List(x._2))
else y ++ Map(x._1 -> (y(x._1):+x._2))
}
)
In the scala REPL, I can do the following:
scala> val l1 = List(("a",(1,2)),("a", (2,3)), ("a",(3,4)), ("b", (17,18)), ("b", (18,19)), ("c", (20,21)), ("d",(0,0)))
l1: List[(String, (Int, Int))] = List((a,(1,2)), (a,(2,3)), (a,(3,4)), (b,(17,18)), (b,(18,19)), (c,(20,21)), (d,(0,0)))
scala> l1.foldLeft(Map().empty.asInstanceOf[Map[String,List[(Int,Int)]]])((y,x) => if (!y.contains(x._1)) y ++ Map(x._1->List(x._2)) else y ++ Map(x._1 -> (y(x._1):+x._2)))
res88: Map[String,List[(Int, Int)]] = Map(a -> List((1,2), (2,3), (3,4)), b -> List((17,18), (18,19)), c -> List((20,21)), d -> List((0,0)))
However, when I compile my project using "sbt compile", it fails to compile by throwing up, such as:
[error] last tree to typer: Ident(scala)
[error] tree position: line 26 of /usr/home/maketo/util/SegmentMatcher.scala
[error] tree tpe: scala.type
[error] symbol: final package scala
[error] symbol definition: final package scala (a ModuleSymbol)
[error] symbol package: <none>
[error] symbol owners: package scala
[error] call site: class SegmentMatcher in package util in package util
[error]
[error] == Source file context for tree position ==
[error]
[error] 23 // within the bucket we can do binary search on "segStart" and "segEnd" fields
[error] 24 // after we sort first by segStart and then segEnd
[error] 25 val chrBuckets:Map[String,List[(Int,Int)]] = tpls.
[error] 26 foldLeft(Map().empty.asInstanceOf[String,List[(Int,Int)]])((y,x) => { if (!y.contains(x._1)) y ++ Map(x._1->List(x._2)) else y ++ Map(x._1 -> (y(x._1):+x._2))})
[error] 27
[error] 28 // take a segment and match it against a segmentation
[error] 29 // return segment ID or None
[error] at scala.reflect.internal.SymbolTable.throwAssertionError(SymbolTable.scala:183)
[error] at scala.tools.nsc.typechecker.Typers$Typer.vanillaAdapt$1(Typers.scala:1189)
[error] at scala.tools.nsc.typechecker.Typers$Typer.adapt(Typers.scala:1243)
[error] at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5740)
[error] at scala.tools.nsc.typechecker.Typers$Typer.$anonfun$typed1$60(Typers.scala:5142)
[error] at scala.tools.nsc.typechecker.Typers$Typer.typedTypeSelectionQualifier$1(Typers.scala:5142)
[error] at scala.tools.nsc.typechecker.Typers$Typer.typedSelectOrSuperCall$1(Typers.scala:5150)
.. snipped for brevity, to this:
[error] (Compile / compileIncremental) java.lang.AssertionError: assertion failed:
[error] Context(SegmentMatcher.scala) {
[error] owner = value chrBuckets
[error] tree = Apply:tpls.foldLeft(Map().empty.asInstanceOf[String, List[scala.Tuple2[Int,
[error] scope = 0 decls
[error] contextMode = MacrosEnabled TypeConstructorAllowed
[error] outer.owner = value chrBuckets
[error] }
[error] while compiling: /usr/home/maketo/SegmentMatcher.scala
[error] during phase: typer
[error] library version: version 2.12.8
[error] compiler version: version 2.12.8
Yes, that's an internal compiler error. The following snippet crashes 2.12.5 - 2.12.8 and 2.13.0-M5:
Map().asInstanceOf[Int, List[(Int, Int)]]
Workaround (that's what you should have done anyway, compiler error or not):
Map.empty[Int, List[(Int, Int)]]
I understand you are looking to figure out whether the above was a compiler error, but, if you are looking for a workaround, see if one of these 2 snippets work for you.
// Initialize
val tpls: List[(String, (Int, Int))] = List(
"foo" -> (1, 2),
"bar" -> (3, 4),
"foo" -> (5, 6)
)
// Option 1
tpls.groupBy(_._1).mapValues(_.map(_._2))
// returns Map(foo -> List((1,2), (5,6)), bar -> List((3,4)))
// Option 2
val emptyMap = Map.empty[String, List[(Int,Int)]].withDefaultValue(List.empty[(Int, Int)])
tpls.foldLeft(emptyMap) { case (mapSoFar, (segmentName, interval)) =>
mapSoFar.updated(segmentName, interval::mapSoFar(segmentName))
}
// returns Map(foo -> List((5,6), (1,2)), bar -> List((3,4)))
The second result is in reverse order, but, I just wanted to note that for lists :: is significantly less expensive that :+ (in case you don't care about the order)

Error after running "sbt test" in Chisel 3

I'm trying to use chisel 3.
I tried to test GCD.scala file in the chisel project template repo using sbt test and sbt "test-only example.GCD" commands following the answer to a previous question. But this gives an error(s) that I cannot find the reason for. I didn't do any changes to the build.sbt file or repo layout. I'm posting only the last part of the error message since it is very long and repetitive.
[info] Loading project definition from /home/isuru/fyp/ChiselProjects/TrialProject/project
[info] Set current project to chisel-module-template (in build file:/home/isuru/fyp/ChiselProjects/TrialProject/)
[info] Compiling 1 Scala source to /home/isuru/fyp/ChiselProjects/TrialProject/target/scala-2.11/classes...
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:5: not found: object Chisel3
[error] import Chisel3._
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:7: not found: type Module
[error] class GCD extends Module {
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:8: not found: type Bundle
[error] val io = new Bundle {
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:9: not found: value UInt
[error] val a = UInt(INPUT, 16)
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:9: not found: value INPUT
[error] val a = UInt(INPUT, 16)
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:10: not found: value UInt
[error] val b = UInt(INPUT, 16)
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:10: not found: value INPUT
[error] val b = UInt(INPUT, 16)
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:11: not found: value Bool
[error] val e = Bool(INPUT)
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:11: not found: value INPUT
[error] val e = Bool(INPUT)
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:12: not found: value UInt
[error] val z = UInt(OUTPUT, 16)
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:12: not found: value OUTPUT
[error] val z = UInt(OUTPUT, 16)
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:13: not found: value Bool
[error] val v = Bool(OUTPUT)
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:13: not found: value OUTPUT
[error] val v = Bool(OUTPUT)
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:15: not found: value Reg
[error] val x = Reg(UInt())
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:15: not found: value UInt
[error] val x = Reg(UInt())
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:16: not found: value Reg
[error] val y = Reg(UInt())
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:16: not found: value UInt
[error] val y = Reg(UInt())
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:17: not found: value when
[error] when (x > y) { x := x - y }
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:18: not found: value unless
[error] unless (x > y) { y := y - x }
[error] ^
[error] /home/isuru/fyp/ChiselProjects/TrialProject/src/main/scala/example/GCD.scala:19: not found: value when
[error] when (io.e) { x := io.a; y := io.b }
[error] ^
[error] 20 errors found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 2 s, completed Dec 1, 2016 8:26:25 PM
The errors you have shown suggest that sbt is somehow not finding Chisel, could you by chance show the full list of errors (especially early on ones)? With the following sequence of commands I am unable to reproduce the errors you are seeing:
git clone git#github.com:ucb-bar/chisel-template.git
cd chisel-template
sbt test
It is not the cause of this issue, but to run the test in chisel-template you should actually run sbt "test-only examples.test.GCDTester". example.GCD is the top of the design, but to run the test you have to refer to the Tester class in src/test/scala/examples/test/GCDUnitTest.scala.
I just encountered this same problem when making my own chisel project. However it was not that the import chisel3._ was wrong. The problem I had was I did not have a build.sbt file included in my directory.
I found my solution here.
https://chisel.eecs.berkeley.edu/2.0.6/getting-started.html

Specs2 won't print scalacheck counterexamples?

I'm using specs2 to run my tests. I'm able to get it running scalacheck, but in the below (when I run with sbt test) it doesn't print the counterexample. This is almost useless without the counterexample:
import org.specs2.mutable.Specification
import org.scalacheck.Properties
import org.scalacheck.Prop
import org.specs2.ScalaCheck
import org.specs2.scalacheck.Parameters
import org.scalacheck.Gen
class StripeExportSpec extends Specification with ScalaCheck {
import StripeExportJob._
//.verbose makes no difference
implicit val params = Parameters().setVerbosity(10)
val p2: Properties = new Properties("dayIntervals") {
val dayEpochs = for {
n <- Gen.choose(1l, 500l)
m <- Gen.choose(n, 500l)
} yield (n*twentyFourHours,m*twentyFourHours)
property("aligns start to first parameter") = Prop.forAll(dayEpochs) { x: (Long,Long) =>
val (a, b) = x
val result = dayIntervals(a, b)
result.head._1 == a
}
property("aligns end correctly to 24 hours after b") = Prop.forAll(dayEpochs) { x: (Long,Long) =>
val (a, b) = x
val result = dayIntervals(a, b)
result.last._2 == b+twentyFourHours
}
}
//s2"dayIntervals respects ${properties(p2)}"
"dayIntervals respects " >> addFragments(properties(p2))
}
Instead all I get is:
[info] StripeExportSpec
[info]
[info] dayIntervals respects
[info]
[error] ! dayIntervals.aligns start to first parameter
[error] java.lang.AssertionError: assertion failed (StripeExport.scala:157)
[error] com.handy.pipeline.jobs.StripeExportJob$.dayIntervals(StripeExport.scala:157)
[error] com.handy.pipeline.jobs.StripeExportSpec$$anon$1$$anonfun$2.apply(StripeExportSpec.scala:28)
[error] com.handy.pipeline.jobs.StripeExportSpec$$anon$1$$anonfun$2.apply(StripeExportSpec.scala:26)
[error] org.scalacheck.Prop$$anonfun$forAllShrink$1$$anonfun$3.apply(Prop.scala:713)
[error] org.scalacheck.Prop$$anonfun$forAllShrink$1$$anonfun$3.apply(Prop.scala:713)
[error] org.scalacheck.Prop$.secure(Prop.scala:457)
[error] org.scalacheck.Prop$$anonfun$forAllShrink$1.org$scalacheck$Prop$$anonfun$$result$1(Prop.scala:713)
[error] org.scalacheck.Prop$$anonfun$forAllShrink$1$$anonfun$4.apply(Prop.scala:720)
[error] org.scalacheck.Prop$$anonfun$forAllShrink$1$$anonfun$4.apply(Prop.scala:720)
[error] org.scalacheck.Prop$$anonfun$forAllShrink$1.getFirstFailure$1(Prop.scala:720)
[error] org.scalacheck.Prop$$anonfun$forAllShrink$1.shrinker$1(Prop.scala:730)
[error] org.scalacheck.Prop$$anonfun$forAllShrink$1.apply(Prop.scala:752)
[error] org.scalacheck.Prop$$anonfun$forAllShrink$1.apply(Prop.scala:707)
[error] org.scalacheck.Prop$$anonfun$apply$5.apply(Prop.scala:292)
[error] org.scalacheck.Prop$$anonfun$apply$5.apply(Prop.scala:291)
[error] org.scalacheck.PropFromFun.apply(Prop.scala:22)
[error] org.scalacheck.Test$.org$scalacheck$Test$$workerFun$1(Test.scala:294)
[error] org.scalacheck.Test$$anonfun$3.apply(Test.scala:323)
[error] org.scalacheck.Test$$anonfun$3.apply(Test.scala:323)
[error] org.scalacheck.Platform$.runWorkers(Platform.scala:40)
[error] org.scalacheck.Test$.check(Test.scala:323)
[error] com.handy.pipeline.jobs.StripeExportSpec.check(StripeExportSpec.scala:14)
[info]
[error] ! dayIntervals.aligns end correctly to 24 hours after b
[error] java.lang.AssertionError: assertion failed (StripeExport.scala:157)
[error] com.handy.pipeline.jobs.StripeExportJob$.dayIntervals(StripeExport.scala:157)
[error] com.handy.pipeline.jobs.StripeExportSpec$$anon$1$$anonfun$5.apply(StripeExportSpec.scala:34)
[error] com.handy.pipeline.jobs.StripeExportSpec$$anon$1$$anonfun$5.apply(StripeExportSpec.scala:32)
[error] org.scalacheck.Prop$$anonfun$forAllShrink$1$$anonfun$3.apply(Prop.scala:713)
[error] org.scalacheck.Prop$$anonfun$forAllShrink$1$$anonfun$3.apply(Prop.scala:713)
[error] org.scalacheck.Prop$.secure(Prop.scala:457)
[error] org.scalacheck.Prop$$anonfun$forAllShrink$1.org$scalacheck$Prop$$anonfun$$result$1(Prop.scala:713)
[error] org.scalacheck.Prop$$anonfun$forAllShrink$1$$anonfun$4.apply(Prop.scala:720)
[error] org.scalacheck.Prop$$anonfun$forAllShrink$1$$anonfun$4.apply(Prop.scala:720)
[error] org.scalacheck.Prop$$anonfun$forAllShrink$1.getFirstFailure$1(Prop.scala:720)
[error] org.scalacheck.Prop$$anonfun$forAllShrink$1.shrinker$1(Prop.scala:730)
[error] org.scalacheck.Prop$$anonfun$forAllShrink$1.apply(Prop.scala:752)
[error] org.scalacheck.Prop$$anonfun$forAllShrink$1.apply(Prop.scala:707)
[error] org.scalacheck.Prop$$anonfun$apply$5.apply(Prop.scala:292)
[error] org.scalacheck.Prop$$anonfun$apply$5.apply(Prop.scala:291)
[error] org.scalacheck.PropFromFun.apply(Prop.scala:22)
[error] org.scalacheck.Test$.org$scalacheck$Test$$workerFun$1(Test.scala:294)
[error] org.scalacheck.Test$$anonfun$3.apply(Test.scala:323)
[error] org.scalacheck.Test$$anonfun$3.apply(Test.scala:323)
[error] org.scalacheck.Platform$.runWorkers(Platform.scala:40)
[error] org.scalacheck.Test$.check(Test.scala:323)
[error] com.handy.pipeline.jobs.StripeExportSpec.check(StripeExportSpec.scala:14)
This is a bug in specs2 which occurs when you throw AssertionErrors (or any kind of java.lang.Error in properties. This is fixed in 3.8.4-20160905063548-8470e96.
Also, since you are using a specification you don't have to use ScalaCheck Properties. You can write:
"dayIntervals" >> {
"aligns start to first parameter" >> Prop.forAll(dayEpochs) { x: (Long,Long) =>
val (a, b) = x
val result = dayIntervals(a, b)
result.head._1 === a
}
// another way of using generators
"aligns end correctly to 24 hours after b" >> prop { x: (Long,Long) =>
val (a, b) = x
val result = dayIntervals(a, b)
result.last._2 === b+twentyFourHours
}.setGen(dayEpochs)
}
val dayEpochs = for {
n <- Gen.choose(1l, 500l)
m <- Gen.choose(n, 500l)
} yield (n*twentyFourHours,m*twentyFourHours)

How can I use Scala macros to create an object?

I am trying to create a Scala macro that will generate an object - something like
object SomeEnum {
sealed abstract class Enum(name: String)
case object Option1 extends Enum("option1")
case object Option2 extends Enum("option2")
private val elements: Seq[Enum] = Seq(Option1, Option2)
def apply(code: String): Enum = {
...
}
}
I thought I might be able to create a macro createEnum, so I could just put
createEnum("SomeEnum", "Option1", "Option2") into my code and have it generate. Seems like it's calling out for a macro.
But I must not be understanding macros. I am using Scala 2.11.6, and just to try to get something working, I created the following:
object createEnumObj {
def createEnumImpl(c: scala.reflect.macros.whitebox.Context)(ename: c.Expr[String]): c.universe.ModuleDef = {
import c.universe._
val Literal(Constant(s_ename: String)) = ename.tree
val oname = TermName(s_ename)
val barLine = q"val bar: Int = 5"
q"object $oname { $barLine }"
}
def createEnum(ename: String): Unit = macro createEnumImpl
}
This is in a separate project - everything seems to be compiling for it OK.
If I stick a call to createEnumObj.createEnum into some source and try to compile that, I get a billion lines (give or take a few) of exception output, which seems to repeat something like this:
[error] (main/compile:compile) java.lang.AssertionError: assertion failed:
[error] object foo extends scala.AnyRef {
[error] def <init>() = {
[error] super.<init>();
[error] ()
[error] };
[error] val bar: Int = 5
[error] }
[error] while compiling: /Users/bob/ICL/ironcore-id/src/main/scala/package.scala
[error] during phase: typer
[error] library version: version 2.11.6
[error] compiler version: version 2.11.6
[error] reconstructed args: -Xfuture ...
error]
[error] last tree to typer: term foo
[error] tree position: line 8 of /Users/bob/ICL/ironcore-id/src/main/scala/package.scala
[error] symbol: <none>
[error] symbol definition: <none> (a NoSymbol)
[error] symbol package: <none>
[error] symbol owners:
[error] call site: <none> in <none>
[error]
[error] == Source file context for tree position ==
[error]
[error] 5 type DateTime = Int
[error] 6
[error] 7 createEnumObj.createEnum("foo")
[error] 8
[error] 9 }
[error] Total time: 2 s, completed Jun 18, 2015 2:46:05 PM
What I am trying to do doesn't seem too dissimilar to this question, but I'm obviously missing something. Any ideas about how to accomplish this would be gratefully accepted.
Thanks,
Bob

I want get maplist but there is still error for do not have value

I use scala+play2+slick2:
val subject = TableQuery[Subjects]
subject is my tableQuery
in controller I defined a functional:
import scala.slick.driver.PostgresDriver.simple._
def index = DBAction {implicit rs =>
Ok(views.html.list_subject.render("Hello from Scala", (subject.drop(0).map(i => (i.id, i.name, i.describe,i.sub_resource)).run)
))
}
the error is:
[info] Loading project definition from G:\testprojects\slickplay\project
[info] Set current project to slickplay (in build file:/G:/testprojects/slickpla
y/)
[info] Compiling 6 Scala sources to G:\testprojects\slickplay\target\scala-2.10\
classes...
[error] G:\testprojects\slickplay\app\controllers\MainController.scala:12: could
not find implicit value for parameter session: scala.slick.jdbc.JdbcBackend#Ses
sionDef
[error] Ok(views.html.list_subject.render("Hello from Scala", (subject.drop(
0).map(i => (i.id, i.name, i.describe,i.sub_resource)).run)
[error]
^
[error] one error found
[error] (compile:compile) Compilation failed
[error] Total time: 9 s, completed 11/04/2014 2:23:57 PM
Is there any example with scala+play2+slick2 and use Form? Any one know how to return the
(i.id, i.name, i.describe,i.sub_resource) list?
try importing
import play.api.db.slick._
import play.api.db.slick.Config.driver.simple._
if you are using DBAction, which I presume from play-slick plugin