Chisel Programming Error - scala

I am having a problem in my Chisel code, I tried the following approach
deqReg := Cat((0 until ports).map(ownReg === Cat(io.configVal(portBits*(_) + 2),io.configVal(portBits*(_)+ 1), io.configVal(portBits*(_)))))
but I am getting the following error when running the above code
[error] /home/jayant/Dropbox/FIFO/fifo.scala:24: missing parameter type for expanded function ((x$1) => portBits.$times(x$1).$plus(2))
[error] deqReg := Cat((0 until ports).map(ownReg === Cat(io.configVal(portBits*(_) + 2),io.configVal(portBits*(_)+ 1), io.configVal(portBits*(_)))))
[error] ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 2 s, completed 4 Sep, 2015 12:31:40 PM
can any one tell what is this error and how to correct it.

You have multiple nested functions in your map which would make it impossible for the Scala compiler to infer the type of the argument. In other words you cannot user the "_" placeholder here. The placeholder just replaces the argument of the innermost function within the expression. Try a fully specified anonymous function (or a partial function) like this:
deqReg := Cat((0 until ports).map{ case i:Int => ownReg === Cat(io.configVal(portBits*i + 2), io.configVal(portBits*i + 1), io.configVal(portBits*i))})
Scala is a quite powerful language and you'd most probably be able to find a more elegant way to write that code.

Related

Scala foreach loop unexpected compilation error while printing elements of a list N times

I just started learning Scala and I am trying to solve a very simple exercise on HackerRank.
I get a number and a list as input and I want to print them N number of times.
This is the function I wrote:
def f(num:Int,arr:List[Int]):List[Int] = {
arr.foreach((element:Int) => print((s"$element" + "\n") * num))
}
Error:
Solution.scala:3: error: type mismatch;
found : Unit
required: List[Int]
arr.foreach((element:Int) => println((s"$element" + "\n") * num))
^
one error found
If I run the same thing in my scala (v2.13) console, I get the expected output

scala "Illegal start of simple expression" in for comprehension with if

I am in the process of implementing a simple in-memory, Redis-like KeyValue store and experiencing a compilation failure on the if statement within the for comprehension following piece of code:
/*
Returns the specified elements of the list stored at key. The offsets start and
stop are zero-based indexes, with 0 being the first element of the list to n. */
def lrange(k: keyT, maxIdx:Int, minIdx:Int): List[valT] = {
val l = lookup(k)
//assert(maxIdx >= minIdx && maxIdx <= (l length) && minIdx >= 0, "invalid min or max argument. list size ")
for {
(x: valT, i: Int) <- l zipWithIndex //tried without explicit typing
if i <= maxIdx && i >= minIdx //tried indenting if
} yield x
}
The editor (IntelliJ) shows no errors, but I receive the following build error when attempting to build and run tests.
[INFO] --- scala-maven-plugin:3.3.2:compile (default) # DS4300Project3 ---
[INFO] .../Spring2019/DS4300/scala/DS4300Project3/src/main/scala:-1: info: compiling
[INFO] Compiling 3 source files to .../Spring2019/DS4300/scala/DS4300Project3/target/classes at 1550678144065
[ERROR] .../Spring2019/DS4300/scala/DS4300Project3/src/main/scala/com/rejevichb/homework3/KeyValStore.scala:70: error: illegal start of simple expression
[ERROR] if (i <= maxIdx) && (i >= minIdx) //tried indenting if
[ERROR] ^
[ERROR] one error found
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
Specifically:
KeyValStore.scala:70: error: illegal start of simple expression
Any guidance or insight into what is going wrong here is appreciated, as the solution is not clear to me.
This is exactly the reason why you should use postfix operators with caution.
for {
i <- "a" zipWithIndex
if true
} yield i
is parsed as
for { i <- ("a" zipWithIndex if true) } yield i
because the compiler attempts to interpret zipWithIndex as a binary infix operator, but then runs into if true, which is indeed not a simple expression.
Workarounds:
just don't use postfix ops, use a period:
for {
i <- "a".zipWithIndex
if true
} yield i
add a semicolon to force zipWithIndex to be interpreted as postfix op:
for {
i <- "a" zipWithIndex;
if true
} yield i
and then enjoy your feature warning:
warning: postfix operator zipWithIndex should be enabled
by making the implicit value scala.language.postfixOps visible.

Adding an item to scala.collection.mutable.HashMap

I tried to add an element to a Scala HashMap
val c2 = new collection.mutable.HashMap[String,Int]()
c2 += ("hh",1)
but the above gives me a compile error.
[error] found : String("hh")
[error] required: (String, Int)
[error] c2 += ("hh", 1)
[error] ^
[error] /scalathing/Main.scala:5: type mismatch;
[error] found : Int(1)
[error] required: (String, Int)
[error] c2 += ("hh", 1)
[error] ^
[error] two errors found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 3 s, completed Sep 1, 2016 1:22:52 AM
The pair I'm adding seems to be of the correct type as demanded by the HashMap. Why do I get a compile error?
The += operator is overloaded to work with variadic arguments. Therefore when the compiler sees c2 += ("hh", 1) it interprets that as two arguments being passed in, one of which is "hh" and the other of which is 1. You can fix that either by using the -> operator, i.e. c2 += ("hh" -> 1) or enclosing the tuple in another series of parantheses, i.e. c2 += (("hh, 1)).
Slightly gory details below as requested in the comments.
As for how all this works, in the case of mutable collections such as HashMap, += is simply an ordinary method called with operator syntax (i.e. spaces instead of a .) or "infix notation" as the Scala community calls it, as any method in Scala can be. It is provided by the Growable trait which mutable collections mix in. You can see on the documentation for Growable both the single argument += method and the variadic method. In other words the following code would have also worked.
c2.+=(("hh", 1))
Not all +=s are created equal however. += commonly shows up in vars as well. Although it can be called with method syntax ., it's magic syntax sugar implemented directly by the Scala compiler. In particular any nonalphanumeric name followed by an = gets desugared. x $NAME= y becomes x = x.$NAME(y). In this case $NAME= is variadic if and only if $NAME is variadic.
var i = 0
i += 1
i.+=(1) // Also compiles
case class SuperInt(raw: Int) {
def &*&(x: SuperInt) = SuperInt(raw + x.raw)
}
var x = SuperInt(1)
x &*&= SuperInt(1) // This works
x.&*&=(SuperInt(1)) // As does this
x &*&= (SuperInt(1), SuperInt(1)) // Does not compile because &*& is not variadic

Something like "Power assert" in Scala similar to Groovy?

I'm wondering if there is something that can give me results similar to Groovy's nice power assert statement.
> assert ["1", '2']*.size() == [2, 3]
Result: Assertion failed:
assert ["1", '2']*.size() == [2, 3]
| |
[1, 1] false
AFAIK there is no support for such thing neither in language, nor in
scalatest, which I'm currently using.
But maybe someone can suggest some side library doing that? It's a pet project, so experimental and not well-supported libs are fine.
EDIT: I know about matchers (scalatest ones, or even plain-java hamcrest matchers). I find them verbose to write and that their output lacks details.
The example above shows intermediate computation steps, facilitating detection of errors. It shows you what's wrong with tested code with more details.
I expect, that introducing such behaviour will require having information about expression AST at runtime. But I suppose, that this information can be "baked" compile time with usage of macroses.
I.e. if we have expression assert a + b == c scala (or some macro extension I'm looking for) can rewrite it to something like:
if (!(a + b == c)) {
// detailed message is
// compute a
// compute b
// compute a + b
// compute c
// compute a + b == c
// Make it pretty.
throw new AssertionFailedException(prettyDetailedMessage)
}
So I'm looking if it's already implemented, and if yes - where.
In ScalaTest, you can use DiagrammedAssertions.
See http://www.scalatest.org/release_notes/2.2.0#diagrammedAssertions
This is based on Expecty which is a macro-based implementation of Spock's power assertions. See https://github.com/pniederw/expecty
Specs2 matchers do a good job with error messages:
class Specs2Specification extends Specification {
"specs2 assertion" should {
"fail" in {
List("1", "2").map(_.length) must_=== List(2, 3)
}
}
}
run output:
[info] Specs2Specification
[info]
[info] specs2 assertion should
[error] x fail
[error] List(1, 1) is not equal to List(2, 3)
[info]
[error] Added (2)
[error] 1
[error] 1
[info]
[error] Missing (2)
[error] 2
[error] 3
or
List("1", "2").map(_.length) must contain(exactly(2, 3)).inOrder
which produces
[error] x fail
[error] the values 2, 3 are not in order
There are lots of them and you can create custom ones.
Your groovy code snippet is literally translated to the following scala code (given that you're already using scalatest):
assert((List("1", "2") map (_.length)) === List(2, 3))
It produces the following error message:
*** FAILED ***
List(1, 1) did not equal List(1, 3)

why scala lambda with _ can't using && to combine two bool expression

As far as I understand .
_ is a short lambda to omit a=>
i find this code (can find here scala-function-true-power)
val file = List("warn 2013 msg", "warn 2012 msg", "error 2013 msg", "warn 2013 msg")
val size = file.filter(_.contains("warn")).filter(_.contains("2013")).size
//val size1 = file.filter(_.contains("warn") && _.contains("2013")).size
val size2 = file.filter( a=> a.contains("warn") && a.contains("2013")).size
println("cat file | grep 'warn' | grep '2013' | wc : " +size )
the line to get size1 has syntax error,looks like it can't recognize the "_" ,it's not a element in fileList.
but i use a=>,the normal kind,it works good .
so,why the scala work by this way?
is there more difference in _ and a=> ?
In scala, any _ placeholder is matched against the passed arguments in the context of calling function. So for example if the signature of the function you are trying to use is f : A ⇒ B and you are calling something like collectionOfFunctA.map(_.f) - Scala compiler will infer the correct type of the function and will use the first underscore to put the actual item from a collection and call the function f over it. But if you will try to write it as collectionOfFunctA.map(_.f + _.size) - that will fail, because Scala compiler will pick up the first placeholder as of type that has function f defined, and the second underscore will not match any function in the context. So it will expect to have a function that takes two parameters instead of one.
More on this
As jdevelop says, but here in the words of the compiler/REPL:
scala> val size1 = file.filter(_.contains("warn") && _.contains("2013")).size
<console>:8: error: missing parameter type for expanded function ((x$1, x$2) => x$1.contains("warn").$amp$amp(x$2.contains("2013")))
val size1 = file.filter(_.contains("warn") && _.contains("2013")).size
^
<console>:8: error: missing parameter type for expanded function ((x$1: <error>, x$2) => x$1.contains("warn").$amp$amp(x$2.contains("2013")))
val size1 = file.filter(_.contains("warn") && _.contains("2013")).size
^
You see that hint: for expanded function ((x$1, x$2) => x$1.contains("warn").$amp$amp(x$2.contains("2013")))
It is expecting 2 parameters while there is just one.
You can think of the place holder as being matched with the lambda's arguments positionnally.
The first occurrence of the _ is matched with the first argument, the second occurence is matched with the second argument, etc.
As the other answers have shown, this means that using the placeholder twice will be desugared as trying to pass a lamba with 2 arguments to the filter which only expects one.
In your example :
val size = file.filter(_.contains("warn") && _.contains("2013")).size
would be desugared as
val size = file.filter((a,b)=>a.contains("warn") && b.contains("2013")).size
which will not compile since filter expects a predicate p: A => Boolean
Now, a reason the placeholder is matched positionnally is to avoid ambiguity in lambdas with more than one argument.
How can the compiler guess the correct implementation for the following case if the place holder can be reused multiple times for the same argument:
file.fold("")(_++_)
Should it be desugared as :
file.fold("")((a,b)=> a++b )
or as
file.fold("")((a,b)=> a++a )
or as
file.fold("")((a,b)=> b++b )
and worse, what would you expect for
file.fold("")(_++_++_)
There is no general way for the compiler to infer the correct implementation.
One might argue for relaxing the constraint when the expected lambda only accepts one argument. I suggest doing a more detailed research before taking the first steps to the scala improvement process as it seems likely that this particular design decision has been challenged and explained before.
If you are worried about the performance of iterating over the list twice (which is the case when you write)
file.filter(_.contains("warn")).filter(_.contains("2013")).size
In theory it should be possible for the compiler to detect that both filters can be applied within the same iteration.
In scala, the collections are eager by default but you can get the lazy evaluation by using views.
The current implementation has known issues which are being worked on. Other collection implementations in scala are actively being developed to be able to combine transformations and computations by default (see psp-std for example)