I try to generate the verilog code from chisel
My code is:
class RegFifo[T <: Data](gen:T, depth:Int) extends Fifo(gen:T, depth:Int){
def counter(depth: Int, incr: Bool): (UInt, UInt)= {
val cntReg = RegInit(0.U(log2Ceil(depth).W))
val nextVal = Mux(cntReg === (depth-1).U, 0.U, cntReg+1.U)
when (incr){
cntReg := nextVal
}
(cntReg, nextVal)
}
.
.
.
}
object RegFifoDriver extends App {
(new chisel3.stage.ChiselStage).emitVerilog(new RegFifo(args, args), args)//, depth)//, args)
}
I got the error message is:
[error] found : Array[String]
[error] required: Int
[error] (new chisel3.stage.ChiselStage).emitVerilog(new RegFifo(args, args), args)//, depth)//, args)
[error] ^
[error] one error found
I know the error because of the 'emitVerilog(new RegFifo(args, args)' but how can I fix it? THX.
I have some issues with scala.
I got this error when I run scalastyle
illegal start of simple expression
This question is linked with other questions about scalastyle illagal error but I tried all solutions proposed but nothing has fixed the error.
Here is the piece of my code
And the error is located in the if-statement at netexr function.
case class EXT (ev: Seq[String])
val schema = StructType(Seq(
StructField("label", DoubleType)
))
def hasVis(ev: Seq[String]): Boolean = ev.toSet.exists(videoV.contains)
def hasCom(ev: Seq[String]): Boolean = ev.toSet.exists(videoC.contains)
def nexter: EXT => Seq[Row] = (extension: EXT) => Seq(new GenericRowWithSchema(
Array(if (hasVis(extension.ev) && hasCom(extension.ev)) 1.0 else 0.0), schema))
Thank you for your help
I am using Play-Framework-2.6 i am getting this error
/myproject/app/controllers/Application.scala:151: not enough
arguments for method apply: (data: akka.stream.scaladsl.Source[akka.util.ByteString, _], contentLength: Option[Long], contentType: Option[String])play.api.http.HttpEntity.Streamed in object Streamed.
[error] Unspecified value parameters contentLength, contentType.
[error] body = HttpEntity.Streamed(responseStream)
[error] ^
here is my code
def prometheusMetrics = Action {
val responseStream = Concurrent.unicast[Array[Byte]] { channel =>
val writer = new WriterAdapter(channel)
TextFormat.write004(writer, CollectorRegistry.defaultRegistry.metricFamilySamples())
writer.close()
}
Result(
header = ResponseHeader(200, Map.empty),
body = HttpEntity.Streamed(responseStream)
).as(TextFormat.CONTENT_TYPE_004)
}
i researched this but i did not find any suitable solution for it .please guide me
Update # 1
answer given by user #James Whiteley
after doing this
Result(
header = ResponseHeader(200, Map.empty),
body = HttpEntity.Streamed(responseStream, None, None)
).as(TextFormat.CONTENT_TYPE_004)
i am getting
type mismatch;
[error] found : play.api.libs.iteratee.Enumerator[Array[Byte]]{implicit val pec: scala.concurrent.ExecutionContext}
[error] required: akka.stream.scaladsl.Source[akka.util.ByteString, _]
[error] body = HttpEntity.Streamed(responseStream, None, None)
HttpEntity.Streamed seems to take three parameters, not one. Try
body = HttpEntity.Streamed(responseStream, None, None)
if you don't want to specify contentLength and contentType - these are optional parameters but still need to be defined.
I am puzzled by why the following code using scala fastparse 0.4.3 fails typechecking.
val White = WhitespaceApi.Wrapper{
import fastparse.all._
NoTrace(CharIn(" \t\n").rep)
}
import fastparse.noApi._
import White._
case class Term(tokens: Seq[String])
case class Terms(terms: Seq[Term])
val token = P[String] ( CharIn('a' to 'z', 'A' to 'Z', '0' to '9').rep(min=1).!)
val term: P[Term] = P("[" ~ token.!.rep(sep=" ", min=1) ~ "]").map(x => Term(x))
val terms = P("(" ~ term.!.rep(sep=" ", min=1) ~ ")").map{x => Terms(x)}
val parse = terms.parse("([ab bd ef] [xy wa dd] [jk mn op])")
The error messages:
[error] .../MyParser.scala: type mismatch;
[error] found : Seq[String]
[error] required: Seq[Term]
[error] val terms = P("(" ~ term.!.rep(sep=" ", min=1) ~")").map{x => Terms(x)}
[error] ^
I would imagine that since term is of type Term and since the terms pattern uses term.!.rep(..., it should get a Seq[Term].
I figured it out. My mistake was capturing (with !) redundantly in terms. That line should instead be written:
val terms = P("(" ~ term.rep(sep=" ", min=1) ~ ")").map{x => Terms(x)}
Notice that term.!.rep( has been rewritten to term.rep(.
Apparently capturing in any rule will return the text that the captured subrule matches overriding what the subrule actually returns. I guess this is a feature when used correctly. :)
I am adding two additional fields to the Person table: a Date and a String. I built the Person tabel and mapped it with Play Slick by following olivebh's tutorial.
However, I get the following erros from the Slick data model trait Tables:
dao/Tables.scala:85: ambiguous implicit values:
[error] both value e3 of type slick.jdbc.GetResult[String]
[error] and value e1 of type slick.jdbc.GetResult[String]
[error] match expected type slick.jdbc.GetResult[String]
[error] ProjectRow.tupled((<<[Int], <<[String], <<[Date], <<[String]))
which refers to the following line:
implicit def GetResultPersonRow(implicit e0: GR[Int], e1: GR[String], e2: GR[Date], e3: GR[String]): GR[ProjectRow] = GR {
prs =>
import prs._
PersonRow.tupled((<<[Int], <<[String], <<[Date], <<[String]))
}
where the "int, string, date, string" represent the "id, name, birthdate, language" fields respectively. Everything worked fine by following the tutorial that covers "id, name" as an example. But as soon as I added birthdate and language, I got the error quoted above.
Also, when creating the prototypes for the table rows:
class Person(_tableTag: Tag) extends Table[PersonRow](_tableTag, "person") {
def * = (personId, name, birthdate, language) <>(PersonRow.tupled, PersonRow.unapply)
def ? = (Rep.Some(personId), Rep.Some(name), Rep.Some(birthdate), Rep.Some(language)).shaped.<>({ r => import r._; _1.map(_ => ProjectRow.tupled((_1.get, _2.get, _3.get, _4.get))) }, (_: Any) => throw new Exception("Inserting into ? projection not supported."))
val personId: Rep[Int] = column[Int]("person_id", O.AutoInc, O.PrimaryKey)
val name: Rep[String] = column[String]("name", O.Length(50, varying = true))
val birthdate: Rep[Date] = column[Date]("birthdate", O.Length(50, varying = true))
val language: Rep[String] = column[String]("language", O.Length(50, varying = true))
I get the following errors:
No matching Shape found.
[error] Slick does not know how to map the given types.
[error] Possible causes: T in Table[T] does not match your * projection. Or you use an unsupported type in a Query (e.g. scala List).
[error] Required level: slick.lifted.FlatShapeLevel
[error] Source type: (slick.lifted.Rep[Int], slick.lifted.Rep[String], slick.lifted.Rep[java.util.Date], slick.lifted.Rep[String])
[error] Unpacked type: (Int, String, java.util.Date, String)
[error] Packed type: Any
[error] def * = (personId, name, birthdate, language) <>(PersonRow.tupled, PersonRow.unapply)
and also:
dao/Tables.scala:94: could not find implicit value for parameter od: slick.lifted.OptionLift[Tables.this.driver.api.Rep[java.util.Date],O]
[error] def ? = (Rep.Some(personId), Rep.Some(name), Rep.Some(birthdate), Rep.Some(language)).shaped.<>({ r => import r._; _1.map(_ => PersonRow.tupled((_1.get, _2.get, _3.get, _4.get))) }, (_: Any) => throw new Exception("Inserting into ? projection not supported."))
dao/Tables.scala:94: not found: value _1
[error] def ? = (Rep.Some(personId), Rep.Some(name), Rep.Some(birthdate), Rep.Some(language)).shaped.<>({ r => import r._; _1.map(_ => PersonRow.tupled((_1.get, _2.get, _3.get, _4.get))) }, (_: Any) => throw new Exception("Inserting into ? projection not supported."))
Any help in understanding these errors and therefore how I could change the Slick data model trait in order for it to properly handle two additional Date and String fields, would be greatly appreciated. Thank you so much!
Slick can't handle java.util.Date because databases understand only java.sql.Date via JDBC drivers. You can make your own mapper, so that Slick can know how to read/write java.util.Date/java.sql.Date. But, Java 8 now has a better API for handling date/time/calendars.
implicit val localDateTimeColumnType = MappedColumnType.base[LocalDateTime, Timestamp](
ldt => Timestamp.valueOf(ldt),
t => t.toLocalDateTime
)
See this question also. Btw, thanks for reading my article, hope it was useful! :)