How to select just one first or last record compliant to a where clause with ScalaQuery? - scala

Having the following query template to select all:
val q = for {
a <- Parameters[Int]
b <- Parameters[Int]
t <- T if t.a == a && t.b == b
_ <- Query.orderBy(t.c, t.d)
} yield t
I need to modify it to select the very first (with minimum c and d minimum for this c) or the very last (with maximum c and d maximum for this c) record of those matching the where condition. I'd usually strongly prefer no other (than the last/first) records to be selected as there are hundreds thousands of them...

There's a potential danger here in how the OP's query is currently constructed. Run as is, getting the first or last result of a 100K result set is not terribly efficient (unlikely, yes, but the point is, the query places no limit on number of number of rows returned)
With straight SQL you would never do such a thing; instead you would tack on a LIMIT 1
In ScalaQuery, LIMIT = take(n), so add take(1) to get a single record returned from the query itself
val q = (for {
a <- Parameters[Int]
b <- Parameters[Int]
t <- T if t.a == a && t.b == b
_ <- Query.orderBy(t.c, t.d)
} yield t) take(1)
q.firstOption

There is method firstOption defined on the Invoker trait, and by some magic it is available on the Query class. So maybe you can try it like this:
val q = for {
a <- Parameters[Int]
b <- Parameters[Int]
t <- T if t.a == a && t.b == b
_ <- Query.orderBy(t.c, t.d)
} yield t
q.firstOption

Related

How to access previous element when using yield in for loop chisel3

This is mix Chisel / Scala question.
Background, I need to sum up a lot of numbers (the number of input signals in configurable). Due to timing constrains I had to split it to groups of 4 and pipe(register it), then it is fed into next stage (which will be 4 times smaller, until I reach on)
this is my code:
// log4 Aux function //
def log4(n : Int): Int = math.ceil(math.log10(n.toDouble) / math.log10(4.0)).toInt
// stage //
def Adder4PipeStage(len: Int,in: Vec[SInt]) : Vec[SInt] = {
require(in.length % 4 == 0) // will not work if not a muliplication of 4
val pipe = RegInit(VecInit(Seq.fill(len/4)(0.S(in(0).getWidth.W))))
pipe.zipWithIndex.foreach {case(p,j) => p := in.slice(j*4,(j+1)*4).reduce(_ +& _)}
pipe
}
// the pipeline
val adderPiped = for(j <- 1 to log4(len)) yield Adder4PipeStage(len/j,if(j==1) io.in else <what here ?>)
how to I access the previous stage, I am also open to hear about other ways to implement the above
There are several things you could do here:
You could just use a var for the "previous" value:
var prev: Vec[SInt] = io.in
val adderPiped = for(j <- 1 to log4(len)) yield {
prev = Adder4PipeStage(len/j, prev)
prev
}
It is a little weird using a var with a for yield (since the former is fundamentally mutable while the latter tends to be used with immutable-style code).
You could alternatively use a fold building up a List
// Build up backwards and reverse (typical in functional programming)
val adderPiped = (1 to log4(len)).foldLeft(io.in :: Nil) {
case (pipes, j) => Adder4PipeStage(len/j, pipes.head) :: pipes
}.reverse
.tail // Tail drops "io.in" which was 1st element in the result List
If you don't like the backwards construction of the previous fold,
You could use a fold with a Vector (better for appending than a List):
val adderPiped = (1 to log4(len)).foldLeft(Vector(io.in)) {
case (pipes, j) => pipes :+ Adder4PipeStage(len/j, pipes.last)
}.tail // Tail drops "io.in" which was 1st element in the result Vector
Finally, if you don't like these immutable ways of doing it, you could always just embrace mutability and write something similar to what one would in Java or Python:
For loop and mutable collection
val pipes = new mutable.ArrayBuffer[Vec[SInt]]
for (j <- 1 to log4(len)) {
pipes += Adder4PipeStage(len/j, if (j == 1) io.in else pipes.last)
}

Reg of Vec of Valid(Vec

I am continually getting an error due to the use of different sized Vecs. Is there a way I could implement this so that b, c, and d would all take just their respective # of Vec elements from a.
val a = Reg(Vec(10, Valid(Vec(32, UInt(8.W)))))
val b = Reg(Vec(10, Valid(Vec(16, UInt(8.W)))))
val c = Reg(Vec(8, Valid(Vec(24, UInt(8.W)))))
val d = Reg(Vec(7, Valid(Vec(32, UInt(8.W)))))
for (i <- 0 to 10) {
b(i).bits := a(i).bits
}
for (i <- 0 to 8) {
c(i).bits := a(i).bits
}
for (i <- 0 to 7) {
d(i).bits := a(i).bits
}
Here is the error message I am receiving.
Connection between sink (UInt<8>[16](Reg in file)) and source (UInt<8>[32](Reg in file)) failed #: Sink and Source are different length Vecs.
You can use the slice method to take a portion of any Scala Seq (Chisel's Vec extends Seq):
val a = Reg(Vec(10, Valid(Vec(32, UInt(8.W)))))
val b = Reg(Vec(10, Valid(Vec(16, UInt(8.W)))))
for (i <- 0 until 10) {
b(i).bits := a(i).bits.slice(0, 16)
}
This all is very index heavy though, relying on knowing the sizes of your Vecs, we could use the Seq method zip which allows you to iterate on corresponding elements in two Seqs. If one is longer than the other, it just truncates to the shorter one. The below is functionally identical to the for loop with the slice:
for ((bb, aa) <- b zip a) { // I'm using infix notation: b zip a == b.zip(a)
for ((x, y) <- bb.bits zip aa.bits) {
x := y
}
}
Note the lack of any indices, this just works no matter the size, so we could use this in a reusable function:
def nestedConnect(x: Vec[Valid[Vec[UInt]]], y: Vec[Valid[Vec[UInt]]]): Unit = {
for ((xx, yy) <- x zip y) {
for ((l, r) <- xx.bits zip yy.bits) {
l := r
}
}
}
We can then do all of your connects by using this function:
nestedConnect(b, a)
nestedConnect(c, a)
nestedConnect(d, a)
Executable example showing that this works: https://scastie.scala-lang.org/SFh1PratTCWhxHc55VGeHg

How to group items from an unsorted sequence with custom logic?

Having trouble with this... I'm not even sure where to start.
I have an unsorted list of objects:
myList = (A, Z, T, J, D, L, W...)
These objects have different types, but all share the same parent type.
Some of the objects "match" each other through custom business logic:
A.matches(B) = True
A.matches(C) = False
(Edit: Matching is commutative. X.matches(Y) = Y.matches(X))
I'm looking for a way in Scala to group those objects that match, so I end up with something like:
myMatches = [ [A,B,C], [D,Z,X], [H], ...]
Here's the catch -- matching is not transitive.
A.matches(B) = True
B.matches(C) = True
A.matches(C) = False <---- A and C can only be associated through their matches to B
I still want [A,B,C] to be grouped even though A and C don't directly match.
Is there an easy way to group together all the things that match each other? Is there a name for this kind of problem so I can Google more about it?
Under the assumptions, that
matching is commutative, that is if A matches B, then B matches A
if A matches B, B matches C and C matches D, all of them should be in the same group.
you just need to do a search (DFS or BFS) through the graph of matches starting with every element, that is not yet in a group. The elements you find in one search form exactly one group.
Here is some example code:
import scala.collection.mutable
case class Element(name: Char) {
def matches(other: Element): Boolean = {
val a = name - 'A'
val b = other.name - 'A'
a * 2 == b || b * 2 == a
}
override def toString: String = s"$name (${name - 'A'})"
}
def matchingGroups(elements: Seq[Element]): Seq[Seq[Element]] = {
val notInGroup: mutable.Set[Element] = elements.to[mutable.Set]
val groups: mutable.ArrayBuilder[Seq[Element]] = mutable.ArrayBuilder.make()
val currentGroup: mutable.ArrayBuilder[Element] = mutable.ArrayBuilder.make()
def fillCurrentGroup(element: Element): Unit = {
notInGroup -= element
currentGroup += element
for (candidate <- notInGroup) {
if (element matches candidate) {
fillCurrentGroup(candidate)
}
}
}
while (notInGroup.nonEmpty) {
currentGroup.clear()
fillCurrentGroup(notInGroup.head)
groups += currentGroup.result()
}
groups.result()
}
matchingGroups('A' to 'Z' map Element) foreach println
This finds the following groups:
WrappedArray(M (12), G (6), D (3), Y (24))
WrappedArray(R (17))
WrappedArray(K (10), F (5), U (20))
WrappedArray(X (23))
WrappedArray(V (21))
WrappedArray(B (1), C (2), E (4), I (8), Q (16))
WrappedArray(H (7), O (14))
WrappedArray(L (11), W (22))
WrappedArray(N (13))
WrappedArray(J (9), S (18))
WrappedArray(A (0))
WrappedArray(Z (25))
WrappedArray(P (15))
WrappedArray(T (19))
If matches relationship is non-commutative, this problem is a bit more complex. In this case during a search you may run into several different groups, you've discovered during previous searches, and you'll have to merge these groups into one. It may be useful to represent the groups with the disjoint-set data structure for faster merging.
Here is a functional solution based on Scala Sets. It assumes that the unsorted list of objects does not contain duplicates, and that they all inherit from some type MatchT that contains an appropriate matches method.
This solution first groups all objects into sets that contain directly matching objects. It then checks each set in turn and combines it with any other sets that have any elements in common (non-empty intersection).
def groupByMatch[T <: MatchT](elems: Set[T]): Set[Set[T]] = {
#tailrec
def loop(sets: Set[Set[T]], res: Set[Set[T]]): Set[Set[T]] =
sets.headOption match {
case None =>
res
case Some(h) =>
val (matches, noMatches) = res.partition(_.intersect(h).nonEmpty)
val newMatches = h ++ matches.flatten
loop(sets.tail, noMatches + newMatches)
}
val matchSets = objs.map(x => objs.filter(_.matches(x)) + x)
loop(matchSets, Set.empty[Set[T]])
}
There are a number of inefficiencies here, so if performance is an issue then a non-functional version based on mutable Maps is likely to be faster.

Scala list of tuples of different size zip issues?

Hi my two lists as follows:
val a = List((1430299869,"A",4200), (1430299869,"A",0))
val b = List((1430302366,"B",4100), (1430302366,"B",4200), (1430302366,"B",5000), (1430302366,"B",27017), (1430302366,"B",80), (1430302366,"B",9300), (1430302366,"B",9200), (1430302366,"A",5000), (1430302366,"A",4200), (1430302366,"A",80), (1430302366,"A",443), (1430302366,"C",4100), (1430302366,"C",4200), (1430302366,"C",27017), (1430302366,"C",5000), (1430302366,"C",80))
when I used zip two lists as below :
val c = a zip b
it returns results as
List(((1430299869,A,4200),(1430302366,B,4100)), ((1430299869,A,0),(1430302366,B,4200)))
Not all lists of tuples, how can I zip all above data?
EDIT
expected results as combine of two lists like :
List((1430299869,"A",4200), (1430299869,"A",0),(1430302366,"B",4100), (1430302366,"B",4200), (1430302366,"B",5000), (1430302366,"B",27017), (1430302366,"B",80), (1430302366,"B",9300), (1430302366,"B",9200), (1430302366,"A",5000), (1430302366,"A",4200), (1430302366,"A",80), (1430302366,"A",443), (1430302366,"C",4100), (1430302366,"C",4200), (1430302366,"C",27017), (1430302366,"C",5000), (1430302366,"C",80))
Second Edit
I tried this :
val d = for(((a,b,c),(d,e,f)) <- (a zip b)if(b.equals(e) && c.equals(f))) yield (d,e,f)
but it gives empty results because of (a zip b) but I replaced a zip b as a ++ b then it shows following error :
constructor cannot be instantiated to expected type;
So how can I get matching tuples?
Just add one list to another:
a ++ b
According to your 2nd edit, what you need is:
for {
(a1,b1,c) <- a //rename extracted to a1 and b1 to avoid confusion
(d,e,f) <- b
if b1.equals(e) && c.equals(f)
} yield (d,e,f)
Or:
for {
(a1, b1, c) <- a
(d, `b1`, `c`) <- b //enclosing it in backticks avoids capture and matches against already defined values
} yield (d, b1, c)
Zipping won't help since you need to compare all tuples in a with all tuples in b , it seems.
a zip b creates a list of pairs of elements from a and b.
What you're most likely looking for is list concatenation, which is a ++ b
On zipping (pairing) all data in the lists, consider first a briefer input for illustrating the case,
val a = (1 to 2).toList
val b = (10 to 12).toList
Then for instance a for comprehension may convey the needs,
for (i <- a; j <- b) yield (i,j)
which delivers
List((1,10), (1,11), (1,12),
(2,10), (2,11), (2,12))
Update
From OP latest update, consider a dedicated filtering function,
type triplet = (Int,String,Int)
def filtering(key: triplet, xs: List[triplet]) =
xs.filter( v => key._2 == v._2 && key._3 == v._3 )
and so apply it with flatMap,
a.flatMap(filtering(_, b))
List((1430302366,A,4200))
One additional step is to encapsulate this in an implicit class,
implicit class OpsFilter(val keys: List[triplet]) extends AnyVal {
def filtering(xs: List[triplet]) = {
keys.flatMap ( key => xs.filter( v => key._2 == v._2 && key._3 == v._3 ))
}
}
and likewise,
a.filtering(b)
List((1430302366,A,4200))

Haskell GHCi prints lazy sequence but Scala REPL doesn't

I would to print out a stream of numbers but the following code only prints out the first number in the sequence:
for ( n <- Stream.from(2) if n % 2 == 0 ) yield println(n)
2
res4: scala.collection.immutable.Stream[Unit] = Stream((), ?)
In Haskell the following keeps printing out numbers until interrupted and I would like similar behaviour in Scala:
naturals = [1..]
[n | n <- naturals, even n]
[2,4,6,8,10,12,14,16,18,20,22,24,26,28,30,32,34,36,38,40,42,44,46,48,50,52,54,56,58,
Instead of yielding just println (why would one want infinite sequence of Unit's?):
for ( n <- Stream.from(2) if n % 2 == 0 ) println(n)
If you really want that infinite sequence of Units, force result:
val infUnit = for ( n <- Stream.from(2) if n % 2 == 0 ) yield println(n)
infUnit.force // or convert to any other non-lazy collection
Though, eventually, it will crash program (due to large length of materialized sequence).
The result type of a for comprehension is a collection of the same type of the collection in the first clause. See the flatMap function signature
So the result of a
for ( n <- Stream.from(2) .....
is a collection of type Stream[_] which is lazy so you have to pull the element values or actions out.
Look at the result types:
scala> :type for( n <- Stream.from(2)) yield n
scala.collection.immutable.Stream[Int]
scala> :type for( n <- List(1,2,3)) yield n
List[Int]
scala> :type for( n <- Set(1,2,3)) yield n
scala.collection.immutable.Set[Int]
To print out numbers until interrupted try this:
Stream.from(2).filter(_ % 2 == 0) foreach println
Its type grants us it will work:
scala> :type Stream.from(2).filter(_ % 2 == 0) foreach println
Unit
I think you meant:
for (n <- Stream.from(2) if n % 2 == 0) yield n
(because yield println(n) will always yield () with a side effect of printing n)
This gives you the collection you want. However, Scala, unlike Haskell, doesn't evaluate all members of a (lazy) list when you print the lazy list (a Stream). But you can convert it into a non-lazy list using .toList. However, you won't see the same infinite printing behaviour in Scala as it will try to build the entire (infinite) list first before printing anything at all.
Basically there is no way to get the exact same combination of semantics and behaviour in Scala compared to Haskell when printing infinite lists using the built-in toString infrastructure.
P.S.
for (n <- Stream.from(2) if n % 2 == 0) yield n
is expressed more shortly as
Stream.from(2).filter(_ % 2 == 0)