Composing Nested Observables with Dependencies in RxJava - reactive-programming

I'm writing a small application to familiarize myself with the FRP paradigm and RxJava.
I have two methods; getUsers() and getNextTask(userId). These methods return Observable<User> and Observable<Task> respectively and getNextTask depends on User items emitted from getUsers Observable.
I have written the following code which combines the results of the two calls into one object, UserSummary, using flatMap():
getUsers()
.flatMap(u -> (Observable<UserSummary>) getNextTask(u.getId()).map(t -> new UserSummary(u, t))
.subscribe(System.out::println);
This code works fine and emits the values that I expect, however, I expected values of type UserSummary and instead received values of type Object which I then cast to UserSummary.
My question is, is flatMap() a good place to call getNextTask() or is there perhaps a more effective method of calling methods with dependencies?
Edit
With regards to the flatMap returning Observable<Object> instead of Observable<UserSummary> as I would have expected of a generic method, the following quick snippet illustrates the problem. The flatMap operator is expected to return Observable<String>, however it returns Observable<Object>. Therefore, returns have to be cast to their respective types.
Integer[] numberArray = {1, 2, 3, 4, 5};
Observable.from(numberArray)
.flatMap(i -> {
String[] letterArray = {"a", "b", "c", "d", "e"};
return Observable.from(letterArray)
.map(x -> x + i);
}).subscribe(System.out::println);

Related

new to scala anyone explain the code for me?

def indexOf[T](seq: Seq[T],value: T, from: Int):Int={
for(i<-from until seq.length){
if(seq(i)== value) return i
}
-1
}
Anyone explain to me indexOf[T] meaning? And what does (seq:Seq[T],value:T) do?
def indexOf - This is a method. We'll call it indexOf.
[T] - This method will make reference to an unspecified type. We'll call it T.
(seq:Seq[T], value:T, from:Int) - This method will take 3 passed parameters:
variable seq which is a Seq of elements of type T
variable value which is a single value of type T
variable from which is a single value of type Int
:Int - This method returns a value of type Int.
= { - Mehod code begins here.
This is related to Scala generics.
https://docs.scala-lang.org/tour/generic-classes.html
In simple terms, here, T acts as a place holder for any data type.
The indexOf function takes a generic T, which during runtime can be a Integer, String or custom Employee object.
For example in the sequence, you can pass a Seq of Employee or String and same data type value.
By using generics, for your example, you dont have to create different indexOf function for every other data type.
How to call indexOf? As below:
val index = indexOf[String](stringSeq, "searchThis", 0)
or
val index = indexOf[Employee](employeeSeq, empObj, 0)
This method is what we call a parametric method in scala.
Parametric methods in Scala can be parameterized by type as well as
value. The syntax is similar to that of generic classes. Type
parameters are enclosed in square brackets, while value parameters are
enclosed in parentheses.
Since T is a generic type, that means that indexOf method can be called on a variety of types.
Your method indexOf[T] takes a type parameter T and value parameters seq, value and from.
When calling your method, you can either set explicitly the type you will be manipulating by replacing the T by your concrete type (see example 1), or let the compiler work for you (type inference) based on the parameter type of your param seq and value. (see example 2)
Example 1
val index = indexOf[Int](Seq(3, 5, 4), 4, 0)
Example 2
val index = indexOf(Seq("alice", "bob", "yo"), "bob", 1)

Purpose of toIterable

In scala exercies I have found the following example:
val set = Set(4, 6, 7, 8, 9, 13, 14)
val result = set.toIterable
with the following description:
toIterable will convert any Traversable to an Iterable. This is a base trait for all Scala collections that define an iterator method to iterate through the collection's elements
But Set is already an Iterable, so what's the point of this method? If this isn't the valid case, could you point me one?
In Scala 2.13 there is no more Traversable:
Simpler type hierarchy
No more Traversable and TraversableOnce. They remain only as
deprecated aliases for Iterable and IterableOnce.
Calling toIterable on Set is redundant as it will simply return this same collection:
This collection as an Iterable[A]. No new collection will be built if
this is already an Iterable[A].
Examples where toIterable would have an effect would be
"Hello".toIterable
Array(1).toIterable
which implicitly converts to
wrapString("Hello").toIterable
wrapIntArray(Array(1)).toIterable
and make these Java-like types into Scala collections proper.
In addition to Mario Galic's answer, the other thing it does is change the static type. If you and the compiler knew it was a Set before the call, you don't know afterwards. Though the same can be achieved with a type ascription
val result: Iterable[Int] = set
(and this will work for strings and arrays as well), then you need to write out the type parameter, which may much more complex than Int.
Why would I use it? If i know it's a Set, why would I change the type to Iterable?
it can be in a method which can be overridden and doesn't have to return Set in subclasses:
class Super {
def someValues = {
val set = ... // you want to avoid duplicates
set
}
}
class Sub : Super {
override def someValues = {
List(...) // happens to have duplicates this time
}
doesn't compile, but would if Super#someValues returned set.toIterable (though it's generally good practice to have explicit return types).
It can influence later inferred types:
val arr = Array(set)
arr(0) = List(0, 1, 2, 3)
doesn't compile, but would with Array(set.toIterable).

Scala: mutable HashMap does not update inside for loop

I have a var permutedTables = HashMap[List[Int], List[String, String]] defined globally. I first populate the Hashmap with the keys in a method, which works.
print(permutedTables) :
Map(List(1,2,3,4) -> List(),
List(2,4,5,6) -> List(), etc...)
The problem occurs when I want to update the values (empty lists) of the HashMap inside a for loop (inside a second method). In other words, I want to add tuples (String, String) in the List() for each key.
for(pi_k <- permutedTables.keySet){
var mask = emptyMask;
mask = pi_k.foldLeft(mask)((s, i) => s.updated(i, '1'))
val maskB = Integer.parseInt(mask,2)
val permutedFP = (intFP & maskB).toBinaryString
// attempt 1 :
// permutedTables(pi_k) :+ (url, permutedFP)
// attempt 2 :
// permutedTables.update(pi_k, permutedTables(pi_k):::List((url, permutedFP)))
}
The values do not update. I still have empty lists as values. I don't understand what is wrong with my code.
EDIT 1: When I call print(permutedTables) after any of the two attempts (inside the loop), the value seem updated, but when I call it outside of the loop, the Lists are empty
EDIT 2: The second attempt in my code seems to work now(!). But why does first not work ?
The second attempt in my code seems to work now(!). But why does first not work ?
Because what you do in the first case is get a list from permutedTables, add an element and throw away the result without storing it back. It would work if you mutated the value, but a List is immutable. With List, you need
permutedTables += pi_k -> permutedTables(pi_k) :+ (url, permutedFP)
Or, as you saw, update.
You can use e.g. ArrayBuffer or ListBuffer as your value type instead (note that you need :+= instead of :+ to mutate them), and convert to your desired type at the end. This is going to be rather more efficient than appending to the end of the list, unless the lists are quite small!
Finally, note that you generally want either var or a mutable type, not both at the same time.

Scala Slick Bulk Insert with Array

I'm splitting a long string into an array of strings, then I want to insert all of them to the database. I can easily loop through the array and insert one by one, but it seems very inefficient. Then I think there is a insertAll() method. However, the insertAll() method is defined like this:
def insertAll(values: U*)
This only accepts multiple U, but not an Array or List.
/* Insert a new Tag */
def insert(insertTags: Array[String])(implicit s: Session) {
var insertSeq: List[Tag] = List()
for(tag <- insertTags) {
insertSeq ++= new Tag(None, tag)
}
Tag.insertAll(insertSeq)
}
*Tag is the table Object
This is the preliminary code I have written. It doesn't work because insertAll() doesn't take Seq. I hope there is a way to do this...so it won't generate an array length times SQL insertion clause.
When a function expect repeated parameters such as U* and you want to pass a sequence of U instead, it must be marked as a sequence argument, which is done with : _* as in
Tag.insertAll(insertSeq: _*)
This is described in the specification in section 6.6. It disambiguates some situations such as ;
def f(x: Any*)
f(Seq(1, "a"))
There, f could be called with a single argument, Seq(1, "a") or two, 1 and "a". It will be the former, the latter is done with f(Seq(1, "a"): _*). Python has a similar syntax.
Regarding the questions in your comment :
It works with Seq, collections that confoms to Seq, and values that can be implicitly converted to Seqs (which includes Arrays). This means a lot of collections, but not all of them. For instances, Set are not allowed (but they have a toSeq methods so it is still easy to call with a Set).
It is not a method, it it more like a type abscription. It simply tells the compiler that this the argument is the full expected Seq in itself, and not the only item in a sequence argument.

Surprise casts needed Initializing MongoDB document from F# dict?

There is a micro question, here about why an upcast is needed in the final answer I came up with (at the bottom of this); and a macro question about whether I am just missing the "elephant in the room:" some really obvious succinct way to do what I want [please don't ask me -why- I want what I do want; just take it as a given that I want this, and it is...]
I want to initialize a BsonDocument from F# via the MongoDB.Bson CLR assmbly. The particular overload of the BsonDocument constructor I think I should use is
MongoDB.Bson.BsonDocument.BsonDocument(IDictionary<string,object>)
and here is why I think this is the one I should use (the following is a long stroll through the garden of types ...)
The C# sample from the MongoDB site MongoDB CSharp Driver Tutorial uses collection-initializer syntax, which maps to one or more calls of .Add on the interface exposed by BsonDocument. The tutorial sample resembles the following:
var bdoc = new BsonDocument { { "a", "1" }, { "b", "2" }, };
I am not positive which overload of .Add is being used (and don't know how to check in visual studio), but all of the dictionary-based overloads are typed as <string, object>. In this case, the second values in each pair, namely "1" and "2", of type string, are automatically (by inheritance) also of type object, so everything is ok. And the other overloads of .Add that require the second item to be of type BsonValue, which is an abstract supertype of BsonString, which has an implicit conversion from .NET string no matter which overload is used; so everything is ok there, too. It doesn't matter which overload of the constructor is called.
This is a little difficult to maneuver into an F# equivalent because it's difficult to get at the .Add method of BsonDocument. I thought of
[("a", "1");("b", "2");] |> Seq.iter BsonDocument.Add
but that doesn't work because BsonDocument.Add is not a static method; I could instantiate the BsonDocument and then write a fun lambda that calls the BsonDocument's .Add method, which would at least isolate mutability to the fun:
[("a", "1");("b", "2");] |> Seq.fold ...
but this turns out to be super ugly, both due to the fun needing an explicit type notation on the BsonDocument because the variable referring to the BsonDocument occurs before the (new BsonDocument()), so left-to-right type inference doesn't have enough info (yet), and because the fun (at least, apparently) has no way to know that it should access the implicit conversion from string to BsonString for the second value in each pair...
let bdoc = [("a","1");("b","2");] |> Seq.fold (fun (doc:BsonDocument) pair -> doc.Add(fst pair, new BsonString(snd pair))) (new BsonDocument())
... no matter, I thought, I'll use the bigger overload of the constructor
BsonDocument(IDictionary<string, object>)
but this is forced to the following:
let bdoc = (new BsonDocument(dict
[("a", "1" :> Object);
("b", "2" :> Object);
]))
If I take out the upcasts
:> Object
then F# complains that it can't find an overload of BsonDocument.
(long stroll in the garden is over ...)
After all that, the micro question is why, in F#, can it not figure out that the "1" and "2" in the input dictionary are objects and thus find the appropriate overload?
And the bigger-picture macro question is whether I've just missed the appropriate, best-practices, supercool, succinct way to do this in F# ?
This is not a MongoDB issue. The issue is that ("1", "2") is a string * string, so you're creating an IDictionary<string,string> with your dict constructor. F# has inferred the types you've given it. Its type inference won't determine you meant obj in this case. Therefore you have to tell it.
> dict
[("a", "1" :> obj);
("b", "2" :> obj);
];;
val it : System.Collections.Generic.IDictionary<string,obj> =
seq [[a, 1] {Key = "a";
Value = "1";}; [b, 2] {Key = "b";
Value = "2";}]
> dict
[("a", "1");
("b", "2");
];;
val it : System.Collections.Generic.IDictionary<string,string> =
seq [[a, 1] {Key = "a";
Value = "1";}; [b, 2] {Key = "b";
Value = "2";}]