Mocking configuration objects with MockFactory - scala

I am doing some tests, and in many cases I have a configuration of an FTP / HTTP.
I am working with Scala and the following libraries in my sbt:
"org.scalatest" %% "scalatest" % "3.0.1" % Test,
"org.scalamock" %% "scalamock" % "4.1.0" % Test,
I am doing for the following code as an example of a configuration mocked, inside of my test:
val someConfig = SomeConfig(
endpoint = "",
user = "",
password = "",
companyName="",
proxy = ProxyConfig("", 2323)
)
But I feel it is not nice to do this for each configuration that I am going to be dealing with...
I would like to create the following:
val someConfig = mock[SomeConfig]
but when my code tries to reach the proxy property, which is a case class, it fails with a null pointer exception.
I would like to know how to mock case classes that contains other case classes and make my code a bit more clear, is there a way to do this with MockFactory?

You can try to mock it like this:
val someConfig = mock[SomeConfig]
when(someConfig.proxy).thenReturn(ProxyConfig("", 2323))
So it will return ProxyConfig("", 2323) when you try to get someConfig.proxy.
The above code is using Mockito due to a known limitation of ScalaMock

Parameters of case classes are translated into val fields, and ScalaMock has a known limitation where it is not able to mock val, so I think it is not possible to directly do this with ScalaMock.
Mockito does have this capability.

Related

How to read a file via sftp in scala

I am looking for a simple way to read a file (and maybe a directory) via sftp protocol in scala.
My tries:
I've looked at the Alpakka library, which is a part of akka.
But this works with streams, which is a complex topic I am not familiar with and it seems to much effort for this.
Then there is spark-sftp: this needs scala spark, which would be a bit much just to load a file.
There is the jsch library for java that could do the job, but I could not bring it to work
I am looking for actual working code that uses a library and sftp instead of a plain scp, which I am forced to do. I've found that there are not many examples for this on the web, and the ones I have found are much more complex.
Here is a working example, using sshj:
import net.schmizz.sshj.SSHClient
import net.schmizz.sshj.sftp.SFTPClient
object Main extends App {
val hostname = "myServerName"
val username = "myUserName"
val password = "thePassword"
val destinationFile = "C:/Temp/Test.txt"
val sourceFile = "./Test.txt"
val ssh = new SSHClient()
ssh.addHostKeyVerifier("xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx")
ssh.connect(hostname)
ssh.authPassword(username, password)
val sftp: SFTPClient = ssh.newSFTPClient()
sftp.get(sourceFile, destinationFile)
sftp.close()
ssh.disconnect()
}
I tested this on scala version 2.13.4 with the following entries in build.sbt:
libraryDependencies += "com.hierynomus" % "sshj" % "0.31.0"
libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.2.3"
I would not recommend to actually use it this way. Some of these steps should be wrapped in a Try and then some error checking should be done if the file didn't exists or the connection failed and so on. I intentionally left that out for clarity.
I am not saying that this is the only or the right library for this task. It is just the first one that did work for me. Especially the addHostKeyVerifier method was very helpful in my case.
There are also other libraries like JSCH, jassh, scala-ssh and scala-ftp wich also could very well do the job.

Working with ReactiveMongo (for play framework apps) in 2020

I'v set up and new project of playframework 2.8, and my dilemmas are:
1. which dependency should I use:
"org.reactivemongo" %% "reactivemongo" % "1.0"
OR
"org.reactivemongo" %% "play2-reactivemongo" // dont even think that there is 1.0 for play 2.8, is it deprecated?
2. up until now I used play-json for serialize/deserialize my objects that I insets or fetch from mongo, for example:
object MongoSerializer {
implicit val InstantFormat = CommonSerializers.InstantSerializers.BSONFormat
implicit val MetadataFormat: OFormat[Metadata] = Json.format[Metadata]
implicit val PairingFormat: OFormat[Pairing] = Json.format[Pairing]
implicit val pairTypeFormat: Format[PairType] = EnumFormats.formats(PairType)
}
and in my dbconfig I used _.collection[JSONCollection], but I remember someone wrote that JSONCollection is about to be deprecated and there will be only support for BSONCollection so I wanted to work with BSONCollection.
so as you can see I'm a bit confused, if there is someone who can help me understand what setup should I use and which serialize/deserialize will go best with it I will appreciate it allot. thanks!
I will go for the first option because some of my result is an aggregation/customization of different collections. Thus, I will have to write custom BSON/JSON converters myself.

Can't get the banana-rdf to work with scalajs

From my understanding I should be able to use the banana-rdf library in my scalajs code? I followed the instructions on the website and added the following to my build.sbt:
val banana = (name: String) => "org.w3" %% name % "0.8.4" excludeAll (ExclusionRule(organization = "org.scala-stm"))
and added the following to my common settings:
resolvers += "bblfish-snapshots" at "http://bblfish.net/work/repo/releases"
libraryDependencies ++= Seq("banana", "banana-rdf", "banana-sesame").map(banana)
It all compiles fine until it gets to the point where it does the fast optimizing. Then I get the following error:
Referring to non-existent class org.w3.banana.sesame.Sesame$
I tried changing Seasame for Plantain but got the same outcome.
Am I missing something?
I was using the "%%" notation which is for a jvm module.
Changed it to "%%%" and it was able to find the correct library.
NOTE. I had to use Plantain as this is the only one currently compiled for scalajs

Issue with Kafka stream filtering

I'm trying to run a basic app from the following example:
https://github.com/confluentinc/examples/blob/3.3.x/kafka-streams/src/main/scala/io/confluent/examples/streams/MapFunctionScalaExample.scala
However I'm getting an exception at this line:
// Variant 1: using `mapValues`
val uppercasedWithMapValues: KStream[Array[Byte], String] = textLines.mapValues(_.toUpperCase())
Error:(33, 25) missing parameter type for expanded function ((x$1) =>
x$1.toUpperCase())
textLines.mapValues(_.toUpperCase())
Error I'm getting if I hover cursor over the code:
Type mismatch, expected: ValueMapper[_ >: String, _ <: NotInferedVR],
actual: (Any) => Any Cannot resolve symbol toUpperCase
Contents of my sbt file:
name := "untitled1"
version := "0.1"
scalaVersion := "2.11.11"
// https://mvnrepository.com/artifact/org.apache.kafka/kafka_2.11
libraryDependencies += "org.apache.kafka" % "kafka_2.11" % "0.11.0.0"
// https://mvnrepository.com/artifact/org.apache.kafka/kafka-clients
libraryDependencies += "org.apache.kafka" % "kafka-clients" % "0.11.0.0"
// https://mvnrepository.com/artifact/org.apache.kafka/kafka-streams
libraryDependencies += "org.apache.kafka" % "kafka-streams" % "0.11.0.0"
// https://mvnrepository.com/artifact/org.apache.kafka/connect-api
libraryDependencies += "org.apache.kafka" % "connect-api" % "0.11.0.0"
I'm really not sure how to proceed with that as I'm quite new to Scala. I'd like to know what's the issue and how to fix it.
From http://docs.confluent.io/current/streams/faq.html#scala-compile-error-no-type-parameter-java-defined-trait-is-invariant-in-type-t
The root cause of this problem is Scala-Java interoperability – the Kafka Streams API is implemented in Java, but your application is written in Scala. Notably, this problem is caused by how the type systems of Java and Scala interact. Generic wildcards in Java, for example, are often causing such Scala issues.
To fix the problem you would need to declare types explicitly in your Scala application in order for the code to compile. For example, you may need to break a single statement that chains multiple DSL operations into multiple statements, where each statement explicitly declares the respective return types. The StreamToTableJoinScalaIntegrationTest demonstrates how the types of return variables are explicitly declared.
Update
Kafka 2.0 (will be released in June) contains a proper Scala API that avoid those issues. Compare https://cwiki.apache.org/confluence/display/KAFKA/KIP-270+-+A+Scala+Wrapper+Library+for+Kafka+Streams

abandon calling `get` on Option and generate compile error

If I want to generate compile time error when calling .get on any Option value, how to go about doing this?
Haven't written any custom macros but guess it's about time for it? Any pointers?
There is a compiler plugin called wartremover, that provides what you want.
https://github.com/typelevel/wartremover
It has error messages and warning for some scala functions, that should be avoided for safety.
This is the description of the OptionPartial wart from the github readme page:
scala.Option has a get method which will throw if the value is
None. The program should be refactored to use scala.Option#fold to
explicitly handle both the Some and None cases.
compiler plugin
To add wartremover, as a plugin, to scalac, you need to add this to your project/plugins.sbt:
resolvers += Resolver.sonatypeRepo("releases")
addSbtPlugin("org.brianmckenna" % "sbt-wartremover" % "0.11")
And activate it in your build.sbt:
wartremoverErrors ++= Warts.unsafe
macro
https://github.com/typelevel/wartremover/blob/master/OTHER-WAYS.md descripes other ways how you can use the plugin, one of them is using it as a macro, as mentioned in the question.
Add wart remover as library to your build.sbt:
resolvers += Resolver.sonatypeRepo("releases")
libraryDependencies += "org.brianmckenna" %% "wartremover" % "0.11"
You can make any wart into a macro, like so:
scala> import language.experimental.macros
import language.experimental.macros
scala> import org.brianmckenna.wartremover.warts.Unsafe
import org.brianmckenna.wartremover.warts.Unsafe
scala> def safe(expr: Any):Any = macro Unsafe.asMacro
safe: (expr: Any)Any
scala> safe { 1.some.get }
<console>:10: error: Option#get is disabled - use Option#fold instead
safe { 1.some.get }
The example is adapted from the wartremover github page.
Not strictly an answer to your question, but you might prefer to use Scalaz's Maybe type, which avoids this problem by not having a .get method.