I'v set up and new project of playframework 2.8, and my dilemmas are:
1. which dependency should I use:
"org.reactivemongo" %% "reactivemongo" % "1.0"
OR
"org.reactivemongo" %% "play2-reactivemongo" // dont even think that there is 1.0 for play 2.8, is it deprecated?
2. up until now I used play-json for serialize/deserialize my objects that I insets or fetch from mongo, for example:
object MongoSerializer {
implicit val InstantFormat = CommonSerializers.InstantSerializers.BSONFormat
implicit val MetadataFormat: OFormat[Metadata] = Json.format[Metadata]
implicit val PairingFormat: OFormat[Pairing] = Json.format[Pairing]
implicit val pairTypeFormat: Format[PairType] = EnumFormats.formats(PairType)
}
and in my dbconfig I used _.collection[JSONCollection], but I remember someone wrote that JSONCollection is about to be deprecated and there will be only support for BSONCollection so I wanted to work with BSONCollection.
so as you can see I'm a bit confused, if there is someone who can help me understand what setup should I use and which serialize/deserialize will go best with it I will appreciate it allot. thanks!
I will go for the first option because some of my result is an aggregation/customization of different collections. Thus, I will have to write custom BSON/JSON converters myself.
Related
I brought up the scalikejdbc version and got an error like this:
[error] Implicit ParameterBinderFactory[org.joda.time.LocalDateTime]
for the parameter type org.joda.time.LocalDateTime is missing. [error]
You need to define ParameterBinderFactory for the type or use
AsIsParameterBinder.
def toUpdaters: Seq[(SQLSyntax, ParameterBinder)] = {
val ses = SeasonProjection.column
Seq(
start.map(ses.start -> _) //<- here
How can I solve the problem?
I have not found any examples
You can check out the documentation at http://scalikejdbc.org/documentation/operations.html, section Using joda-time library.
You need to add a library to allow scalikejdbc to work with Joda:
libraryDependencies += "org.scalikejdbc" %% "scalikejdbc-joda-time" % "3.5.0"
Then use appropriate imports in your code like:
// If you need ParameterBinderFactory for joda-time classes
import scalikejdbc.jodatime.JodaParameterBinderFactory._
// If you need TypeBinder for joda-time classes
import scalikejdbc.jodatime.JodaTypeBinder._
That being said, you might want to get rid of Joda in favour of java.time which is nowadays standard for date/time representations and is most of the time supported out-of-the-box by libraries.
I am looking for a simple way to read a file (and maybe a directory) via sftp protocol in scala.
My tries:
I've looked at the Alpakka library, which is a part of akka.
But this works with streams, which is a complex topic I am not familiar with and it seems to much effort for this.
Then there is spark-sftp: this needs scala spark, which would be a bit much just to load a file.
There is the jsch library for java that could do the job, but I could not bring it to work
I am looking for actual working code that uses a library and sftp instead of a plain scp, which I am forced to do. I've found that there are not many examples for this on the web, and the ones I have found are much more complex.
Here is a working example, using sshj:
import net.schmizz.sshj.SSHClient
import net.schmizz.sshj.sftp.SFTPClient
object Main extends App {
val hostname = "myServerName"
val username = "myUserName"
val password = "thePassword"
val destinationFile = "C:/Temp/Test.txt"
val sourceFile = "./Test.txt"
val ssh = new SSHClient()
ssh.addHostKeyVerifier("xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx")
ssh.connect(hostname)
ssh.authPassword(username, password)
val sftp: SFTPClient = ssh.newSFTPClient()
sftp.get(sourceFile, destinationFile)
sftp.close()
ssh.disconnect()
}
I tested this on scala version 2.13.4 with the following entries in build.sbt:
libraryDependencies += "com.hierynomus" % "sshj" % "0.31.0"
libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.2.3"
I would not recommend to actually use it this way. Some of these steps should be wrapped in a Try and then some error checking should be done if the file didn't exists or the connection failed and so on. I intentionally left that out for clarity.
I am not saying that this is the only or the right library for this task. It is just the first one that did work for me. Especially the addHostKeyVerifier method was very helpful in my case.
There are also other libraries like JSCH, jassh, scala-ssh and scala-ftp wich also could very well do the job.
From my understanding I should be able to use the banana-rdf library in my scalajs code? I followed the instructions on the website and added the following to my build.sbt:
val banana = (name: String) => "org.w3" %% name % "0.8.4" excludeAll (ExclusionRule(organization = "org.scala-stm"))
and added the following to my common settings:
resolvers += "bblfish-snapshots" at "http://bblfish.net/work/repo/releases"
libraryDependencies ++= Seq("banana", "banana-rdf", "banana-sesame").map(banana)
It all compiles fine until it gets to the point where it does the fast optimizing. Then I get the following error:
Referring to non-existent class org.w3.banana.sesame.Sesame$
I tried changing Seasame for Plantain but got the same outcome.
Am I missing something?
I was using the "%%" notation which is for a jvm module.
Changed it to "%%%" and it was able to find the correct library.
NOTE. I had to use Plantain as this is the only one currently compiled for scalajs
I am doing some tests, and in many cases I have a configuration of an FTP / HTTP.
I am working with Scala and the following libraries in my sbt:
"org.scalatest" %% "scalatest" % "3.0.1" % Test,
"org.scalamock" %% "scalamock" % "4.1.0" % Test,
I am doing for the following code as an example of a configuration mocked, inside of my test:
val someConfig = SomeConfig(
endpoint = "",
user = "",
password = "",
companyName="",
proxy = ProxyConfig("", 2323)
)
But I feel it is not nice to do this for each configuration that I am going to be dealing with...
I would like to create the following:
val someConfig = mock[SomeConfig]
but when my code tries to reach the proxy property, which is a case class, it fails with a null pointer exception.
I would like to know how to mock case classes that contains other case classes and make my code a bit more clear, is there a way to do this with MockFactory?
You can try to mock it like this:
val someConfig = mock[SomeConfig]
when(someConfig.proxy).thenReturn(ProxyConfig("", 2323))
So it will return ProxyConfig("", 2323) when you try to get someConfig.proxy.
The above code is using Mockito due to a known limitation of ScalaMock
Parameters of case classes are translated into val fields, and ScalaMock has a known limitation where it is not able to mock val, so I think it is not possible to directly do this with ScalaMock.
Mockito does have this capability.
In the tutorial you find 2 versions for Scala-Meta.
lazy val MetaVersion = "3.7.2"
lazy val MetaVersion1 = "1.8.0"
I am a bit confused as they seem to refer the same project:
lazy val scalameta1 = "org.scalameta" %% "scalameta" % MetaVersion1
lazy val scalameta = "org.scalameta" %% "scalameta" % MetaVersion
Can somebody point out the difference, and when you use which one of these?
The Tutorial only mentions "3.7.2", but with that I got the exception
ERROR: new-style ("inline") macros require scala.meta
explained here: new-style-inline-macros-require-scala-meta
3.7.2 is the current version of scalameta (actually already 3.7.4).
1.8.0 is the last version of scalameta that worked with scalameta macro annotations through scalameta paradise compiler plugin (1 2 3).
So if you need the latest version of scalameta you use 3.7.4. If you need scalameta macros you use 1.8.0.