How do I post a non-string value in a Scala Http request? - scala.js

From a Scala program, I need to send values with different primitive types to a Node server.
Http("http://middleware-api:3000/createPromoCodeInCms?")
.postForm
.param("lifetime", 5)
.method("POST")
.asString
This does not compile:
[error] /Users/corbac/Desktop/spark-projects/App.scala:192:24: type mismatch;
[error] found : Int(5)
[error] required: String
[error] .param("lifetime", 5)
[error] ^
However it seems like only strings are accepted as body parameters. Is it possible to post non-string types such as a number or a boolean? If yes how do I do it?
I am using Scala 2.11.

Related

ambigious implicits value - twirl form error

With playframework 2.6, I am getting following errors. I have read and found this error only on older versions, that too resolved easily.
[error] /Users/vishalupadhyay/Work/app/views/login_form.scala.html:12:22: ambiguous implicit values:
[error] both method implicitJavaMessages in object PlayMagicForJava of type => play.api.i18n.Messages
[error] and value request of type play.api.mvc.MessagesRequestHeader
[error] match expected type play.api.i18n.MessagesProvider
[error] #helper.inputText(loginForm("password"))
[error] ^
Could not find any answer which can help it. Please see complete code in this link.
Remove the implicit Lang and Messages arguments

Slick generic query with distinctOn

I would want to make a generic slick query using distinctOn on table to count distinct elements in the column.
def countDistinct(table: TableQuery[_], column: Rep[_]): DBIO[Int] =
table.distinctOn(_ => column).length.result
This code above doesn't compile because:
No matching Shape found.
[error] Slick does not know how to map the given types.
[error] Possible causes: T in Table[T] does not match your * projection,
[error] you use an unsupported type in a Query (e.g. scala List),
[error] or you forgot to import a driver api into scope.
[error] Required level: slick.lifted.FlatShapeLevel
[error] Source type: slick.lifted.Rep[_]
[error] Unpacked type: T
[error] Packed type: Any
[error] table.distinctOn(_ => column).length.result
FlatShapeLevel instead of Rep[_] also doesn't work. I'm using slick 3.
distinctOn doesn't work properly in Slick due to incorrect projection to a particular field/column. The bug was raised 5 years ago, surprisingly still hasn't been resolved.

spark example wont compile

Trying to run one of apache sparks example codes (https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/graphx/AggregateMessagesExample.scala) I get the following compile error
too many arguments for method sendToDst: (msg: (Int, Double))Unit
[error] Error occurred in an application involving default arguments.
[error] triplet.sendToDst(1, triplet.srcAttr)
[error] ^
[error] one error found
But looking at the mehtods it seems to be correct. Not sure what is wrong here.
It looks like the method you are calling expects a single argument (a Tuple2) and you are passing in 2 arguments.
Try
triplet.sendToDst((1, triplet.srcAttr))

Why can't _ be used inside of string interpolation?

This works
(x => s"$x")
but this
(s"${_}")
results in
[error] ...: unbound placeholder parameter
[error] (s"${_}")
Is this a case of leaky abstraction?
Furthermore: (s"$_") fails with a completely different output:
[error] ...: invalid string interpolation: `$$', `$'ident or `$'BlockExpr expected
[error] (s"$_")
[error] ^
[error] ...: unclosed string literal
[error] (s"$_")
Calling string interpolation a leaky abstraction is totally right in my opinion. While it works fine in most cases, there are many edge cases where it just doesn't work the way how one expects it. This one is another incarnation of such an edge case.
I don't know why s"$_" is not accepted by the compiler. Some time ago there were a pull request that introduced this syntax for pattern matching: PR 2823
Interestingly this PR also contains test cases that test that the underscore outside of a pattern match produces an error.
Unfortunately there is no further description why this is implemented the way it is implemented.
Som Snytt, the guy who implemented the PR is active on SO, hopefully he can tell more.

Why are Map and Set aliased in scala.Predef?

9 times out of 10, simply using Map and Set behave like I expect they would, but occasionally I am unexpectedly hit with
error: type mismatch;
[INFO] found : scala.collection.Set[String]
[INFO] required: Set[String]
As an example, from the REPL:
scala> case class Calculator[+T](name: String, parameters: Set[String])
defined class Calculator
scala> val binding=Map.empty[String, String]
binding: scala.collection.immutable.Map[String,String] = Map()
scala> Calculator("Hello",binding.keySet)
<console>:9: error: type mismatch;
found : scala.collection.Set[String]
required: Set[String]
Calculator("Hello",binding.keySet)
^
I think I understand the error, that is, the function call on the aliased types return the actual types.
And so it seems to me the solution is to import the un-aliased types. Upon which every other file in my project will now generate type mismatch errors, so I will have to import it in each file. Which leads to the question I ask in the title -- what was the purpose of the alias in Predef, if eventually I need to import the actual package anyway?
Is my understanding flawed, or is my use case not the typical one, or both?
You have misdiagnosed the problem. It isn't that it doesn't recognize the type alias is the same type as what it is aliasing. It's that the type alias is scala.collection.immutable.Set and that is not the same as scala.collection.Set.
Edit: by the way, I thought I'd fixed this, as evinced by the comment in the type diagnostics:
... Also, if the
* type error is because of a conflict between two identically named
* classes and one is in package scala, fully qualify the name so one
* need not deduce why "java.util.Iterator" and "Iterator" don't match.
Apparently needs more work.
Edit 7/17/2010: OK, it took me a shockingly long time, but now at least it says something hard to misunderstand.
files/neg/type-diagnostics.scala:4: error: type mismatch;
found : scala.collection.Set[String]
required: scala.collection.immutable.Set[String]
def f = Calculator("Hello",binding.keySet)
^
The real problem is that scala.collection.immutable.Map#keySet returns a scala.collection.Set (a read-only Set) instead of a scala.collection.immutable.Set (an immutable Set). I'll leave it for someone else to explain why that is...
Edit
Someone asks for an explanation for the return type of Map#keySet in this thread, but doesn't get an answer.