Having trouble with implicit conversion in scala - scala

Having this code
case class Workspace(ident: Long, name: String)
case class Project(ident: Long, name: String)
implicit def workspaceJSON: JSONR[Workspace] = new JSONR[Workspace] {
def read(json: JValue) =
Workspace.applyJSON(field[Long]("id"), field[String]("name"))(json)
}
implicit def projectJSON: JSONR[Project] = new JSONR[Project] {
def read(json: JValue) =
Project.applyJSON(field[Long]("id"), field[String]("name"))(json)
}
def parseEnt[T: JSONR](json: JValue): Either[String, T] =
fromJSON[T](json).toEither.left.map{ _.toString }
def fetchProjects(ws: Workspace): Either[String, Project] = {
parseEnt(parse("some text"))
}
Which fails to compile on parseEnt(parse("some text")) with
ambiguous implicit values: both method taskJSON in class Fetcher of type =>
Fetcher.this.JSONR[types.Task] and method workspaceJSON in class Fetcher of type =>
Fetcher.this.JSONR[Fetcher.this.Workspace] match expected type Fetcher.this.JSONR[T]
Is there a way to assure scala, that in this case I want type variable T to be a Project and choose projectJSON function to parse it? Or if I'm doing it wrong, then how do it in right way?

The compiler is trying to automatically infer the type T, which must be something that it can be produced starting from a String (more or less, but details an unimportant here)
Unfortunately it can't succeed, since you're providing multiple implicit conversions going from String to Project and Workspace. The simple solution is to explicitly indicate the type you're trying to produce:
parseEnt[Project](parse("some text"))
This pattern is fairly common with serialization type classes, where you are mapping multiple specific types to a generic one (String in this case).

Related

Updating case classes from field-name strings and values

Is there a less-verbose way to achieve this?
case class MyClass(
a: A,
b: B,
c: C,
...
)
def updatedFromString(m: MyClass, field: String, value: String) = field match {
case "A" => m.withA(value)
case "B" => m.withB(value)
case "C" => m.withC(value)
...
}
implicit class FromStrings(m: MyClass) {
def withA(v: String) = m.copy(a = A.fromString(v))
def withB(v: String) = m.copy(b = B.fromString(v))
def withC(v: String) = m.copy(c = C.fromString(v))
...
}
MyClass has a lot of fields - a,b,c, etc - all of which are instances of different case classes.
This leads to a lot of case statements above and a lot of updater methods named withXXX, which look fairly repetitive.
You could extract the logic:
// repetitive and generic enough to make it easier to generate
val setters: Map[String, String => MyClass => MyClass] = Map(
"A" -> (v => _.copy(a => A.fromString(v)),
"B" -> (v => _.copy(b => B.fromString(v)),
"C" -> (v => _.copy(c => C.fromString(v)),
...
)
def updatedFromString(m: MyClass, field: String, value: String) =
setters(field)(value)(m)
If it is still too much, you could generate setters using macros or runtime reflection, but I am not sure it is worth the effort.
EDIT: An alternative solution which changes how you deal with code:
sealed trait PatchedField
object PatchedField {
// field names corresponding to names from MyClass
case class FieldA(a: A) extends PatchedField
case class FieldB(b: B) extends PatchedField
...
}
// removed stringiness and creates some type-level information
def parseKV(key: String, value: String): PatchedField = key match {
case "A" => FieldA(A.fromString(v))
case "B" => FieldB(B.fromString(v))
...
}
import io.scalaland.chimney.dls._
def updatedFromString(m: MyClass, field: String, value: String) =
parse(field, value) match {
// performs exhaustivity check
case fieldA: FieldA => m.patchUsing(fieldA)
case fieldB: FieldB => m.patchUsing(fieldB)
...
}
If you don't like it... well, then you have to write you own macro, very obscure shapeless and/or codegen:
there is no way you can generate x.copy(y = z) without a macro, even if some library does it, it does it using a macro underneath. At best you could use some lens library, but AFAIK no lens library out of the box would provide you a Setter for a field by singleton type of the field name (that is without writing something like Lens[Type](_.field) explicitly) - that I believe would be doable with some Shapeless black magic mapping over LabelledGenerics
you might still need to convert a singleton type into A singleton type in compile time - that I am not sure if it is possible in Shapeless so you might need to push it down to value level, summoning a Witness, and then .toUpperCaseing it
you would have to make each field aware of Type.fromString functionality - is it different for each field by its name or my its type? If the latter - you could use a normal parser typeclass. If the former, this typeclass would have to be dependently typed for a singleton type with a field name. Either way you would most likely have to define these typeclasses yourself
then you would have to combine all of that together
It could be easier if you did it in runtime (scanning classes for some method and fields) instead of compile time... but you would have no checks that conversion actually exists for a field string to its value, type erasure would kick in (Option[Int] and Option[String] being the same thing, null no being anything).
With compile time approach you would have to at least define a typeclass per type, and then manually create the code that would put it all together. With some fast prototyping I arrived at:
import shapeless._
import shapeless.labelled._
trait StringParser[A] { def parse(string: String): A }
object StringParser {
implicit val string: StringParser[String] = s => s
implicit val int: StringParser[Int] = s => java.lang.Integer.parseInt(s).toInt
implicit val double: StringParser[Double] = s => java.lang.Double.parseDouble(s).toDouble
// ...
}
trait Mapper[X] { def mapper(): Map[String, StringParser[_]] }
object Mapper {
implicit val hnilMapper: Mapper[HNil] = () => Map.empty
implicit def consMapper[K <: Symbol, H, Repr <: HList](
implicit
key: Witness.Aux[K],
parser: StringParser[H],
mapper: Mapper[Repr]
): Mapper[FieldType[K, H] :: Repr] = () => mapper.mapper() + (key.value.name -> (parser : StringParser[_]))
implicit def hlistMapper[T, Repr <: HList](
implicit gen: LabelledGeneric.Aux[T, Repr],
mapper: Mapper[Repr]
): Mapper[T] = () => mapper.mapper()
def apply[T](implicit mapper: Mapper[T]): Map[String, StringParser[_]] = mapper.mapper()
}
val mappers = Mapper[MyClass]
Which you could use like:
convert a field String to an actual field name
extract a parser from the map using field name
pass the value to the parser
use runtime reflection to simulate copy or generate copy calls using macros
The last part simply cannot be done "magically" - as far as I am aware, there is no library where you would require an implicit Lens[Type, fieldName] and obtain Lens[Type, fieldName] { type Input; def setter(input: Input): Type => Type }, so there is nothing which would generate that .copy for you. As a result it would require some form of manually written reflection.
If you want to have compile-time safety at this step, you might as well do the rest compile-time safe as well and implement everything as a macro which verifies the presence of the right typeclasses and things.

Scala implicit search of covariant type class replaces the type argument with Nothing. Why?

Let's use a real world example. A string parser type class whose implicit instances are created by a function that delegates the creation to a factory.
import scala.reflect.runtime.universe.TypeTag
object Test {
trait Parser[+T] { def parse(input: String): T }
implicit def summonParserOf[T](implicit factory: ParserFactory[T]): Parser[T] = factory.build
trait ParserFactory[T] { def build: Parser[T] }
implicit def summonFactoryOfParsersOf[T](implicit t: TypeTag[T]): ParserFactory[T] =
new ParserFactory[T] {
def build: Parser[T] = new Parser[T] {
def parse(input: String) = {
println("T = " + t.tpe) // this outputs "T = Int" if Parser is non variant, and "T = Nothing" if Parser is covariant on T. Why?
null.asInstanceOf[T]
}
}
}
def main(args: Array[String]): Unit = {
val parserOfInt = implicitly[Parser[Int]]
parserOfInt.parse("")
}
}
The type parameter T received by the factory is Int when Parser is non-variant, and Nothing when it is covariant. Why?
Edit 1:
The factory is not necessary. The replacement occurs before. So the test can be reduced to:
package jsfacile.test
import scala.reflect.runtime.universe.TypeTag
object Probando {
trait Parser[+T] { def parse(input: String): T }
implicit def summonParserOf[T](implicit t: TypeTag[T]): Parser[T] = new Parser[T] {
def parse(input: String): T = {
println("summon parser: T = " + t.tpe) // this outputs "T = Int" if Parser is non variant, and "T = Nothing" if Parser is covariant on T. Why?
null.asInstanceOf[T]
null.asInstanceOf[T]
}
}
def main(args: Array[String]): Unit = {
val parserOfInt = implicitly[Parser[Int]]
parserOfInt.parse("")
}
}
Parser[Nothing] is assignable to Parser[Int], but which is the purpose of choosing the lower bound instead of the upper one?
Edit 2: The answer given by #Dmytro Mitin and the useful comments below, translated to my own words and limited scope of thinking, for future reference to myself.
What stopped me to understand was the wrong idea that, when the implicit value provider is a def with parametrized result type, there is no set of living values from which the compiler has to pick one of. In that case, I thought, it just skips that step (the one that chooses the value with the most specific declared type).
And given the summoner function grants the compiler the power to build a value of any type, why not to fill the implicit parameter with a value that makes him happy. If the implicit parameter demands something assignable to a type T then give it a value of type T. Giving it Nothing, which is assignable to everything, wouldn't be nice nor useful.
The problem with that idea arises when there is more than one summoner providing values assignable to the implicit parameter type. In that case, the only consistent way to decide which summoner to chose is to deduce the set of types of the values they produce, pick a type from said set based on an established criteria (the most specific, for instance), and choose the summoner that produces it.
Scala spec says
If there are several eligible arguments which match the implicit parameter's type, a most specific one will be chosen using the rules of static overloading resolution
https://scala-lang.org/files/archive/spec/2.11/07-implicits.html#implicit-parameters
Since you defined instances like
implicit def summonParserOf[T](implicit t: TypeTag[T]): Parser[T] = ...
for covariant
trait Parser[+T] { ... }
when you look for implicitly[Parser[T]], all summonParserOf[S] (S <: T) are eligible candidates, so the compiler selects the most specific one.

Scala: `ambigious implicit values` but the right value is not event found

I am writing a small Scala Program which should:
Read a file (line by line) from a local FS
Parse from each line three double values
Make instances of a case class based on those three values
Pass those instances to a Binary Heap
To be able to parse Strings to both Doubles and CoordinatePoints I've came up with this trait:
trait Parseable[T] {
def parse(input: String): Either[String, T]
}
and I have a number of type object implementations for the latter:
object Parseable {
implicit val parseDouble: Parseable[Double] = new Parseable[Double] {
override def parse(input: String): Either[String, Double] = {
val simplifiedInput = input.replaceAll("[ \\n]", "").toLowerCase
try Right(simplifiedInput.toDouble) catch {
case _: NumberFormatException =>
Left(input)
}
}
}
implicit val parseInt: Parseable[Int] = new Parseable[Int] {
override def parse(input: String): Either[String, Int] = {
val simplifiedInput = input.replaceAll("[ \\n]", "").toLowerCase
try Right(simplifiedInput.toInt) catch {
case _: NumberFormatException =>
Left(input)
}
}
}
implicit val parseCoordinatePoint: Parseable[CoordinatePoint] = new Parseable[CoordinatePoint] {
override def parse(input: String): Either[String, CoordinatePoint] = {
val simplifiedInput = input.replaceAll("[ \\n]", "").toLowerCase
val unparsedPoints: List[String] = simplifiedInput.split(",").toList
val eithers: List[Either[String, Double]] = unparsedPoints.map(parseDouble.parse)
val sequence: Either[String, List[Double]] = eithers.sequence
sequence match {
case Left(value) => Left(value)
case Right(doublePoints) => Right(CoordinatePoint(doublePoints.head, doublePoints(1), doublePoints(2)))
}
}
}
}
I have a common object that delegates the call to a corresponding implicit Parseable (in the same file):
object InputParser {
def parse[T](input: String)(implicit p: Parseable[T]): Either[String, T] = p.parse(input)
}
and just for reference - this is the CoordinatePoint case class:
case class CoordinatePoint(x: Double, y: Double, z: Double)
In my main program (after having validated that the file is there, and is not empty, etc..) I want to transform each line into an instance of CoordinatePoint as follows:
import Parseable._
import CoordinatePoint._
...
private val bufferedReader = new BufferedReader(new FileReader(fileName))
private val streamOfMaybeCoordinatePoints: Stream[Either[String, CoordinatePoint]] = Stream
.continually(bufferedReader.readLine())
.takeWhile(_ != null)
.map(InputParser.parse(_))
and the error I get is this:
[error] /home/vgorcinschi/data/eclipseProjects/Algorithms/Chapter 2 Sorting/algorithms2_1/src/main/scala/ca/vgorcinschi/algorithms2_4/selectionfilter/SelectionFilter.scala:42:27: ambiguous implicit values:
[error] both value parseDouble in object Parseable of type => ca.vgorcinschi.algorithms2_4.selectionfilter.Parseable[Double]
[error] and value parseInt in object Parseable of type => ca.vgorcinschi.algorithms2_4.selectionfilter.Parseable[Int]
[error] match expected type ca.vgorcinschi.algorithms2_4.selectionfilter.Parseable[T]
[error] .map(InputParser.parse(_))
[error] ^
[error] one error found
[error] (Compile / compileIncremental) Compilation failed
[error] Total time: 1 s, completed Sep 1, 2020 10:38:18 PM
I don't understand nor know where to look for why is the compiler finding Parseable[Int] and Parseable[Double] but not the only right one - Parseable[CoordinatePoint].
So I thought, ok let me give the compiler a hand by specifying the transformation function from beforehand:
private val bufferedReader = new BufferedReader(new FileReader(fileName))
val stringTransformer: String => Either[String, CoordinatePoint] = s => InputParser.parse(s)
private val streamOfMaybeCoordinatePoints: Stream[Either[String, CoordinatePoint]] = Stream
.continually(bufferedReader.readLine())
.takeWhile(_ != null)
.map(stringTransformer)
Alas this yields the same error just a bit up the code - in the function declaration.
I would love to learn what is that that causes such behavior. Both to rectify the code and for personal knowledge. At this point I am very curious.
One fix is to specify type prameter explicitly
InputParser.parse[CoordinatePoint](_)
Another is to prioritize implicits. For example
trait LowPriorityParseable1 {
implicit val parseInt: Parseable[Int] = ...
}
trait LowPriorityParseable extends LowPriorityParseable1 {
implicit val parseDouble: Parseable[Double] = ...
}
object Parseable extends LowPriorityParseable {
implicit val parseCoordinatePoint: Parseable[CoordinatePoint] = ...
}
By the way, since you put implicits into the companion object it doesn't make much sense now to import them.
In the call site of
object InputParser {
def parse[T](input: String)(implicit p: Parseable[T]): Either[String, T] = p.parse(input)
}
type parameter T is inferred (if not specified explicitly) not before the implicit is resolved (type inference and implicit resolution make impact on each other). Otherwise the following code wouldn't compile
trait TC[A]
object TC {
implicit val theOnlyImplicit: TC[Int] = null
}
def materializeTC[A]()(implicit tc: TC[A]): TC[A] = tc
materializeTC() // compiles, A is inferred as Int
So during implicit resolution compiler tries to infer types not too early (otherwise in the example with TC type A would be inferred as Nothing and implicit wouldn't be found). By the way, an exception is implicit conversions where compiler tries to infer types eagerly (sometimes this can make troubles too)
// try to infer implicit parameters immediately in order to:
// 1) guide type inference for implicit views
// 2) discard ineligible views right away instead of risking spurious ambiguous implicits
https://github.com/scala/scala/blob/2.13.x/src/compiler/scala/tools/nsc/typechecker/Implicits.scala#L842-L854
The problem that the compiler does not inference and fix type parameter T in .map(InputParser.parse(_)) before trying to find the implicit in the second parameter list.
In the compiler, there is a concrete algorithm that infers types with its own logic, constraints, and tradeoffs. In that concrete compiler version that you use it first goes to the parameter lists and infer and checks types list by list, and only at the end, it infers type parameter by returning type (I do not imply that in other versions it differs, I only point out that it is implementation behavior not a fundamental constraint).
More precisely what is going on is that type parameter T is not being inferred or specified somehow at the step of typechecking of the second parameter list. T (at that point) is existential and it can be any/every type and there is 3 different implicit object that suitable for such type.
It is just how the compiler and its type inference works for now.

Why doesn't scala compiler infer type parameter from superclass?

I want to understand why the scala compiler cannot infer a type parameter passed to a superclass so that I can come up with a workaround. Workaround suggestions are also very welcome! Here's a contrived example of what I'm stuck on (comments in the code explaining issues):
Code is also in a scala fiddle.
/** A Svc is a function that responds to requests
* #tparam Req[_] a request ADT whose instances specify their response type
*/
trait Svc[Req[_]] {
def apply[Resp](req: Req[Resp]): Resp
}
/** Service request ADT */
sealed trait MyReq[_]
// two requests have the same response type of String (i.e. MyReq[String]):
case class GetString(id: String) extends MyReq[String]
case class GetAltString(id: String) extends MyReq[String]
// this one is the only MyReq[Int]
case class GetInt(id: String) extends MyReq[Int]
/** Type class for marshalling a response for a concrete request type.
* This lets us handle marshalling differently for different requests
* that have the same response type (such as GetString and GetAltString above).
*
* #tparam ReqImpl concrete MyReq type. This is required to enforce unique marshaller
* per request when there are mutliple request types with the same response type.
*/
trait ReqMarshaller[ReqImpl <: MyReq[Resp], Resp] {
def marshal(r: Resp): String
}
class MySvc extends Svc[MyReq] {
// this apply function compiles and works just fine.
override def apply[Resp](req: MyReq[Resp]): Resp = req match {
case GetString(id) => id
case GetAltString(id) => id + id
case GetInt(id) => id.length
}
// This is the problem. I want to specify the request is a subclass so
// we get the specific marshaller for the request type and avoid
// ambiguous implicit errors.
// However, the Resp type parameter is always inferred as Nothing
// instead of the correct response type.
def marshal[ReqImpl <: MyReq[Resp], Resp](req: ReqImpl)(
implicit
marshaller: ReqMarshaller[ReqImpl, Resp]
): String = marshaller.marshal(apply(req))
// this method is just here to show that it won't work as a solution
// because it doesn't work when there are multiple request types with
// the same response type (causes ambiguous implicits errors)
def marshalGeneric[Resp](req: MyReq[Resp])(
implicit
marshaller: ReqMarshaller[_ <: MyReq[Resp], Resp]
): String = marshaller.marshal(apply(req))
}
implicit val getIntMarshaller: ReqMarshaller[GetInt, Int] = new ReqMarshaller[GetInt, Int] {
def marshal(i: Int): String = (i * i).toString
}
implicit val getStrMarshaller: ReqMarshaller[GetString, String] = new ReqMarshaller[GetString, String] {
def marshal(s: String): String = s
}
implicit val getAltStrMarshaller: ReqMarshaller[GetAltString, String] = new ReqMarshaller[GetAltString, String] {
def marshal(s: String): String = s + s
}
val svc = new MySvc
val myLength = svc(GetInt("me")) // 2
println(s"myLength: $myLength")
svc.marshalGeneric(GetInt("me")) // compiles and works
//svc.marshal(GetInt("me")) // fails to compile due to infering Resp type as Nothing
//svc.marshalGeneric(GetAltString("me")) // fails to compile because of ambiguous implicits
The problem is that Scala is trying to infer both ReqImpl and Resp type parameters at once instead of inferring ReqImpl first and getting Resp from that. Since Resp doesn't actually appear in the parameter list, it's inferred to Nothing and then Scala notices type bounds are violated. A workaround (I don't remember where I saw it first) is to give an equivalent type to req but one which depends on Resp explicitly:
def marshal[ReqImpl <: MyReq[Resp], Resp](req: ReqImpl with MyReq[Resp])(
implicit marshaller: ReqMarshaller[ReqImpl, Resp]
): String = marshaller.marshal(apply(req))
svc.marshal(GetInt("me")) now compiles.
I think you will need to capture the relationship between Type parameter of Req and type param of your apply function in your Svc trait. And then you can modify the rest of things accordingly.
trait Svc[Req[_ <: XX], XX] {
def apply[Resp <: XX](req: Req[Resp]): Resp
}
One way of doing this is to explicitly mention your ReqImpl is a parametrised type (Type infered to Nothing in Scala). In your case it will look like this:
def marshal[ReqImpl[Resp] <: MyReq[Resp], Resp](req: ReqImpl[Resp])(
implicit
marshaller: ReqMarshaller[ReqImpl[Resp], Resp]
): String = marshaller.marshal(apply(req))
But there are two problems with this approach:
(1) In svc.marshal(GetInt("me")) Scala will infer the type of RepImpl as MyReq[Int], which kind of make sense, but ReqMarshaller[GetInt, Int] won't match. So you need to define it as:
implicit val getIntMarshaller = new ReqMarshaller[MyReq[Int], Int] {
def marshal(i: Int): String = (i * i).toString
}
(2) Now you immediately have another problem, you can't define two ReqMarshaller[MyReq[String], String] at the same time. And maybe it's a bad idea to define two endpoints with the same type parameter (just a guess, but something doesn't fit here, it doesn't work with Alexey Romanov's solution either).
UPDATE
(1) is solved by making ReqMarshaller covariant:
trait ReqMarshaller[+ReqImpl <: MyReq[Resp], Resp] { ...
(2) still fails with ambiguous implicits.

Scala - How can I exclude my function's generic type until use?

I have a map of String to Functions which details all of the valid functions that are in a language. When I add a function to my map, I am required to specify the type (in this case Int).
var functionMap: Map[String, (Nothing) => Any] = Map[String, (Nothing) => Any]()
functionMap += ("Neg" -> expr_neg[Int])
def expr_neg[T: Numeric](value: T)(implicit n: Numeric[T]): T = {
n.negate(value)
}
Instead, how can I do something like:
functionMap += ("Neg" -> expr_neg)
without the [Int] and add it in later on when I call:
(unaryFunctionMap.get("abs").get)[Int](-45)
You're trying to build your function using type classes (in this case, Numeric). Type classes rely on implicit parameters. Implicits are resolved at compile time. Your function name string values are only known at runtime, therefore you shouldn't build your solution on top of type classes like this.
An alternative would be to store a separate function object in your map for each parameter type. You could store the parameter type with a TypeTag:
import scala.reflect.runtime.universe._
var functionMap: Map[(String, TypeTag[_]), (Nothing) => Any] = Map()
def addFn[T: TypeTag](name: String, f: T => Any) =
functionMap += ((name, typeTag[T]) -> f)
def callFn[T: TypeTag](name: String, value: T): Any =
functionMap((name, typeTag[T])).asInstanceOf[T => Any](value)
addFn[Int]("Neg", expr_neg)
addFn[Long]("Neg", expr_neg)
addFn[Double]("Neg", expr_neg)
val neg10 = callFn("Neg", 10)
No type class implicit needs to be resolved to call callFn(), because the implicit Numeric was already resolved on the call to addFn.
What happens if we try to resolve the type class when the function is called?
The first problem is that a Function1 (or Function2) can't have implicit parameters. Only a method can. (See this other question for more explanation.) So if you want something that acts like a Function1 but takes an implicit parameter, you'll need to create your own type that defines the apply() method. It has to be a different type from Function1, though.
Now we get to the main problem: all implicits must be able to be resolved at compile time. At the location in code where the method is run, all the type information needed to choose the implicit value needs to be available. In the following code example:
unaryFunctionMap("abs")(-45)
We don't really need to specify that our value type is Int, because it can be inferred from the value -45 itself. But the fact that our method uses a Numeric implicit value can't be inferred from anything in that line of code. We need to specify the use of Numeric somewhere at compile time.
If you can have a separate map for unary functions that take a numeric value, this is (relatively) easy:
trait UnaryNumericFn {
def apply[T](value: T)(implicit n: Numeric[T]): Any
}
var unaryNumericFnMap: Map[String, UnaryNumericFn] = Map()
object expr_neg extends UnaryNumericFn {
override def apply[T](value: T)(implicit n: Numeric[T]): T = n.negate(value)
}
unaryNumericFnMap += ("Neg" -> expr_neg)
val neg3 = unaryNumericFnMap("Neg")(3)
You can make the function trait generic on the type class it requires, letting your map hold unary functions that use different type classes. This requires a cast internally, and moves the specification of Numeric to where the function is finally called:
trait UnaryFn[-E[X]] {
def apply[T](value: T)(implicit ev: E[T]): Any
}
object expr_neg extends UnaryFn[Numeric] {
override def apply[T](value: T)(implicit n: Numeric[T]): T = n.negate(value)
}
var privateMap: Map[String, UnaryFn[Nothing]] = Map()
def putUnary[E[X]](key: String, value: UnaryFn[E]): Unit =
privateMap += (key -> value)
def getUnary[E[X]](key: String): UnaryFn[E] =
privateMap(key).asInstanceOf[UnaryFn[E]]
putUnary("Neg", expr_neg)
val pos5 = getUnary[Numeric]("Neg")(-5)
But you still have to specify Numeric somewhere.
Also, neither of these solutions, as written, support functions that don't need type classes. Being forced to be this explicit about which functions take implicit parameters, and what kinds of implicits they use, starts to defeat the purpose of using implicits in the first place.
You can't. Because expr_neg is a method with a type parameter T and an implicit argument n depending on that parameter. For Scala to lift that method to a function, it needs to capture the implicit, and therefore it must know what kind of type you want.