I need to implement the cats Show instances for an enum which express the basic operators.
enum Expr[T]:
case Plus(left: Expr[Double], right: Expr[Double]) extends Expr[Double]
case Minus(left: Expr[Double], right: Expr[Double]) extends Expr[Double]
case Num(value: Double) extends Expr[Double]
object Expr {
def eval[T](expr: Expr[T]): T =
expr match {
case Plus(left, right) => eval(left) + eval(right)
case Minus(left, right) => eval(left) - eval(right)
case Num(value) => value
}
After reading the doc, I tried to implement as following:
object ExprShow {
implicit val numShow: Show[Expr.Num] = Show.show(
num => num.value.toString
)
implicit val minusShow: Show[Expr.Minus] = Show.show(
minus => show"${minus.left} - ${minus.right}"
)
implicit val plusShow: Show[Expr.Plus] = Show.show(
plus => show"${plus.left} + ${plus.right}"
)
}
But I got the errors when trying to execute the show method:
val test = Expr.Num(3.0)
test.show
[error] -- [E007] Type Mismatch Error:
[error] 69 | minus => show"${minus.left} - ${minus.right}"
[error] | ^^^^^^^^^^
[error] | Found: (minus.left : grox.Expr[Double])
[error] | Required: cats.Show.Shown
[error] Explanation
[error] ===========
[error]
[error] Tree: minus.left
[error]
[error] I tried to show that
[error] (minus.left : grox.Expr[Double])
[error] conforms to
[error] cats.Show.Shown
[error] but the comparison trace ended with `false`:
[error]
[error] ==> (minus.left : grox.Expr[Double]) <: cats.Show.Shown
[error] ==> (minus.left : grox.Expr[Double]) <: cats.Show.Shown (recurring)
[error] ==> grox.Expr[Double] <: cats.Show.Shown (left is approximated)
[error] ==> grox.Expr[Double] <: cats.Show.Shown (recurring)
[error] <== grox.Expr[Double] <: cats.Show.Shown (recurring) = false
[error] <== grox.Expr[Double] <: cats.Show.Shown (left is approximated) = false
[error] <== (minus.left : grox.Expr[Double]) <: cats.Show.Shown (recurring) = false
[error] <== (minus.left : grox.Expr[Double]) <: cats.Show.Shown = false
[error]
[error] The tests were made under a constraint with:
[error] uninstantiated variables: A
[error] constrained types: [A](f: A => String): cats.Show[A],
[error] [A](f: A => String): cats.Show[A]
...
----------
[error] -- [E008] Not Found Error:
[error] 15 | test.show
[error] | ^^^^^^^^^
[error] | value show is not a member of grox.Expr[Double]
Is there any best practice approach to implement cats Show for enum? What's the root cause of my issue? Any suggestions or doc recommendation would be appreciated. Thanks a bunch
implicit val minusShow: Show[Expr.Minus] = Show.show(
minus => show"${minus.left} - ${minus.right}"
)
minus.left type is Expr[Double], you should define Show[Expr[Double]] first so it could find the correct implicit.
Here is the solution
implicit def exprShow[T]: Show[Expr[T]] = Show.show(
num => Expr.eval(num).toString
)
Related
When I use Mirror of scala 3 to generate a typeclass list, the exception occurs. I know it's the hard limit of jvm of method size, but how can I circumvent this issue.
ps: When delete some fields of Data class it works, but any other solution?
info
sbt: 1.6.0
scala: 3.1.0
error
scala.tools.asm.MethodTooLargeException: Method too large: parse/Main$. ()V while compiling
stack trace
[error] scala.tools.asm.MethodTooLargeException: Method too large: parse/Main$.<clinit> ()V
[error] scala.tools.asm.MethodWriter.computeMethodInfoSize(MethodWriter.java:2087)
[error] scala.tools.asm.ClassWriter.toByteArray(ClassWriter.java:489)
[error] dotty.tools.backend.jvm.GenBCodePipeline$Worker2.getByteArray$1(GenBCode.scala:478)
[error] dotty.tools.backend.jvm.GenBCodePipeline$Worker2.addToQ3(GenBCode.scala:484)
[error] dotty.tools.backend.jvm.GenBCodePipeline$Worker2.run(GenBCode.scala:461)
[error] dotty.tools.backend.jvm.GenBCodePipeline.buildAndSendToDisk(GenBCode.scala:562)
[error] dotty.tools.backend.jvm.GenBCodePipeline.run(GenBCode.scala:525)
[error] dotty.tools.backend.jvm.GenBCode.run(GenBCode.scala:63)
[error] dotty.tools.dotc.core.Phases$Phase.runOn$$anonfun$1(Phases.scala:308)
[error] scala.collection.immutable.List.map(List.scala:246)
[error] dotty.tools.dotc.core.Phases$Phase.runOn(Phases.scala:309)
[error] dotty.tools.backend.jvm.GenBCode.runOn(GenBCode.scala:71)
[error] dotty.tools.dotc.Run.runPhases$4$$anonfun$4(Run.scala:261)
[error] scala.runtime.function.JProcedure1.apply(JProcedure1.java:15)
[error] scala.runtime.function.JProcedure1.apply(JProcedure1.java:10)
[error] scala.collection.ArrayOps$.foreach$extension(ArrayOps.scala:1323)
[error] dotty.tools.dotc.Run.runPhases$5(Run.scala:272)
[error] dotty.tools.dotc.Run.compileUnits$$anonfun$1(Run.scala:280)
[error] scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
[error] dotty.tools.dotc.util.Stats$.maybeMonitored(Stats.scala:68)
[error] dotty.tools.dotc.Run.compileUnits(Run.scala:289)
[error] dotty.tools.dotc.Run.compileSources(Run.scala:222)
[error] dotty.tools.dotc.Run.compile(Run.scala:206)
[error] dotty.tools.dotc.Driver.doCompile(Driver.scala:39)
[error] dotty.tools.xsbt.CompilerBridgeDriver.run(CompilerBridgeDriver.java:88)
[error] dotty.tools.xsbt.CompilerBridge.run(CompilerBridge.java:22)
[error] sbt.internal.inc.AnalyzingCompiler.compile(AnalyzingCompiler.scala:91)
[error] sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$7(MixedAnalyzingCompiler.scala:192)
[error] scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
[error] sbt.internal.inc.MixedAnalyzingCompiler.timed(MixedAnalyzingCompiler.scala:247)
[error] sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$4(MixedAnalyzingCompiler.scala:182)
[error] sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$4$adapted(MixedAnalyzingCompiler.scala:163)
[error] sbt.internal.inc.JarUtils$.withPreviousJar(JarUtils.scala:239)
[error] sbt.internal.inc.MixedAnalyzingCompiler.compileScala$1(MixedAnalyzingCompiler.scala:163)
[error] sbt.internal.inc.MixedAnalyzingCompiler.compile(MixedAnalyzingCompiler.scala:210)
[error] sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileInternal$1(IncrementalCompilerImpl.scala:528)
[error] sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileInternal$1$adapted(IncrementalCompilerImpl.scala:528)
[error] sbt.internal.inc.Incremental$.$anonfun$apply$5(Incremental.scala:177)
[error] sbt.internal.inc.Incremental$.$anonfun$apply$5$adapted(Incremental.scala:175)
[error] sbt.internal.inc.Incremental$$anon$2.run(Incremental.scala:461)
[error] sbt.internal.inc.IncrementalCommon$CycleState.next(IncrementalCommon.scala:116)
[error] sbt.internal.inc.IncrementalCommon$$anon$1.next(IncrementalCommon.scala:56)
[error] sbt.internal.inc.IncrementalCommon$$anon$1.next(IncrementalCommon.scala:52)
[error] sbt.internal.inc.IncrementalCommon.cycle(IncrementalCommon.scala:263)
[error] sbt.internal.inc.Incremental$.$anonfun$incrementalCompile$8(Incremental.scala:416)
[error] sbt.internal.inc.Incremental$.withClassfileManager(Incremental.scala:503)
[error] sbt.internal.inc.Incremental$.incrementalCompile(Incremental.scala:403)
[error] sbt.internal.inc.Incremental$.apply(Incremental.scala:169)
[error] sbt.internal.inc.IncrementalCompilerImpl.compileInternal(IncrementalCompilerImpl.scala:528)
[error] sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileIncrementally$1(IncrementalCompilerImpl.scala:482)
[error] sbt.internal.inc.IncrementalCompilerImpl.handleCompilationError(IncrementalCompilerImpl.scala:332)
[error] sbt.internal.inc.IncrementalCompilerImpl.compileIncrementally(IncrementalCompilerImpl.scala:420)
[error] sbt.internal.inc.IncrementalCompilerImpl.compile(IncrementalCompilerImpl.scala:137)
[error] sbt.Defaults$.compileIncrementalTaskImpl(Defaults.scala:2366)
[error] sbt.Defaults$.$anonfun$compileIncrementalTask$2(Defaults.scala:2316)
[error] sbt.internal.server.BspCompileTask$.$anonfun$compute$1(BspCompileTask.scala:30)
[error] sbt.internal.io.Retry$.apply(Retry.scala:46)
[error] sbt.internal.io.Retry$.apply(Retry.scala:28)
[error] sbt.internal.io.Retry$.apply(Retry.scala:23)
[error] sbt.internal.server.BspCompileTask$.compute(BspCompileTask.scala:30)
[error] sbt.Defaults$.$anonfun$compileIncrementalTask$1(Defaults.scala:2314)
[error] scala.Function1.$anonfun$compose$1(Function1.scala:49)
[error] sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:62)
[error] sbt.std.Transform$$anon$4.work(Transform.scala:68)
[error] sbt.Execute.$anonfun$submit$2(Execute.scala:282)
[error] sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:23)
[error] sbt.Execute.work(Execute.scala:291)
[error] sbt.Execute.$anonfun$submit$1(Execute.scala:282)
[error] sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:265)
[error] sbt.CompletionService$$anon$2.call(CompletionService.scala:64)
[error] java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error] java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error] java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error] java.lang.Thread.run(Thread.java:748)
[error]
[error] stack trace is suppressed; run last Compile / compileIncremental for the full output
[error] (Compile / compileIncremental) scala.tools.asm.MethodTooLargeException: Method too large: parse/Main$.<clinit> ()V
[error] Total time: 2 s, completed 2022-1-5 14:11:19
code
import scala.compiletime.*
import scala.deriving.Mirror
object Main extends App {
trait FromString[A] {
def convert(str: String): A
}
object FromString {
given FromString[Int] = (str) => str.toInt
given FromString[Double] = (str) => str.toDouble
given FromString[String] = (str) => str
}
inline def getTypeclassInstances[F[_], A <: Tuple]: List[F[Any]] =
inline erasedValue[A] match {
case _: EmptyTuple => Nil
case _: (head *: tail) =>
val headTypeClass =
summonInline[F[head]]
val tailTypeClasses =
getTypeclassInstances[F, tail]
headTypeClass.asInstanceOf[F[Any]] :: getTypeclassInstances[F, tail]
}
inline def summonInstancesHelper[F[_], A](using
m: Mirror.Of[A]
): List[F[Any]] =
getTypeclassInstances[F, m.MirroredElemTypes]
case class Data(
ip: String,
method: String,
uri: String,
protocal: String,
httpStatus: Int,
byteSent: Double,
reqLength: Double,
reqTime: Double,
respTime: Double,
referer: String,
device: String
)
val types =
summonInstancesHelper[FromString, Data]
println(types.mkString("\r\n"))
}
Thanks to #bishabosha, it figures out:
reference: https://github.com/lampepfl/dotty/issues/14213
Hope it help others.
inline def getTypeclassInstances[F[_], A <: Tuple]: List[F[Any]] =
inline erasedValue[A] match {
case _: EmptyTuple => Nil
case _: (head *: tail) =>
val headTypeClass =
summonInline[F[head]]
val tailTypeClasses =
getTypeclassInstances[F, tail]
- headTypeClass.asInstanceOf[F[Any]] :: getTypeclassInstances[F, tail]
+ headTypeClass.asInstanceOf[F[Any]] :: tailTypeClasses
}
in general though this could still crash with a very large case class - perhaps you can change headTypeClass and tailTypeClasses to def instead of val.
inline def getTypeclassInstances[F[_], A <: Tuple]: List[F[Any]] =
inline erasedValue[A] match {
case _: EmptyTuple => Nil
case _: (head *: tail) =>
//use def rather than val
def headTypeClass =
summonInline[F[head]]
def tailTypeClasses =
getTypeclassInstances[F, tail]
headTypeClass.asInstanceOf[F[Any]] :: tailTypeClasses
}
I have following code snippet:
def determineProducerType(keySerializer: KkSerializer)(valueSerializer: KkSerializer)(props: Properties)
: Eval[KafkaProducer[java.lang.Object, java.lang.Object]] = (keySerializer, valueSerializer) match {
case (KkStringSeDe, KkStringSeDe) => Later(new KafkaProducer[String, String](props))
case (KkStringSeDe, KkByteArraySeDe) => Later(new KafkaProducer[String, Byte](props))
case (KkStringSeDe, KkIntegerSeDe) => Later(new KafkaProducer[String, Integer](props))
case (KkStringSeDe, KkLongSeDe) => Later(new KafkaProducer[String, Long](props))
}
The compiler complains:
[info] Compiling 2 Scala sources to /home/developer/Desktop/scala/PureProducer/target/scala-2.12/classes ...
[error] /home/developer/Desktop/scala/PureProducer/src/main/scala/TheProducer.scala:113:48: type mismatch;
[error] found : org.apache.kafka.clients.producer.KafkaProducer[String,String]
[error] required: org.apache.kafka.clients.producer.KafkaProducer[Object,Object]
[error] Note: String <: Object, but Java-defined class KafkaProducer is invariant in type K.
[error] You may wish to investigate a wildcard type such as `_ <: Object`. (SLS 3.2.10)
[error] Note: String <: Object, but Java-defined class KafkaProducer is invariant in type V.
[error] You may wish to investigate a wildcard type such as `_ <: Object`. (SLS 3.2.10)
[error] case (KkStringSeDe, KkStringSeDe) => Later(new KafkaProducer[String, String](props))
[error] ^
[error] /home/developer/Desktop/scala/PureProducer/src/main/scala/TheProducer.scala:114:51: type mismatch;
[error] found : org.apache.kafka.clients.producer.KafkaProducer[String,Byte]
[error] required: org.apache.kafka.clients.producer.KafkaProducer[Object,Object]
[error] case (KkStringSeDe, KkByteArraySeDe) => Later(new KafkaProducer[String, Byte](props))
[error] ^
[error] /home/developer/Desktop/scala/PureProducer/src/main/scala/TheProducer.scala:115:49: type mismatch;
[error] found : org.apache.kafka.clients.producer.KafkaProducer[String,Integer]
[error] required: org.apache.kafka.clients.producer.KafkaProducer[Object,Object]
[error] Note: String <: Object, but Java-defined class KafkaProducer is invariant in type K.
[error] You may wish to investigate a wildcard type such as `_ <: Object`. (SLS 3.2.10)
[error] Note: Integer <: Object, but Java-defined class KafkaProducer is invariant in type V.
[error] You may wish to investigate a wildcard type such as `_ <: Object`. (SLS 3.2.10)
[error] case (KkStringSeDe, KkIntegerSeDe) => Later(new KafkaProducer[String, Integer](props))
[error] ^
[error] /home/developer/Desktop/scala/PureProducer/src/main/scala/TheProducer.scala:116:46: type mismatch;
[error] found : org.apache.kafka.clients.producer.KafkaProducer[String,Long]
[error] required: org.apache.kafka.clients.producer.KafkaProducer[Object,Object]
[error] case (KkStringSeDe, KkLongSeDe) => Later(new KafkaProducer[String, Long](props))
[error] ^
[error] four errors found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 2 s, completed Nov 12, 2017 10:39:14 AM
What I am trying to do is, I defined a sum type:
sealed trait KkSerializer
case object KkStringSeDe extends KkSerializer
case object KkByteArraySeDe extends KkSerializer
case object KkIntegerSeDe extends KkSerializer
case object KkLongSeDe extends KkSerializer
When it matches an appropriate sum type, then it should returns corresponding type.
Create an instance of KafkaProducer is:
val producer = new KafkaProducer[String, String](props)
How to solve it?
I think in this case you can just use path dependent types to get what you want:
sealed trait KkSerializer { type Out }
case object KkStringSeDe extends KkSerializer {
type Out = String
}
case object KkByteArraySeDe extends KkSerializer {
type Out = Byte
}
def determineProducerType(k: KkSerializer)(v: KkSerializer)(props: Properties): Eval[KafkaProducer[k.Out, v.Out]] =
Later(new KafkaProducer[k.Out, v.Out](props))
Could you please tell me what is wrong with this Scala code?
package com.user.common
class Notification(message: String, next: Option[Notification]) {
def write(): String = {
message
}
def getAll(): Stream[Notification] = {
next match {
case Some(n) => Stream.cons(n, n.getAll())
case None => Stream.empty
}
}
}
case class Email(msg: String)
extends Notification(msg, None)
case class SMS(msg: String)
extends Notification(msg, Option(Email))
case class VoiceRecording(msg: String)
extends Notification(msg, Option(SMS))
The errors from compiler are as below.
[error] /common/Test.scala:15: type mismatch;
[error] found : Some[A]
[error] required: Option[com.user.common.Notification]
[error] case Some(n) => Stream.cons(n, n.getAll())
[error] ^
[error] /common/Test.scala:15: type mismatch;
[error] found : A
[error] required: com.user.common.Notification
[error] case Some(n) => Stream.cons(n, n.getAll())
[error] ^
[error] /common/Test.scala:15: value getAll is not a member of type parameter A
[error] case Some(n) => Stream.cons(n, n.getAll())
[error] ^
[error] /common/Test.scala:25: type mismatch;
[error] found : com.user.common.Email.type
[error] required: com.user.common.Notification
[error] extends Notification(msg, Option(Email))
[error] ^
[error] /common/Test.scala:28: type mismatch;
[error] found : com.user.common.SMS.type
[error] required: com.user.common.Notification
[error] extends Notification(msg, Option(SMS))
[error] ^
[error] 5 errors found
[error] (compile:compileIncremental) Compilation failed
I could not understand the problem. Similarly, I have no idea how to restructure the code. My basic idea is to keep value of one case class and iterate over them until I reach to None. From top level case class until low level one.
case class SMS(msg: String)
extends Notification(msg, Option(Email))
case class VoiceRecording(msg: String)
extends Notification(msg, Option(SMS))`
In your second parameter, you are passing an option on a class type whereas an instance of the class is expected
Maybe what you want is
case class SMS(msg: String)
extends Notification(msg, Option(Email(msg)))
case class VoiceRecording(msg: String)
extends Notification(msg, Option(SMS(msg)))`
As the title suggests, I am trying to create a Scala macro that generates a class definition. Basically I want to eventually do the following:
classify("StandardEvent", Map("name" -> String, "id" -> Int))
// should at compile-time expand to
case class StandardEvent(name: String, id: Int) extends Event
Is that even possible? And if so, could anyone point me into the right direction on how to do so? Actually, I cannot even get the following simple macro to work:
// Macro definition:
def classify(): Unit =
macro classifyImpl
def classifyImpl(c: Context)(): c.Tree = {
import c.universe._
q"class SomeClass"
}
// Macro usage in seperate compilation unit
classify()
val s = new SomeClass
This gets me the following error message:
[error] ClassifyTest.scala:6: not found: type SomeClass
[error] val s = new SomeClass
[error] ^
[trace] Stack trace suppressed: run last app/compile:compileIncremental for the full output.
[error] (app/compile:compileIncremental) java.lang.AssertionError: assertion failed:
[error] class SomeClass extends scala.AnyRef {
[error] def <init>() = {
[error] super.<init>();
[error] ()
[error] }
[error] }
[error] while compiling: ClassifyTest.scala
[error] during phase: typer
[error] library version: version 2.11.8
[error] compiler version: version 2.11.8
[error] reconstructed args: -bootclasspath // ...
[error]
[error] last tree to typer: type SomeClass
[error] tree position: line 5 of ClassifyTest.scala
[error] symbol: <none>
[error] symbol definition: <none> (a NoSymbol)
[error] symbol package: <none>
[error] symbol owners:
[error] call site: <none> in <none>
[error]
[error] == Source file context for tree position ==
[error]
[error] 2 object ClassifyTest extends App {
[error] 3
[error] 4 classify()
[error] 5 val s = new SomeClass
[error] 6
[error] 7 }
Can anyone make sense of this? Help is much appreciated.
I'm not sure whether its duplicate of Type Parameters on Scala Macro Annotations or not.
I'm trying to get type parameter on macro annotation:
class builder extends StaticAnnotation {
def macroTransform(annottees: Any*) = macro builderMacro.impl
}
//....
val q"new $_[$tpt]().macroTransform(..$_)" = c.macroApplication
val tpe = c.typecheck(tpt).tpe
// also tried
// val tpe = c.typecheck(q"None.asInstanceOf[$tpt]").tpe
Code that uses macro:
object Test2 {
trait TestBuilders
#builder[TestBuilders]
case class TestClass(x: Int, opt1: Option[String], opt2: Option[String]) {
val opts = (opt1, opt2)
}
}
and exception i get:
[error] scala.reflect.macros.TypecheckException: not found: type TestBuilders
[error] at scala.reflect.macros.contexts.Typers$$anonfun$typecheck$2$$anonfun$apply$1.apply(Typers.scala:34)
[error] at scala.reflect.macros.contexts.Typers$$anonfun$typecheck$2$$anonfun$apply$1.apply(Typers.scala:28)
[error] at scala.tools.nsc.typechecker.Contexts$Context.withMode(Contexts.scala:374)
[error] at scala.reflect.macros.contexts.Typers$$anonfun$3.apply(Typers.scala:24)
[error] at scala.reflect.macros.contexts.Typers$$anonfun$3.apply(Typers.scala:24)
[error] at scala.reflect.macros.contexts.Typers$$anonfun$withContext$1$1.apply(Typers.scala:25)
[error] at scala.reflect.macros.contexts.Typers$$anonfun$withContext$1$1.apply(Typers.scala:25)
[error] at scala.tools.nsc.typechecker.Contexts$Context.withMode(Contexts.scala:374)
[error] at scala.reflect.macros.contexts.Typers$$anonfun$1.apply(Typers.scala:23)
[error] at scala.reflect.macros.contexts.Typers$$anonfun$1.apply(Typers.scala:23)
[error] at scala.reflect.macros.contexts.Typers$class.withContext$1(Typers.scala:25)
[error] at scala.reflect.macros.contexts.Typers$$anonfun$typecheck$2.apply(Typers.scala:28)
[error] at scala.reflect.macros.contexts.Typers$$anonfun$typecheck$2.apply(Typers.scala:28)
[error] at scala.reflect.internal.Trees$class.wrappingIntoTerm(Trees.scala:1691)
[error] at scala.reflect.internal.SymbolTable.wrappingIntoTerm(SymbolTable.scala:16)
[error] at scala.reflect.macros.contexts.Typers$class.withWrapping$1(Typers.scala:26)
[error] at scala.reflect.macros.contexts.Typers$class.typecheck(Typers.scala:28)
[error] at scala.reflect.macros.contexts.Context.typecheck(Context.scala:6)
[error] at scala.reflect.macros.contexts.Context.typecheck(Context.scala:6)
[error] at builderMacro$.impl(Macros.scala:55)
what am i doing wrong?
This is a known issue in current macro paradise: https://github.com/scalamacros/paradise/issues/14. Note that if TestBuilders is declared in a different scope, everything should work out.