UUID Path Bindable - Play Framework - scala

In my build.sbt I have
routesImport += "play.api.mvc.PathBindable.bindableUUID"
And in my routes I have:
GET /groups/:id controllers.GroupController.get(id)
And in my controller I have
class GroupController { ....
def get (id: UUID)
I am getting the following error for the above route
type mismatch;
found : String
required: java.util.UUID
How can used uuid in path in routes file in Play. I am using play 2.4.2 - scala 2.11.7

String is the default type for parameters in the routes file. To change this, you need to explicitly specify a type for the Id:
GET /groups/:id controllers.GroupController.get(id: java.util.UUID)
If you do that, you should find you can also delete the import of bindableUUID in your build file.

Related

Bug an error when accessing a class that doesn't have access to type information

There are some classes in the library that have an IMPORT that cannot be resolved.
For example, org.scalatest.tools.Framework in ScalaTest.
If I add scalatest as a dependent library, it will be added to the test classpath, but this import will not be resolved in the normal test classpath.
There is no SBT module in the test classpath.
import sbt.testing.{Event => SbtEvent, Framework => SbtFramework, Runner => SbtRunner, Status => SbtStatus, _}
I need to scan for classes under a specific package arrangement in macro and search for classes with specific features.
def collectXxx(targets: List[c.Symbol]) {
targets.filter { x =>
{
x.isModule || (
x.isClass &&
!x.isAbstract &&
x.asClass.primaryConstructor.isMethod
} && x.typeSignature.baseClasses.contains(XxxTag.typeSymbol)
}
}
This will filter to symbols that are object / class and inherit from Xxx.
This will work in most cases, but if there is a class in targets that cannot be compiled as is, such as scalatest, the compiler error cannot be avoided.
The moment baseClasses is accessed, the macro deployment status is set to global error.
[error] <macro>:1:26: Symbol 'type sbt.testing.Framework' is missing from the classpath.
[error] This symbol is required by 'class org.scalatest.tools.Framework'.
[error] Make sure that type Framework is in your classpath and check for conflicting dependencies with `-Ylog-classpath`.
[error] A full rebuild may help if 'Framework.class' was compiled against an incompatible version of sbt.testing.
[error] type `fresh$macro$612` = org.scalatest.tools.Framework
[error] ^
If you look at the stack trace in debug mode, I was setting global_error when I accessed each property of StubClassSymbol.
java.lang.Throwable
at scala.reflect.internal.Symbols$StubSymbol.fail(Symbols.scala:3552)
at scala.reflect.internal.Symbols$StubSymbol.info(Symbols.scala:3563)
at scala.reflect.internal.Symbols$StubSymbol.info$(Symbols.scala:3563)
at scala.reflect.internal.Symbols$StubClassSymbol.info(Symbols.scala:3567)
| => cat scala.reflect.internal.Symbols$StubClassSymbol.info(Symbols.scala:3567)
at scala.reflect.internal.Types$TypeRef.baseClasses(Types.scala:2593)
at scala.reflect.internal.Types.computeBaseClasses(Types.scala:1703)
at scala.reflect.internal.Types.computeBaseClasses$(Types.scala:1680)
at scala.reflect.internal.SymbolTable.computeBaseClasses(SymbolTable.scala:28)
at scala.reflect.internal.Types.$anonfun$defineBaseClassesOfCompoundType$2(Types.scala:1781)
at scala.reflect.internal.Types$CompoundType.memo(Types.scala:1651)
at scala.reflect.internal.Types.defineBaseClassesOfCompoundType(Types.scala:1781)
at scala.reflect.internal.Types.defineBaseClassesOfCompoundType$(Types.scala:1773)
at scala.reflect.internal.SymbolTable.defineBaseClassesOfCompoundType(SymbolTable.scala:28)
at scala.reflect.internal.Types$CompoundType.baseClasses(Types.scala:1634)
at refuel.internal.AutoDIExtractor.$anonfun$recursivePackageExplore$3(AutoDIExtractor.scala:119)
I thought of a way to get around this.
Perhaps when the import fails to resolve, that TypeSymbol becomes a StubClassSymbol.
So I parsed the structure of the Symbol that went into error and added a condition to filter it if a StubClassSymbol was found. And this one has worked.
!x.typeSignature.asInstanceOf[ClassInfoTypeApi].parents.exists { pr =>
pr.typeSymbol.isClass &&
pr.typeSymbol.asClass.isInstanceOf[scala.reflect.internal.Symbols#StubClassSymbol]
}
But I think this is really pushy. Is there any other way around it? And I wonder if this really covers all cases.

How to use Gremlin in Scala script?

I'm trying to use Janusgraph in scala script with tinkerpop 3. I use the gremlin.scala library (https://github.com/mpollmeier/gremlin-scala) but I get an error about HNil (see below). How to use gremlin in scala script and Janusgraph ?
import gremlin.scala._
import org.apache.commons.configuration.BaseConfiguration
import org.janusgraph.core.JanusGraphFactory
import org.apache.tinkerpop.gremlin.structure.Graph
object Janus {
def main(args: Array[String]): Unit = {
val conf = new BaseConfiguration()
conf.setProperty("storage.backend","inmemory")
val graph = JanusGraphFactory.open(conf)
val v1 = graph.graph.addV("test")
}
}
Error:(11, 14) Symbol 'type scala.ScalaObject' is missing from the classpath.
This symbol is required by 'trait shapeless.HNil'.
Make sure that type ScalaObject is in your classpath and check for conflicting dependencies with -Ylog-classpath.
A full rebuild may help if 'HNil.class' was compiled against an incompatible version of scala.
val v1 = graph.graph.addV("test")
Not sure what you mean by 'scala script', but it looks like you're missing many (all?) dependencies. Did you have a look at https://github.com/mpollmeier/gremlin-scala-examples/ ? It contains an example setup for janusgraph.

Type mismatch when utilising a case class in a package object

I am receiving the following error when I try to run my code:
Error:(104, 63) type mismatch;
found : hydrant.spark.hydrant.spark.IPPortPair
required: hydrant.spark.(some other)hydrant.spark.IPPortPair
IPPortPair(it.getHeader.getDestinationIP, it.getHeader.getDestinationPort))
My code uses a case class defined in the package object spark to set up the IP/Port map for each connection.
The package object looks like this:
package object spark{
case class IPPortPair(ip:Long,port:Long)
}
And the code using the package object like the below:
package hydrant.spark
import java.io.{File,PrintStream}
object identifyCustomers{
……………
def mapCustomers(suspectTraffic:RDD[Generic])={
suspectTraffic.filter(
it => !it.getHeader.isEmtpy
).map(
it => IPPortPair(it.getHeader.getDestinationIP,it.getHeader.getDestinationPort)
) ^`
}
I am concious about the strange way that my packages are being displayed as the error makes it seem that I am in hydrant.spark.hydrant.spark which does not exist.
I am also using Intellij if that makes a difference.
You need to run sbt clean (or the IntelliJ equivalent). You changed something in the project (e.g. Scala version) and this is how the incompatibility manifests.

Why is my object not a member of package <root> if it's in a separate source file?

I'm having a problem accessing an object defined in the root package. If I have all my code in one file, it works fine, but when I split it across two files, I can't get it past the compiler.
This works fine:
All in one file called packages.scala:
object Foo
val name = "Brian"
}
package somepackage {
object Test extends App {
println(Foo.name)
}
}
Witness:
$ scalac packages.scala
$ scala -cp . somepackage.Test
Brian
But if I split the code across two files:
packages.scala
object Foo {
val name = "Brian"
}
packages2.scala
package somepackage {
object Test extends App {
println(Foo.name)
}
}
it all fails:
$ scalac packages.scala packages2.scala
packages2.scala:3: error: not found: value Foo
So I try to make the reference to Foo absolute:
...
println(_root_.Foo.name)
...
But that doesn't work either:
$ scalac packages.scala packages2.scala
packages2.scala:3: error: object Foo is not a member of package <root>
If Foo is not a member of the root package, where on earth is it?
I think this is the relevant part in the spec:
Top-level definitions outside a packaging are assumed to be injected into a special empty package. That package cannot be named and therefore cannot be imported. However, members of the empty package are visible to each other without qualification.
Source Scala Reference §9.2 Packages.
But don’t ask me why it works if you have the following in packages2.scala:
object Dummy
package somepackage {
object Test extends App {
println(Foo.name)
}
}
Foo is a member of the root package, but you can't refer to it. It's a generic thing with JVM languages, (see How to access java-classes in the default-package? for Groovy, What's the syntax to import a class in a default package in Java?). It's the same for Scala.
From the Java answer:
You can't import classes from the default package. You should avoid
using the default package except for very small example programs.
From the Java language specification:
It is a compile time error to import a type from the unnamed package.
The reason it works in one single file is because everything is available to the compiler at once, and the compiler copes with it. I suspect that this is to allow scripting.
Moral of the story: don't use the default package if you're not scripting.
Ran into this when trying to import the main App entrypoint into a test. This may be an evil hack, but putting package scala at the top of the entrypoint definition seems to have made the object globally available. This may be evil, but it works.
E.g.
/src/main/scala/EntryPoint.scala
package scala
object EntryPoint extends App {
val s = "Foo"
}
/src/test/scala/integration/EntryPointSuite.scala
package integration
import org.scalatest.FlatSpec
class EntryPointSuite extends FlatSpec {
"EntryPoint" should "have property s" in {
val ep = EntryPoint.main()
assert(ep.s == "Foo")
}
}

Scala + stax compile problem during deploy process

I developed an app in scala-ide (eclipse plugin), no errors or warnings. Now I'm trying to deploy it to the stax cloud:
$ stax deploy
But it fails to compile it:
compile:
[scalac] Compiling 2 source files to /home/gleontiev/workspace/rss2lj/webapp/WEB-INF/classes
error: error while loading FlickrUtils, Scala signature FlickrUtils has wrong version
expected: 4.1
found: 5.0
/home/gleontiev/workspace/rss2lj/src/scala/example/snippet/DisplaySnippet.scala:8: error: com.folone.logic.FlickrUtils does not have a constructor
val dispatcher = new FlickrUtils("8196243#N02")
^
error: error while loading Photo, Scala signature Photo has wrong version
expected: 4.1
found: 5.0
/home/gleontiev/workspace/rss2lj/src/scala/example/snippet/DisplaySnippet.scala:9: error: value link is not a member of com.folone.logic.Photo
val linksGetter = (p:Photo) => p.link
^
/home/gleontiev/workspace/rss2lj/src/scala/example/snippet/DisplaySnippet.scala:15: error: com.folone.logic.FlickrUtils does not have a constructor
val dispatcher = new FlickrUtils("8196243#N02")
^
/home/gleontiev/workspace/rss2lj/src/scala/example/snippet/DisplaySnippet.scala:16: error: value medium1 is not a member of com.folone.logic.Photo
val picsGetter = (p:Photo) => p.medium1
^
/home/gleontiev/workspace/rss2lj/src/scala/example/snippet/RefreshSnippet.scala:12: error: com.folone.logic.FlickrUtils does not have a constructor
val dispatcher = new FlickrUtils("8196243#N02")
^
7 errors found
ERROR: : The following error occurred while executing this line:
/home/gleontiev/workspace/rss2lj/build.xml:61: Compile failed with 7 errors; see the compiler error output for details.
I see two errors, it is complaining about: the first one is FlickrUtils class constructor, which is defined like this:
class FlickrUtils(val userId : String) {
//...
}
The second one is the fact, that two fields are missing from Photo class, which is:
class Photo (val photoId:String, val userId:String, val secret:String, val server:String) {
private val _medium1 = "/sizes/m/in/photostream"
val link = "http://flickr.com/photos/" + userId + "/" + photoId
val medium1 = link + _medium1
}
Seems like stax sdk uses the wrong comliler (?). How do I make it use the right one? If it is not, what is the problem here, and what are some ways to resolve it?
Edit: $ scala -version says
Scala code runner version 2.8.0.final -- Copyright 2002-2010, LAMP/EPFL
I tried compiling everything with scalac manually, puting everything to their places, and running stax deploy afterwards -- same result.
I actually resolved this by moving FlickrUtils and Photo classes to the packages, where snippets originally are, but I still don't get, why it was not able to compile and use them from the other package.