Object is not callable error - class

import math
class Word:
'''A class that represents word'''
def __init__(self, word):
self._word = word
def __str__(self):
return self._word
c = word("Hello")
print (c())
This results with "builtins.TypeError: 'Word' object is not callable"
How can I check the value of c using print(c())?

Related

Create instance of child from parent class

I am building a simple DSL for length operations. I want the domain operations to be extensible, so I am using them as mixins along with the implicit conversions for my domain components.
1. Below is my App.
package com.shasank.funWithLengths
object LengthAdditionApp extends App{
val length1 = 11 inches
val length2 = 15 inches
val length3 = 2 feet
println(length1)
println(length2)
println(length3)
println(length1 + length2) // all ok
println(length1 + length3) // all ok
println(length3 - length1) // all ok
println(length1 + length2 + length2) // this breaks since object returned from first operation doesn't have adder
}
Below is my base class. I would have loved this to be abstract, but since I could not find a way to create instance of subclass Inches, I marked the constructor protected, so that only subclasses can extend it and nothing else can create an instance.
package com.shasank.funWithLengths
class Length protected(val measure: Int, val unit: String) {
private def convertToInches(length: Length)= length.unit match {
case "feet" => length.measure * 12
case "inches" => length.measure
}
protected def operateOnMeasures(other: Length, op: (Int,Int) => Int): Length ={
val thisInches = convertToInches(this)
val otherInches = convertToInches(other)
val operatedMeasure = op(thisInches,otherInches)
new Length(operatedMeasure, "inches") // object created does not have adder & subtracter capabilities
}
override def toString = {
val measureInInches = convertToInches(this)
val (feetMeasure, inchesMeasure) = BigInt(measureInInches) /% 12
val feetMeasureString = s"$feetMeasure feet and"
val inchesMeasureString = s"$inchesMeasure inches"
s"$feetMeasureString $inchesMeasureString"
}
}
Below are my domain components.
package com.shasank
package object funWithLengths {
implicit class Inches(measure: Int) extends Length(measure, "inches") with Adder with Subtracter {
def inches = this
}
implicit class Feet(measure: Int) extends Length(measure, "feet") with Adder with Subtracter {
def feet = this
}
}
Below are my domain operators.
package com.shasank.funWithLengths
trait Adder extends Length {
def +(other: Length) = super.operateOnMeasures(other, _+_)
}
package com.shasank.funWithLengths
trait Subtracter extends Length {
def -(other: Length) = super.operateOnMeasures(other, _-_)
}
Question: Is there a way to create an instance of Inches (so that I can get all the goodies of it) while returning from the method operateOnMeasures in my base class Length?
I was able to resolve this by moving the Length class inside the package object where I declared the implicit classes Inches and Feet.
Here is link to my working
code

Scala Skinny ORM - Obligatory Relation

belongsTo relationship obligatory
I don't want to define typeWine as an optional value, but if i don't put it, I have to declare typeWine in the extract method and I don't know how to do that.
In the documentacion of Skinny ORM, it doesn't describe how to do this, and I'm getting stuck.
package app.models.wine
import scalikejdbc._
import skinny.orm.SkinnyCRUDMapper
case class Wine (id: Option[Long], typeWine: Option[Type] = None, name: String)
object Wine extends SkinnyCRUDMapper[Wine] {
override def defaultAlias = createAlias("w")
override def extract (rs: WrappedResultSet, n: ResultName[Wine]): Wine = new Wine(
id = rs.get(n.id),
name = rs.get(n.name)
)
belongsTo[Type](Type, (w, t) => w.copy(typeWine = t)).byDefault
}
package app.models.wine
import scalikejdbc._
import skinny.orm.SkinnyCRUDMapper
case class Type (id: Option[Long], typeName: String)
object Type extends SkinnyCRUDMapper[Type] {
override def defaultAlias = createAlias("t")
override def columnNames = Seq("id", "type_name")
override def extract (rs: WrappedResultSet, n: ResultName[Type]): Type = new Type(
id = rs.get(n.id),
typeName = rs.get(n.typeName)
)
}
Defining belongsTo/byDefault relationship to typeWine and extracting typeWine value with its resultName should work for you.

Scala macro for shortcut

I have defined the following macros to get file, line and object/class from current location:
http://pastebin.com/UsNLemnK
Using SBT, I have defined two projects, in order to compile the macros first, then the actual project using these macros.
The purpose of these macros are to be be used in a log method:
def log( msg: Any, srcFile: String = "", srcLine: String = "", srcClass:String = "")
I am then using this log method as follows:
log(msg, s"$F_",s"$L_",s"$C_")
where F_, L_ and C_ are defined in the macro.
Now, I would like to create a shortcut to avoid this boilerplate and just call:
log(msg)
which should automatically be replaced by
log(msg, s"$F_",s"$L_",s"$C_")
I could define a macro to do this:
def log_(msg: String) : Unit = macro logImpl
def logImpl( c: Context )(msg: c.Expr[String]): c.Expr[Unit] = {
import c.universe._
reify( log(msg.splice, srcFile=s"$F_", srcLine=s"$L_", srcClass=s"$C_") )
}
but again, this macro needs to be compiled before the project, where the log function itself is defined... So I don't see how to solve the compilation dependencies cycle...
Any suggestion about how to do this?
Thanks
Barring the use of macro annotations (which would necessarily and significantly alter your API's syntax), the problem you have to face is that you need the type-checked identifier of your log function.
Since you can't import the entire log implementation, a solution would be to:
wrap the method into a trait,
define this trait in the "macro" project,
add an implicit parameter to the log_ method,
in your "main" project, create an implementation of this trait, and instantiate this implementation in an implicit val visible everywhere you'd like to use the log_ macro (in the package object for example).
Of course, you could also use a simple FunctionN here and avoid the trait definition and implementation, but this way you'll avoid potential conflicts with other same-typed implicits.
In general, your code would resemble the following:
//"macro" project
trait EncapsulatingTrait {
def yourMethod(...)
}
object Macros {
def myMacro(...)(implicit param: EncapsulatingTrait) = macro myMacroImpl
def myMacroImpl( c: Context )(...)
(param: c.Expr[EncapsulatingTrait]): c.Expr[...] = {
import c.universe._
reify(param.splice.yourMethod(...))
}
}
//--------------------------
//"main" project
class Impl extends EncapsulatingTrait {
def yourMethod(...)
}
...
implicit val defaultParam = new Impl
import Macros.myMacro
myMacro(...)
In your specific case, here's how an implementation could look like:
//"macro" project
package yourpackage
import java.io.File
import language.experimental.macros
import scala.reflect.macros.Context
trait LogFunction {
def log( msg: Any, srcFile: String = "", srcLine: Int = -1, srcClass:String = "")
}
object Macros {
// get current line in source code
def L_ : Int = macro lineImpl
def lineImpl( c: Context ): c.Expr[Int] = {
import c.universe._
val line = Literal( Constant( c.enclosingPosition.line ) )
c.Expr[Int]( line )
}
// get current file from source code (relative path)
def F_ : String = macro fileImpl
def fileImpl( c: Context ): c.Expr[String] = {
import c.universe._
val absolute = c.enclosingPosition.source.file.file.toURI
val base = new File( "." ).toURI
val path = Literal( Constant( c.enclosingPosition.source.file.file.getName() ) )
c.Expr[String]( path )
}
// get current class/object (a bit sketchy)
def C_ : String = macro classImpl
def classImpl( c: Context ): c.Expr[String] = {
import c.universe._
val class_ = Literal( Constant( c.enclosingClass.toString.split(" ")( 1 ) ) )
c.Expr[String]( class_ )
}
def log_(msg: String)(implicit logFunc: LogFunction) : Unit = macro logImpl
def logImpl( c: Context )(msg: c.Expr[String])(logFunc: c.Expr[LogFunction]): c.Expr[Unit] = {
import c.universe._
reify( logFunc.splice.log(msg.splice, srcFile=fileImpl(c).splice, srcLine=lineImpl(c).splice, srcClass=classImpl(c).splice) )
}
}
//--------------------------
//"main" project
import yourpackage.LogFunction
class LogImpl extends LogFunction {
def log( msg: Any, srcFile: String = "", srcLine: Int = -1, srcClass:String = "") {
println(List(msg,srcFile,srcLine,srcClass).mkString("|"))
}
}
object testLog {
def main(args: Array[String]): Unit = {
implicit val defaultLog = new LogImpl
import yourpackage.Macros.log_
log_("blah")
}
}
(note that I had to correct the signature of log_ and tweak the macro call a bit)

Slick 2.0: How to convert lifted query results to a case class?

in order to implement a ReSTfull APIs stack, I need to convert data extracted from a DB to JSON format. I think that the best way is to extract data from the DB and then convert the row set to JSON using Json.toJson() passing as argument a case class after having defined a implicit serializer (writes).
Here's my case class and companion object:
package deals.db.interf.slick2
import scala.slick.driver.MySQLDriver.simple._
import play.api.libs.json.Json
case class PartnerInfo(
id: Int,
name: String,
site: String,
largeLogo: String,
smallLogo: String,
publicationSite: String
)
object PartnerInfo {
def toCaseClass( ?? ) = { // what type are the arguments to be passed?
PartnerInfo( fx(??) ) // how to transform the input types (slick) to Scala types?
}
// Notice I'm using slick 2.0.0 RC1
class PartnerInfoTable(tag: Tag) extends Table[(Int, String, String, String, String, String)](tag, "PARTNER"){
def id = column[Int]("id")
def name = column[String]("name")
def site = column[String]("site")
def largeLogo = column[String]("large_logo")
def smallLogo = column[String]("small_logo")
def publicationSite = column[String]("publication_site")
def * = (id, name, site, largeLogo, smallLogo, publicationSite)
}
val partnerInfos = TableQuery[PartnerInfoTable]
def qPartnerInfosForPuglisher(publicationSite: String) = {
for (
pi <- partnerInfos if ( pi.publicationSite == publicationSite )
) yield toCaseClass( _ ) // Pass all the table columns to toCaseClass()
}
implicit val partnerInfoWrites = Json.writes[PartnerInfo]
}
What I cannot get is how to implement the toCaseClass() method in order to transform the column types from Slick 2 to Scala types - notice the function fx() in the body of toCaseClass() is only meant to give emphasis to that.
I'm wondering if is it possible to get the Scala type from Slick column type because it is clearly passed in the table definition, but I cannot find how to get it.
Any idea?
I believe the simplest method here would be to map PartnerInfo in the table schema:
class PartnerInfoTable(tag: Tag) extends Table[PartnerInfo](tag, "PARTNER"){
def id = column[Int]("id")
def name = column[String]("name")
def site = column[String]("site")
def largeLogo = column[String]("large_logo")
def smallLogo = column[String]("small_logo")
def publicationSite = column[String]("publication_site")
def * = (id, name, site, largeLogo, smallLogo, publicationSite) <> (PartnerInfo.tupled, PartnerInfo.unapply)
}
val partnerInfos = TableQuery[PartnerInfoTable]
def qPartnerInfosForPuglisher(publicationSite: String) = {
for (
pi <- partnerInfos if ( pi.publicationSite == publicationSite )
) yield pi
}
Otherwise PartnerInfo.tupled should do the trick:
def toCaseClass(pi:(Int, String, String, String, String, String)) = PartnerInfo.tupled(pi)

Comparing type mapped values in Slick queries

Consider the Favorites table object below, we want to write a query to find Favorites by their type (defined below). We have also defined a Typemapper, to map a FavoriteType to a String for the database
import scala.slick.driver.PostgresDriver.simple._
//Other imports have been omitted in this question
object Favorites extends Table[Favorite]("favorites") {
// Convert the favoriteTypes to strings for the database
implicit val favoriteMapping: TypeMapper[FavorietType] = MappedTypeMapper.base[FavorietType, String](
favType => FavorietType.values.find(_ == favType).get.mapping,
mapping => FavorietType.values.find(_.mapping == mapping).get
)
def favoriteType = column[FavoriteType]("type")
//other columns here
This is the query I want to write (however it does not compile)
def queryByFavoriteType(ftype : FavoriteType)(implicit s: Session) = {
for(
f <- Favorieten if f.favoriteType === ftype
) yield f
}
}
Here I have defined de different FavoriteType objects (this is outside the Favorieten Object)
sealed case class FavorietType(mapping: String) {
override def toString = mapping.capitalize
}
object FavoriteType {
object Exam extends FavoriteType("examen")
object Topic extends FavoriteType("onderwerp")
object Paper extends FavoriteType("profielwerkstuk")
val values = Seq(Exam , Topic , Paper )
}
The problem I have here is that the query does not compile:
value === is not a member of scala.slick.lifted.Column[models.gebruiker.FavorietType]
It appears that === can not be used to compare a User-defined type, is this true? Is there an alternative way to do this?
Edit
Related issue: before I had my TypeMapper without explicit type, it was defined as implicit val favoriteMapping = MappedTypeMapper.base[FavorietType, String]( ...
When I would write a query that would compare a FavoriteType.Exam (for example) such as
def queryByFavoriteExam()(implicit s: Session) = {
for(f <- Favorieten if f.favorietType === FavorietType.Exam) yield f
}
This would result in the error could not find implicit value for evidence parameter of type scala.slick.lifted.TypeMapper[models.gebruiker.FavorietType.Exam.type]
The solution for this is the same as the one presented below
When in doubt with Slick, go check out the unit tests. After reading their docs on mapping in a custom type and then looking at their unit tests, I got your query code to compile by changing it to:
def queryByFavoriteType(ftype : FavoriteType)(implicit s: Session) = {
for(f <- Favorites if f.favoriteType === (ftype:FavoriteType)) yield f
}
Also, I had imported the H2Driver just to get things to compile (import scala.slick.driver.H2Driver.simple._). I was assuming that you also had imported whatever driver it is that you need for your db.
EDIT
My full code example is as follows:
import scala.slick.driver.PostgresDriver.simple._
import scala.slick.session.Session
sealed case class FavoriteType(mapping: String) {
override def toString = mapping.capitalize
}
case class Favorite(ft:FavoriteType, foo:String)
object FavoriteType {
object Exam extends FavoriteType("examen")
object Topic extends FavoriteType("onderwerp")
object Paper extends FavoriteType("profielwerkstuk")
val values = Seq(Exam , Topic , Paper )
}
object Favorites extends Table[Favorite]("favorites") {
// Convert the favoriteTypes to strings for the database
implicit val favoriteMapping = MappedTypeMapper.base[FavoriteType, String](
{favType => FavoriteType.values.find(_ == favType).get.mapping},
{mapping => FavoriteType.values.find(_.mapping == mapping).get}
)
def favoriteType = column[FavoriteType]("type")
def foo = column[String]("foo")
def * = favoriteType ~ foo <> (Favorite.apply _, Favorite.unapply _)
def queryByFavoriteType(ftype : FavoriteType)(implicit s: Session) = {
for(f <- Favorites if f.favoriteType === (ftype:FavoriteType)) yield f
}
}