NullPointerException when Plain SQL and String Interpolation - scala

I'm having some trouble doing a Plain SQL Query using String Interpolation. I'm trying to query a MS-SQL 2008 database using Microsofts sqljdbc4.jar.
The table is simple:
[serv_num]
number (nvarchar(15)) | name (nvarchar(50))
=============================================
1234 | AA
+345 | BB
The code I'm using is as follows:
import java.sql.{Connection, DriverManager}
import org.slf4j.LoggerFactory
import scala.collection.JavaConversions._
import scala.slick.session.Database
import Database.threadLocalSession
import scala.slick.jdbc.{GetResult, StaticQuery => Q}
import Q.interpolation
object NumImport extends App {
def logger = LoggerFactory.getLogger("NumImport")
implicit val getServNumResult = GetResult(r => ServNum(r.<<, r.<<))
override def main(args: Array[String]) {
val uri = "jdbc:sqlserver://192.168.1.2;databaseName=slicktestdb;user=yyyy;password=xxxxxx"
val drv = "com.microsoft.sqlserver.jdbc.SQLServerDriver"
Database.forURL(uri, driver = drv) withSession {
def getServNum(n: String) = sql"select number, name from serv_num where number = $n".as[ServNum]
val result = getServNum("1234")
println(result.firstOption)
}
}
case class ServNum(num: String, name: String)
}
My issue is that I'm getting a java.lang.NullPointerException on result.firstOption (I think!). This happens when a row is found. If try to select a number that doesn't exist, say getServNum("34"), then result.firstOption is a None, as expected.
Stacktrace:
08:54:47.932 TKD [run-main] DEBUG scala.slick.session.BaseSession - Preparing statement: select number, name from serv_num where number = ?
[error] (run-main) java.lang.NullPointerException
java.lang.NullPointerException
at scala.slick.jdbc.StaticQuery.extractValue(StaticQuery.scala:16)
at scala.slick.jdbc.StatementInvoker$$anon$1.extractValue(StatementInvoker.scala:37)
at scala.slick.session.PositionedResultIterator.foreach(PositionedResult.scala:212)
at scala.slick.jdbc.Invoker$class.foreach(Invoker.scala:91)
at scala.slick.jdbc.StatementInvoker.foreach(StatementInvoker.scala:10)
at scala.slick.jdbc.Invoker$class.firstOption(Invoker.scala:41)
at scala.slick.jdbc.StatementInvoker.firstOption(StatementInvoker.scala:10)
at scala.slick.jdbc.UnitInvoker$class.firstOption(Invoker.scala:148)
at scala.slick.jdbc.StaticQuery0.firstOption(StaticQuery.scala:95)
I tried debugging and following the stacktrace, but I'm quite new to both Scala and Slick, so I'm lost. If anyone could help me or point me in the right direction I would appreciate it a lot.
Thanks.

In looking at the slick source code where the exception was happening, I could see that it was trying to invoke apply on the GetResult object, but that object seemed to be null. In thinking about what might of caused it, I'm suggesting that you move the implicit inside of the main method. It seems that the way things are structured, slick is losing track of the implicit ResultSet to case class converter (the GetResult)

Related

Convert prepareStament object to Json Scala

I'am trying to convert prepareStament(object uses for sending SQL statement to the database ) to Json with scala.
So far, I've discovered that the best way to convert an object to Json in scala is to do it with the net.liftweb library.
But when I tried it, I got an empty json.
this is the code
import java.sql.DriverManager
import net.liftweb.json._
import net.liftweb.json.Serialization.write
object Main {
def main (args: Array[String]): Unit = {
implicit val formats = DefaultFormats
val jdbcSqlConnStr = "sqlserverurl**"
val conn = DriverManager.getConnection(jdbcSqlConnStr)
val statement = conn.prepareStatement("exec select_all")
val piedPierJSON2= write(statement)
println(piedPierJSON2)
}
}
this is the result
{}
I used an object I created , and the conversion worked.
case class Person(name: String, address: Address)
case class Address(city: String, state: String)
val p = Person("Alvin Alexander", Address("Talkeetna", "AK"))
val piedPierJSON3 = write(p)
println(piedPierJSON3)
This is the result
{"name":"Alvin Alexander","address":{"city":"Talkeetna","state":"AK"}}
I understood where the problem was, PrepareStament is an interface, and none of its subtypes are serializable...
I'm going to try to wrap it up and put it in a different class.

Code working in Spark-Shell not in eclipse

I have a small Scala code which works properly on Spark-Shell but not in Eclipse with Scala plugin. I can access hdfs using plugin tried writing another file and it worked..
FirstSpark.scala
package bigdata.spark
import org.apache.spark.SparkConf
import java. io. _
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
object FirstSpark {
def main(args: Array[String])={
val conf = new SparkConf().setMaster("local").setAppName("FirstSparkProgram")
val sparkcontext = new SparkContext(conf)
val textFile =sparkcontext.textFile("hdfs://pranay:8020/spark/linkage")
val m = new Methods()
val q =textFile.filter(x => !m.isHeader(x)).map(x=> m.parse(x))
q.saveAsTextFile("hdfs://pranay:8020/output") }
}
Methods.scala
package bigdata.spark
import java.util.function.ToDoubleFunction
class Methods {
def isHeader(s:String):Boolean={
s.contains("id_1")
}
def parse(line:String) ={
val pieces = line.split(',')
val id1=pieces(0).toInt
val id2=pieces(1).toInt
val matches=pieces(11).toBoolean
val mapArray=pieces.slice(2, 11).map(toDouble)
MatchData(id1,id2,mapArray,matches)
}
def toDouble(s: String) = {
if ("?".equals(s)) Double.NaN else s.toDouble
}
}
case class MatchData(id1: Int, id2: Int,
scores: Array[Double], matched: Boolean)
Error Message:
Exception in thread "main" org.apache.spark.SparkException: Task not serializable
at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:304)
at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:294)
at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:122)
at org.apache.spark.SparkContext.clean(SparkContext.scala:2032)
at org.apache.spark.rdd.RDD$$anonfun$filter$1.apply(RDD.scala:335)
at org.apache.spark.rdd.RDD$$anonfun$filter$1.apply(RDD.scala:334)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:310)
Can anyone please help me with this
Try changing class Methods { .. } to object Methods { .. }.
I think the problem is at val q =textFile.filter(x => !m.isHeader(x)).map(x=> m.parse(x)). When Spark sees the filter and map functions it tries to serialize the functions passed to them (x => !m.isHeader(x) and x=> m.parse(x)) so that it can dispatch the work of executing them to all of the executors (this is the Task referred to). However, to do this, it needs to serialize m, since this object is referenced inside the function (it is in the closure of the two anonymous methods) - but it cannot do this since Methods is not serializable. You could add extends Serializable to the Methods class, but in this case an object is more appropriate (and is already Serializable).

Scala Slick - Infinite recursion (StackOverflowError) on query results

I am playing around with Slick on top of a Twitter Finatra application. Finally I thought I made it but now, when I want to process a result, I always get an recursion error. I looked around but I did not find anything helpful for this particular problem. The code I have is actually quite simple:
Map the Database class to a custom Type:
package com.configurationManagement.library
package object Types {
type SlickDatabase = slick.driver.MySQLDriver.api.Database
}
Model:
package com.configurationManagement.app.domain
import slick.lifted.Tag
import slick.driver.MySQLDriver.api._
import slick.profile.SqlProfile.ColumnOption.NotNull
case class ConfigurationTemplate(id: Option[Int], name: String)
class ConfigurationTemplates(tag: Tag) extends Table[ConfigurationTemplate](tag: Tag, "configuration_template") {
def id = column[Int]("id", O.AutoInc, O.PrimaryKey)
def name = column[String]("name", NotNull)
def uniqueNameIndex = index("unique_name", name, unique = true)
def * = (id.?, name) <> (ConfigurationTemplate.tupled, ConfigurationTemplate.unapply)
}
Controller:
package com.configurationManagement.app.controller
import com.google.inject.{Inject, Singleton}
import com.configurationManagement.app.domain.ConfigurationTemplates
import com.configurationManagement.app.dto.request.RequestConfigurationTemplateDto
import com.configurationManagement.library.Types._
import com.twitter.finatra.http.Controller
import com.twitter.inject.Logging
import slick.driver.MySQLDriver.api._
#Singleton
class ConfigurationTemplateController #Inject()(database: SlickDatabase)
extends Controller with Logging with FutureConverter {
post("/configurations/templates") { dto: RequestConfigurationTemplateDto =>
val templates = TableQuery[ConfigurationTemplates]
val query = templates.filter(_.id === 1)
response.ok(query.map(_.name))
}
}
And here comes the error
Infinite recursion (StackOverflowError) (through reference chain: com.configurationManagement.app.domain.ConfigurationTemplates["table_node"]->slick.ast.TableNode["driver_table"]->com.configurationManagement.app.domain.ConfigurationTemplates["table_node"]->slick.ast.TableNode["driver_table"]->com.configurationManagement.app.domain.ConfigurationTemplates["table_node"]->slick.ast.TableNode["driver_table"]->com.configurationManagement.app.domain.ConfigurationTemplates["table_node"]->slick.ast.TableNode["driver_table"]->com.configurationManagement.app.domain.ConfigurationTemplates["table_node"]->slick.ast.TableNode["driver_table"]->com.configurationManagement.app.domain.ConfigurationTemplates["table_node"]->slick.ast.TableNode["driver_table"]->com.configurationManagement.app.domain.ConfigurationTemplates["table_node"]->slick.ast.TableNode["driver_table"]->com.configurationManagement.app.domain.ConfigurationTemplates["table_node"]->slick.ast
Obvisously this line causes the error:
query.map(_.name)
Two things I see, first you need to add .result to the query to transform it into a FixedSqlStreamingAction, second you need a database to run that query on:
private[this] val database: slick.driver.MySQLDriver.backend.DatabaseDef = Database.forDataSource(...)
database.run(templates.filter(_.id === 1).map(_.name).result)
Which returns a Future[Seq[String]], this probably is the expected type from response.ok.

Add data type from PostgreSQL extension in Slick

I'm using the PostGIS extension for PostgreSQL and I'm trying to retrieve a PGgeometry object from a table.
This version is working fine :
import java.sql.DriverManager
import java.sql.Connection
import org.postgis.PGgeometry
object PostgersqlTest extends App {
val driver = "org.postgresql.Driver"
val url = "jdbc:postgresql://localhost:5432/gis"
var connection:Connection = null
try {
Class.forName(driver)
connection = DriverManager.getConnection(url)
val statement = connection.createStatement()
val resultSet = statement.executeQuery("SELECT geom FROM table;")
while ( resultSet.next() ) {
val geom = resultSet.getObject("geom").asInstanceOf[PGgeometry]
println(geom)
}
} catch {
case e: Exception => e.printStackTrace()
}
connection.close()
}
I need to be able to do the same thing using Slick custom query. But this version doesn't work :
Q.queryNA[PGgeometry]("SELECT geom FROM table;")
and gives me this compilation error
Error:(50, 40) could not find implicit value for parameter rconv: scala.slick.jdbc.GetResult[org.postgis.PGgeometry]
val query = Q.queryNA[PGgeometry](
^
Is there a simple way to add the PGgeometry data type in Slick without having to convert the returned object to a String and parse it?
To use it successfully, you need define a GetResult, and maybe SetParameter if you want to insert/update it to db.
Here's some codes extracted from slick tests (p.s. I assume you're using slick 2.1.0):
implicit val getUserResult = GetResult(r => new User(r.<<, r.<<))
case class User(id:Int, name:String)
val userForID = Q[Int, User] + "select id, name from USERS where id = ?"
But, if your java/scala type is jts.Geometry instead of PGgeometry, you can try to use slick-pg, which has built-in support for jts.Geometry and PostGIS for slick Lifted and Plain SQL.
To overcome the same issue, I used slick-pg (0.8.2)and JTS's Geometry classes as tminglei mentioned in the previous answer. There are two steps to use slick-pg to handle PostGIS's geometry types: (i) extend Slick's PostgresDriver with PgPostGISSupport and (ii) define an implicit converter for your plain query as shown below.
As shown in this page, you should first extend the PostgresDriver with PgPostGISSupport:
object MyPostgresDriver extends PostgresDriver with PgPostGISSupport {
override lazy val Implicit = new Implicits with PostGISImplicits
override val simple = new Implicits with SimpleQL with PostGISImplicits with PostGISAssistants
val plainImplicits = new Implicits with PostGISPlainImplicits
}
Using the implicit conversions defined in plainImplicits in the extended driver, you can write your query as:
import com.vividsolutions.jts.geom.LineString // Or any other JTS geometry types.
import MyPostgresDriver.plainImplicits._
import scala.slick.jdbc.GetResult
case class Row(id: Int, geom: LineString)
implicit val geomConverter = GetResult[Row](r => {
Row(r.nextInt, r.nextGeometry[LineString])
})
val query = Q.queryNA[Row](
"""SELECT id, geom FROM table;"""
)

Slick (Scalaquery) - insert gives type error

I am using Slick (Scalaquery) in my play framework application. According to slick tutorial - I can define a projection for columns I want to insert into. (I am defining this projection because my index is an auto-incrementing column.) But, when I use this projection in an insert, I get a type-error saying:
[NoSuchMethodError: scala.Predef$.augmentString(Ljava/lang/String;)Ljava/lang/String;].
My model is defined like this:
package models
import play.api.db._
import play.api.Play.current
import scala.slick.driver.PostgresDriver.simple._
import scala.slick.ql.{MappedTypeMapper}
import scala.slick.session.{Session, Database}
case class Book(name: String, filename: String)
object Book extends Table[(Long, String, String)]("book") {
lazy val database = Database.forDataSource(DB.getDataSource())
def id = column[Long]("id", O PrimaryKey, O AutoInc)
def name = column[String]("name", O NotNull)
def filename = column[String]("filename", O NotNull)
def * = id ~ name ~ filename
def withoutId = name ~ filename
def findAll() : Seq[Book] = database.withSession { implicit db:Session =>
(for(t <- this) yield t.name ~ t.filename).list.map(attrs => Book(attrs._1, attrs._2))
}
def create(book: Book): Unit = database.withSession { implicit db:Session =>
this.withoutId insert(book.name, book.filename)
}
}
Here is the stack trace:
! #6b1eg7f2d - Internal server error, for request [POST /addBook] ->
play.core.ActionInvoker$$anonfun$receive$1$$anon$1: Execution exception [[NoSuchMethodError: scala.Predef$.augmentString(Ljava/lang/String;)Ljava/lang/String;]]
at play.core.ActionInvoker$$anonfun$receive$1.apply(Invoker.scala:134) [play_2.9.1-2.0.2.jar:2.0.2]
at play.core.ActionInvoker$$anonfun$receive$1.apply(Invoker.scala:115) [play_2.9.1-2.0.2.jar:2.0.2]
at akka.actor.Actor$class.apply(Actor.scala:318) [akka-actor-2.0.2.jar:2.0.2]
at play.core.ActionInvoker.apply(Invoker.scala:113) [play_2.9.1-2.0.2.jar:2.0.2]
at akka.actor.ActorCell.invoke(ActorCell.scala:626) [akka-actor-2.0.2.jar:2.0.2]
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:197) [akka-actor-2.0.2.jar:2.0.2]
Caused by: java.lang.NoSuchMethodError: scala.Predef$.augmentString(Ljava/lang/String;)Ljava/lang/String;
at scala.slick.driver.BasicSQLUtilsComponent$class.quoteIdentifier(BasicSQLUtilsComponent.scala:12) ~[slick_2.10.0-M4-0.10.0-M2.jar:0.10.0-M2]
at scala.slick.driver.PostgresDriver$.quoteIdentifier(PostgresDriver.scala:69) ~[slick_2.10.0-M4-0.10.0-M2.jar:0.10.0-M2]
at scala.slick.driver.BasicStatementBuilderComponent$InsertBuilder.appendNamedColumn(BasicStatementBuilderComponent.scala:400) ~[slick_2.10.0-M4-0.10.0-M2.jar:0.10.0-M2]
at scala.slick.driver.BasicStatementBuilderComponent$InsertBuilder.scala$slick$driver$BasicStatementBuilderComponent$InsertBuilder$$f$1(BasicStatementBuilderComponent.scala:387) ~[slick_2.10.0-M4-0.10.0-M2.jar:0.10.0-M2]
at scala.slick.driver.BasicStatementBuilderComponent$InsertBuilder$$anonfun$scala$slick$driver$BasicStatementBuilderComponent$InsertBuilder$$f$1$1.apply$mcVI$sp(BasicStatementBuilderComponent.scala:377) ~[slick_2.10.0-M4-0.10.0-M2.jar:0.10.0-M2]
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:75) ~[scala-library.jar:0.11.3]
Can someone help me by pointing out what's wrong with my insert?
SLICK is not supported on Scala 2.9 (as used by Play). All "proper" SLICK builds (in the scala.slick namepace and with the new frontend) require Scala 2.10.0-M4 or -M5. The latest supported version on Scala 2.9 is ScalaQuery 0.10.0-M1.
Judging by the feature you're using in the snippet, everything should work the same way with ScalaQuery 0.10.0-M1 or even 0.9 (except for the different imports).