Parse multiple ResultSets from select query with Anorm 2.4 - postgresql

I am continuing my journey with Play, Scala & Anorm, and I faced a following problem:
One of my repository classes holds a logic to fetch list of email from DB for a user, as follows:
val sql =
"""
|select * from user_email where user_id = {user_id}
|;--
""".stripMargin
SQL(sql).on(
'user_id -> user.id
).as(UserEmail.simpleParser *)
While a parser is something like this:
val simpleParser: RowParser[UserEmail] = (
SqlParser.get[Muid](qualifiedColumnNameOf("", Identifiable.Column.Id)) ~
AuditMetadata.generateParser("") ~
SqlParser.get[Muid]("user_id") ~
SqlParser.get[String]("email") ~
SqlParser.get[Boolean]("verified") ~
SqlParser.get[Muid]("verification_code_id") ~
SqlParser.get[Option[DateTime]]("verification_sent_date")
) map {
case id ~ audit ~ userId ~ email ~ emailVerified ~ emailVerificationCode ~ emailVerificationSentDate =>
UserEmail(
id,
audit,
userId,
email,
emailVerified,
emailVerificationCode,
emailVerificationSentDate
)
}
When I execute this in test, I am getting the following error:
[error] Multiple ResultSets were returned by the query. (AbstractJdbc2Statement.java:354)
...
It is expected, that there are more than a simple result; however, I am puzzled about how to parse this case correctly.
I was under assumption, that:
UserEmail.simpleParser single is for simple row
and
UserEmail.simpleParser * is to handle multiple rows
I could not figure this put based on documentation, and, at least for now, didn't find anything useful anywhere else.
How do I parse multiple rows from the result set?
Update: I just found this gist (https://gist.github.com/davegurnell/4b432066b39949850b04) with pretty good explanation and created a ResultSetParser like so:
val multipleParser: ResultSetParser[List[UserEmail]] = UserEmail.simpleParser.*
And... that didn't help!
Thanks,

Related

Anorm returning 0 results while psql returns 2 results

I'm powering a search bar via AJAX that passes a selected filter (radio button) that relates to a database column and a search string for whatever is entered in the search bar. The scala/play/anorm code I am using is this:
def searchDB(searchString: String, filter: String): List[DatabaseResult] = {
DB.withConnection { implicit c =>
SQL(
"""
SELECT name, email, emailsecondary, picture, linkedin, title, company, companylink, companydesc, location, github, stackoverflow, twitter, blog
FROM mailinglistperson
WHERE {filter} LIKE '%{searchString}%'
""").on(
'filter -> filter,
'searchString -> searchString
).as(databaseResultParser.*)
}
}
When I run a query on the database (PostgreSQL) using psql that is isomorphic to the above anorm code, it returns 2 results, i.e.:
select id, name, email from mailinglistperson where company like '%kixer%';
But the anorm code returns 0 results when passed the exact same values (I've verified the values via println's)
EDIT: When I switch the anorm code to use String Interpolation I get:
[error] - play.core.server.netty.PlayDefaultUpstreamHandler - Cannot invoke the action
java.lang.RuntimeException: No parameter value for placeholder: 3
EDIT2: I also tried passing the '%...%' along with searchString into LIKE and still got 0 results.
There are two issues - the name of the column, and the filter value
As for the filter value: You have to omit the single ticks in the SQL command, and you should pass the placeholder "%" in the argument. The ticks are handled automatically in case of a string.
As for the column name: It's like a string parameter, so again ticks are handled automatically as well:
[debug] c.j.b.PreparedStatementHandle - select ... from ... where 'filter' like '%aaa%'
One solution: Use normal string interpolation s"""... $filter ...""".
All together:
SQL(
s"""
SELECT name, email, ...
FROM mailinglistperson
WHERE $filter LIKE {searchString}
""").on(
'searchString -> "%" + searchString + "%"
).as(databaseResultParser.*)
but that should be accompanied by a check before, something like
val validColumns = List("name", "email")
if (validColumns.contains(filter)) == false) {
throw new IllegalArgumentException("...")
}
to guard against SQL injection.
Update
As pointed out by cchantep: If Anorm >= 2.4 is used, one can use mixed interpolation (both for column names and values):
SQL"... WHERE #$filter LIKE $searchString"
In this case it's partially safe against SQL injection: that only covers the values, and not the column name.
Update 2
As for logging the SQL statements, see Where to see the logged sql statements in play2?
But as you are using PostgreSQL I suggest the definitive source: The PostgreSQL log: In postgresql.conf:
log_statement = 'all' # none, ddl, mod, all
then you will see in the PostgreSQL log file something like this:
LOG: select * from test50 where name like $1
DETAIL: Parameter: $1 = '%aaa'

Scala Postgres IN Operator

I'm developing a web application based on Activator and Postgres.
I am trying to perform the following SQL query:
SELECT *
FROM table t_0
WHERE t_0.category IN (?)
then to populate the query I am using the following Scala code
val readQuery = """ SELECT * FROM table t_0 WHERE t_0.category IN (?) """
val categories = Array("free time", "living")
val insertValues = Array(categories)
val queryResult = await { connection.sendPreparedStatement(readQuery, insertValues) }
Even though there are some records in the database I always get an empty set, I have already tried with some forms of Array[Byte], but I have never managed to get results.
Does anybody have some tips or trick that I can use?
Thanks!

Dynamic Auto Sharding support for Scala Slick

This post is related to the issue already raised at Dynamically changing the database shard that I am connecting too.
Pointers to code that should be changed to implement this feature is given at https://github.com/slick/slick/issues/703
Am a newbie to Scala and Slick , Can I get some help ? , as to how to proceed implementing this feature. Is there any slick/scala pattern to do this at application level.
My problem is "I have pool of connections of different shards of MySQL, when I write a query/queries involving ID's (sharding keys), slick should dynamically run that particular query on the respective database shard"
For Example: If I write a query like this
val q = for ( user <- users.filter(_.name === "cat")
post <- posts.filter(_.postedBy === user.id)
comment <- comments.filter(_.postId === post.id)
} yield comment.content
q.run
a trivial case should be like one below.
users += User(id = 1, name = "cat", email = "cat#mat.com") => hits shard no 1
Even if the User ID, Post ID, Comment ID are dynamically produced, slick should hit the correct database shard using some sharding criteria ( key (ID) % 3 ) and everything should happen at the background just like single database query.
To implement the feature at application level
Is there any way to read Query object state dynamically so that I can write a function like
def func(q: Query[Something], shards: Map[Int, Database], num: Int): Unit = {
shards(q.getId % num).withSession{ implicit session => {
q.run
}
}
Usage:
val q = users.insert(User(id = 1, name = "cat", email = "cat#cat.com"))
func(q, shards, 10) => q executes on one of the 10 shards.
Thanks.

Slick issue when going with PostgreSQL

I'm using slick in a scala project to query some tables.
//define table
object Addresses extends Table[Address]("assetxs.address") {
def id = column[Int]("id", O.PrimaryKey)
def street = column[String]("street")
def number = column[String]("number")
def zipcode = column[String]("zipcode")
def country = column[String]("country")
def * = id ~ street ~ number ~ zipcode ~ country <> (Address, Address.unapply _)
}
If I use any query of this table it does not work (it says it cannot find my table) so I went further and print out the query like:
implicit val session = Database.forURL("jdbc:postgresql://localhost:5432/postgres", driver = "org.postgresql.Driver", user="postgres", password="postgres").createSession()
session.withTransaction{
val query = Query(Addresses)
println("Addresses: " + query.selectStatement)
}
I noticed that the name of the schema.table appears in "" so the statement is:
select x2."id", x2."street", x2."number", x2."zipcode", x2."country"
from "assetxs.address" x2
which of course does not work (I've tried to run it in PostgreSQL tool and I needed to remove "" from table name in order to have it working.
Can you please tell me if there is any slick option to not include "" in any query when using table names?
You've put the schema into the table name. A (quoted) table name containing a dot character is valid in SQL but it's not what you want here. You have to specify the schema separately:
object Addresses extends Table[Address](Some("assetxs"), "address")
In the end I was able to solve this issue.
I specify the table name only:
object Addresses extends Table[Address]("address")
and change my postgresql conf to include my schema when searching (it seems that slick is looking on public schema only):
search_path = '"$user",assetxs,public'
and now it works.
The solution I found when wanting to work with both H2 (testing) and Postgres (production) using liquibase and slick.
Stick with lowercase in your Slick Table objects
class MyTable(tag: Tag) extends Table[MyRecord](tag,
Some("my_schema"), "my_table")
In your H2 url config you'll need to specify DATABASE_TO_UPPER=false (this prevents the table and column names from being upper cased) and put quotation marks around the INIT schema (this prevents the schema from being upper cased)
url =
jdbc:h2:mem:test;MODE=PostgreSQL;DATABASE_TO_UPPER=false;INIT=create
schema if not exists \"my_schema\"\;SET SCHEMA \"my_schema\""
When specifying schema names in liquibase scripts it must also be quoted so that H2 won't try to capitalize it.
Since this problem is still bothering Scala newcomers (like me), I've performed small research and found that such an application.conf was successful with Slick 3.1.1 and PostgreSQL 9.5:
postgres.devenv = {
url = "jdbc:postgresql://localhost:5432/dbname?currentSchema=customSchema"
user = "user"
password = "password"
driver = org.postgresql.Driver
}
You're just using the wrong driver, check your imports
import scala.slick.driver.PostgresDriver.simple._

How can I limit my query by date using ScalaQuery

I'm using Scalaquery and have run into a problem when I attempt to limit my query based on a date field. I'm using Scala 2.9.2, ScalaQuery 2.9.1:0.10.0-M1 Consider the code below:
case class MyCase (opr_date: Date)
object MyClass extends BasicTable[MyCase]("MYTABLE") {
def opr_date = column[Date]("OPR_DATE")
def * = opr_date <> (MyCase, MyCase.unapply _)
def test(date: Date) = db.withSession {
logDebug("test date: " + date)
val qry = for {
d <- MyClass if (d.opr_date === date)
} yield d.opr_date
logDebug(qry.selectStatement)
qry.list
}
}
This query never returns any rows. Here is the calling code:
"The data" should {
"be available " in {
val testDate = CommonFormat.parseDate("2012-10-27", CommonFormat.EURO_SHORT).getTime
val records = MyClass.test2(new java.sql.Date(testDate))
records.size must be_>(0)
}
}
The query returns 0 rows and produces the following SQL when I print the select:
SELECT "t1"."OPR_DATE" FROM "MYTABLE" "t1" WHERE ("t1"."OPR_DATE"={d '2012-10-27'})
I have data available for the test date. If I paste the SQL into a SQL editor and edit the date so that its not the JDBC template format ('27-Oct-2012') the query returns the expected rows. Can anyone tell me what I'm doing wrong? Shouldn't this work?
I found out this morning that this was a data problem. The query works fine. It turns out I was connecting to the wrong server. We have a confusing setup of multiple environments and back-up systems that share the same database name. After connecting to the correct server the query works as expected. I saw different results between my code and the editor-tool because they were pointing at different servers (same database name ugh).Thank you to all who took time to look into this for me. I appreciate your efforts.