How can I limit my query by date using ScalaQuery - scala

I'm using Scalaquery and have run into a problem when I attempt to limit my query based on a date field. I'm using Scala 2.9.2, ScalaQuery 2.9.1:0.10.0-M1 Consider the code below:
case class MyCase (opr_date: Date)
object MyClass extends BasicTable[MyCase]("MYTABLE") {
def opr_date = column[Date]("OPR_DATE")
def * = opr_date <> (MyCase, MyCase.unapply _)
def test(date: Date) = db.withSession {
logDebug("test date: " + date)
val qry = for {
d <- MyClass if (d.opr_date === date)
} yield d.opr_date
logDebug(qry.selectStatement)
qry.list
}
}
This query never returns any rows. Here is the calling code:
"The data" should {
"be available " in {
val testDate = CommonFormat.parseDate("2012-10-27", CommonFormat.EURO_SHORT).getTime
val records = MyClass.test2(new java.sql.Date(testDate))
records.size must be_>(0)
}
}
The query returns 0 rows and produces the following SQL when I print the select:
SELECT "t1"."OPR_DATE" FROM "MYTABLE" "t1" WHERE ("t1"."OPR_DATE"={d '2012-10-27'})
I have data available for the test date. If I paste the SQL into a SQL editor and edit the date so that its not the JDBC template format ('27-Oct-2012') the query returns the expected rows. Can anyone tell me what I'm doing wrong? Shouldn't this work?

I found out this morning that this was a data problem. The query works fine. It turns out I was connecting to the wrong server. We have a confusing setup of multiple environments and back-up systems that share the same database name. After connecting to the correct server the query works as expected. I saw different results between my code and the editor-tool because they were pointing at different servers (same database name ugh).Thank you to all who took time to look into this for me. I appreciate your efforts.

Related

Parse PGobject in Groovy

I'm trying to parse selected result from PostgreeSQL in Groovy, but something goes wrong, when I'm trying to parse it.
I used a lot of things, but I haven't reached the goal.
I've tried so:
I have three columns in postgre. One of the columns (document) is jsonb type and I want to parse it.
//instance of connection string to postrgre
PgAdmin pg = new
List<GroovyRowResult> result = pg.getSelectResult(Here goes select script)
for(Map oneRow in result) {
def document = oneRow.get("document")
println documnet //Well from here it works good, it prints jsonb data stored in document column.
def customerName = document.get("CustomerName") // it failes and goes on error, where CustomerName is inside jsonB.
}
I've also tried to use JsonSlurper, but it also fails.

Slick - Update query with multiple tables

I'm currently facing an issue with my update query in my scala-slick3 project. I have a Report-Class, which contains multiple Products and each Product contains multiple Parts. I want to implement a function that marks every Part of every Product within this Report as assessed.
I thought about doing something like this:
def markProductPartsForReportAsAssessed(reportId: Int) = {
val query = for {
(products, parts) <- (report_product_query filter(_.reportId === reportId)
join (part_query filter(_.isAssessed === false))
on (_.productId === _.productId))
} yield parts.isAssessed
db.run(query.update(true))
}
Now, when I run this code slick throws this exception:
SlickException: A query for an UPDATE statement must resolve to a comprehension with a single table.
I already looked at similiar problems of which their solutions (like this or this) weren't really satisfying to me.
Why does slick throw this excpetion or why is it a problem to begin with? I was under the impression that my yield already takes care of not "updating multiple tables".
Thanks in advance!
I guess it's because the UPDATE query requires just one table. If you write SQL for the above query, it can be
UPDATE parts a SET isAccessed = 'true'
WHERE a.isAccessed = 'false' and
exists(select 'x' from products b
where a.productId = b.producId and b.reportId = reportId)
Therefore, you can put conditions related with 'Product' table in the filter as follows.
val reportId = "123" // some variable
val subQuery = (reportId:Rep[String], productId:Rep[String]) =>
report_product_query.filter(r => r.report_id === reportId && r.product_id === productId)
val query = part_query.filter(p => p.isAccesssed === false:Rep[Boolean] &&
subQuery(reportId, p.productId).exists).map(_.isAccessed)
db.run(query.update(true))

Anorm returning 0 results while psql returns 2 results

I'm powering a search bar via AJAX that passes a selected filter (radio button) that relates to a database column and a search string for whatever is entered in the search bar. The scala/play/anorm code I am using is this:
def searchDB(searchString: String, filter: String): List[DatabaseResult] = {
DB.withConnection { implicit c =>
SQL(
"""
SELECT name, email, emailsecondary, picture, linkedin, title, company, companylink, companydesc, location, github, stackoverflow, twitter, blog
FROM mailinglistperson
WHERE {filter} LIKE '%{searchString}%'
""").on(
'filter -> filter,
'searchString -> searchString
).as(databaseResultParser.*)
}
}
When I run a query on the database (PostgreSQL) using psql that is isomorphic to the above anorm code, it returns 2 results, i.e.:
select id, name, email from mailinglistperson where company like '%kixer%';
But the anorm code returns 0 results when passed the exact same values (I've verified the values via println's)
EDIT: When I switch the anorm code to use String Interpolation I get:
[error] - play.core.server.netty.PlayDefaultUpstreamHandler - Cannot invoke the action
java.lang.RuntimeException: No parameter value for placeholder: 3
EDIT2: I also tried passing the '%...%' along with searchString into LIKE and still got 0 results.
There are two issues - the name of the column, and the filter value
As for the filter value: You have to omit the single ticks in the SQL command, and you should pass the placeholder "%" in the argument. The ticks are handled automatically in case of a string.
As for the column name: It's like a string parameter, so again ticks are handled automatically as well:
[debug] c.j.b.PreparedStatementHandle - select ... from ... where 'filter' like '%aaa%'
One solution: Use normal string interpolation s"""... $filter ...""".
All together:
SQL(
s"""
SELECT name, email, ...
FROM mailinglistperson
WHERE $filter LIKE {searchString}
""").on(
'searchString -> "%" + searchString + "%"
).as(databaseResultParser.*)
but that should be accompanied by a check before, something like
val validColumns = List("name", "email")
if (validColumns.contains(filter)) == false) {
throw new IllegalArgumentException("...")
}
to guard against SQL injection.
Update
As pointed out by cchantep: If Anorm >= 2.4 is used, one can use mixed interpolation (both for column names and values):
SQL"... WHERE #$filter LIKE $searchString"
In this case it's partially safe against SQL injection: that only covers the values, and not the column name.
Update 2
As for logging the SQL statements, see Where to see the logged sql statements in play2?
But as you are using PostgreSQL I suggest the definitive source: The PostgreSQL log: In postgresql.conf:
log_statement = 'all' # none, ddl, mod, all
then you will see in the PostgreSQL log file something like this:
LOG: select * from test50 where name like $1
DETAIL: Parameter: $1 = '%aaa'

Scala Postgres IN Operator

I'm developing a web application based on Activator and Postgres.
I am trying to perform the following SQL query:
SELECT *
FROM table t_0
WHERE t_0.category IN (?)
then to populate the query I am using the following Scala code
val readQuery = """ SELECT * FROM table t_0 WHERE t_0.category IN (?) """
val categories = Array("free time", "living")
val insertValues = Array(categories)
val queryResult = await { connection.sendPreparedStatement(readQuery, insertValues) }
Even though there are some records in the database I always get an empty set, I have already tried with some forms of Array[Byte], but I have never managed to get results.
Does anybody have some tips or trick that I can use?
Thanks!

Anorm broken in PostgreSQL 9.0 Selects with Order By?

I'm trying to build a list page like the one in the "Computers" sample. My environment is Play 2.0 and PostrgreSQL 9.0
I have the following method in my User object:
def list(page: Int = 0, pageSize: Int = 10, orderBy: Int = 1, filter: String = "%"): Page[User] = {
val offset = pageSize * page
val mode = if (orderBy > 0) "ASC NULLS FIRST" else "DESC NULLS LAST"
Logger.debug("Users.list with params: page[%d] pageSize[%d] orderBy[%d] filter[%s] order[%s]".format(page, pageSize, orderBy, filter, mode))
DB.withConnection {
implicit connection =>
val users = SQL(
"""
select * from publisher
where name ilike {filter}
order by {orderBy} %s
limit {pageSize} offset {offset}
""".format(mode)
).on(
'pageSize -> pageSize,
'offset -> offset,
'filter -> filter,
'orderBy -> scala.math.abs(orderBy)
).as(User.simple *)
val totalRows = SQL(
"""
select count(*) from publisher
where name like {filter}
"""
).on(
'filter -> filter
).as(scalar[Long].single)
Page(users, page, offset, totalRows)
}
}
Doesn't matter which value of 'orderBy' I provide, the order is always based on id of the entities.
The query generated by Anorm is valid PostgreSQL and it works fine when running it against the database directly. But it seems like if Anorm parser was ignoring the order in which the results are returned, and instead returns a list ordered by 'id'.
I've even tried to simplify the query to a "select * from publisher order by 2 ASC/DESC", but nothing is fixed, the ordering is ignored by Anorm on return.
Any suggestion on how to solve this issue?
Thanks to Guillaume on the mailing list of Play I found a workaround.
All placeholders work except the one in order by. The worse part is that when you follow the logs, the driver generates the correct query and PostgreSQL is receiving it. I'm not sure what's the deal, very confusing, but if I remove that placeholder, it just works.
Depressing :(
I solved it like this:
val users = SQL(
"""
select * from publisher
where name ilike {filter}
order by %d %s
limit {pageSize} offset {offset}
""".format(scala.math.abs(orderBy), mode)
).on(
'pageSize -> pageSize,
'offset -> offset,
'filter -> filter
).as(User.simple *)
Now you'll be screaming "SQL INJECTION". Relax. Although it may be possible somehow, orderBy is an integer (which we turn into abs value for more safety). If you try to call the controller that provides orderBy with a string, Play returns a 404 error. So only integers are allowed. And if there is no column corresponding to the given integer, the order by is ignored. So, not ideal, but not so bad.