I am trying to pass a List[String] into a query and then match on possibly multiple nodes by a property, where the value of the property is the string passed into the query.
I get an error - Expected parameter(s): list
import org.neo4j.driver.v1._
def getNodesByPropertyValue(list: List[String]): Future[List[(String, String)]] = {
val getNodes =
s"""
| UNWIND $$list AS propValue
| MATCH (i: item {id: propValue})<-[:CONTAINS]-(c: Collection)
| RETURN i.originalID AS OID
|""".stripMargin
storeAPI.NeoQuery(getNodes).resultList().map {
result =>
result.map {
record =>
record.get("OID").toString
}
}).recoverWith {
case e: Exception =>
logger.error(s"Failure in getNodesByProperty: ", e)
throw e
}
}
Also, when I use $list instead, I get an error saying Neo4J doesn't recognise the function List().
A solution to this would be appreciated.
Also, what is the difference between passing a variable with $ into a query, than passing a variable with $$? I thought the $$ might be used for collections but I am unsure, I haven't found information on it yet.
Thanks.
In my storeAPI.NeoQuery I was missing the parameter that maps the string $$list in the query, to the val list outside of the query.
Working version below.
import org.neo4j.driver.v1._
def getNodesByPropertyValue(list: List[String]): Future[List[(String, String)]] = {
val getNodes =
s"""
| UNWIND $$list AS propValue
| MATCH (i: item {id: propValue})<-[:CONTAINS]-(c: Collection)
| RETURN i.originalID AS OID
|""".stripMargin
storeAPI.NeoQuery(getNodes, Map("list" -> list.asJava)).resultList().map {
result =>
result.map {
record =>
record.get("OID").toString
}
}).recoverWith {
case e: Exception =>
logger.error(s"Failure in getNodesByProperty: ", e)
throw e
}
}
Related
I have a generic repository with a method as:
object Queries {
def getByFieldId(field: String, id: Int): String = {
s"""
|SELECT
| DF.id AS fileId,
| DF.name AS fileName,
| AG.id AS groupId,
| AG.name AS groupName
|FROM $tableName DFG
|INNER JOIN directory_files DF on DF.id = DFG.file_id
|INNER JOIN ad_groups AG on AG.id = DFG.group_id
|WHERE DFG.$field = $id
|""".stripMargin
}
}
def getByFieldId(field: String, id: Int): Try[List[Object]] = {
try {
val sqlQuery = Queries.getByFieldId("ad_group", 1)
statement = conn.getPreparedStatement(sqlQuery)
setParameters(statement, params)
resultSet = statement.executeQuery()
val metadata = resultSet.getMetaData
val columnCount = metadata.getColumnCount
val columns: ListBuffer[String] = ListBuffer.empty
for (i <- 1 to columnCount) {
columns += metadata.getColumnName(i)
}
var item: List[Object] = List.empty
while (resultSet.next()) {
val row = columns.toList.map(x => resultSet.getObject(x))
item = row
}
Success(item)
} catch {
case e: Any => Failure(errorHandler(e))
} finally conn.closeConnection(resultSet, statement)
}
The problems is that my result set ignore the query aliases and return columns as (id, name, id, name) instead of (fileId, fileName, groupId, groupName).
One solution found is to use column index instead of col names, but I'm not sure if this solution will cover entire app and will not break some other queries.
Maybe, another found solution is here and if I'm right, I can still use colNames but need to get them together with colTypes, then inside resultSet.next() to call getType method for each as:
// this part of code is not tested
// this idea came to me writing this topic
while (resultSet.next()) {
val row = columns.toList.map(x => {
x.colType match {
case "string" => resultSet.getString(x.colName)
case "integer" => resultSet.getInt(x.colName)
case "decimal" => resultSet.getDecimal(x.colName)
case _ => resultSet.getString(x.colName)
})
item = row
}
You may try to use getColumnLabel instead of getColumnName
as documented
Gets the designated column's suggested title for use in printouts and displays. The suggested title is usually specified by the SQL AS clause. If a SQL AS is not specified, the value returned from getColumnLabel will be the same as the value returned by the getColumnName method.
Note that this is highly dependent on the used RDBM.
For Oracle both methods return the alias and there is no chance to get the original column name.
I have a Neo4J query that returns a list of nodes that most likely wont be empty, but in some cases could return null. How can I check for a null result within a map or flatmap operation?
val nodes = {
storeAPI.NeoQuery(parentNodesIDs).resultList().map {
_.flatMap {
record =>
record.get("assetList").asList.asScala.map(_.toString).toSet
}
}.recover {
case e: Exception =>
logger.error(s"Failure in getSimplifiedAssetListFromContainer: ", e)
throw e
}
}
I have tried to take the storeAPI.NeoQuery(parentNodeIDs).resultList() into a val outside of the above block, but then val nodes goes out of scope.
val nodes = {...} is of type Future[List[String]]
Any help would be great!
Try flatMap(Option(_)) like so
record
.get("assetList")
.asList
.asScala
.flatMap(Option(_))
.map(_.toString)
.toSet
For example
List(1,null,3).flatMap(Option(_)).map(_.toString).foreach(println)
outputs
1
3
This works because Option(null) is None, and Nones get discarded by flatMap.
I have 2 case classes like this :
case class ClassTeacherWrapper(
success: Boolean,
classes: List[ClassTeacher]
)
2nd one :
case class ClassTeacher(
clid: String,
name: String
)
And a query like this :
val query =
SQL"""
SELECT
s.section_sk::text AS clid,
s.name AS name
from
********************
"""
P.S. I put * in place of query for security reasons :
So my query is returning 2 values. How do i map it to case class ClassTeacher
currently I am doing something like this :
def getClassTeachersByInstructor(instructor: String, section: String): ClassTeacherWrapper = {
implicit var conn: Connection = null
try {
conn = datamartDatasourceConnectionPool.getDBConnection()
// Define query
val query =
SQL"""
SELECT
s.section_sk::text AS clid,
s.name AS name
********
"""
logger.info("Read from DB: " + query)
// create a List containing all the datasets from the resultset and return
new ClassTeacherWrapper(
success =true,
query.as(Macro.namedParser[ClassTeacher].*)
)
//Trying new approch
//val users = query.map(user => new ClassTeacherWrapper(true, user[Int]("clid"), user[String]("name")).tolist
}
catch {
case NonFatal(e) =>
logger.error("getGradebookScores: error getting/parsing data from DB", e)
throw e
}
}
with is I am getting this exception :
{
"error": "ERROR: operator does not exist: uuid = character varying\n
Hint: No operator matches the given name and argument type(s). You
might need to add explicit type casts.\n Position: 324"
}
Can anyone help where am I going wrong. I am new to scala and Anorm
What should I modify in query.as part of code
Do you need the success field? Often an empty list would suffice?
I find parsers very useful (and reusable), so something like the following in the ClassTeacher singleton (or similar location):
val fields = "s.section_sk::text AS clid, s.name"
val classTeacherP =
get[Int]("clid") ~
get[String]("name") map {
case clid ~ name =>
ClassTeacher(clid,name)
}
def allForInstructorSection(instructor: String, section: String):List[ClassTeacher] =
DB.withConnection { implicit c => //-- or injected db
SQL(s"""select $fields from ******""")
.on('instructor -> instructor, 'section -> section)
.as(classTeacherP *)
}
I have a simple database consisting of 2 tables - movie and comment, where comments are related to movies, and then I have following piece of scala anorm code:
case class Comment(commentId: Long, comment: String)
case class Movie(movieId: Long, name: String, movieType: String)
object MovieDao {
val movieParser: RowParser[Movie] = {
long("movieId") ~
str("name") ~
str("movieType") map {
case movieId ~ name ~ movieType => Movie(movieId, name, movieType)
}
}
val commentParser: RowParser[Comment] = {
long("commentId") ~
str("comment") map {
case commentId ~ comment => Comment(commentId, comment)
}
}
def getAll(movieType: String) = DB.withConnection {
implicit connection =>
SQL(
"""
|SELECT
|movie.movieId,
|movie.name,
|movie.movieType,
|comment.commentId,
|comment.comment
|FROM movie
|LEFT JOIN comment USING(movieId)
|WHERE movieType = {movieType}
""".stripMargin)
.on("movieType" -> movieType)
.as(((movieParser ~ (commentParser ?)) map (flatten)) *)
.groupBy(_._1) map {(mc: (Movie, List[(Movie, Option[Comment])])) =>
mc match {
case (a, b) => (a, b filter { //filter rows with no comments
case (c, Some(d)) => true
case _ => false
} map(_._2))
}
} toList
}
}
My goal is to return List[(Movie, Option[List[Comment]])] from getAll method, so I can iterate over movies and check if there are any comments as simple as possible, e.i. match None or Some on comments List. I'm currently returning List[(Movie, Option[List[Option[Comment]])] and I'm only able to check size of comments List (thanks to using filter method), which I don't consider as the right way to do it in scala.
My second question is about parsing query itself, I think it's just to complicated the way I did it. Is there any simpler and nicer solution to parse 0..N relation using anorm?
Peter, it's possibly more style than anything dramatically different, but with a MovieComments case class, you could write something like:
case class MovieComments(movie: Movie, comments: List[Comment])
val movieCommentsP =
movieParser ~ (commentParser ?) map {
case movie ~ comment =>
MovieComments(movie,if (comment.isEmpty) List() else List(comment.get))
}
val movieSqlSelector = "m.movieId, m.name, m.movieType"
val commentSqlSelector = "c.commentId, c.comment"
def getAll(movieType: String) :List[MovieComments]= DB.withConnection {
implicit connection =>
(SQL(
s"""
|SELECT
|$movieSqlSelector,
|$commentSqlSelector
|FROM movie
|LEFT JOIN comment USING(movieId)
|WHERE movieType = {movieType}
""".stripMargin)
.on('movieType -> movieType)
.as(movieCommentsP *)
.groupBy(_.movie.movieId) map {
case (movieId,movieComments) =>
MovieComments(
movieComments.head.movie,
movieComments.flatMap(_.comments))
}
).toList
}
You may really need an Option[List[Comment]], but wouldn't a List[Comment] do? List() is the "no comment" case after all. (P.S. I find the use of sqlSelector variables helps with refactoring.)
Using Casbah, I query Mongo.
val mongoClient = MongoClient("localhost", 27017)
val db = mongoClient("test")
val coll = db("test")
val results: MongoCursor = coll.find(builder)
var matchedDocuments = List[DBObject]()
for(result <- results) {
matchedDocuments = matchedDocuments :+ result
}
Then, I convert the List[DBObject] into JSON via:
val jsonString: String = buildJsonString(matchedDocuments)
Is there a better way to convert from "results" (MongoCursor) to JSON (JsValue)?
private def buildJsonString(list: List[DBObject]): Option[String] = {
def go(list: List[DBObject], json: String): Option[String] = list match {
case Nil => Some(json)
case x :: xs if(json == "") => go(xs, x.toString)
case x :: xs => go(xs, json + "," + x.toString)
case _ => None
}
go(list, "")
}
Assuming you want implicit conversion (like in flavian's answer), the easiest way to join the elements of your list with commas is:
private implicit def buildJsonString(list: List[DBObject]): String =
list.mkString(",")
Which is basically the answer given in Scala: join an iterable of strings
If you want to include the square brackets to properly construct a JSON array you'd just change it to:
list.mkString("[", ",", "]") // punctuation madness
However if you'd actually like to get to Play JsValue elements as you seem to indicate in the original question, then you could do:
list.map { x => Json.parse(x.toString) }
Which should produce a List[JsValue] instead of a String. However, if you're just going to convert it back to a string again when sending a response, then it's an unneeded step.