I have problem using OrietDB Lucene index. When I query using it, it return an incomplete dataset. Here is the example:
create class Foo extends V
create property Foo.text string
create index Foo.text_spanish on Foo(text) fulltext engine lucene metadata
{ "analyzer": "org.apache.lucene.analysis.es.SpanishAnalyzer",
"index": "org.apache.lucene.analysis.es.SpanishAnalyzer",
"query": "org.apache.lucene.analysis.es.SpanishAnalyzer",
"allowLeadingWildcard": true
}
insert into Foo (text) values ("axxx")
insert into Foo (text) values ("áxxx")
insert into Foo (text) values ("xxxa")
insert into Foo (text) values ("xxxá")
insert into Foo (text) values ("xxaxx")
insert into Foo (text) values ("xxáxx")
now when I run this query:
select from Foo where text lucene "*a*"
I get:
xxáxx
xxaxx
xxxa
axxx
And it missed
áxxx
xxxá
And if I run this:
select from Foo where text lucene "*á*"
I get:
áxxx
xxxá
And miss the rest. Even in this case it should show xxáxx.
What am I doing wrong?
By default, OrientDB supports all analyzers listed here, however there are characters that are not considered "Basic Latin" and are available only when creating a custom analyzer with supported filters, such as ASCIIFoldingFilter.
Once you create and compile the class, import its .jar in the OrientDB's lib directory and then create the index with your custom analyzer.
In the meantime a quick solution would be:
SELECT FROM Foo WHERE text LUCENE "*a*" OR text LUCENE "*á*";
Related
There is a simple database entity:
case class Foo(id: Option[UUID], keywords: Seq[String])
I want to implement a search function which returns all entities of type Foo which have at least one keyword that contains the search string.
I'm using Slick and tried this:
def searchKeywords(txt: String): Future[Seq[Foo]] = {
val action = Foos.filter(p => p.keywords.any like s"%$txt%").result
db.run(action)
}
This piece of code compiles, but when executing, I get this SQL error:
PSQLException: ERROR: syntax error at or near "any"
The generated sql statement looks like:
select "id", "title", "tagline", "logo", "short_desc", "keywords", "initial_condition", "work_process", "end_result", "ts", "lm", "v" from "projects" where any("keywords") like '%foo%'
And it does not work with postgresql. (I'm using v12)
Schema for the table looks like this:
CREATE TABLE foos
(
id UUID NOT NULL PRIMARY KEY,
keywords varchar[] NOT NULL
);
How can I achieve to search in a list of strings using the like operator?
From a pure SQL point of view, you need a derived table to achieve that. I hope some expert corrects me if I'm wrong but you can't use SQL operator like on a array.
Supposing your table construction is :
CREATE TABLE foos
(
id UUID NOT NULL PRIMARY KEY,
keywords varchar[] NOT NULL
);
Then an SQL way of retrieving the results would be :
select * from (
select id, unnest(keywords) as keyw from foos
) myTable where keyw like '%foo%'
Otherwise, the syntax you're using for the like operator seems correct.
myProperty like s"%$myVariable"
I'm trying to return a simple, scalar string value from a Postgres DB using Knex. So far, everything I do returns a JSON object with a key (the column name) and the value, so I have to reach into the object to get the value. If I return multiple rows, then I get multiple JSON objects, each one repeating the key.
I could be returning multiple columns, in which case each row would at least need to be an array. I'm not looking for a special case where specifying a single column returns the value without the array -- I'm OK reaching into the array. I want to avoid the JSON object with the repetitive listing of column names as keys.
I've scoured the Knex docs but don't see how to control the output.
My table is a simple mapping table with two string columns:
CREATE TABLE public._suite
(
piv_id character(18) NOT NULL,
sf_id character(18) NOT NULL,
CONSTRAINT _suite_pkey PRIMARY KEY (piv_id)
)
When I build a query using Knex methods like
let myId = 'foo', table = '_suite';
return db(table).where('piv_id', myId).first(['sf_id'])
.then( function(id) { return(id); });
I get {"sf_id":"a4T8A0000009PsfUAE"} ; what I want is just "a4T8A0000009PsfUAE"
If I use a raw query, like
return db.raw(`select sf_id from ${table} where piv_id = '${myId}'`);
I get a much larger JSON object describing the result:
{"command":"SELECT","rowCount":1,"oid":null,"rows":[{"sf_id":"a4T8A0000009Q9HUAU"}],"fields":[{"name":"sf_id","tableID":33799,"columnID":2,"dataTypeID":1042,"dataTypeSize":-1,"dataTypeModifier":22,"format":"text"}],"_parsers":[null],"RowCtor":null,"rowAsArray":false}
What do I have to do to just get the value itself? (Again, I'm OK if it's in an array -- I just don't want the column names.)
Take a look at the pluck method.
db(table).where('piv_id', myId).pluck('sf_id'); // => will return you ["a4T8A0000009PsfUAE"]
I have a bunch of similar temp tables which I am trying to query using go-pg's ORM. I can't find a way to dynamically change the queried table during a select:
import "gopkg.in/pg.v4"
type MyModel struct {
TableName struct{} `sql:"temp_table1"`
Id int64
Name string
}
var mymodels []MyModel
err := db.Model(&mymodels).Column("mymodel.id", "mymodel.name").Select()
This will query temp_table1 as defined in the model's TableName. Is there a way to pass table name as a parameter so I can query temp_table_X?
(I can just not use ORM and go with raw db.Query(), but I wanted to see if there is a way to use ORM).
Got an answer on github:
err := db.Model().TableExpr("temp_table_999 AS mymodel").Column("mymodel.id", "mymodel.name").Select(&mymodels)
Seems you can specify the table directly: db.Model(&mymodels).Table('temp_table1').Column("mymodel.id", "mymodel.name").Select()
With orientDB you can insert / update directly using JSON as the input using the 'CONTENT' keyword which is great. ODB takes care of the mappings.
Id like to be able to return JSON as well from SELECT queries - is this possible?
Kurt
You can use .toJSON() method.
Syntax: <value>.toJSON([<format>])
Example:
create class Test extends V
insert into Test content {"attr1": "value 1", "attr2": "value 2"}
select #this.toJson('rid,version,fetchPlan:in_*:-2 out_*:-2') from Test
Ref.: SQL Methods - .toJSON()
I'm powering a search bar via AJAX that passes a selected filter (radio button) that relates to a database column and a search string for whatever is entered in the search bar. The scala/play/anorm code I am using is this:
def searchDB(searchString: String, filter: String): List[DatabaseResult] = {
DB.withConnection { implicit c =>
SQL(
"""
SELECT name, email, emailsecondary, picture, linkedin, title, company, companylink, companydesc, location, github, stackoverflow, twitter, blog
FROM mailinglistperson
WHERE {filter} LIKE '%{searchString}%'
""").on(
'filter -> filter,
'searchString -> searchString
).as(databaseResultParser.*)
}
}
When I run a query on the database (PostgreSQL) using psql that is isomorphic to the above anorm code, it returns 2 results, i.e.:
select id, name, email from mailinglistperson where company like '%kixer%';
But the anorm code returns 0 results when passed the exact same values (I've verified the values via println's)
EDIT: When I switch the anorm code to use String Interpolation I get:
[error] - play.core.server.netty.PlayDefaultUpstreamHandler - Cannot invoke the action
java.lang.RuntimeException: No parameter value for placeholder: 3
EDIT2: I also tried passing the '%...%' along with searchString into LIKE and still got 0 results.
There are two issues - the name of the column, and the filter value
As for the filter value: You have to omit the single ticks in the SQL command, and you should pass the placeholder "%" in the argument. The ticks are handled automatically in case of a string.
As for the column name: It's like a string parameter, so again ticks are handled automatically as well:
[debug] c.j.b.PreparedStatementHandle - select ... from ... where 'filter' like '%aaa%'
One solution: Use normal string interpolation s"""... $filter ...""".
All together:
SQL(
s"""
SELECT name, email, ...
FROM mailinglistperson
WHERE $filter LIKE {searchString}
""").on(
'searchString -> "%" + searchString + "%"
).as(databaseResultParser.*)
but that should be accompanied by a check before, something like
val validColumns = List("name", "email")
if (validColumns.contains(filter)) == false) {
throw new IllegalArgumentException("...")
}
to guard against SQL injection.
Update
As pointed out by cchantep: If Anorm >= 2.4 is used, one can use mixed interpolation (both for column names and values):
SQL"... WHERE #$filter LIKE $searchString"
In this case it's partially safe against SQL injection: that only covers the values, and not the column name.
Update 2
As for logging the SQL statements, see Where to see the logged sql statements in play2?
But as you are using PostgreSQL I suggest the definitive source: The PostgreSQL log: In postgresql.conf:
log_statement = 'all' # none, ddl, mod, all
then you will see in the PostgreSQL log file something like this:
LOG: select * from test50 where name like $1
DETAIL: Parameter: $1 = '%aaa'