How to perform full text search on a gin postgres index in doctrine 2.2? - postgresql

Is there a simple way of doing this in doctrine, or am I going to have to do a native sql query? If someone could point me in the right direction, it would be appreciated.

Quick answer is, no. There is not a "simple way" of doing this.
You need to use Doctrine Native SQL which can be found here. This will allow you to map the results into usable entities like you're used to while using database specific things.
You could also use a DQL user defined function, described here. This is probably the "correct" way.

Related

Using Full text Search with kotlin-exposed

I am trying to implement a FULL-TEXT SEARCH (PostgreSQL) on kotlin by using kotlin-exposed. I have my queries in raw SQL but cannot write queries containing to_tsvector()or to_tsquery() in kotlin. I couldn't actually find anything similar anywhere. After a bit of reading, I understood that complex queries could be written as raw SQL here (i couldn't get it working too) but there is a chance of SQL injection in that. Is there a way to tackle this?
I am not posting any code since what I've tried is just trial and error, actually, the methods are not even available in IDE. Any help is appreciated. My DB is PostgreSQL.

How to create custom aggregate functions using SQLAlchemy ORM?

SQLAlchemy 1.4 ORM using an AsyncSession, Postgres backend, Python 3.6
I am trying to create a custom aggregate function using the SQLAlchemy ORM. The SQL query would look something like:
COUNT({group_by_function}),{group_by_function} AS {aggregated_field_name)}
I've been searching for information on this.
I know this can be created internally within the Postgres db first, and then used by SA, but this will be problematic for the way the codebase I'm working with is set up.
I know SQLAlchemy-Utils has functionality for this, but I would prefer not to use an external library.
The most direct post on this topic I can find says "The creation of new aggregate functions is backend-dependant, and must be done directly with the API of the underlining connection." But this is from quite a few years ago and thought there might have been updates since.
Am I missing something in the SA ORM docs that discusses this or is this not supported by SA, full stop?
you can try something this query
query = db.session.query(Model)\
.with_entities(
Model.id,
func.sum(Model.number).label('total_sum')
).group_by(Model.id)

AnsiDialect not available in spring-data-jpa tree

I'm interested to use an AnsiDialect from spring-data-jdbc to connect to my Oracle database using a proprietary driver. However, this dialect is not present in spring-data-jpa (only contains dialects from hibernate)
Is there a way to do this, or do I need to implement my own dialect?
By the looks of it, it seem that the equivalent of AnsiDialect is simply org.hibernate.dialect.Dialect. Don't know why I didn't think about this :)

Hibernate indexing for like expression

I am using Hibernate 3.3.1 and PostgresQL 9.2.2 server. The tables for my application
are generated by hibernate automatically and now i would like to do an optimization for
a very often used "like" expression in a table wich looks that wy:
"where path like 'RootFolder_FirstSubfolder%'"
by default hibernate only creates an index for the "id" column i defined via annotation.
Are there any recommendations how i could speedup my "like" expression using more indexes?
Thanks very much in advance for helping me
Kind regards
Shannon
Hibernate can use the Index annotation to automatically creating an additional index:
#org.hibernate.annotations.Index(name = "IDX_PATH")
private String path;
BUT it won't help since the created index is not suitable for like clauses.
Read the most upvoted answer here for a better solution. Unfortunately, it requires custom sql and AFAIK there is no easy way to integrate custom sql in script generated by hibernate schema update tool.
As an alternative to hibernate auto update: you can use a tool like liquibase to manage schema update. It requires more setup, but it gives you full control of schema update scripts.

Increase security of DB INSERT

Currently, I have a PostGIS DB, and I have the basics working. However, I am inserting directly into the database using pg_query. I have been recommended to use pg_query_params to help prevent against SQL injections, but am unsure how to implement this. Below is a cut-down example of my current insert statement for a site location. How would I, for example, utilise pg_query_params with this example? I know I will have to implement further security, but it is a starting point.
EDIT: I was going to use the drupal form API but it gave me headaches. I realize that would do a lot of this stuff automatically.
$sql = "INSERT INTO sites_tbl (river_id ,sitename ,the_geom) VALUES ('$_POST[river_id]','$_POST[sitename]',st_geomfromtext('POINT($geomstring)',27700))";
$result = pg_query($sql);
Because you are using strings rather than parameters, your example is vulnerable to SQL injection. It's best to avoid pg_ functions. In your case there are two things you need to take into account:
Learn the Drupal API (considering you are using Drupal this would be the best for code consistency
or
Use stored procedures
Use a library like PDO or pg_query_params which takes care of parameterized queries
Normally you use stored procedures in addition to PDO, unfortunately sometimes this is not manageable because you have too much code. My advice is to use as much stored procedures as possible.