My application uses the SQLalchemy ORM code exclusively to define the database schema. For the database, two uses cases exist. For the sake of simplicity, I'll call the first the "simple" use case, the second the "complex" use case.
Both use cases go through my application. Both use cases generate JSON data. However, the use cases differ on how the JSON data is queried later on; consequently, the reports that run on the database make use of the same application/library code, but construct different queries on the JSON attributes of the schema.
Now, the simple use case employs exclusively SQLite, while the complex use case relies on PostgreSQL exclusively. I would like to use JSONB on PostgreSQL, because all reports that run against the PostgreSQL database cast all JSON fields to JSONB. SQLite, however, does not have JSONB, obviously. Still, I would like to use the same ORM code in both cases.
How can I make SQLalchemy use JSONB when my code connects to a PostgreSQL database, but JSON in all other cases? I.e., can I change the facade of JSON for the PostgreSQL dialect?
As per the helpful people in the SQLalchemy forums, the definition is actually pretty simple:
Column(
"my_column",
sqlalchemy.JSON().with_variant(
sqlalchemy.dialects.postgresql.JSONB(),
"postgresql"
)
)
Related
PostgreSQL has excellent support for evaluating JSONPath expressions against JSON data.
For example, this query returns true because the value of the nested field is indeed "foo".
select '{"header": {"nested": "foo"}}'::jsonb #? '$.header ? (#.nested == "foo")'
Notably this query does not reference any schemas or tables. Ideally, I would like to use this functionality of PostgreSQL without creating or connecting to a full database instance. Is it possible to run PostgreSQL in such a way that it doesn't have schemas or tables, but is still able to evaluate "standalone" queries?
Some other context on the project, we need to evaluate JSONPath expressions against JSON data in both a Postgres database and Python application. Unfortunately, Python does not have any JSONPath libraries that support enough of the spec to be useful to us.
Ideally, I would like to use this functionality of PostgreSQL without creating or connecting to a full database instance.
Well, it is open source. You can always pull out the source code for this functionality you want and adapt it to compile by itself. But that seems like a large and annoying undertaking, and I probably wouldn't do it. And short of that, no.
Why do you need this? Are you worried about scalability or ease of installation or performance or what? If you are already using PostgreSQL anyway, firing up a dummy connection to just fire some queries at the JSONB engine doesn't seem too hard.
I have a sequence created using flyway in postgres which should start from 10000.
I want to get the next value of the sequence using JPA and not a native query , since i have different db platforms being run at different cloud providers.
I'm not able to find a JPA query to get the next value of a sequence, please redirect me to the right page if i am missing something already ..
Thanks for any help in that area though already!
P.S : I found this link which helps me doing the same with native query.
postgresql sequence nextval in schema
I don't think this is possible in a direct way.
JPA doesn't know about sequences.
Only the implementation knows about those and utilizes them to create ids.
I see the following options to get it to work anyway:
create a view in the database with a single row and a single column containing the next value. You can query that with native SQL which should be the same for all databases since it is a trivial select.
Create a dummy entity using the sequence for id generation, save a new instance and let JPA populate the id.
A horrible workaround but pure JPA.
Bite the bullet and create a simple class that provides the correct native SQL statement to use for the current environment and execute it via JdbcTemplate.
How do I use Slick to call a stored procedure?
I want it to be type safe / injection safe. (ie, I don't want any SQL query strings in my code...)
According to the docs, Slick includes "Type-safe support of stored procedures" (http://slick.typesafe.com/doc/1.0.0/introduction.html)
But I do not see any example in the docs of how to do it.
That's a typo. It should read "Type-safe support of scalar database functions". At the moment you have to use plain SQL to call stored procedures.
Also see https://groups.google.com/forum/#!searchin/scalaquery/procedure/scalaquery/BUB2-ryR0bY/EFZGX663tRYJ
when trying to make this question, i got this one it is using Java, and in the answer it gave a Ruby example, and it seems that the injection happens only when using Json? because i've an expose where i'll try to compare between NoSQL and SQL and i was trying to said: be happy, nosql has no sql injection since it's not sql ...
can you please explain me:
how sql injection happens when using Python driver (pymongo).
how to avoid it.
the comparison using the old way sql injection using the comment in the login form.
There are a couple of concerns with injection in MongoDB:
$where JS injection - Building JavaScript functions from user input can result in a query that can behave differently to what you expect. JavaScript functions in general are not a responsible method to program MongoDB queries and it is highly recommended to not use them unless absolutely needed.
Operator injection - If you allow users to build (from the front) a $or or something they could easily manipulate this ability to change your queries. This of course does not apply if you just take data from a set of text fields and manually build a $or from that data.
JSON injection - Quite a few people recently have been trying to convert a full JSON document sent (saw this first in JAVA, ironically) from some client side source into a document for insertion into MongoDB. I shouldn't need to even go into why this is bad. A JSON value for a field is fine since, of course, MongoDB is BSON.
As #Burhan stated injection comes from none sanitized input. Fortunately for MongoDB it has object orientated querying.
The problem with SQL injection comes from the word "SQL". SQL is a querying language built up of strings. On the other hand MongoDB actually uses a BSON document to specify a query (an Object). If you keep to the basic common sense rules I gave you above you should never have a problem with an attack vector like:
SELECT * FROM tbl_user WHERE ='';DROP TABLE;
Also MongoDB only supports one operation per command atm (without using eval, don't ever do that though) so that wouldn't work anyway...
I should add that this does not apply to data validation only injection.
SQL injection has nothing to do with the database. It is a type of vulnerability that allows for execution of arbitrary SQL commands because the target system does not sanitize the SQL that is given to the SQL server.
It doesn't matter if you are on NoSQL or not. If you have a system running on mongodb (or couchdb, or XYZ db), and you provide a front end where users can enter records - and you don't correctly escape and sanitize the input coming from the front end; you are open to SQL injection.
I need to get a database connection through a firewall, and also limit what queries can be run. DBD::Proxy seems to be the perfect solution for this. However, I'm currently using DBIx::Class, and can't figure out how to hook them together.
In particular, DBD::Proxy doesn't take SQL; it takes particular named queries. But DBIx::Class doesn't seem to have a way to invoke those named queries.
This is inside a Catalyst-based webapp.
DBD::Proxy does take SQL. It allows for named queries as a convenience.
There is no convenient way to use DBIx::Class with DBD::Proxy named queries, since the purpose of the DBIx::Class Object-Relational Mapper (ORM) is to present an object-oriented view of SQL's Data Manipulation Language (DML) statements. The named query feature of DBD::Proxy is not a DML statement, so DBIx::Class does not have feature that suit your needs: passing a literal string directly to the prepare() function of your DBD::Proxy driver.
Some inconvenient ways:
Don't use DBIx::Class. Just do it in DBI. You could use Catalyst::Model::DBI, or plain DBI + catalyst::Model::Adaptor + your own model class.
Don't use named queries. This means that if you were planning to use named queries as a way to control access to the database, then you'll need to move the query authorization
logic into the code that makes the call to the database inside your controller or model, depending on how you built your application.