ArangoDB AQL is affected by the injection issue? - code-injection

I'm working with AQL in these days, and I'm creating a library for dynamically creating the AQL script.
Because I didn't find anything related to the PARAMETER INJECTION issue (like SQL INJECTION) do you think that is secure if I set my FILTER variable directly inside the AQL query string?

If you are using bindParameters for all user-defined input the value inserted will not be evaluated by the AQL parser and hence injected code will not be executed.
Safe query:
FOR x IN items FILTER x.name == #name RETURN x
Unsafe query:
"FOR x IN items FILTER x.name == " + name + " RETURN x"
Inserting sth. like
'a' LET t = (FOR h IN items DELETE h)
in name will return all elements having exactly this string in the save query (not harmful).
In the unsafe query it will drop all elements in items (harmful).

Related

Can't declare a variable in ArangoDB

I'm using ArangoDB 3.4.6-1 and would like to delete vertices with AQL (as stated here) in online console.
In the first step according to the tutorial you are supposed to save your Edges into a variable. In my case the statement looks like this:
LET edgeKeys = (FOR v, e, p IN 1..100 INBOUND 'Node/N3' GRAPH 'graph' RETURN e._key)
The For itself without the brackets returns the correct result:
[
"E3"
]
Yet, running the whole statement with the brackets just throws the following error:
Query: AQL: syntax error, unexpected end of query string near ')' at position 1:83 (while parsing)
I tried using a comparable command with other graphs or other returned values and objects, but always get the same error.
So far I wasn't able to find a proper solution online. The tutorial provides the following example code (copied 1:1):
LET edgeKeys = (FOR v, e IN 1..1 ANY 'persons/eve' GRAPH 'knows_graph' RETURN e._key)
And I'm getting exactly the same error, it's not even able to check the collections.
What am I doing wrong?
Only defining a variable with LET is not a valid AQL statement.
From the AQL Syntax documentation:
An AQL query must either return a result (indicated by usage of the
RETURN keyword) or execute a data-modification operation (indicated by
usage of one of the keywords INSERT, UPDATE, REPLACE, REMOVE or
UPSERT). The AQL parser will return an error if it detects more than
one data-modification operation in the same query or if it cannot
figure out if the query is meant to be a data retrieval or a
modification operation.
Using the full AQL block that is stated in the tutorial the execution works as expected since the query is executing a data-modification with REMOVE in this case. Just a RETURN operation inside the LET variable declaration is not sufficient to run an AQL query. When removing the LET operation the query works as well since in this case the AQL query directly returns the result.
Complete AQL query:
LET edgeKeys = (FOR v, e IN 1..1 ANY 'persons/eve' GRAPH 'knows_graph' RETURN e._key)
LET r = (FOR key IN edgeKeys REMOVE key IN knows)
REMOVE 'eve' IN persons
An additional RETURN also makes the query work:
LET edgeKeys = (FOR v, e IN 1..1 ANY 'persons/eve' GRAPH 'knows_graph' RETURN e._key)
RETURN edgeKeys

Add SQL restriction to predicates in JPA query

I have to filter entities according to some parameters sent by client. For this purpose, i create a list of predicates like this:
List<Predicate> predicates = new ArrayList<Predicate>();
if (filters != null && StringUtils.hasText(filters.getName())) {
predicates.add(cb.like(cb.upper(root.get("name")),
"%" + filters.getName().trim().toUpperCase() + "%"));
}
//OTHER FILTERS
Now the problem is to add a criteria that is sql syntax. In particular i have to find entities that are located into a particular polygon. So i should add a restriction like this:
and within(point, :bounds) = true //bounds is the geometry drawn by client
I have read that i could use #Formula to define an sql filter but this annotation works everytime even if filter in input(about localization) is not setted.
Can anyone help me?
You should be able to call database functions using CriteriaBuilder.function(...). Try cb.equal(cb.function("within", Boolean.class, root.get("point"), bounds), cb.literal(true)).
This may or may not work out of the box depending on the type of the bounds parameter, because Hibernate needs to recognize it to know its SQL representation, though. You might need to create a custom LiteralType

Access locally scoped variables from within a string using parse or value (KDB / Q)

The following lines of Q code all throw an error, because when the statement "local" is parsed, the local variable is not in the correct scope.
{local:1; value "local"}[]
{[local]; value "local"}[1]
{local:1; eval parse "local"}[]
{[local]; eval parse "local"}[1]
Is there a way to reach the local variable from inside the parsed string?
Note: This is a simplification of the actual problem I'm grappling with, which is to write a function that executes a query, accepting a list of columns which it should return. I imagine the finished product looking something like this:
getData:{[requiredColumns, condition]
value "select ",(", " sv string[requiredColumns])," from myTable where someCol=condition"
}
The condition parameter in this query is the one that isn’t recognised and I do realise I could append it’s value rather than reference it inside a string, but the real query uses lots of local variables including tables etc, so it’s not as easy as just pulling all the variables out of the string before calling value on it.
I'm new to KDB and Q, so if anyone has a better way to achieve the same effect I'm happy to be schooled on the proper way to achieve this outcome in Q. Would still be interested to know in the variable access thing is possible though.
In the first example, you are right that local is not within the correct scope, as value is looking for the global variable local.
One way to get around this is to use a namespace, which will define the variable globally, but can only be accessed by calling that namespace. In the modified example below I have defined local in the .ns namespace
{.ns.local:1; value ".ns.local"}[]
For the problem you are facing with selecting, if requiredColumns is a symbol list of columns you can just use the take operator # to select them.
getData:{[requiredColumns] requiredColumns#myTable}
For more advanced queries using variables you may have to use functional select form, explained here. This will allow you to include variables in the where and by clause of the select statement
The same example in functional form would be (no by clause, only select and where):
getData:{[requiredColumns;condition] requiredColumns:(), requiredColumns;
?[myTable;enlist (=;`someCol;condition);0b;requiredColumns!requiredColumns]}
The first line ensures that requiredColumns is a list even if the user enters a single column name
value will look for a variable in the global scope that's why you are getting an error. You can directly use local variables like you are doing that in your function.
Your function is mostly correct, just need a slight correction to append condition(I have mentioned that below). However, a better approach would be to use functional select in this case.
Using functional select:
q) t:([]id:`a`b; val:3 4)
q) gd: {?[`t;enlist (=;`val;y);0b;((),x)!(),x]}
q) gd[`id;3] / for single column
Output:
id
-
1
q) gd[`id`val;3] / for multiple columns
In case your condition column is of type symbol, then enlist your condition value like:
q) gd: {?[`t;enlist (=;`id;y);0b;((),x)!(),x]}
q) gd[`id;enlist `a]
You can use parse to get a functional form of qsql queries:
q) parse " select id,val from t where id=`a"
?
`t
,,(=;`id;,`a)
0b
`id`val!`id`val
Using String concat(your function):
q)getData:{[requiredColumns;condition] value "select ",(", " sv string[requiredColumns])," from t where id=", .Q.s1 condition}
q) getData[enlist `id;`a] / for single column
q) getData[`id`val;`a] / for multi columns

Erlang mnesia equivalent of "select * from Tb"

I'm a total erlang noob and I just want to see what's in a particular table I have. I want to just "select *" from a particular table to start with. The examples I'm seeing, such as the official documentation, all have column restrictions which I don't really want. I don't really know how to form the MatchHead or Guard to match anything (aka "*").
A very simple primer on how to just get everything out of a table would be very appreciated!
For example, you can use qlc:
F = fun() ->
Q = qlc:q([R || R <- mnesia:table(foo)]),
qlc:e(Q)
end,
mnesia:transaction(F).
The simplest way to do it is probably mnesia:dirty_match_object:
mnesia:dirty_match_object(foo, #foo{_ = '_'}).
That is, match everything in the table foo that is a foo record, regardless of the values of the fields (every field is '_', i.e. wildcard). Note that since it uses record construction syntax, it will only work in a module where you have included the record definition, or in the shell after evaluating rr(my_module) to make the record definition available.
(I expected mnesia:dirty_match_object(foo, '_') to work, but that fails with a bad_type error.)
To do it with select, call it like this:
mnesia:dirty_select(foo, [{'_', [], ['$_']}]).
Here, MatchHead is _, i.e. match anything. The guards are [], an empty list, i.e. no extra limitations. The result spec is ['$_'], i.e. return the entire record. For more information about match specs, see the match specifications chapter of the ERTS user guide.
If an expression is too deep and gets printed with ... in the shell, you can ask the shell to print the entire thing by evaluating rp(EXPRESSION). EXPRESSION can either be the function call once again, or v(-1) for the value returned by the previous expression, or v(42) for the value returned by the expression preceded by the shell prompt 42>.

How do I dynamically build a search block in sunspot?

I am converting a Rails app from using acts_as_solr to sunspot.
The app uses the field search capability in solr that was exposed in acts_as_solr. You could give it a query string like this:
title:"The thing to search"
and it would search for that string in the title field.
In converting to sunspot I am parsing out field specific portions of the query string and I need to dynamically generate the search block. Something like this:
Sunspot.search(table_clazz) do
keywords(first_string, :fields => :title)
keywords(second_string, :fields => :description)
...
paginate(:page => page, :per_page => per_page)
end
This is complicated by also needing to do duration (seconds, integer) ranges and negation if the query requires it.
On the current system users can search for something in the title, excluding records with something else in another field and scoping by duration.
In a nutshell, how do I generate these blocks dynamically?
I recently did this kind of thing using instance_eval to evaluate procs (created elsewhere) in the context of the Sunspot search block.
The advantage is that these procs can be created anywhere in your application yet you can write them with the same syntax as if you were inside a sunspot search block.
Here's a quick example to get you started for your particular case:
def build_sunspot_query(conditions)
condition_procs = conditions.map{|c| build_condition c}
Sunspot.search(table_clazz) do
condition_procs.each{|c| instance_eval &c}
paginate(:page => page, :per_page => per_page)
end
end
def build_condition(condition)
Proc.new do
# write this code as if it was inside the sunspot search block
keywords condition['words'], :fields => condition[:field].to_sym
end
end
conditions = [{words: "tasty pizza", field: "title"},
{words: "cheap", field: "description"}]
build_sunspot_query conditions
By the way, if you need to, you can even instance_eval a proc inside of another proc (in my case I composed arbitrarily-nested 'and'/'or' conditions).
Sunspot provides a method called Sunspot.new_search which lets you build the search conditions incrementally and execute it on demand.
An example provided by the Sunspot's source code:
search = Sunspot.new_search do
with(:blog_id, 1)
end
search.build do
keywords('some keywords')
end
search.build do
order_by(:published_at, :desc)
end
search.execute
# This is equivalent to:
Sunspot.search do
with(:blog_id, 1)
keywords('some keywords')
order_by(:published_at, :desc)
end
With this flexibility, you should be able to build your query dynamically. Also, you can extract common conditions to a method, like so:
def blog_facets
lambda { |s|
s.facet(:published_year)
s.facet(:author)
}
end
search = Sunspot.new_search(Blog)
search.build(&blog_facets)
search.execute
I have solved this myself. The solution I used was to compiled the required scopes as strings, concatenate them, and then eval them inside the search block.
This required a separate query builder library that interrogates the solr indexes to ensure that a scope is not created for a non existent index field.
The code is very specific to my project, and too long to post in full, but this is what I do:
1. Split the search terms
this gives me an array of the terms or terms plus fields:
['field:term', 'non field terms']
2. This is passed to the query builder.
The builder converts the array to scopes, based on what indexes are available. This method is an example that takes the model class, field and value and returns the scope if the field is indexed.
def convert_text_query_to_search_scope(model_clazz, field, value)
if field_is_indexed?(model_clazz, field)
escaped_value = value.gsub(/'/, "\\\\'")
"keywords('#{escaped_value}', :fields => [:#{field}])"
else
""
end
end
3. Join all the scopes
The generated scopes are joined join("\n") and that is evaled.
This approach allows the user to selected the models they want to search, and optionally to do field specific searching. The system will then only search the models with any specified fields (or common fields), ignoring the rest.
The method to check if the field is indexed is:
# based on http://blog.locomotivellc.com/post/6321969631/sunspot-introspection
def field_is_indexed?(model_clazz, field)
# first part returns an array of all indexed fields - text and other types - plus ':class'
Sunspot::Setup.for(model_clazz).all_field_factories.map(&:name).include?(field.to_sym)
end
And if anyone needs it, a check for sortability:
def field_is_sortable?(classes_to_check, field)
if field.present?
classes_to_check.each do |table_clazz|
return false if ! Sunspot::Setup.for(table_clazz).field_factories.map(&:name).include?(field.to_sym)
end
return true
end
false
end