Is it possible to do something like this in Data Factory (data flow expression builder)? I am trying to create an array within an array but with a filter so it doesn't create an array with an empty item.
When I do this, it works however produces an array that looks like fooBar: [{ }] note the array contains a single item.
collect(#(
optionNumber=OptionNo,
price=OptionPrice,
salePricePeriods=array(#(
salePrice=OptionSalePrice,
priceActiveFrom=toString(OptionSaleFrom),
priceActiveTo=toString(OptionSaleTo)))))
Ideally, I want to filter this data by using an expression:
collect(#(
optionNumber=OptionNo,
price=OptionPrice,
salePricePeriods=
filter(
collect(#(
salePrice=OptionSalePrice,
priceActiveFrom=toString(OptionSaleFrom),
priceActiveTo=toString(OptionSaleTo))),
and(not(isNull(#item.salePrice)), and(not(isNull(#item.priceActiveFrom)), not(isNull(#item.priceActiveTo)))))))
When I do the above I get an error stating
Job failed due to reason: It is not allowed to use an aggregate function in the argument of another aggregate function. Please use the inner aggregate function in a sub-query.;;
Aggregate [958abb16-5236-430c-9af6-497495d60469#23243], [958abb16-5236-430c-9af6-497495d60469#23243, first(DataSet#22183, false) AS DataSet#23295, first(Realm#22184, false) AS Realm#23297, first(Territory#22185, false) AS Territory#23299, first(ItemNo#22186, false) AS ItemNo#23301, first(PriceGroupCode#22187, false) AS PriceGroupCode#23303, first(MinOptionPrice#22196, false) AS MinOptionPrice#23305, first(MaxOptionPrice#22197, false) AS MaxOptionPrice#23307, min(MinOptionSalePriceForPeriod#22198) AS MinOptionSalePrice#23309, max(MaxOptionSalePriceForPeriod#22199) AS MaxOptionSalePrice#23311, first(OldestDatePointMinPrice#22203, false) AS OldestDatePointMinPrice#23313, first(OldestDatePointMaxPrice#22204, false) AS OldestDatePointMaxPrice#23315, collect_list(named_struct(optionNumber, OptionNo#22189, price, OptionPrice#22191, salePrice
Figured it out, annoyingly the filter should have been on the incoming field so like so. The filter expression seems to get evaluated first before adding it to the array.
collect(#(
optionNumber=OptionNo,
price=OptionPrice,
salePricePeriods=
filter(
array(#(
salePrice=OptionSalePrice,
priceActiveFrom=toString(OptionSaleFrom),
priceActiveTo=toString(OptionSaleTo))),
not(isNull(toString(OptionSaleFrom))))))
Related
I am listing users in a CustomScrollView/SliversList,ListTiles. I have a String field in my firestore and only want to return ListTile of a user, where his String field contains specific words (more than 2). For example, the users fields contain: "Apples, Ice, Bananas, Soup, Peaches, e.g...." and i want to list all users which have apples and bananas inside the field. how can i achieve this?
The only way to do it at the moment (with the way you have it set up) is actually pulling the value and doing a string "contains" or splitting the string into an array and check whether the value is within that array, otherwise I'd advise to refactor that field and make it into an array, that way you can perform a native arrayContainsAny against your field.
For you it will look like this (with your current implementation):
// ... after pulling all users' documents
// let's say your field is called 'foodField':
var criteria = 'Banana';
var fieldContent = doc.data()['foodField'];
// you can either do this:
if (fieldContent.toLowerCase().contains(criteria.toLowerCase())) {
// ...
}
// or you can tokenize it depending on your purposes...
var foodTokens = fieldContent.split(',').map((f) => f.toLowerCase());
if (foodTokens.contains(criteria.toLowerCase()) {
// ...
}
If your Firestore field was an array type, then you could've just done that, while querying:
FirebaseFirestore.instance.collection('users').where('foodField', arrayContainsAny: ['Banana', 'Apples'])
Which then would give you only the users whose foodField contain that value.
As you can see from previous questions on querying where text contains a substring, Firestore does not currently support such text searches. The typical solutions are to either perform part of your filtering in your application code as Roman answered, or to integrate a third-party full-text search solution.
In your specific case though, your string seems to be a list of words, so I'd recommend considering to change your data model to an array of the individual values in there:
"foodFields": ["Apples", "Ice", "Banana", "Soup", "Peaches"]
You can then use array field operators in the query.
While there is no array-contains-all operator, using array-contains you can at least filter on one value in the database, and with array-contains-any you can do on OR like condition.
Another data model would be to store the individual values in a map field with value true for each of them:
"foodFields": {
"Apples": true,
"Ice": true,
"Banana": true,
"Soup": true,
"Peaches": true
}
With such a structure you can perform an AND like query with:
collectionRef
.where('foodFields.Apples', isEqualTo: true)
.where('foodFields.Bananas', isEqualTo: true)
I have in my index a list of object, each of them has an objectID value.
On some search, i want to filter OUT a certain number of them, using there objectID.
For the moment it works with one value as a string, i would like to know how to do for multiple value.
filters = 'NOT objectID:' + objectIDToFilter;
This work for one object, what can i do to apply this for an array of ObjectID. because :
filters = 'NOT objectID:' + arrayObjectID;
does not work.
I was thinking of generating a huge string with an arrayId.map with all my 'NOT objectID:1 AND NOT objectID: 2 ...' but i wanted to know if there is a cleaner way to do it.
I unfortunately misunderstood the line in algolia doc :
Array Attributes: Any attribute set up as an array will match the filter as soon as one of the values in the array match.
This apparently refers to the value itself in Algolia and not the filter
So i did not found a solution on algolia doc, i went for the long string, hope there is no limits on how much filter we can add on a query (found nothing about that).
Here is what i did if someone need it :
let filters = `NOT objectID:${userID}`;
blockedByUsers.map((blockedByUser) => {
filters = filters + ` AND NOT objectID:${blockedByUser}`;
});
If you need to add multiple but don't have a starting point like i do, you can't start the query with an AND , a solution i found to bypass that:
let filters = `NOT objectID:${blockedByUsers[0]}`;
blockedByUsers.map((blockedByUser, i) => {
if (i > 0) filters = filters + ` AND NOT objectID:${blockedByUser}`;
});
There is probably a cleaner solution, but this work. If you have found other solution for that problems i'll be happy to see :)
We can apply a simple query filter in vapor with:
// User is my model object connecting corresponding MySQL database table
let aUser = try User.query().filter("user_email", "asd#example.com")
How can we chain multiple query filters with AND or OR conditions like we do in SQL queries?
Say for example, if we need to join filter("user_email", "asd#example.com") and filter("user_password", "123456") with an AND condition, how can we achieve that?
As I know the .filter function throws a bool after checking the condition to filter. So you can try to combine two conditions with the logical AND-operator. For a proper solution it would be good to know the structure and properties of your object User: but a suggestion would be:
let aUser = User.query().filter { condition1 && condition2 }
Filter would only pass an element of User.query() to aUser, if BOTH conditions are true. Make sure, that User.query() is an array and that you refer to the current object with $0 in your conditions. The filter function will also return an Array.
Let's say I have a Postgres database (9.3) and there is a table called Resources. In the Resources table I have the fields id which is an int and data which is a JSON type.
Let's say I have the following records in said table.
1, {'firstname':'Dave', 'lastname':'Gallant'}
2, {'firstname':'John', 'lastname':'Doe'}
What I want to do is write a query that would return all the records in which the data column has a json element with the lastname equal to "Doe"
I tried to write something like this:
records = db_session.query(Resource).filter(Resources.data->>'lastname' == "Doe").all()
Pycharm however is giving me a compile error on the "->>"
Does anyone know how I would write the filter clause to do what I need?
Try using astext
records = db_session.query(Resource).filter(
Resources.data["lastname"].astext == "Doe"
).all()
Please note that the column MUST have a type of a JSONB. The regular JSON column will not work.
Also you could explicitly cast string to JSON (see Postgres JSON type doc).
from sqlalchemy.dialects.postgres import JSON
from sqlalchemy.sql.expression import cast
db_session.query(Resource).filter(
Resources.data["lastname"] == cast("Doe", JSON)
).all()
If you are using JSON type (not JSONB) the following worked for me:
Note the '"object"'
query = db.session.query(ProductSchema).filter(
cast(ProductSchema.ProductJSON["type"], db.String) != '"object"'
)
I have some GeoJSON in a JSON (not JSONB) type column and none of the existing solutions worked, but as it turns out, in version 1.3.11 some new data casters were added, so now you can:
records = db_session.query(Resource).filter(Resources.data["lastname"].as_string() == "Doe").all()
Reference: https://docs.sqlalchemy.org/en/14/core/type_basics.html#sqlalchemy.types.JSON
Casting JSON Elements to Other Types
Index operations, i.e. those invoked by calling upon the expression
using the Python bracket operator as in some_column['some key'],
return an expression object whose type defaults to JSON by default, so
that further JSON-oriented instructions may be called upon the result
type. However, it is likely more common that an index operation is
expected to return a specific scalar element, such as a string or
integer. In order to provide access to these elements in a
backend-agnostic way, a series of data casters are provided:
Comparator.as_string() - return the element as a string
Comparator.as_boolean() - return the element as a boolean
Comparator.as_float() - return the element as a float
Comparator.as_integer() - return the element as an integer
These data casters are implemented by supporting dialects in order to
assure that comparisons to the above types will work as expected, such
as:
# integer comparison
data_table.c.data["some_integer_key"].as_integer() == 5
# boolean comparison
data_table.c.data["some_boolean"].as_boolean() == True
According sqlalchemy.types.JSON, you can do it like this
from sqlalchemy import JSON
from sqlalchemy import cast
records = db_session.query(Resource).filter(Resources.data["lastname"] == cast("Doe", JSON)).all()
According to this, pre version 1.3.11, the most robust way should be like this, as it works for multiple database types, e.g. SQLite, MySQL, Postgres:
from sqlalchemy import cast, JSON, type_coerce, String
db_session.query(Resource).filter(
cast(Resources.data["lastname"], String) == type_coerce("Doe", JSON)
).all()
From version 1.3.11 onward, type-specific casters is the new and neater way to handle this:
db_session.query(Resource).filter(
Resources.data["lastname"].as_string() == "Doe"
).all()
I am trying to store the values of select list in an array variable
a = b.options.each {|option| puts option.attribute_value "value" }
Output :
IN PROGRESS
UPCOMING
FINAL
POSTPONED
CANCELLED
a.to_i
Is it possible to store all values which getting from attribute and store in An array
The element collections in Watir include the Enumerable module, which gives a lot of useful methods for iterating. In particular, it includes a map method that will perform a block on each element and collect the result in an array.
To store the value of all options, you can simply do:
total_list_values = #browser.options.map(&:value)
#=> ["IN PROGRESS", "UPCOMING", "FINAL", "POSTPONED", "CANCELLED"]
I coded it like this and its worked, posted if anyone wanted this
total_list_values = Array.new
body = #browser.options
body.options.each do |option|
total_list_values << option.value
end