Trying to experiment with Ecto embeds_many, and work great, until I need to query on some data in the embedded field.
So I have something like a product that embeds_many categories
schema "products" do
field :code, :string, null: false
embeds_many :categories, Category,
on_replace: :delete,
primary_key: {:id, :binary_id, autogenerate: false}
do
field :name, :string
end
end
def create_changeset(%Product{} = product, attrs) do
product
|> cast(attrs, [:code])
|> cast_embed(:categories, with: &attributes_changeset/2)
end
def attributes_changeset(%{} = product_attribute, attrs) do
product_attribute
|> cast(attrs, [:id, :name])
end
After creating products I end up with something like this in the postgres table
id code categories
1 11 {"{\"id\": \"dress\", \"name\": \"Dress\"},
"{\"id\": \"shirt\", \"name\": \"Shirt\"}}
2 22 {"{\"id\": \"dress\", \"name\": \"Dress\"}}
So now I want to query all products where id == "dress", and of course I would like to get the 2 results above.
I have experimented with something like this:
q = from p in Product, where: fragment("? #> ?", p.categories, '{"id": "dress"}') but transforms the array in integers: operator does not exist: jsonb[] #> integer[] ... WHERE (p0."categories" #> ARRAY[123,34,105,100,34,58,32,34,100,114,101,115,115,34,125])
or that:
q = from p in Product, where: fragment("? #> ?", p.categories, "[{\"id\": \"dress\"}]"), getting malformed array literal: "[{"id": "dress"}]"
What I hoped was something like:
q = from p in Product, where: fragment("? -> 'id' = ?", p.categories, "rochie") but not sure at all if that will work.
Since categories is a jsonb[] here, not a plain json, the operator #> won't work with it directly. You can use ANY and <# to do what you want:
where: fragment("? <# ANY(?)", ~s|{"id": "dress"}|, p.categories)
This will run '{"id": "dress"}' <# ? for each category in the categories array and return true if any of them match.
(~s|"a"| is just a cleaner way of writing "\"a\"".)
Related
I have the following query:
SELECT
u.*,
array_agg(
json_build_object(
'id', usn.id,
'schemaName', usn.schema_name
)
) AS schemas
FROM dev.users u
FULL OUTER JOIN dev.user_schema_names usn ON u.id = usn.user_id
WHERE u.email = $1
GROUP BY u.id
But for some odd reason, this returns the following:
{
id: 1,
email: 'test#test.com',
schemas: [ { id: null, schemaName: null } ]
}
The dev.user_schema_names has no records in it. I am expecting schemas to be an empty array []. If I insert records in dev.user_schema_names, then it works just fine, however.
What am I doing wrong? Should I be using something else instead of json_build_object?
I am using whoosh to index and search throught my documents. I developped a multi=field search, but I want to specify some "MUST" fields.
What I want is: I when I am searching for a book with a query q1, it search on title and summary, but I want to specify some filters like autor= 'name of author' and category= "books category".
The results must take into account the two 'MUST' field and search on the two others.
Thank you for your help
Searcher's search() has parameter filter:
https://whoosh.readthedocs.io/en/latest/searching.html#filtering-results
For filtering you can use a query like
'author:"Name of Author" AND category:Genre'
To build filtering queries automatically, let's assume we have sets authors and categories:
from whoosh.query import *
from whoosh.qparser import MultifieldParser
if categories:
# Filter the search by selected categories:
if len(categories) > 1:
# Must be: Category1 OR Category2 OR Category3...
cat_q = Or([Term('category', x) for x in categories])
else:
# If there's just 1 category in the set:
cat_q = Term('category', next(iter(categories)))
print('Query to filter categories:', cat_q)
if authors:
# Filter the search by authors:
if len(authors) > 1:
# Must be: Author1 OR Author2 OR Author3...
aut_q = Or([Term('author', x) for x in authors])
else:
# If there's just 1 author in the set:
aut_q = Term('author', next(iter(authors)))
print('Query to filter authors:', au_q)
# Now combine the two filters:
if categories:
# Both fields are used for filtering:
final_filter = And([cat_q, aut_q])
else:
# Only authors:
final_filter = aut_q
elif categories:
# Only categories:
final_filter = cat_q
else:
# None:
final_filter = None
print('final_filter:', final_filter)
# Now parse the user query for 2 fields:
parser = MultifieldParser(["title", "summary"], ix.schema)
query = parser.parse(q1)
if final_filter is None:
results = s.search(query)
else:
# Using the filter:
results = s.search(query, filter=final_filter)
You can use whooshMultifieldParser for this scenario
from whoosh.qparser import MultifieldParser
fields = ["title", "summary", "author", "category"]
query = MultifieldParser(fields, schema=idx.schema, group=qparser.OrGroup).parse(q1)
with idx.searcher() as searcher:
results = searcher.search(query, limit=limit)
...........
Above using Or Group which will search on all fields with or operator. According to your need you can customize them . more on operators here like and not etc.
i have a table foo:
id | items
---+--------------------------------
1 |{"item_1": {"status": "status_1"}}
2 |{"item_2": {"status": "status_2"}}
...
I need to update all rows in column items which is a jsonb and set after {"status": "status"} new values ("new_value": "new_value") and after update the result must look like this:
id | items
---+------------------------------------------------------------
1 |{"item_1": {"status": "status_1", "new_value": "new_value"}}
2 |{"item_2": {"status": "status_2", "new_value": "new_value"}}
...
i've tried to do this:
WITH result AS (
INSERT INTO foo (id, items)
SELECT id, options || newvalue as res
FROM foo AS bar,
jsonb_each(bar.items::jsonb) AS item,
to_jsonb(item.value) AS options,
jsonb_build_object('new_value', 'new_value') as newvalue
WHERE id IN ('1', '2'...)
ON CONFLICT (id)
DO UPDATE
SET items = foo.items || Excluded.items::jsonb RETURNING *)
SELECT item.key AS itemkey
FROM result AS res,
jsonb_each(res.items) AS item,
to_jsonb(item.value) AS options;
but when i run this script the postgres shows this error message:
on conflict do update command cannot affect row a second time postgres
i dont understand what am i doing wrong?
UPDATE#1
Postgres version 9.6
table foo id = TEXT UNIQUE NOT NULL
about why INSERT but not just UPDATE? the answer is this is my mistake first mistake.
after some reading postgres functions finally i find out:
UPDATE foo as t
SET items = result.new_value
FROM (SELECT st.id, new_value
FROM foo AS st
CROSS JOIN LATERAL (
SELECT jsonb_object_agg(the_key, the_value || '{"new_value": "some_new_value"}'::jsonb) AS new_value
FROM jsonb_each(st.items) AS x(the_key, the_value)
LIMIT 1) AS n) AS result
WHERE result.id = t.id;
I'm trying to select multiple matches from a field, I have the query working as SQL but am at a loss when it comes to the jOOQ version. Here's the SQL:
SELECT
array_to_string(array(select array_to_string(regexp_matches(m.content, '#([a-zA-Z0-9]+)', 'g'), ' ')), ' ') hashtags
FROM tweets
But I can't work out how to pass in the 'g' flag to regexp_matches, and Postgres doesn't support g: style embedded flags. At the moment I'm using (in Scala):
val hashtags = DSL.field(
"array_to_string(array(select array_to_string(regexp_matches(tweets.content, '#([a-zA-Z0-9]+)', 'g'), ' ')), ' ')",
classOf[String])
but that seems kind of gross (but it works, so there's that! 😀).
Current Approach
I have an enum with fields associated with it, like this:
MentionUri(MENTIONS.URL),
Content(MENTIONS.CONTENT),
Hashtags(DSL.field(
"array(select array_to_string(regexp_matches(mentions.content, '#([a-zA-Z0-9]+)', 'g'), ' '))").as("hashtags")),
MediaLinks(DSL.field("json_agg(twitter_media)").as("media_links")),
Location(MENTIONS.LOCATION),
the 'g' flag is needed because I want to get all of the matching fragments from the target text (all of the hashtags from the mention content in this case, where mentions are Tweets, Facebook posts, &c.)
and a class with a bunch of optional search parameters, like this:
minFollowing: Option[Int] = None,
maxFollowing: Option[Int] = None,
workflowStates: Set[WorkflowState] = WorkflowState.values.toSet,
onlyVerifiedUsers: Boolean = false,
and then I build up the query based on a list of these fields
private val fields = v.fields.toSet
override def query(sql: DSLContext): Query = {
val fields = v.fields.map(_.field) ++ Seq(M.PROJECT_ID)
var query = (sql
select fields.toSet.asJava
from M
leftJoin MTM on (M.PROJECT_ID === MTM.PROJECT_ID) and (M.ID === MTM.MENTION_ID)
leftJoin TM on (MTM.URL === TM.URL)
where (M.PROJECT_ID in v.criteria.projectIds.asJava))
if (v.criteria.channels.size < numChannels)
query = query and (M.CHANNEL in v.criteria.channels.asJava)
if (v.criteria.mentionTypes.size < numMentionTypes)
query = query and (M.MENTION_TYPE in v.criteria.mentionTypes.asJava)
// ... more critera get added here, finally ...
if (v.max.isDefined)
query groupBy (M.PROJECT_ID, M.ID, M.CHANNEL, M.USERNAME) limit v.max.get
else
query groupBy (M.PROJECT_ID, M.ID, M.CHANNEL, M.USERNAME)
Table users:
userid username
------------------------------------
1 venkatesh duggirala
2 deviprasad
3 dhanu
if user sends username="d" then need to get all records.by using "contains" i am getting 2,3 as result.but 1st record also having "d" in duggirala.
query:
var result = from p in cxt.users
where p.Users.username.Contains(name)
select new
{
p.Userid
};
rewrite your query like this:
var result = from p in cxt.users
where p.Users.username.ToLower().Contains(name.ToLower())
select new
{
p.Userid
};