Sum a field in a jsonb column with an Ecto query - postgresql

Say I have a jsonb type column in a Postgres DB, called info. One of the fields is bytes, which is stored as an integer in the info field.
If I try and sum the values of the info => bytes field in an Ecto Query, as below:
total_bytes = Repo.one(from job in FilesTable,
select: sum(fragment("info->>'bytes'")))
I get the error function sum(text) does not exist.
Is there a way to write the query above so that info => bytes can be summed, or would I have to just select that field from each row in the database, and then use Elixir to add up the values?

The error message says that it can't sum a text field. You need to explicitly cast the field to an integer so that sum works.
Also, it's incorrect to hardcode a column name in a fragment. It only works in this case because you're selecting from only one table. If you had some join statements in there with other tables with the same column name, the query won't work. You can use ? in the string and then pass the column as argument.
Here's the final thing that should work:
sum(fragment("(?->>'bytes')::integer", job.info)))

Related

Postgres get all fields that are not certain values (including nulls)

I'm looking to filter a table but, the number of queries i'm expecting differ from the result.
SELECT *
FROM table
WHERE name NOT IN ('matrix', 'filters')
That name column contains strings values and nulls. It seems like the nulls are being filtered out but, I would like them included in the result

Postgresql parse string values to integers

Is there a way in Postgres I can parse string values to integers? Basically I'm trying to query one table (let's call it table_one) using values from another table (table_two) in a character varying column.
Say SELECT char_column FROM table_two results in "2,4,6,8", I'd like to use this result in a second query as;
SELECT column FROM table_one WHERE some_id IN (2,4,6,8)
How can I get the string "2,4,6,8" to values 2,4,6,8 so as to be able to use it in the second query?
I've tried casting and to_number functions to no success.
SELECT column
FROM table
WHERE other_column = ANY(string_to_array('2,4,6,8', ',')::INT[])
Please try this:
SELECT column FROM table WHERE other_column IN (
SELECT NULLIF(i,'')::int
FROM regexp_split_to_tables('2,4,6,8',',') t(i)
)
Explanation:
The part regexp_split_to_tables('2,4,6,8',',') will split the string into a table. Then you cast it into integer.
Hopefully it will help you.

Only getting 1 result from postgres tsvector

I am using PostgreSQL 9.3. I have built a dataset with a tsvector field called vector.
Then I execute a query against it
SELECT id, vector, relative_path, title
FROM site_server.indexed_url, plainto_tsquery('english','booking') query
WHERE vector ## query;
Only 1 row is returned. When I look at the data there are at least 6 rows that would match. How do I get it to retrieve all matching records?
Data file
Values in vector column in your data sample are not normalized. Which is ignored on COPY, as per docs:
It is important to understand that the tsvector type itself does not
perform any word normalization; it assumes the words it is given are
normalized appropriately for the application
If you run:
SELECT id, vector, relative_path, title
FROM site_server.indexed_url
WHERE to_tsvector(vector) ## plainto_tsquery('english','booking') query;
It will produce expected result I think.

How to turn a column of ints into one array in postgres

I currently have a table with one column and 400 rows; each row has an integer. How can I create an int array with all of these integers that preserves order?
I am using postgreSQL-9.2.
select array_agg(int_column order by some_column) as int_array_column
from the_table;
Where some_column is the column that defines the "order" of the integer values. Rows in relational database do not have "an order", so your request "that preserves order" only makes sense if you have a column that defines that sort order that you try to preserve.
SELECT array_agg(column_name ORDER by sort_column_name) AS ints
FROM table

T-SQL LEFT JOIN on bigint id return only ids lower than 101 on right table

I have two tables on a Sql Server 2008.
ownership with 3 fields and case with another 3 fields I need to join both on the ID field (bigint).
For testing purposes I'm only using one field from each table. This field is bigint and has values from 1 to 170 (for now).
My query is:
SELECT DISTINCT
ownership.fCase,
case.id
FROM
ownership LEFT JOIN case ON (case.id=ownership.fCase)
WHERE
ownership.dUser='demo'
This was expected to return 4 rows with the same values on both columns. Problem is that the last row of the right table comes as null for the fCase = 140. This is the only value above 100.
If I run the query without the WHERE clause it show all rows on the left table but the values on the right only apear if below 101 otherwise shows null.
Can someone help me, am I doing something wrong or is this a limitation or a bug?
Case is also a verb so it may be getting confused. Try your table and column names in []. E.G. [case].[id] = [ownership].[fCase]. Are you like double check sure that [case].[id] and [ownership].[fCase] are both bigint. If your current values are 1-170 then why bigint (9,223,372,036,854,775,807)? Does that column accept nulls?