jsonb with psycopg2 RealDictCursor - postgresql

I have a postgresql 9.4 (aka mongodb killer ;-) ) and this simple schema :
CREATE TABLE test (id SERIAL, name text, misc jsonb);
now i populate this, if i make a select it will show something like
id | name | misc
1 | user1 | { "age" : 23, "size" : "M" }
2 | user2 | { "age" : 30, "size" : "XL" }
now, if i make a request with psycopg2,
cur.execute("SELECT * FROM test;")
rows = list(cur)
i'll end up with
[ { 'id' : 1, 'name' : 'user1', 'misc' : '{ "age" : 23, "size" : "M" }' },
{ 'id2' : 2, 'name' : 'user2', 'misc' : '{ "age" : 30, "size" : "XL' }' }]
what's wrong you would tell me ? well misc is type str. i would expect it to be recognized as json and converted as Python dict.
from psycopg2 doc (psycopg2/extras page) it states that "Reading from the database, json values will be automatically converted to Python objects."
with RealDictCursor it seems that it is not the case.
it means that that i cannot access rows[0]['misc']['age'] as it would be convenient...
ok, i could do manually with
for r in rows:
r['misc'] = json.loads(r['misc'])
but if i can avoid that because there's a nicer solution...
ps.
someone with 1500+ rep could create the postgresql9.4 tag ;-)

Current psycopg version (2.5.3) doesn't know the oid for the jsonb type. In order to support it it's enough to call:
import psycopg2.extras
psycopg2.extras.register_json(oid=3802, array_oid=3807, globally=True)
once in your project.
You can find further information in this ML message.

Works out of the box with psycopg2 2.7.1 (not need to json.loads -- dictionaries are what come out of queries.)
sudo apt-get install libpq-dev
sudo pip install psycopg2 --upgrade
python -c "import psycopg2 ; print psycopg2.__version__"
2.7.1 (dt dec pq3 ext lo64)

Related

How to read table data row by row of postgres db using shell script

How to read table data row by row of postgres db using shell script.
I tried out this:
psql -d db_name -t -c "SELECT * FROM table_name" | while read -a Record ; do
echo $Record[0]
echo $Record[1]
done
but this apporach is giving me data like:
Apple
|
Why this | is coming when I'm fetching row data only.
Actually I want to create json objects out of a db table data.
in the format:
column-name : value,
column-name : value
..... ;
Something like that
Table name -> student
Fields :
id : string
name : string
age : int
inSchool : boolean
Table data :
ID Name Age inSchool
1 Amit 18 Yes
2 Sunil 21 No
3 Anil 17 Yes
The output i want :
[
{
id : 1,
name : Amit,
age : 18,
inSchool : 1;
},
{
id : 2,
name : Sunil,
age : 21,
inSchool : 0;
},
{
id : 3,
name : Anil,
age : 17,
inSchool : 1;
}
]
If there's is any good way, please help me.
Let Postgres do the aggregation and spool the output into file after turning off header formatting:
postgres=> \t
Tuples only is on.
postgres=> \o data.json
postgres=> select jsonb_agg(to_jsonb(s)) from student s;
Or in a single line:
psql -d db_name -t -c "select jsonb_agg(to_jsonb(s)) from student s" -o data.json
After that, the file data.json will contain the entire table as a huge JSON array.

Change Json text to json array

Currently in my table data is like this
Field name : author
Field Data : In json form
When we run select query
SELECT bs.author FROM books bs; it returns data like this
"[{\"author_id\": 1, \"author_name\": \"It happend once again\", \"author_slug\": \"fiction-books\"}]"
But I need selected data should be like this
[
{
"author_id": 1,
"author_name": "It happend once again",
"author_slug": "fiction-books"
}
]
Database : PostgreSql
Note : Please avoid PHP code or iteration by PHP code
The answer depends on the version of PostgreSQL you are using and ALSO what client you are using but PostgreSQL has lots of builtin json processing functions.
https://www.postgresql.org/docs/10/functions-json.html
Your goal is also not clearly defined...If all you want to do is pretty print the json, this is included.
# select jsonb_pretty('[{"author_id": 1,"author_name":"It happend once again","author_slug":"fiction-books"}]') as json;
json
-------------------------------------------------
[ +
{ +
"author_id": 1, +
"author_name": "It happend once again",+
"author_slug": "fiction-books" +
} +
]
If instead you're looking for how to populate a postgres record set from json, this is also included:
# select * from json_to_recordset('[{"author_id": 1,"author_name":"It happend once again","author_slug":"fiction-books"}]')
as x(author_id text, author_name text, author_slug text);
author_id | author_name | author_slug
-----------+-----------------------+---------------
1 | It happend once again | fiction-books

MongoDB Loop Query

db.fs.files.distinct( "metadata.user" )
[
"5027",
"6048",
"6049",
]
The below X represents where I would like the numbers from the above query to appear.
db.fs.files.find({ 'metadata.user' : X, 'metadata.folder' : 'inbox' }).count()
I'm trying to find a way to iterate through each of the users in the first query and count the total number of results in the second query. Is there an easy way to craft this query in the MongoDB Shell?
The output I would be looking for would be (just looking for pure numbers):
User Inbox
5027 9872
6048 12
6049 125
Update:
I was able to accomplish something pretty close to what I was looking for:
# for x in $(psql -d ****** postgres -c "select user_name from users where user_name ~ '^[0-9]+$' order by user_name;" | tail -n +3 | head -n -2); do mongo vmdb --quiet --eval "db.fs.files.find({ 'metadata.user' : '$x'}).count()"; done| sort -nr
1381
1073
982
However, i'm missing out on the username part. The point is to generate a list of users with the number of messages in their mailboxes.
Please try this
var myCursor = db.collection.find( "metadata.user" );
while(myCursor.hasNext()) {
db.fs.files.find({ 'metadata.user' : X, 'metadata.folder' : 'inbox' }).count();
}

How can I export a PostgreSQL table to HTML?

How can I save a PostgreSQL table to HTML?
I'll take a stab at assuming what you mean. In psql:
dbname=# \H
dbname=# \d tablename
Here is an example of an XML "forest":
SELECT xmlforest (
"FirstName" as "FName", "LastName" as "LName",
’string’ as "str", "Title", "Region" )
FROM "Demo"."demo"."Employees";
With some data in the employees table, this might result in:
<FName>Nancy</FName>
<LName>Davolio</LName>
<str>string</str>
<Title>Sales Representative</Title>
<Region>WA</Region>
...
<FName>Anne</FName>
<LName>Dodsworth</LName>
<str>string</str>
<Title>Sales Representative</Title>
http://wiki.postgresql.org/wiki/XML_Support

Does SELECT DISTINCT work with Perl's DBD::CSV?

I found a SELECT-example on the web.
When I try it in my script I get this error-message:
Specifying DISTINCT when using aggregate functions isn't reasonable - ignored. at /usr/lib/perl5/site_perl/5.10.0/SQL/Parser.pm line 496.
#!/usr/bin/perl
use warnings;
use strict;
use DBI;
my $dbh = DBI->connect( "DBI:CSV:", undef, undef, { RaiseError => 1, AutoCommit => 1 } );
my $table = 'artikel';
my $array_ref = [ [ 'a_nr', 'a_name', 'a_preis' ],
[ 12, 'Oberhemd', 39.80, ],
[ 22, 'Mantel', 360.00, ],
[ 11, 'Oberhemd', 44.20, ],
[ 13, 'Hose', 119.50, ],
];
$dbh->do( "CREATE TEMP TABLE $table AS IMPORT(?)", {}, $array_ref );
my $sth = $dbh->prepare( "SELECT DISTINCT a_name FROM $table" );
$sth->execute();
$sth->dump_results();
$dbh->disconnect();
Does SELECT DISTINCT not work with DBD::CSV or is something wrong with my script?
edit:
The output is
'Oberhemd'
'Mantel'
'Oberhemd'
'Hose'
4 rows
I thought it should be
'Oberhemd'
'Mantel'
'Hose'
3 rows
Installed versions:
Perl : 5.010000 (x86_64-linux-thread-multi)
OS : linux (2.6.31)
DBI : 1.609
DBD::Sponge : 12.010002
DBD::SQLite : 1.25
DBD::Proxy : 0.2004
DBD::Gofer : 0.011565
DBD::File : 0.37
DBD::ExampleP : 12.010007
DBD::DBM : 0.03
DBD::CSV : 0.26
Hi This is an easily reproducable bug. SELECT data_display_mask FROM test.csv returns 200 plus rows. SELECT DISTINCT data_display_mask FROM test.csv returns the warning message and same 200 rows.
If i do an awk, sort -u for unique ( values of the row ) I get 36 values, which is what I would expect.
Certainly a bug in the code.
-Kanwar
perl -V
Summary of my perl5 (revision 5 version 10 subversion 0) configuration:
Platform:
osname=linux, osvers=2.2.24-6.2.3, archname=i686-linux-thread-multi
DBD::CSV 0.26
SQL::Parser 1.23
DBI 1.609
example:
Specifying DISTINCT when using aggregate functions isn't reasonable - ignored. at /opt/perl2exe/perl5/lib/site_perl/5.10.0/SQL/Parser.pm line 496.
87060
87060
87060
87060
SQL used is SELECT DISTINCT entry_id FROM test.csv
Note that the message about something being not reasonable is
Only a warning. Your script works nevertheless.
Confusing and non-sensical: you don't use any aggregate functions.
I smell a bug in either DBD::CSV or SQL::Statement.
Edit: DISTINCT is explicitly allowed in SQL::Statement
my $sth = $dbh->prepare("SELECT DISTINCT $attributeName1, COUNT( $attributeName2) FROM tableName GROUP BY $attributeName1, $attributeName2");
this gave me: attributeName1 and a distinct count of attributeName2
This is an example of a more general phenomenon with DBD::CSV, namely, that it allows a lot of SQL syntax for which the meaning is silently ignored.
I have seen cases of SELECT DISTINCT that actually filter out duplicates, so the case you mention here seems a bug,
but I haven't found a way to make the DISTINCT in SELECT COUNT(DISTINCT foo) FROM bar do anything.
Works for me. I get 3 rows back.
$ perl x.pl
'Oberhemd'
'Mantel'
'Hose'
3 rows
perl -MDBI -le 'DBI->installed_versions;'
Perl : 5.010001 (i686-linux-gnu-thread-multi)
OS : linux (2.6.24-28-server)
DBI : 1.617
DBD::mysql : 4.020
DBD::Sys : 0.102
DBD::Sponge : 12.010002
DBD::SQLite : 1.33
DBD::Proxy : 0.2004
DBD::Pg : 2.17.2
DBD::Oracle : 1.38
DBD::ODBC : 1.33
DBD::Multiplex : 2.014122
DBD::Gofer : 0.015057
DBD::File : 0.40
DBD::ExampleP : 12.014310
DBD::DBM : 0.06
DBD::CSV : 0.30
Added:
perl -MSQL::Statement -le 'print $SQL::Statement::VERSION'
1.31
Version 1.23, release November 20th, 2009
* Correct handling of DISTINCT in aggregate functions
I met the same problem.
You can turn around this problem using a GROUP BY statement instead of DISTINCT.
This is just a turn around waiting for the resolution of a bug ...