How get PostgreSQL version using psycopg2? - postgresql

I have the follow statements:
from alembic import op
conn = op.get_bind()
Now I want to get postgresql version.

According to documentation it is server_version property of connection:
conn = psycopg2.connect(settings.DB_DSN)
>>> conn.server_version
90504
The number is formed by converting the major, minor, and revision numbers into two-decimal-digit numbers and appending them together. For example, version 8.1.5 will be returned as 80105.

If you can run raw SQL queries, just run the following:
SELECT version();

If you are using ubuntu then you can execute command
psql -V
to check postgresql version where
V must be in capital.
and to check version of psycopg2 you can run following command
pip freeze | grep psycopg2
alternatively you can execute
pip freeze
to show all the installed packages with their respective version

import os
import psycopg2
conn_str = {
'user': os.environ['USERNAME'].lower(),
'host': 'POSTGES_SERVER_NAME',
'port': 5432,
'database': 'database_name'
}
conn = psycopg2.connect(**conn_str)
cursor = conn.cursor()
cursor.execute('SELECT VERSION()')
row = cursor.fetchone()
print(row)
cursor.close()
# output:
# ('PostgreSQL 10.1, compiled by Visual C++ build 1800, 64-bit',)

Related

Pyodbc connection with amazon rds postgres database produces error when executing SQL commands (syntax error)

I have set up a connection between pyodbc and the aws rds (postgresql database) and have installed psqlodbc (which is what the Postgres Unicode(x64) odbc driver is). Everything looks fine until I run a SQL query. It returns a syntax error but there is nothing wrong with my syntax. I'm not exactly sure what would be the issue.
This is Python 3.7 by the way.
import pyodbc
mypw = 'skjhaf234234dkjhkjx'
string = 'Driver={PostgreSQL Unicode(x64)};Server=myfakeserveraddress.rds.amazonaws.com;Database=mydb;UID=myusername;PWD='+mypw+';'
connection = pyodbc.connect(string)
c = connection.cursor()
c.execute("SELECT * FROM schema_table.test_table;")
Error Message:
Traceback (most recent call last):
File "", line 1, in
pyodbc.ProgrammingError: ('42601', '[42601] ERROR: syntax error at or near "'schema_table.test_table'";\nError while executing the query (1) (SQLExecDirectW)')
Without the single quotation marks ' surrounding the table name, I get this error
c.execute("SELECT * from schema_table.test_table")
Traceback (most recent call last): File "", line 1, in
pyodbc.ProgrammingError: ('25P02', '[25P02] ERROR: current
transaction is aborted, commands ignored until end of transaction
block;\nError while executing the query (1) (SQLExecDirectW)')
PS My company has disabled pip installs so I cannot upgrade my packages and am limited to using only a few packages (including this one).
How can I execute my commands without errors?
It seems I have figured it out.... I added autocommit=False to the connection initialization and it seems fine now.... Perhaps it has something to do with the underlying parsing of the sql commands. Keeping the question in case it helps someone.
import pyodbc
mypw = 'skjhaf234234dkjhkjx'
string = 'Driver={PostgreSQL Unicode(x64)};Server=myfakeserveraddress.rds.amazonaws.com;Database=mydb;UID=myusername;PWD='+mypw+';'
connection = pyodbc.connect(string, autocommit=False)
c = connection.cursor()
c.execute("SELECT * FROM schema_table.test_table;")

PostGIS: function ST_AsRaster does not exist. Even using examples from the docs

I'm trying to convert geometries to images, and the functions to do so don't seem to exist.
The following example is from the ST_AsRaster Docs WHich specify the requirements are Availability: 2.0.0 - requires GDAL >= 1.6.0.
SELECT ST_AsPNG(ST_AsRaster(ST_Buffer(ST_Point(1,5),10),150, 150));
This results in:
ERROR: function st_asraster(geometry, integer, integer) does not exist
LINE 1: SELECT ST_AsPNG(ST_AsRaster(ST_Buffer(ST_Point(1,5),10),150,...
I found some info that points towards needing GDAL drivers, however, when I try:
SELECT short_name, long_name FROM ST_GdalDrivers();
I get:
ERROR: function st_gdaldrivers() does not exist
LINE 1: SELECT short_name, long_name FROM ST_GdalDrivers();
I have no idea where to even go to try solving this, why don't the functions exist, was there some config I needed to add, some doc I didn't read?
Even the https://postgis.net/docs/RT_reference.html seems to suggest that it should "just work"?
This is installed from the package manager on Ubuntu 20.0.4.
Version Info SELECT PostGIS_Full_Version();:
POSTGIS="3.0.0 r17983" [EXTENSION]
PGSQL="120"
GEOS="3.8.0-CAPI-1.13.1 "
PROJ="6.3.1"
LIBXML="2.9.4"
LIBJSON="0.13.1"
LIBPROTOBUF="1.3.3"
WAGYU="0.4.3 (Internal)"
You must have forgotten to install the postgis_raster extension:
CREATE EXTENSION postgis_raster;
This extension is new in PostGIS 3.0; before that, its objects were part of the postgis extension.
The documentation mentions that:
Once postgis is installed, it needs to be enabled in each individual database you want to use it in.
psql -d yourdatabase -c "CREATE EXTENSION postgis;"
-- if you built with raster support and want to install it --
psql -d yourdatabase -c "CREATE EXTENSION postgis_raster;"

How to stop 'import psycopg2' from causing an Exception when starting an Azure Container?

I am trying to deploy a Django REST API using Azure App Service on Linux. I am using a postgresql Database and deploy via pipeline. Azure has postgresql 9.6. After running my pipeline, the Website shows an Server Error (500).
The AppLogs show, that the Container couldn't be started due an failed import of psycopg2.
[ERROR] Exception in worker process
Traceback (most recent call last):
File "/home/site/wwwroot/antenv/lib/python3.7/site-packages/django/db/backends/postgresql/base.py", line 25, in
import psycopg2 as Database
File "/home/site/wwwroot/antenv/lib/python3.7/site-packages/psycopg2/__init__.py", line 50, in
from psycopg2._psycopg import ( # noqa
ImportError: /home/site/wwwroot/antenv/lib/python3.7/site-packages/psycopg2/_psycopg.cpython-37m-x86_64-linux-gnu.so: undefined symbol: PQencryptPasswordConn
In the Build-stage of my pipeline, I set up my environment (python3.7) like this:
- script: |
python -m venv antenv
source antenv/bin/activate
python -m pip install --upgrade pip
pip install setup
pip install -r requirements.txt
Where requirements.txt looks like this:
Django==3.0.2
djangorestframework==3.11.0
psycopg2-binary==2.8.4
pandas==0.25.3
pytest==5.3.5
pytest-django==3.8.0
pytest-mock==2.0.0
python-dateutil==2.8.1
sqlparse==0.3.0
whitenoise==5.0.1
BuildJob and DeploymentJob seem to run flawless. the Build-logs indicate that psycopg2_binary-2.8.4-cp37-cp37m-manylinux1_x86_64.whl was correctly downloaded and installed.
Also the App runs fine on my machine when using the database on azure by configuring in the settings.py:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'databasename',
'USER': 'user#postgresqlserver',
'PASSWORD': 'Password',
'HOST': 'postgresqlserver.postgres.database.azure.com',
'PORT': '',
'OPTIONS': {'sslmode': 'require'}
}
} # Of course the info is actually saved in environment variables
This gives me the feeling, that something with the psycopg2 installation is not working... For others the *psycopg2-binary seemed to do the trick but unfortunateley not for me.
Am I right to assume that on azure I'm nether able to install postgresql10 as suggested here https://github.com/psycopg/psycopg2/issues/983 nor can install from source like suggested here https://github.com/psycopg/psycopg2/issues/1018?
There must be something I am missing, I would be grateful for any advice!
EDIT:
Taking a look at the library (as suggested here https://stackoverflow.com/a/59652816/13183775) I found that I don't have a PQencryptPasswordConn function but only a PQencryptPassword function. I have the feeling that this is expected for Postgresql9.6 (https://github.com/psycopg/psycopg2/blob/cb3353be1f10590cdc2a894ada42c3b4c171feb7/psycopg/psycopgmodule.c#L466).
To check, whether there are multiple versions libpq:
/>find . -name "libpq*"
./var/lib/dpkg/info/libpq5:amd64.symbols
./var/lib/dpkg/info/libpq5:amd64.shlibs
./var/lib/dpkg/info/libpq5:amd64.list
./var/lib/dpkg/info/libpq5:amd64.triggers
./var/lib/dpkg/info/libpq-dev.list
./var/lib/dpkg/info/libpq5:amd64.md5sums
./var/lib/dpkg/info/libpq-dev.md5sums
./usr/share/doc/libpq5
./usr/share/doc/libpq-dev
./usr/share/locale/ko/LC_MESSAGES/libpq5-9.6.mo
./usr/share/locale/it/LC_MESSAGES/libpq5-9.6.mo
./usr/share/locale/pl/LC_MESSAGES/libpq5-9.6.mo
./usr/share/locale/zh_TW/LC_MESSAGES/libpq5-9.6.mo
./usr/share/locale/tr/LC_MESSAGES/libpq5-9.6.mo
./usr/share/locale/cs/LC_MESSAGES/libpq5-9.6.mo
./usr/share/locale/de/LC_MESSAGES/libpq5-9.6.mo
./usr/share/locale/ru/LC_MESSAGES/libpq5-9.6.mo
./usr/share/locale/sv/LC_MESSAGES/libpq5-9.6.mo
./usr/share/locale/pt_BR/LC_MESSAGES/libpq5-9.6.mo
./usr/share/locale/fr/LC_MESSAGES/libpq5-9.6.mo
./usr/share/locale/es/LC_MESSAGES/libpq5-9.6.mo
./usr/share/locale/zh_CN/LC_MESSAGES/libpq5-9.6.mo
./usr/share/locale/ja/LC_MESSAGES/libpq5-9.6.mo
./usr/lib/x86_64-linux-gnu/pkgconfig/libpq.pc
./usr/lib/x86_64-linux-gnu/libpq.so.5
./usr/lib/x86_64-linux-gnu/libpq.so
./usr/lib/x86_64-linux-gnu/libpq.so.5.9
./usr/lib/x86_64-linux-gnu/libpq.a
./usr/include/postgresql/libpq-events.h
./usr/include/postgresql/libpq-fe.h
./usr/include/postgresql/libpq
./usr/include/postgresql/libpq/libpq-fs.h
./usr/include/postgresql/internal/libpq
./usr/include/postgresql/internal/libpq-int.h>
Sadly I'm not able to see here wether there are multiple libpq versions...

Chef PostgreSQL Cookbook Installs Wrong Version on RHEL 7

Running the latest version (6.1.1) of the postgresql Chef cookbook (https://supermarket.chef.io/cookbooks/postgresql) with
node.default['postgresql']['enable_pgdg_yum'] = 'true'
node.default['postgresql']['version'] = '9.3'
This installs postgresql in /var/lib/pgsql/9.3, but running
psql -V
returns
psql (PostgreSQL) 9.2.33
You have to overwrite more at least version, dir, client, contrib and server packages:
node.default["postgresql"]["version"] = "9.3"
node.default["postgresql"]["dir"] = "/etc/postgresql/9.3/main"
node.default["postgresql"]["client"]["packages"] = ["postgresql-client-9.3", "libpq-dev"]
node.default["postgresql"]["server"]["packages"] = ["postgresql-9.3"]
node.default["postgresql"]["contrib"]["packages"] = ["postgresql-contrib-9.3"
It is just an example, I am not sure about package names, double check it. It is due to the way ruby evaluates strings.

Loading Data from PostgreSQL into Stata

When I load data from PostgreSQL into Stata some of the data has unexpected characters appended. How can I avoid this?
Here is the Stata code I am using:
odbc query mydatabase, schema $odbc
odbc load, exec("SELECT * FROM my_table") $odbc allstring
Here is an example of the output I see:
198734/0 one/0/r April/0/0/0
893476/0 two/0/r May/0/0/0
324192/0 three/0/r June/0/0/0
In Postgres the data is:
198734 one April
893476 two May
324192 three June
I see this in mostly in larger tables and with fields of all datatypes in PostgreSQL. If I export the data to a csv there are no trailing characters.
The odbci.ini file I am using looks like this:
[ODBC Data Sources]
mydatabase = PostgreSQL
[mydatabase]
Debug = 1
CommLog = 1
ReadOnly = no
Driver = /usr/lib64/psqlodbcw.so
Servername = myserver
Servertype = postgres
FetchBufferSize = 99
Port = 5432
Database = mydatabase
[Default]
Driver = /usr/lib64/psqlodbcw.so
I am using odbc version unixODBC 2.3.1 and PostgreSQL version 9.4.9 with server encoding UTF8 and Stata version 14.1.
What is causing the unexpected characters in the data imported into Stata? I know that I can clean the data once it’s in Stata but I would like to avoid this.
I was able to fix this by adding the line
set odbcdriver ansi
to the Stata code.