Delete a rule within PostgreSQL - postgresql

I have created a rule using the following syntax:
CREATE [ OR REPLACE ] RULE name AS ON event
TO table [ WHERE condition ]
DO [ ALSO | INSTEAD ] { NOTHING | command | ( command ; command ... ) }
I now want to delete this rule and have been searching the documentation to explain how to do so, but I cannot seem to find the answer I am looking for.
Any advice would be appreciated.
Thanks.

Starting with PostgreSQL 8.2, the syntax is:
DROP RULE [ IF EXISTS ] name ON relation [ CASCADE | RESTRICT ]
Example:
DROP RULE newrule ON mytable;

The syntax is:
DROP RULE
ref: https://www.postgresql.org/docs/current/static/sql-droprule.html

Related

is there an way to upload 212 columns csv files in PostgreSQL

I have a csv file with 122 columns I am trying this in Postgres. I am trying this
create tble appl_train ();
\copy appl_train FROM '/path/ to /file' DELIMITER ',' CSV HEADER;
I get this error
ERROR: extra data after last expected column
CONTEXT: COPY application_train, line 2: "0,100001,Cash loans,F,N,Y,0,135000.0,568800.0,20560.5,450000.0,Unaccompanied,Working,Higher educatio..."
The error message means that the number of columns of your table is less then the number of columns of your csv files.
If the DDL of your table is exactly what you reported, you created a table with no columns. You have to enumerate (at least) all column name and column data type while creating a table, as reported from documentation:
CREATE [ [ GLOBAL | LOCAL ] { TEMPORARY | TEMP } | UNLOGGED ] TABLE [ IF NOT EXISTS ] table_name ( [
{ column_name data_type [ COLLATE collation ] [ column_constraint [ ... ] ]
| table_constraint
| LIKE parent_table [ like_option ... ] }
[, ... ]
] )
[ INHERITS ( parent_table [, ... ] ) ]
[ WITH ( storage_parameter [= value] [, ... ] ) | WITH OIDS | WITHOUT OIDS ]
[ ON COMMIT { PRESERVE ROWS | DELETE ROWS | DROP } ]
[ TABLESPACE tablespace ]
In your code you should have something like this:
create table appl_train (
first_column_name integer,
second_column_name integer,
third_column_name character varying (20),
// more fields here
)

Change Json text to json array

Currently in my table data is like this
Field name : author
Field Data : In json form
When we run select query
SELECT bs.author FROM books bs; it returns data like this
"[{\"author_id\": 1, \"author_name\": \"It happend once again\", \"author_slug\": \"fiction-books\"}]"
But I need selected data should be like this
[
{
"author_id": 1,
"author_name": "It happend once again",
"author_slug": "fiction-books"
}
]
Database : PostgreSql
Note : Please avoid PHP code or iteration by PHP code
The answer depends on the version of PostgreSQL you are using and ALSO what client you are using but PostgreSQL has lots of builtin json processing functions.
https://www.postgresql.org/docs/10/functions-json.html
Your goal is also not clearly defined...If all you want to do is pretty print the json, this is included.
# select jsonb_pretty('[{"author_id": 1,"author_name":"It happend once again","author_slug":"fiction-books"}]') as json;
json
-------------------------------------------------
[ +
{ +
"author_id": 1, +
"author_name": "It happend once again",+
"author_slug": "fiction-books" +
} +
]
If instead you're looking for how to populate a postgres record set from json, this is also included:
# select * from json_to_recordset('[{"author_id": 1,"author_name":"It happend once again","author_slug":"fiction-books"}]')
as x(author_id text, author_name text, author_slug text);
author_id | author_name | author_slug
-----------+-----------------------+---------------
1 | It happend once again | fiction-books

Postgresql grant command syntax

I'm having trouble with the syntax to grant a developer the ability to create or replace a function. The syntax guide doesn't seem to show how to do this. Can anyone provide the correct syntax?
Sorry, still earning the reps for comments.
Can you please share your GRANT statement and the complete error message?
I created a test table with a few rows of data, and a test function that returns a rowcount on the test table.
I granted execute to user2 with no errors:
grant execute on function f_totrecords() to user2 with grant option ;
It's in the docs - search the page for "function" and notice the "with grant option" in the syntax:
GRANTS
GRANT { EXECUTE | ALL [ PRIVILEGES ] }
ON { FUNCTION function_name ( [ [ argmode ] [ arg_name ] arg_type [, ...] ] ) [, ...]
| ALL FUNCTIONS IN SCHEMA schema_name [, ...] }
TO { [ GROUP ] role_name | PUBLIC } [, ...] [ WITH GRANT OPTION ]

Import csv file beginning on specific line number

I want to import a csv file into a table beginning on line 9 of the csv file. How do I specify this condition in postgresql?
The first 8 lines have a bunch of irrelevant text describing the data below. This is a screenshot of the file imported into Excel.
And this is the table in my db I am trying to insert the data into.
CREATE TABLE trader.weather
(
station text NOT NULL,
"timestamp" timestamp with time zone NOT NULL,
temp numeric(6,2),
wind numeric(6,2)
)
It can't be done on PostgreSQL, you should do it with an external tool or process before postgres.
According to the manual, the only processes you can do to a CSV are mostly QUOTE or NULL related:
COPY table_name [ ( column_name [, ...] ) ]
FROM { 'filename' | STDIN }
[ [ WITH ]
[ BINARY ]
[ OIDS ]
[ DELIMITER [ AS ] 'delimiter' ]
[ NULL [ AS ] 'null string' ]
[ CSV [ HEADER ]
[ QUOTE [ AS ] 'quote' ]
[ ESCAPE [ AS ] 'escape' ]
[ FORCE NOT NULL column_name [, ...] ] ] ]
COPY { table_name [ ( column_name [, ...] ) ] | ( query ) }
TO { 'filename' | STDOUT }
[ [ WITH ]
[ BINARY ]
[ OIDS ]
[ DELIMITER [ AS ] 'delimiter' ]
[ NULL [ AS ] 'null string' ]
[ CSV [ HEADER ]
[ QUOTE [ AS ] 'quote' ]
[ ESCAPE [ AS ] 'escape' ]
[ FORCE QUOTE { column_name [, ...] | * } ] ] ]
There are many ways to alter a CSV automatically before using it in PostgreSQL, you should check other options.
It can be done with Postgres, just not with COPY directly.
Use a temporary staging table like this:
CREATE TEMP TABLE target_tmp AS
TABLE target_tbl LIMIT 0; -- create temp table with same columns as target table
COPY target_tmp FROM '/absolute/path/to/file' (FORMAT csv);
INSERT INTO target_tbl
TABLE target_tmp
OFFSET 8; -- start with line 9
DROP TABLE target_tmp; -- optional, else it's dropped at end of session automatically
The skipped rows must be valid, too.
Obviously, this is more expensive - which should not matter much with small to medium tables. Matters with big tables. Then you really should trim the surplus rows in the input file before importing.
Make sure your temp_buffers setting is big enough to hold the temp table to minimize the performance penalty.
Related (with instructions for \copy without superuser privileges):
How to update selected rows with values from a CSV file in Postgres?

How to create session level table in PostgreSQL?

I am working on an application using Spring, Hibernate, and PostgreSQL 9.1. The requirement is user can upload bulk data from the browser.
Now the data getting uploaded by each user is very crude and requires lots of validation before it can be put into the actual transaction table. I want a temporary table to be created whenever a user uploads; after data is successfully dumped into this temp table, I will call a procedure to perform the actual work of validating and taking the data from the temp table to the transaction table. If anywhere error is encountered then I will dump logs to any other table so the user can know the status of their upload from the browser.
In PostgreSQL do we have anything like temporary, session-level table?
From the 9.1 manual:
CREATE [ [ GLOBAL | LOCAL ] { TEMPORARY | TEMP } | UNLOGGED ] TABLE [ IF NOT EXISTS ] table_name ( [
{ column_name data_type [ COLLATE collation ] [ column_constraint [ ... ] ]
| table_constraint
| LIKE parent_table [ like_option ... ] }
[, ... ]
] )
[ INHERITS ( parent_table [, ... ] ) ]
[ WITH ( storage_parameter [= value] [, ... ] ) | WITH OIDS | WITHOUT OIDS ]
[ ON COMMIT { PRESERVE ROWS | DELETE ROWS | DROP } ]
[ TABLESPACE tablespace ]
The key word here is TEMPORARY although it is not necessary to the table to be temporary. It could be a permanent table that you truncate before inserting. The whole operation (inserting and validating) would have to be wrapped in a transaction.