ERROR: could not open file "file_name" for reading: Operation not permitted SQL state: 42501 - import

I'm using pgAdmin and can create the schema and table.
I then right click on the table, open the import tool, choose the file, set the format to csv, click the header check box, set the delimiter to ',' and click 'Import', and get an error
ERROR: could not open file "file_name" for reading: Operation not permitted SQL state: 42501
1.) Tried importing the file manually by right clicking on my table and then using the import/export method.
Did not work
2.) Used the ---> copy 'table_name' from 'file_path' DELIMITER ',' CSV HEADER <---
Did not work
3.) Went to the file itself and granted postgresql permission to read and write on my file
Did not work
4.) Made sure the CSV was not opened while doing all these methods
Did not work
5.) Used the \copy method
Did not work, error said
ERROR: syntax error at or near ""
SQL state: 42601
Character: 1
6.) What are other ways around this?

Related

Problems encountered in recovering three tables from a dump file

I'm trying to restore tables from a dump file. It's illustrated by a footnote in the paper "VCCFinder-Finding Potential Vulnerabilities in Open-Source Projects to Assist Code Audits", that the dump file that the team created with pg_dump could be read with pg_restore. As it's shown in paper footnote with red line to emphasize. That's where I've started.
1. Use pg_restore command
By typing the command mentioned in your paper: VCCFinder: Finding Potential Vulnerabilities in Open-Source Projects to Assist Code Audits:
pg_restore -f vcc_base I:\OneDrive\PractiseProject\x_prjs\m_firmware_scan\m_firmware_scan.ref\vcc-database\vccfinder-database.dump
Windows CMD had returned an error message:
pg_restore: error: input file appears to be a text format dump. Please use psql.
As I had tried the operation in different version, including v14.4, v9.6, v9.4 and v9.3, the outcome is the same error message.
2.Use psql command
Then I turned to another direction: using psql. After typing command,
psql -v ON_ERROR_STOP=1 -U postgres < I:\OneDrive\PractiseProject\x_prjs\m_firmware_scan\m_firmware_scan.ref\vcc-database\vccfinder-database.dump
apart from postgreSQL 14.4 environment, the returned error message is:
psql: SCRAM authentication requires libpq version 10 or above
Under postgreSQL 14.4 environment, the returned message became:
SET
SET
SET
SET
SET
SET
ERROR: schema "export" already exists
If I remove the -v ON_ERROR_STOP=1 option, and returned message would be like this:
SET
SET
SET
SET
SET
SET
ERROR: schema "export" already exists
SET
SET
SET
ERROR: type "public.hstore" does not exist
LINE 27: patch_keywords public.hstore
^
ERROR: relation "cves" already exists
ERROR: relation "repositories" already exists
ERROR: relation "commits" does not exist
invalid command \n
invalid command \N
invalid command \N
...
(Solved) I have tried to solve the unreadable code problem shown in above error messages by typing chcp 65001, chcp 437 and etc to change character set into UTF8 or American English in Windows CMD, but it's not helpful. But after viewing the source code of the dump file in Visual Studio, it's not difficult to infer that those error messages were caused by psql commands in the dump file.
After the error messages became understandable, I focused on one particular error message:
ERROR: type "public.hstore" does not exist
LINE 27: patch_keywords public.hstore
So I manually created a "hstore" type below the "pulic SCHEMA", after that error messages turned into these:
SET
SET
SET
SET
SET
SET
SET
ERROR: schema "export" already exists
SET
SET
SET
ERROR: relation "commits" already exists
ERROR: relation "cves" already exists
ERROR: relation "repositories" already exists
ERROR: malformed record literal: ""do"=>"1", "if"=>"0", "asm"=>"41", "for"=>"5", "int"=>"13", "new"=>"0", "try"=>"0", "auto"=>"0", "bool"=>"0", "case"=>"0", "char"=>"1", "else"=>"0", "enum"=>"0", "free"=>"0", "goto"=>"0", "long"=>"15", "this"=>"0", "true"=>"0", "void"=>"49", "alloc"=>"0", "break"=>"0", "catch"=>"0", "class"=>"0", "const"=>"0", "false"=>"0", "float"=>"0", "short"=>"0", "throw"=>"0", "union"=>"0", "using"=>"0", "while"=>"1", "alloca"=>"0", "calloc"=>"0", "delete"=>"0", "double"=>"0", "extern"=>"4", "friend"=>"0", "inline"=>"18", "malloc"=>"0", "public"=>"0", "return"=>"4", "signed"=>"1", "sizeof"=>"0", "static"=>"32", "struct"=>"4", "switch"=>"0", "typeid"=>"0", "default"=>"0", "mutable"=>"0", "private"=>"0", "realloc"=>"0", "typedef"=>"0", "virtual"=>"0", "wchar_t"=>"0", "continue"=>"0", "explicit"=>"0", "operator"=>"0", "register"=>"0", "template"=>"0", "typename"=>"0", "unsigned"=>"23", "volatile"=>"23", "namespace"=>"0", "protected"=>"0", "const_cast"=>"0", "static_cast"=>"0", "dynamic_cast"=>"0", "reinterpret_cast"=>"0""
DETAIL: Missing left parenthesis.
CONTEXT: COPY commits, line 1, column patch_keywords: ""do"=>"1", "if"=>"0", "asm"=>"41", "for"=>"5", "int"=>"13", "new"=>"0", "try"=>"0", "auto"=>"0", "bo..."
ERROR: syntax error at or near "l022_save"
LINE 1: l022_save, pl022_load, s);
^
invalid command \n
invalid command \N
invalid command \N
...
Now the three tables have been created, but there is no content in them.
3. Install hstore
After searching for "hstore"hstore type does not exist with hstore installed postgresql, I realized that the "hstore" should be installed, but not be manually created. So I typed this in psql command line:
postgres=# create EXTENSION hstore; And there were new error messages:
SET
SET
SET
SET
SET
SET
SET
ERROR: schema "export" already exists
SET
SET
SET
CREATE TABLE
ERROR: relation "cves" already exists
ERROR: relation "repositories" already exists
ERROR: missing data for column "hunk_count"
CONTEXT: COPY commits, line 23201: "11388700 178 \N other_commit 1d6198c3b01619151f3227c6461b3d53eeb711e5\N blueswir1#c046a42c-6fe2-441..."
ERROR: syntax error at or near "l022_save"
LINE 1: l022_save, pl022_load, s);
^
invalid command \n
invalid command \N
invalid command \N
...
And still, there is no content in those three tables.
4. Generate and view tables
After looking into the source code of the dump file, and trying to fix the "hunk_count" problem but end up with failure. It occurs to me that the above error messages just caused by one paticular row of code. So I had deleted the row and the old error messages were gone but there were new error messages caused by another row. Evetually I have deleted 10 rows in total, comparing to the total row number: 351409, those deleted parts are negligible. And three tables weren't empty anymore, as it's shown in pgAdmin 4.
However, the pgADmin only demonstrated the structure of those tables, I still didn't know how to view the content in them. By refering to 2 Ways to View the Structure of a Table in PostgreSQL, I typed
SELECT
*
FROM
export.repositories/ export.cves/ export.commits
WHERE
TRUE
to generate and view corresponding tables in pgAdmin 4. For example, final cve table:
5. In the end
Looking back at these steps, these are all easy steps, but for a guy who was not familiar with the tools or operations, it could cost several days to search and type, step by step for one simple purpose. I wish this post could be useful to someone like me.
However, I am not so familiar with psql commands or anything about postgreSQL, as a matter of fact, I had never used them before. So I'm wondering if someone could point out some mistakes I may have made in those attempts, or provide some suggestions for my dilemma.
First , ensure your dump format.
Try to read header (first 5 chars) of dump file.
If it is signed as PGDMP then it is binary/custom dump else it is sql (human readable format).
- use pg_restore for binary dump import.
$ pg_restore -U postgres -d <dbname> file.dump
- use psql to import plain text sql dump.
$ psql -U postgres -d <dbname> < file.dump
Solved, as I've demonstrated above.

PostgreSQL COPY ERROR Invalid syntax. How to set data from dump?

I have a database dump, and I need to import it into a new empty database.
COPY public.accounts_account (id, username, password, first_name, last_name, street_address, city, state, zip, daytime_phone, evening_phone, email, membership, total_purchase_amount, current_discount, registered_at, membership_approved) FROM stdin;
53 user53 password53 Name53 Last53 afd462740737a3801e90c6d050e81b88 Wilmette IL 60091 123.456.786 user53#obfuscated.com 7590 102.00 0 2011-03-24 03:52:23+00 t
I get this error:
ERROR: syntax error at or near "53"
LINE 4566: 53 user53 password53 Name53 Last53 afd462740737a3801e90c6d05...
^
********** Error **********
ERROR: syntax error at or near "53"
SQL state: 42601
Character: 132900
Your COPY command failed for some reason (table doesn't exist, column doesn't exist, you don't have permissions to insert into it, etc.). Since PostgreSQL did not go into COPY mode, it tried to interpret the next line as another command, rather than as data. Look earlier in your log file to see what the initial error is.
This database dump looks like a query you can run from PG Admin's Query Tool but it is not and that is why your getting this error.
Copy paste everything in that database dump into a PSQL session and it should run correctly.
You can usually launch a PSQL session from PG Admin from the Tool drop down.

Multi line command (to export .csv) not working in Apache Drill (web interface)

I am trying to use Apache Drill to export a .csv file. This other question indicated that this is achieved by:
use dfs.tmp;
alter session set `store.format`='csv';
create table dfs.tmp.my_output as select * from cp.`employee.json`;
I tried running this block (of three commands) simultaneously in the Apache Drill web interface but got the error bellow. It somehow is not recognizing the ; or not taking multiple commands.
I also tried running each line separately, without the ; but the changes of the two commands did not persist (and the export command (3rd command) deafauted back to exporting a parquet file (the set default)).
How can I run this in Drill?
Query Failed: An Error Occurred
org.apache.drill.common.exceptions.UserRemoteException: PARSE ERROR: Encountered ";" at line 1, column 12. Was expecting one of: <EOF> "." ... "[" ... SQL Query use dfs.tmp; ^ alter session set `store.format`='csv'; create table dfs.tmp.`elos_cnis` as select * from dfs.tmp.`/bases_parquet/elos_cnis` [Error Id: 00493fbe-924e-43e9-a684-f7d1abfed04e on sbsb35.ipea.gov.br:31010] (org.apache.calcite.sql.parser.SqlParseException) Encountered ";" at line 1, column 12. Was expecting one of: <EOF> "." ... "[" ... org.apache.drill.exec.planner.sql.parser.impl.DrillParserImpl.convertException():391 org.apache.drill.exec.planner.sql.parser.impl.DrillParserImpl.normalizeException():121 org.apache.calcite.sql.parser.SqlParser.parseStmt():149 org.apache.drill.exec.planner.sql.SqlConverter.parse():157 org.apache.drill.exec.planner.sql.DrillSqlWorker.getQueryPlan():104 org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan():79 org.apache.drill.exec.work.foreman.Foreman.runSQL():1017 org.apache.drill.exec.work.foreman.Foreman.run():289 java.util.concurrent.ThreadPoolExecutor.runWorker():1142 java.util.concurrent.ThreadPoolExecutor$Worker.run():617 java.lang.Thread.run():748 Caused By (org.apache.drill.exec.planner.sql.parser.impl.ParseException) Encountered ";" at line 1, column 12. Was expecting one of: <EOF> "." ... "[" ... org.apache.drill.exec.planner.sql.parser.impl.DrillParserImpl.generateParseException():17963 org.apache.drill.exec.planner.sql.parser.impl.DrillParserImpl.jj_consume_token():17792 org.apache.drill.exec.planner.sql.parser.impl.DrillParserImpl.SqlStmtEof():861 org.apache.drill.exec.planner.sql.parser.impl.DrillParserImpl.parseSqlStmtEof():180 org.apache.drill.exec.planner.sql.parser.impl.DrillParserWithCompoundIdConverter.parseSqlStmtEof():59 org.apache.calcite.sql.parser.SqlParser.parseStmt():142 org.apache.drill.exec.planner.sql.SqlConverter.parse():157 org.apache.drill.exec.planner.sql.DrillSqlWorker.getQueryPlan():104 org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan():79 org.apache.drill.exec.work.foreman.Foreman.runSQL():1017 org.apache.drill.exec.work.foreman.Foreman.run():289 java.util.concurrent.ThreadPoolExecutor.runWorker():1142 java.util.concurrent.ThreadPoolExecutor$Worker.run():617 java.lang.Thread.run():748
Drill Web-UI does not support submitting multiple queries within the same query page. Please try using SqlLine or submit in Web-UI one-by-one
alter system set `store.format`='csv';
query to set store.format at the system level, since Web-UI does not store session by default and after that submit the following query
create table dfs.tmp.my_output as select * from cp.`employee.json`;

Error writing to database in Moodle in Find and Replace tool

I have moved my moodle database files from old server URLold= (https://example1.com) to new server URLnew=(https://example2.com). Now I want to replace URLold with URLnew in database tables using find and replace tool provided by moodle. But when i perform the operation I am getting this error. What should i do? Please help.
Error I am getting
Debug info: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'table = REPLACE(table, 'https://example1.com', 'https://example2.com')' at line 1
UPDATE mdl_pma_history SET table = REPLACE(table, ?, ?)
[array (
0 => 'https://example1.com',
1 => 'https://example2.com',
)]
Error code: dmlwriteexception
Stack trace:
line 426 of /lib/dml/moodle_database.php: dml_write_exception thrown
line 895 of /lib/dml/mysqli_native_moodle_database.php: call to moodle_database->query_end()
line 6787 of /lib/adminlib.php: call to mysqli_native_moodle_database->execute()
line 74 of /admin/tool/replace/index.php: call to db_replace()
So I got the answer on my own
Had to delete the mdl_pma_history table that was causing the error. The Steps i followed are as follows.
Exported the table to .sql file
Deleted the table because it was not allowing the script to run
Once the script (Find and replace) ran successfully imported the
table back
Done.

Error on using copy Command in Postgres (ERROR: invalid input syntax for type date: "")

I have a CSV file from which I am trying to use Postgres COPY command in order to populate a table from that CSV file. One of the table columns NEXT_VISIT is of a date data_type. Some of the corresponding fields in the CSV file which are supposed to go into this date column have null values.
The Copy command am running is like so:
COPY "VISIT_STAGING_TABLE" from E'C:\\Users\\Sir Codealot\\Desktop\\rufijihdss-2007-2010\\rufijihdss\\VISIT_TEST.CSV' CSV HEADER
When I run this command I get the error:
ERROR: invalid input syntax for type date: ""
CONTEXT: COPY VISIT_STAGING_TABLE, line 2, column NEXT_VISIT: ""
********** Error **********
ERROR: invalid input syntax for type date: ""
SQL state: 22007
Context: COPY VISIT_STAGING_TABLE, line 2, column NEXT_VISIT: ""
How can I run the copy command and get Postgres to accept that some of the fields in the CSV file corresponding to NEXT_VISIT have values ""?
Add WITH NULL AS '' to your command (COPY expects NULLs to be represented as "\N" (backslash-N) by default).
COPY "VISIT_STAGING_TABLE" from E'C:\\Users\\Sir Codealot\\Desktop\\rufijihdss-2007-2010\\rufijihdss\\VISIT_TEST.CSV' WITH CSV HEADER NULL AS ''
More details here: postgresql COPY
I was having the exact same problem, and what solved it for me was to use the statement WITH NULL ''. It is important not to have a space between the apostrophes.
I originally used the statement WITH NULL ' ' and got the same error message you did (ERROR: syntax error at or near "WITH NULL").
But when I eliminated the space between the apostrophes it worked.