Copy data CSV PostgresSQL - postgresql

I have csv file
CLASS 1;;;;;;
1010;74;14;56;6;5;3
1011;58;0;689;5;6;7
CLASS 2;;;;;;
1030;74;14;6;3;4;5
1031;58;0;689;6;5;4
CLASS 3;;;;;;
1030;74;14;4;1;2;3
1031;58;0;689;7;6;5
how can i import data in such a way that class 1, class 2 etc are different tables and numbers written after them are their fields. I mean import data from same file to different tables.
I tried this but it is not correct
"COPY openingBalance FROM 'D:\\test\\OSV.csv' CSV DELIMITER ';'"

Related

Postgres copy command to import .csv using quote option creates more double quotes in destination table

I want to import a .csv file ('|' delimited) into a postgres table. Below is the sample data of csv file. Data in this csv file has a double quote (")
45926052|3|70092111944|ketamine" Syringe|25180|5902629|1.00000|||4|Pham|2018-08-30|9999-12-31
I use the below code to import this data in the postgres table.
copy public.destination_tbl
from 'D:\Target\sample_data.csv'
DELIMITER '|'
QUOTE E'$'
csv header
Below image you can see the data in destination table after import. Here you can see a double quote in the first column and extra two double quotes near the actual double quote.
enter image description here
I want the data imported in the table to be same as it is present in the source csv file.

Import particular nodes into Neo4J from CSV file

I'm doing some tests in order to figure out how to best import data from a CSV file into Neo4J. I have the following data in the Person.csv file (3 headers, Person_ID, Name, and Type):
Person_ID Name Type
HUA001 Jaap Layperson
HUA002 Teems Priest
HUA003 Frank Layperson
I want to import the nodes which are of a particular type (e.g. 'Layperson').
I thought about creating a LOAD CSV command with a WHERE statement (see below), but Neo4J doesn't particularly like that WHERE statement. Any ideas how to get this (or a query with a similar result) working?
LOAD CSV WITH HEADERS FROM 'file:///Person.csv' AS row
WHERE row.Type='Layperson'
CREATE (p:Person:Layperson {ID: row.Person_ID, name: row.Name})
You can use WITH and WHERE combined to filter the required rows and pass on the filtered rows to the next query which creates the nodes.
LOAD CSV WITH HEADERS FROM 'file:///Person.csv' AS row
WITH row
WHERE row.Type='Layperson'
CREATE (p:Person:Layperson {ID: row.Person_ID, name: row.Name})

Neo4j import tool with csv files: how to import labels column as list of labels

I am importing some nodes and relationships as csv files using the neo4jImport tool and I have a column in my csv file that is a list of strings that are to be my node labels. Is it possible to do this? If so, what is the correct csv format for the list of strings? Will [string1, string2, ..., stringn] work?
For the neo4j-import import tool:
The :LABEL column of the CSV file also supports multiple labels, separated by the provided array separated (default is ;)
:LABELS
Label1;Label2

How can I export images from a postgreSQL database?

I have a simple data table.
Schemas>resources>tables>image_data contains the columns
image_id(integer) raw_data(bytea)
How can I export this data to files named based on the image_id? Once I figure that out I want to use the image_id to name the files based on a reference in another table.
So far I've got this:
SELECT image_id, encode(raw_data, 'hex')
FROM resources.image_data;
It generates a csv with all the images as HEX, but I don't know what to do with it.
Data-sample:
"image_id";"encode"
166;"89504e470d0a1a0a0000000d49484452000001e0000003160806000000298 ..."

Import a CSV file into MySQL workbench into a new table dynamically

I can import a CSV file data into a existing table in MySQL Workbench, using Load data infile , but what if I have a file with 25 columns, it becomes a pain to create a structure for such tables before importing.
Is there a way to import CSV files without creating the structure, like proc import in SAS?
Yes, there is. Try the new 6.3 release (currently in RC, soon to be GA) which comes with a new table data import/export feature that supports CSV and JSON data. It creates the table on the fly during import.