I have two databases sample1 and sample2 both database have same attendance table.
But I retrieve data from sample1 (attendance table) to gridview. I want move data from gridview to sample2 (attendance table) using vb.net.
Is this possible? Please tell me with some links
Sorry for my English
select from one table and insert into another
The export import mechanism will help if both database on the same server.
Export from the one database using query given below
SELECT * FROM tablename
INTO OUTFILE '../file.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
Import into another database using query given below
LOAD DATA INFILE '../file.csv'
INTO TABLE tablename
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
To Insert from one table to another table directly you can use query like this.
INSERT into TABLENAME1 SELECT * FROM TABLENAME2
Related
Facing kind of a mini challenge here today.
I want to create CSV string from a column in a table in postgresSQL using a SQL query inside a stored function and want to be able to store into another table as single value (and do further processing on that table).
My database engine is postgreSQL.
I have seen lots of examples allowing the user to use COPY TO and COPY FROM but they either return to STDOUT or save to a file.
Copy (Select id From product limit 10) To STDOUT With CSV DELIMITER ',';
Source Data:
Product
id | Name
10 | Product1
21 | Product1
34 | Product1
45 | Product1
17 | Product1
Required/Target Data:
TempTable
value
10,21,34,45,17
Neither of above is useful to my requirement. I want to be able to store the generated CSV into another column of another table.
Similar Code for SQL Server:
I used to do this in SQL Server using the following code.
CREATE FUNCTION [dbo].[CreateCSV] (#MyXML XML)
RETURNS VARCHAR(MAX)
BEGIN
DECLARE #listStr VARCHAR(MAX);
SELECT
#listStr =
COALESCE(#listStr+',' ,'') +
c.value('#Value[1]','nvarchar(max)')
FROM #myxml.nodes('/row') as T(c)
RETURN #listStr
END
In SQL Server, I would generate the CSV by calling the CreatCsv() function within a stored procedure. I am trying to replicate the process in postgresql.
I must admit i am new to PostgreSQL so i need your help in this.
Appreciate a helpful response.
Thanks
Steve
Thanks #a_horse_with_no_name
Turns out i needed
SELECT string_agg(id,',') FROM (Select cast (id as varchar(100)) From product limit 10) AS tab;
Thanks for helping me with that. :)
I've come across a problem with loading some CSV files into my Postgres tables. I have data that looks like this:
ID,IS_ALIVE,BODY_TEXT
123,true,Hi Joe, I am looking for a new vehicle, can you help me out?
Now, the problem here is that the text in what is supposed to be the BODY_TEXT column is unstructured email data and can contain any sort of characters, and when I run the following COPY command it's failing because there are multiple , characters within the BODY_TEXT.
COPY sent from ('my_file.csv') DELIMITER ',' CSV;
How can I resolve this so that everything in the BODY_TEXT column gets loaded as-is without the load command potentially using characters within it as separators?
Additionally to the fixing the source file format you can do it by PostgreSQL itself.
Load all lines from file to temporary table:
create temporary table t (x text);
copy t from 'foo.csv';
Then you can to split each string using regexp like:
select regexp_matches(x, '^([0-9]+),(true|false),(.*)$') from t;
regexp_matches
---------------------------------------------------------------------------
{123,true,"Hi Joe, I am looking for a new vehicle, can you help me out?"}
{456,false,"Hello, honey, there is what I want to ask you."}
(2 rows)
You can use this query to load data to your destination table:
insert into sent(id, is_alive, body_text)
select x[1], x[2], x[3]
from (
select regexp_matches(x, '^([0-9]+),(true|false),(.*)$') as x
from t) t
Is there query equivalent to sql server's openquery or openrowset to use in postgresql to query from excel or csv ?
You can use PostgreSQL's COPY
As per doc:
COPY moves data between PostgreSQL tables and standard file-system
files. COPY TO copies the contents of a table to a file, while COPY
FROM copies data from a file to a table (appending the data to
whatever is in the table already). COPY TO can also copy the results
of a SELECT query
COPY works like this:
Importing a table from CSV
Assuming you already have a table in place with the right columns, the command is as follows
COPY tblemployee FROM '~/empsource.csv' DELIMITERS ',' CSV;
Exporting a CSV from a table.
COPY (select * from tblemployee) TO '~/exp_tblemployee.csv' DELIMITERS ',' CSV;
Its important to mention here that generally if your data is in unicode or need strict Encoding, then Always set client_encoding before running any of the above mentioned commands.
To set CLIENT_ENCODING parameter in PostgreSQL
set client_encoding to 'UTF8'
or
set client_encoding to 'latin1'
Another thing to guard against is nulls, while exporting , if some fields are null then PostgreSQL will add '/N' to represent a null field, this is fine but may cause issues if you are trying to import that data in say SQL server.
A quick fix is modify the export command by specifying what would you prefer as a null placeholder in exported CSV
COPY (select * from tblemployee ) TO '~/exp_tblemployee.csv' DELIMITERS ',' NULL as E'';
Another common requirement is import or export with the header.
Import CSV to table with Header for columns present in first row of csv file.
COPY tblemployee FROM '~/empsource.csv' DELIMITERS ',' CSV HEADER
Export a table to CSV with Headers present in the first row.
COPY (select * from tblemployee) TO '~/exp_tblemployee.csv' DELIMITERS ',' CSV HEADER
I am having problems with my query.
Basically, what I am trying to do is empty out a table and copy the records from the same table in another database.
I did use the SET IDENTITY_INSERT code to make sure that the identity column is turned off before I perform my insert. But somehow, it still throws me the error message:
Msg 8101, Level 16, State 1, Line 3
An explicit value for the identity column in table 'dbo.UI_PAGE' can only be specified when a column list is used and IDENTITY_INSERT is ON.
Below is my query:
DELETE FROM [DB1].[dbo].[MY_TABLE]
SET IDENTITY_INSERT [DB1].[dbo].[MY_TABLE] ON
INSERT INTO [DB1].[dbo].[MY_TABLE]
SELECT *
FROM [DB2].[dbo].[MY_TABLE]
SET IDENTITY_INSERT [DB1].[dbo].[MY_TABLE] OFF
Can someone point me as to which step I am doing wrong?
Thanks a lot!
You have to specify all the column names when inserting with IDENTITY INSERT ON when using INSERT INTO
INSERT INTO [DB1].[dbo].[MY_TABLE](TabelID,Field1,Field2,Field3...)
SELECT * FROM [DB2].[dbo].[MY_TABLE]
In case you did not know there is a nifty little trick in ssms. If select a table and expand its' nodes you ctrl-c copy on the Columns node and that will place a comma-delimited list of the field names on your clipboards text buffer.
Addition to the first answer given by Ross Bush,
If your table has many columns then to get those columns name by using this command.
SELECT column_name + ','
FROM information_schema.columns
WHERE table_name = 'TableName'
for xml path('')
(after removing the last comma(',')) Just copy past columns name.
We've got a system (MS SQL 2008 R2-based) that has a number of "input" database and a one "output" database. I'd like to write a query that will read from the output DB, and JOIN it to data in one of the source DB. However, the source table may be one or more individual tables :( The name of the source DB is included in the output DB; ideally, I'd like to do something like the following (pseudo-SQL ahoy)
select o.[UID]
,o.[description]
,i.[data]
from [output].dbo.[description] as o
left join (select [UID]
,[data]
from
[output.sourcedb].dbo.datatable
) as i
on i.[UID] = o.[UID];
Is there any way to do something like the above - "dynamically" specify the database and table to be joined on for each row in the query?
Try using the exec function, then specify the select as a string, adding variables for database names and tables where appropriate. Simple example:
DECLARE #dbName VARCHAR(255), #tableName VARCHAR(255), #colName VARCHAR(255)
...
EXEC('SELECT * FROM ' + #dbName + '.dbo.' + #tableName + ' WHERE ' + #colName + ' = 1')
No, the table must be known at the time you prepare the query. Otherwise how would the query optimizer know what indexes it might be able to use? Or if the table you reference even has an UID column?
You'll have to do this in stages:
Fetch the sourcedb value from your output database in one query.
Build an SQL query string, interpolating the value you fetched in the first query into the FROM clause of the second query.
Be careful to check that this value contains a legitimate database name. For instance, filter out non-alpha characters or apply a regular expression or look it up in a whitelist. Otherwise you're exposing yourself to a SQL Injection risk.
Execute the new SQL string you built with exec() as #user353852 suggests.