mysql import sql file without overwrite but update curent value + backup value - command-line

I have a 2 databases
1.db_temporary
2.db_primary
in db_temporary I have a table which contain bunch of data that I want to keep without overwrite it but update it from imported MYSQL file
I dump db_primary and import backup to db_temporary with this command
D:\mysql4.0.27\bin\mysqldump.exe --add-drop-table db_primary tb_wantomodify > "backupfile.sql"
D:\mysql4.0.27\bin\mysql.exe db_temporary < "backupfile.sql"
I have tried This Solution yeah it not overwrited , but what I want is update (addition) recent field of db_temporary with new value of backup.
Technicaly similiar to update set curvalue=curvalue+ 'newvaluefrombackup' like
Is it possible todo this?
Thank You

Firstly you can put both of those tables in the same database. There's no reason to create two seperate files. Secondly what you want here is the SQL UPDATE command. First create a database object and set it to your database.
SQLiteDatabase dataBase = SQLiteDatabase.openDatabase(myPath,
null, SQLiteDatabase.OPEN_READWRITE);
database.execSQL("UPDATE " +yourTableNameHere+ " SET " +theColumnYouWantToUpdate+ "='" +theNewValue+ "' WHERE " +theColumnNametoUpdate+ "='" +theNewValue+ "'");
This may seem confusing at first but what you need to understand is that SQL commands read as strings. This example assumes you're using String constants for your table data, as you should. The + sign before and after is a concatenation command. Make sure you add spaces. And don't forget the commas after the values you want checked. There's a pretty good SQL commands tutorial here: [http://www.1keydata.com/sql/sqlselect.html]

Related

LONG VARCHAR - Read from table to Front (C#) then INSERT / UPDATE value to a table

I'm reading table info from a SqlBase DB using a DatAdapter.Fill in C#, it's working perfectly for any variable type except for LONG VARCHAR, in this case it converts to String type in C# and if I add a watch the object variable I see some weird chars into it, so later when i try to insert/update another table (in another Database) it fails.
I know that even if the value would be ok in C# i can't insert that as it is, the document say i should bind the value to a variable to be able to insert into table, but i'm not sure how to do it since i'm creating the scripts in C# to be run in SqlBase, i'm not taking direct action from C#, even if a could i'm not being able to read the value correctly since it converts to string with weird digits into it, is this LOAG VARCHAR like a VARBINARY in Sql Server? I assume so because the column i have problems with is a LOGO, like a picture.
So in short, is it any way to
Read a long varchar from .NET and then..
..Use it when inserting / updating values to a table?
(1) is .NET but (2) is a sql script to be run on Sqlbase using SqlTalk.
Thanks!
Suggest you UNLOAD the Long data to a flat file using SQLTalk command. That way you'll get readable data. Read the flat file using C# if you need to, and do what ever you want with it, but to re-load the data into another table using SQLTalk, you need to use specific syntax. Go here: SQLBase Manuals ( all versions ) extract the manual appropriate to the version of SQLBase you are using, and 1) read up on UNLOAD from the 'SQLBase Language Reference' to get the Long data out into a flat file ( there are different snytax giving different results ). Then 2) read up on 'Examples of Bind Variables for Long data' from the 'SQLTalk Command Reference' , as you have to set LONG VARCHAR data to bind variables.
When inserting long data into a LONG VARCHAR or LONG NVARCHAR or LONG BINARY column, precede it with the $LONG keyword. You can then start entering data on the next line, and continue entering on successive lines. To mark the end of text, enter a double slash on a new line (//). e.g.
INSERT INTO BIO (NAME, BIO) VALUES (:1,:2)
\
SHAKESPEARE, $LONG
William Shakespeare was born in Stratford-on-Avon on
April 16, 1564. He was England's most famous poet and
dramatist. . . . .. . .
He died in 1616, leaving his second best bed to his wife.
//
If the data for the LONG (N)VARCHAR or LONG VARBINARY column comes from a file, enter the name of the file after the $LONG keyword. e.g.
INSERT INTO BIO (NAME, BIO) VALUES (:1,:2)
\
SHAKESPEARE, $LONG shakes.txt
JONSON,$LONG jonson.txt
O'NEILL,$LONG oneill.txt
/
To Update Long data e.g.
UPDATE TABLE EXPENSES SET COMMENTS = :1 WHERE DATE = :2
\
"Beltran Tree Service", 1/1/94 "Hercules", 1/2/94
"Checkup", 1/3/94
/

PostgreSQL: Import columns into table, matching key/ID

I have a PostgreSQL database. I had to extend an existing, big table with a few more columns.
Now I need to fill those columns. I tought I can create an .csv file (out of Excel/Calc) which contains the IDs / primary keys of existing rows - and the data for the new, empty fields. Is it possible to do so? If it is, how to?
I remember doing exactly this pretty easily using Microsoft SQL Management Server, but for PostgreSQL I am using PG Admin (but I am ofc willing to switch the tool if it'd be helpfull). I tried using the import function of PG Admin which uses the COPY function of PostgreSQL, but it seems like COPY isn't suitable as it can only create whole new rows.
Edit: I guess I could write a script which loads the csv and iterates over the rows, using UPDATE. But I don't want to reinvent the wheel.
Edit2: I've found this question here on SO which provides an answer by using a temp table. I guess I will use it - although it's more of a workaround than an actual solution.
PostgreSQL can import data directly from CSV files with COPY statements, this will however only work, as you stated, for new rows.
Instead of creating a CSV file you could just generate the necessary SQL UPDATE statements.
Suppose this would be the CSV file
PK;ExtraCol1;ExtraCol2
1;"foo",42
4;"bar",21
Then just produce the following
UPDATE my_table SET ExtraCol1 = 'foo', ExtraCol2 = 42 WHERE PK = 1;
UPDATE my_table SET ExtraCol1 = 'bar', ExtraCol2 = 21 WHERE PK = 4;
You seem to work under Windows, so I don't really know how to accomplish this there (probably with PowerShell), but under Unix you could generate the SQL from a CSV easily with tools like awk or sed. An editor with regular expression support would probably suffice too.

replacing characters in a CLOB column (db2)

I have a CLOB(2000000) field in a db2 (v10) database, and I would like to run a simple UPDATE query on it to replace each occurances of "foo" to "baaz".
Since the contents of the field is more then 32k, I get the following error:
"{some char data from field}" is too long.. SQLCODE=-433, SQLSTATE=22001
How can I replace the values?
UPDATE:
The query was the following (changed UPDATE into SELECT for easier testing):
SELECT REPLACE(my_clob_column, 'foo', 'baaz') FROM my_table WHERE id = 10726
UPDATE 2
As mustaccio pointed out, REPLACE does not work on CLOB fields (or at least not without doing a cast to VARCHAR on the data entered - which in my case is not possible since the size of the data is more than 32k) - the question is about finding an alternative way to acchive the REPLACE functionallity for CLOB fields.
Thanks,
krisy
Finally, since I have found no way to this by an SQL query, I ended up exporting the table, editing its lob content in Notepad++, and importing the table back again.
Not sure if this applies to your case: There are 2 different REPLACE functions offered by DB2, SYSIBM.REPLACE and SYSFUN.REPLACE. The version of REPLACE in SYSFUN accepts CLOBs and supports values up to 1 MByte. In case your values are longer than you would need to write your own (SQL-based?) function.
BTW: You can check function resolution by executing "values(current path)"

How to insert statements that contains apostrophes into Sqlite database

In my iPhone app I am using an Sqlite database. I have a requirement to store the text in database. The text contains apostrophes.
For example:
Insert into tbl_insert values ('It is Steve's Shirt');
How to store this kind of statements in Sqlite database?
This is something that I go through in SQL Server and MySQL as well. You should definitely use parameterised SQL queries
See this page for examples in many languages.
I strongly discourage the use of literal strings in the update statement. Use parameterized queries. There's no reason to compromise security
You can write a function which replaces each instance of character ' with ''
http://www.kamath.com/codelibrary/cl003_apostrophe.asp
Simply replace ' characters to ` :)
text = text.replace("'", "`");
With python and sqlite3 i found that the following line worked perfectly (replacing ' with '')
myString = myString.replace('\'', '\'\'')
the string can then be concatenated in an UPDATE command
The line is stored and displayed correctly. It also works great with Grafana.
I'm not yet sure if this is specific to the sqlite3 python module or if it can be generalized

How to import file into sqlite?

On a Mac, I have a txt file with two columns, one being an autoincrement in an sqlite table:
, "mytext1"
, "mytext2"
, "mytext3"
When I try to import this file, I get a datatype mismatch error:
.separator ","
.import mytextfile.txt mytable
How should the txt file be structured so that it uses the autoincrement?
Also, how do I enter in text that will have line breaks? For example:
"this is a description of the code below.
The text might have some line breaks and indents. Here's
the related code sample:
foreach (int i = 0; i < 5; i++){
//do some stuff here
}
this is a little more follow up text."
I need the above inserted into one row. Is there anything special I need to do to the formatting?
For one particular table, I want each of my rows as a file and import them that way. I'm guessing it is a matter of creating some sort of batch file that runs multiple imports.
Edit
That's exactly the syntax I posted, minus a tab since I'm using a comma. The missing line break in my post didn't make it as apparent. Anyways, that gives the mismatch error.
I was looking on the same problem. Looks like I've found an answer on the first part of your question — about importing a file into a table with ID field.
So yes, create a temporary table without ID, import your file into it, then do insert..select to copy its data into your target table. (Remove leading commas from mytextfile.txt).
-- assuming your table is called Strings and
-- was created like this:
-- create table Strings( ID integer primary key, Code text )
create table StringsImport( Code text );
.import mytextfile.txt StringsImport
insert into Strings ( Code ) select * from StringsImport;
drop table StringsImport;
Do not know what to do with newlines. I've read some mentions that importing in CSV mode will do the trick (.mode csv), but when I tried it did not seem to work.
In case anyone is still having issues with this you can download an SQLLite manager.
There are several that allow importing from a CSV file.
Here is one but a google search should reveal a few: http://sqlitemanager.en.softonic.com/
I'm in the process of moving data containing long text fields with various punctuation marks (they are actually articles on coding) into SQLite and I've been experimenting with various text imports.
I created a database in SQLite with a table:
CREATE TABLE test (id PRIMARY KEY AUTOINCREMENT, textfield TEXT);
then do a backup with .dump.
I then add the text below the "CREATE TABLE" line manually in the resulting .dump file as such:
INSERT INTO test textfield VALUES (1,'Is''t it great to have
really long text with various punctaution marks and
newlines');
Change any single quotes to two single quotes (change ' to ''). Note that an index number needs to be added manually (I'm sure there is an AWK/SED command to do it automatically). Change the auto increment number in the "sequence" line in the dump file to one above the last index number you added (I don't have SQLite in front of me to give you the exact line, but it should be obvious).
With the new file, I can then do a restore onto the database