mybatis annotation batch insert - mybatis

I am using spring-mybatis.
What is the best way to insert thousands in batch?
I am currently using #InsertProvider and loop through thousands records and insert into DB.
The current way will cause connection issue if the insert is in millions.
Example:
#Insert({
"<script>",
"insert into mybatis_demo (name, age)",
"values ",
"<foreach collection='dmoList' item='dmo' separator=','>",
"( #{dmo.name,jdbcType=VARCHAR}, #{dmo.age,jdbcType=INTEGER})",
"</foreach>",
"</script>"
})
int insertBatch(#Param("dmoList") List<MybatisDemoDMO> dmoList);

Related

MyBatis with upperCase while select and insert

I would like to insert email address of an user in to Upper case.Is any way we can do it in Mybatis 3.1.x.
#Insert("INSERT INTO USERSIGNUP(FIRSTNAME, LASTNAME,EMAILID, COUNTRY) " +
"VALUES (#{firstName,jdbcType=VARCHAR}, #{lastName,jdbcType=VARCHAR}, #{email.toUpperCase(),jdbcType=VARCHAR}, #{countryCode,jdbcType=VARCHAR})")
As in above insert I have modified the statement to uppercase but no success
#{email.toUpperCase()
any hints.
It worked.
#Insert("INSERT INTO USERSIGNUP(FIRSTNAME, LASTNAME,EMAILID, COUNTRY) " +
"VALUES (#{firstName,jdbcType=VARCHAR}, #{lastName,jdbcType=VARCHAR}, UPPER(#{email,jdbcType=VARCHAR}), #{countryCode,jdbcType=VARCHAR})")
We have to do like this UPPER(#{email,jdbcType=VARCHAR})

Psycopg2 insert python dictionary in postgres database

In python 3+, I want to insert values from a dictionary (or pandas dataframe) into a database. I have opted for psycopg2 with a postgres database.
The problems is that I cannot figure out the proper way to do this. I can easily concatenate a SQL string to execute, but the psycopg2 documentation explicitly warns against this. Ideally I wanted to do something like this:
cur.execute("INSERT INTO table VALUES (%s);", dict_data)
and hoped that the execute could figure out that the keys of the dict matches the columns in the table. This did not work. From the examples of the psycopg2 documentation I got to this approach
cur.execute("INSERT INTO table (" + ", ".join(dict_data.keys()) + ") VALUES (" + ", ".join(["%s" for pair in dict_data]) + ");", dict_data)
from which I get a
TypeError: 'dict' object does not support indexing
What is the most phytonic way of inserting a dictionary into a table with matching column names?
Two solutions:
d = {'k1': 'v1', 'k2': 'v2'}
insert = 'insert into table (%s) values %s'
l = [(c, v) for c, v in d.items()]
columns = ','.join([t[0] for t in l])
values = tuple([t[1] for t in l])
cursor = conn.cursor()
print cursor.mogrify(insert, ([AsIs(columns)] + [values]))
keys = d.keys()
columns = ','.join(keys)
values = ','.join(['%({})s'.format(k) for k in keys])
insert = 'insert into table ({0}) values ({1})'.format(columns, values)
print cursor.mogrify(insert, d)
Output:
insert into table (k2,k1) values ('v2', 'v1')
insert into table (k2,k1) values ('v2','v1')
I sometimes run into this issue, especially with respect to JSON data, which I naturally want to deal with as a dict. Very similar. . .But maybe a little more readable?
def do_insert(rec: dict):
cols = rec.keys()
cols_str = ','.join(cols)
vals = [ rec[k] for k in cols ]
vals_str = ','.join( ['%s' for i in range(len(vals))] )
sql_str = """INSERT INTO some_table ({}) VALUES ({})""".format(cols_str, vals_str)
cur.execute(sql_str, vals)
I typically call this type of thing from inside an iterator, and usually wrapped in a try/except. Either the cursor (cur) is already defined in an outer scope or one can amend the function signature and pass a cursor instance in. I rarely insert just a single row. . .And like the other solutions, this allows for missing cols/values provided the underlying schema allows for it too. As long as the dict underlying the keys view is not modified as the insert is taking place, there's no need to specify keys by name as the values will be ordered as they are in the keys view.
[Suggested answer/workaround - better answers are appreciated!]
After some trial/error I got the following to work:
sql = "INSERT INTO table (" + ", ".join(dict_data.keys()) + ") VALUES (" + ", ".join(["%("+k+")s" for k in dict_data]) + ");"
This gives the sql string
"INSERT INTO table (k1, k2, ... , kn) VALUES (%(k1)s, %(k2)s, ... , %(kn)s);"
which may be executed by
with psycopg2.connect(database='deepenergy') as con:
with con.cursor() as cur:
cur.execute(sql, dict_data)
Post/cons?
using %(name)s placeholders may solve the problem:
dict_data = {'key1':val1, 'key2':val2}
cur.execute("""INSERT INTO table (field1, field2)
VALUES (%(key1)s, %(key2)s);""",
dict_data)
you can find the usage in psycopg2 doc Passing parameters to SQL queries
Here is another solution inserting a dictionary directly
Product Model (has the following database columns)
name
description
price
image
digital - (defaults to False)
quantity
created_at - (defaults to current date)
Solution:
data = {
"name": "product_name",
"description": "product_description",
"price": 1,
"image": "https",
"quantity": 2,
}
cur = conn.cursor()
cur.execute(
"INSERT INTO products (name,description,price,image,quantity) "
"VALUES(%(name)s, %(description)s, %(price)s, %(image)s, %(quantity)s)", data
)
conn.commit()
conn.close()
Note: The columns to be inserted is specified on the execute statement .. INTO products (column names to be filled) VALUES ..., data <- the dictionary (should be the same **ORDER** of keys)

How can I combine these two statements?

I'm currently trying to insert data into a database from a text boxes, $enter / $enter2 being where the text is being written.
The database consists of three columns ID, name and nametwo
ID is auto incrementing and works fine
Both statements work fine on their own, but because they are being issued separately the first leaves nametwo blank and the second leaves name blank.
I've tried combining both but haven't had much luck, hope someone can help.
$dbh->do("INSERT INTO $table(name) VALUES ('".$enter."')");
$dbh->do("INSERT INTO $table(nametwo) VALUES ('".$enter2."')");
To paraphrase what others have said:
my $sth = $dbh->prepare("INSERT INTO $table(name,nametwo) values (?,?)");
$sth->execute($enter, $enter2);
So you don't have to worry about quoting.
You should read database manual.
The query should be:
$dbh->do("INSERT INTO $table(name,nametwo) VALUES ('".$enter."', '".$enter2."')");
The SQL syntax is
INSERT INTO MyTable (
name_one,
name_two
) VALUES (
"value_one",
"value_two"
)
Your way of generating SQL statements is very fragile. For example, it will fail if the table name is Values or the value is Jester's.
Solution 1:
$dbh->do("
INSERT INTO ".$dbh->quote_identifier($table_name)."
name_one,
name_two
) VALUES (
".$dbh->quote($value_one).",
".$dbh->quote($value_two)."
)
");
Solution 2: Placeholders
$dbh->do(
" INSERT INTO ".$dbh->quote_identifier($table_name)."
name_one,
name_two
) VALUES (
?, ?
)
",
undef,
$value_one,
$value_two,
);

Prevent sql injection, Activerecord #where, with multiple AND/OR clause

I am fairly new to rails. I have a rails Model 'Message' with: 'belongs_to :sender' and 'belongs_to :receiver' relations.
I am trying to create a message thread between two users: 'current_user' and 'params'.
In the show controller action of the MessagesController, I want to use the equivalent of this sql query:
Message.find_by_sql(
"SELECT *
FROM messages
WHERE
(sender_id = #{current_user.id} OR sender_id = #{params[:id]})
AND
(receiver_id = #{current_user.id} OR receiver_id = #{params[:id]});"
)
If I where looking for one Message I would use this Activerecord queryto prevent SQL injection:
Message.where('sender_id = ? OR receiver_id = ?', current_user.id, current_user.id).find(params[:id])
My current query is:
Message.where(sender_id: [current_user.id, params[:id]], receiver_id: [current_user.id, params[:id]])
Is this query currently guarded against SQL injection?
it's safe. the final query would be something like sender_id IN (1, 2) AND receiver_id IN (3, 4) and all integer values are sanitized. You can simply run tests:
Message.where(sender_id: [current_user.id, "' is dangerous"], receiver_id: [current_user.id, params[:id]])
and see raw SQL output in console. illegal integers should be converted to 0.

IF EXISTS not recognized in Derby

DROP TABLE IF EXISTS Pose ;
results in the error
Error code -1, SQL state 42X01: Syntax error: Encountered "EXISTS" at line 1, column 15.
I'm running this from inside NetBeans 7.3 using the default Derby sample db.
Derby does not currently support IF EXISTS
Are you trying to create a table? If yes, this is what you should do:
public void createTables() throws SQLException {
Statement statement = getConnection().createStatement();
System.out.println("Checking database for table");
DatabaseMetaData databaseMetadata = getConnection().getMetaData();
ResultSet resultSet = databaseMetadata.getTables(null, null, "PATIENT", null);
if (resultSet.next()) {
System.out.println("TABLE ALREADY EXISTS");
} else {
//language=MySQL
statement.execute("CREATE TABLE Patient (" +
"CardNumber CHAR(10) NOT NULL PRIMARY KEY, " +
" FirstName CHAR(50)," +
" MiddleName CHAR(50)," +
" LastName CHAR(50) )");
}
}
Remember to use all caps for the table name you pass into databaseMetadata.getTables(...)
The MySQL 6.0 syntax for declaring a table is this:
CREATE TABLE [IF NOT EXISTS] tableName ...
and the MySQL syntax for removing a table is this:
DROP TABLE [IF EXISTS] tableName ...
These clauses are MySQL extensions which are not part of the ANSI/ISO SQL Standard. This functionality may be peculiar to MySQL: I can't find anything similar documented for Derby, Postgres, Oracle, or DB2.
The best alternative I can find is to query the system tables to see if the table exists.
select count(*) from sys.systables where tablename = 'YOUR_TABLE_NAME'"
I had a similar issue dropping stored procedures. They can be queried using this statement.
select count(*) from sys.sysaliases where alias = 'YOUR_STORED_PROCEDURE_NAME'
If someone is looking to drop and create a table in an sql file that is Run with Spring test framework, Check https://stackoverflow.com/a/47459214/3584693 for an answer that ensures that no exception is thrown when drop table is invoked when said table doesn't exist.