OracleBulkCopy: different behavior between Oracle 12c and 19c - oracle12c

Crossposting here from the official Oracle forum:
I'm transferring data from an SQLite database to an Oracle Database with fully managed ODP.NET, 4.122.19.1. So far it has been 12c which worked flawlessly, but when writing to Oracle 19c I get the error "ORA-39822: A new direct path operation is not allowed in the current transaction."
The program flow is basically like this (details omitted for brevity):
using (var transaction = oracleConnection.BeginTransaction())
{
foreach (var dataTable in dataTables) // dataTables is a collection of - well, DataTables
{
using (var bulkCopy = new OracleBulkCopy(oracleConnection))
{
bulkCopy.WriteToServer(dataTable);
}
}
transaction.Commit();
}
When trying to write the second DataTable I get the aforementioned error which is pretty self explaining: I guess I'm supposed to start a new transaction for each DataTable. As I want the whole transfer to be an all-or-nothing operation I hesitate to change the code accordingly.
So - is there any setting in Oracle 19c that re-enables the behavior that I see in Oracle 12c?

Alex Keh from Oracle confirmed that each WriteToServer() call must be inside its own Transaction so that the program flow has to be:
foreach (var dataTable in dataTables)
{
using (var bulkCopy = new OracleBulkCopy(connection))
{
using (var transaction = connection.BeginTransaction())
{
bulkCopy.WriteToServer(dataTable);
transaction.Commit();
}
}
}
Discussion here

Related

EF CORE loads all data before any query takes place

I have a console app to test EF Core with SQL Server Express edition. I have the following code in Program.cs
using EFCoreTutorials;
using EFCoreTutorials.Entities;
using Microsoft.EntityFrameworkCore;
using (var context = new SalesManagementContext())
{
var employees = context.Employees.ToList();
}
I wonder why context.Employees brings all data from my employees table without even asking for it with a query. If I have many tables in my database and my context class with a lot of data , are all these available from the context var at the beginning of program without even specifying which one of these I actuall want to use?
Thank you

Add PostgreSQL databaseas data source for WinForms DataGridView

I've successfully connected to a PostgreSQL database by manually coding it (create connection string for IP address, port, credentials and database name, create the NpgsqlConnection object, and open the connection).
Now I need to add that database as a Data Source for a DataGridView in a WinForms project. I ran across Devart's dotConnection for PostgreSQL and downloaded the Express version. They have a documentation page, but for the life of me I can't figure out how to add the database as a datasource (I also reached out to their support email three days ago, but they've never responded.)
When I click Add New Data Source in the Data Sources tab and the Data Source Configuration Wizard opens, I'm not sure if I should select Database or Object. In any case, I'm not seeing how to add the PostgreSQL database connection information as a datasource through the wizard.
I believe the best way to do that is not to use the wizards, they hide code in .Designer.cs and .resx files and make maintenance difficult.
I recommend you do something like this programmatically:
DataGridView1.DataSource = GetData("Your sql");
public DataTable GetData(string selectSql)
{
try
{
DataSet ds = new DataSet();
string connstring = String.Format("Your conn string");
NpgsqlConnection conn = new NpgsqlConnection(connstring);
conn.Open();
NpgsqlDataAdapter da = new NpgsqlDataAdapter(selectSql, conn);
da.Fill(ds);
return ds.Tables[0];
}
finally
{
conn.Close();
}
}

EF Core Bulk Delete on PostgreSQL

I’m trying to do a, potentially, large scale delete operation on a single table. (think 100,000 rows on a 1m row table)
I’m using PostgreSQL and EntityFrameworkCore.
Details: The application code has a predicate to match and knows nothing about how many rows potentially match the predicate. It could be 0 row/s or a very large amount.
Research indicates EF Core is incapable of handling this efficiently. (i.e. the following code produces a Delete statement for each row!)
Using (var db = new DbContext)
var queryable = db.Table.AsQueryable()
.Where(o => o.ForeignKey == fKey)
.Where(o => o.OtherColumn == false);
db.Table.RemoveRange(queryable);
await db.SaveChangesAsync();
So here is the SQL I would prefer to run in a sort of batched operation:
delete from Table
where ForeignKey = 1234
and OtherColumn = false
and PK in (
select PK
from Table
where ForeignKey = 1234
and OtherColumn = false
limit 500
)
There are extension libraries out there, but I’ve yet to find an active one that supports Postgres. I’m currently executing the raw sql above through EF Core.
This leads to a couple questions:
Is there anyway to get EF Core to delete these rows more efficiently on Postgres using LINQ, etc?
(Seems to me like handing the context a queryable should give it everything it needs to make the proper decision here)
If not, what are your opinions on deleting in batches vs handing the DB just the predicate?
I think you are trying to do something you should not use EntityFrameworkCore for. The object of EntityFrameworkCore is to have a nice way to move data between a .Net-Core application and a database. The typical useway is single or a small amount of objects. For bulk operations there are some nuget-packages. There is this package for inserting and updating with postgres.This article by the creator explains how it uses temporary tables and the postgres COPY command to do bulk operations. This shows us a way to delete rows in bulk by id:
var toDelete = GetIdsToDelete();
using (var conn = new NpgsqlConnection(connectionString))
{
conn.Open();
using ( var cmd = conn.CreateCommand())
{
cmd.CommandText =("CREATE TEMP TABLE temp_ids_to_delete (id int NOT NULL) ON COMMIT DROP ");
cmd.Prepare();
cmd.ExecuteNonQuery();
}
using (var writer = conn.BeginBinaryImport($"COPY temp_ids_to_delete (id) FROM STDIN (FORMAT BINARY)"))
{
foreach (var id in toDelete)
{
writer .StartRow();
writer .Write(id);
}
writer .Complete();
}
using (var cmd = conn.CreateCommand())
{
cmd.CommandText = "delete from myTable where id in(select id from temp_ids_to_delete)";
cmd.Prepare();
cmd.ExecuteNonQuery();
}
conn.Close();
With some small changes this can be more generalized.
But you want to do something different. You dont want to move data or information between the application and the database. You want to use efcore to create a slq-procedure on the fly and run that on the server. The problem is that ef core is not realy build to do that. But maybe there are ways around that. One way i could think of is to use ef-core to build a query, get the query string and then insert that string into another sql-string to run on the server.
Getting the query string is currently not easy but apparently it will be with EF Core 5.0. Then you could do this:
var queryable = db.Table.AsQueryable()
.Where(o => o.ForeignKey == fKey)
.Where(o => o.OtherColumn == false);
var queryString=queryable.ToQueryString();
db.Database.ExecuteSqlRaw("delete from Table where PK in("+queryString+")" )
And yes that is terribly hacky and i would not recommend that. I would recommend to write procedures and functions on the databaseServer because this is not something ef-core should be used for. And then you can still run those functions from ef-core and pass parameters.
I would suggest using temp tables to do an operation like this. You would create a mirror temp table, bulk add the records to keep or delete to the temp table and then execute a delete operation that looks for records in/not in that temp table. Try using a library such as PgPartner to accomplish bulk additions and temp table creation very easily.
Check out PgPartner: https://www.nuget.org/packages/PgPartner/
https://github.com/SourceKor/PgPartner
Disclaimer: I'm the owner of the project Entity Framework Plus
Your scenario look to be something that our Batch Delete features could handle: https://entityframework-plus.net/batch-delete
Using (var db = new DbContext)
var queryable = db.Table.AsQueryable()
.Where(o => o.ForeignKey == fKey)
.Where(o => o.OtherColumn == false);
queryable.Delete();
Entities are not loaded in the application, and only a SQL is executed as you specified.

Bulk Insert/Update with EF6?

I’m looking for a way to Insert or Update about 155,000 records using EF6. It has become obvious that EF6 out of the box is going to take way to long to look up a record and decide if it’s an insert or update, create or update an object, and then commit it to the database.
Looking around I’ve seen third party apps like EntityFramework.Extend but it looks like they are designed to do mass updates like “Update Table where Field=value” which doesn’t quite fit what I’m looking to do.
In my case I read in an XML doc, create a list of objects from that document, and then use EF to either insert or update to a table. Would it be better off going back to just regular ADO.Net and using bulk inserts that way?
BTW: this is using an Oracle database, not SQL Server.
You may use the EntityFramework.BulkInsert-ef6 package:
using EntityFramework.BulkInsert.Extensions;
class Program
{
static void Main(string[] args)
{
var data = new List<Demo>();
for (int i = 0; i < 1000000; i++)
{
data.Add(new Demo { InsertDate = DateTime.Now, Key = Guid.NewGuid(), Name = "Example " + i });
}
Stopwatch sw = Stopwatch.StartNew();
using (Model1 model = new Model1())
{
sw.Start();
model.BulkInsert(data);
sw.Stop();
}
Console.WriteLine($"Elapsed time for {data.Count} rows: {sw.Elapsed}");
Console.ReadKey();
}
}
Running this on my local HDD drive gives
Elapsed time for 1000000 rows: 00:00:24.9646688
P.S. The package provider claims that this version of the bulk package is outdated. Anyhow it's fitting my needs for years now the the package proposed by the author is no longer free of charge.
If you are looking for a "free" way to do it, I recommend you to go back to ADO.NET and use Array Bindings which is what I do under the hood with my library.
Disclaimer: I'm the owner of Entity Framework Extensions
This library support all major provider including Oracle
Oracle DevArt
Oracle DataAccess
Oracle DataAccessManaged
This library allows you to perform all bulk operations you need for your scenarios:
Bulk SaveChanges
Bulk Insert
Bulk Delete
Bulk Update
Bulk Merge
Example
// Easy to use
context.BulkSaveChanges();
// Easy to customize
context.BulkSaveChanges(bulk => bulk.BatchSize = 100);
// Perform Bulk Operations
context.BulkDelete(customers);
context.BulkInsert(customers);
context.BulkUpdate(customers);
// Customize Primary Key
context.BulkMerge(customers, operation => {
operation.ColumnPrimaryKeyExpression =
customer => customer.Code;
});
This library will make you save a ton of time without having you to make any ADO.NET!

Model is not reading from my Database (Entitity Framework)

I created simple MVC application with a database.
First I created a database, initialized the database, then I created a model from the database and the application worked..
Then I decide to load database with totaly different values (but the definitons of tables/fields stayed the same)..
After reloading the database my application does not shows any data from my DB. Using debugger I saw that my application cannot get any data from the table.
Worse - I noticed that additional to my database TestDB, the database explorer is showing TestDB.mdf1 database, with the same definitions as testDB.mdf but table is empty...
Here is the code:
public ActionResult ShowQuestions()
{
TestDBEntities _db = new TestDBEntities(); // this is the database
ObjectSet<question> all_quest = _db.questions ; // this is the table
foreach (var x in all_quest)
{
..... // this part was never executed
}
return View(q_list);
}
Any ideas what I am doing wrong?
1- check the connection string in your MVC application (web.config). This will tell you which database is being used. Then go to this database and check if you have data in your tables.
If it's the wrong database, just amend the connection string.
2- If it's the correct database, but without data, just add data.
It may have been recreated by your ORM (who knows?).
3- What is the error you are getting? Or are you getting any error?