Mutiple Resultset in EF Core 1.0 - entity-framework-core

Does Entity Framework Core 1.0 Support multiple result set?
If so can you give please give an example based on the below stored procedure?
CREATE PROCEDURE uspGetProductInfo
AS
BEGIN
SELECT ID,PRODUCT_NAME FROM PRODUCT
SELECT ID,CATEGORY_NAME FROM PRODUCT_CATEGORY
END

What kind of support are you looking for? You can certainly drop-down to ADO.NET.
var command = db.Database.GetDbConnection().CreateCommand();
command.CommandText = "uspGetProductInfo";
db.Database.OpenConnection();
try
{
using (var reader = command.ExecuteReader())
{
while (reader.Read())
{
// TODO: Read products
}
reader.NextResult();
while (reader.Read())
{
// TODO: Read product categories
}
}
}
finally
{
db.Database.CloseConnection();
}

Related

Force Entity Framework to read each line with all details

I am having trouble with en EF method returning duplicate rows of data. When I am running this, in my example, it returns four rows from a database view. The fourth row includes details from the third row.
The same query in SSMS returns four individual rows with the correct details. I have read somewhere about EK and problems with optimization when there are no identity column. But - is there anyway to alter the below code to force EK to read all records with all details?
public List<vs_transactions> GetTransactionList(int cID)
{
using (StagingDataEntities db = new StagingDataEntities())
{
var res = from trans in db.vs_transactions
where trans.CreditID == cID
orderby trans.ActionDate descending
select trans;
return res.ToList();
}
}
Found the solution :) MergeOption.NoTracking
public List<vs_transactions> GetTransactionList(int cID)
{
db.vs_transactions.MergeOption = MergeOption.NoTracking;
using (StagingDataEntities db = new StagingDataEntities())
{
var res = from trans in db.vs_transactions
where trans.CreditID == cID
orderby trans.ActionDate descending
select trans;
return res.ToList();
}
}

EF Core multiple datasets

I am new to EF and want to return multiple datasets (like we do in stored procedures) in one database trip. I have 3 dropdowns to populate on my form plus the main page fields (which would be another dataset)
Any help would be appreciated.
Thanks
in this MSDN link sample you can find the full solution
using (var db = new BloggingContext())
{
// If using Code First we need to make sure the model is built before we open the connection
// This isn't required for models created with the EF Designer
db.Database.Initialize(force: false);
// Create a SQL command to execute the sproc
var cmd = db.Database.Connection.CreateCommand();
cmd.CommandText = "[dbo].[GetAllBlogsAndPosts]";
try
{
db.Database.Connection.Open();
// Run the sproc
var reader = cmd.ExecuteReader();
// Read Blogs from the first result set
var blogs = ((IObjectContextAdapter)db)
.ObjectContext
.Translate<Blog>(reader, "Blogs", MergeOption.AppendOnly);
foreach (var item in blogs)
{
Console.WriteLine(item.Name);
}
// Move to second result set and read Posts
reader.NextResult();
var posts = ((IObjectContextAdapter)db)
.ObjectContext
.Translate<Post>(reader, "Posts", MergeOption.AppendOnly);
foreach (var item in posts)
{
Console.WriteLine(item.Title);
}
}
finally
{
db.Database.Connection.Close();
}
}

Entity Framework 6: Disable Lazy Loading and specifically load included tables

Our current system is using Lazyloading by default (it is something I am going to be disabling but it can't be done right now)
For this basic query I want to return two tables, CustomerNote and Note.
This is my query
using (var newContext = new Entities(true))
{
newContext.Configuration.LazyLoadingEnabled = false;
var result = from customerNotes in newContext.CustomerNotes.Include(d=>d.Note)
join note in newContext.Notes
on customerNotes.NoteId equals note.Id
where customerNotes.CustomerId == customerId
select customerNotes;
return result.ToList();
}
My result however only contains the data in the CustomerNote table
The linked entities Customer and Note are both null, what am I doing wrong here?
I got it working with the following which is much simpler than what I've found elsewhere
Context.Configuration.LazyLoadingEnabled = false;
var result = Context.CustomerNotes.Where<CustomerNote>(d => d.CustomerId == customerId)
.Include(d=>d.Note)
.Include(d=>d.Note.User);
return result.ToList();
This returns my CustomerNote table, related Notes and related Users from the Notes.
That is callled eager loading you want to achieve.
var customerNotes = newContext.CustomerNotes.Include(t=> t.Node).ToList();
This should work, i don't really understand the keyword syntax.
If the code above doesn't work try this:
var customerNotes = newContext.CustomerNotes.Include(t=> t.Node).Select(t=> new {
Node = t.Node,
Item = t
}).ToList();

Batch update/delete EF5

What is the best way to deal with batch updates using (Entity Framework) EF5?
I have 2 particular cases I'm interested in:
Updating a field (e.g. UpdateDate) for a list (List) of between 100 and 100.000 Id's, which the primary key. Calling each update separately seem to be to much overhead and takes a long time.
Inserting many, also between the 100 and 100.000, of the same objects (e.g. Users) in a single go.
Any good advice?
There are two open source projects allowing this: EntityFramework.Extended and Entity Framework Extensions. You can also check discussion about bulk updates on EF's codeplex site.
Inserting 100k records through EF is in the first place wrong application architecture. You should choose different lightweight technology for data imports. Even EF's internal operation with such big record set will cost you a lot of processing time. There is currently no solution for batch inserts for EF but there is broad discussion about this feature on EF's code plex site.
I see the following options:
1 . The simplest way - create your SQL request by hands and execute through ObjectContext.ExecuteStoreCommand
context.ExecuteStoreCommand("UPDATE TABLE SET FIELD1 = {0} WHERE FIELD2 = {1}", value1, value2);
2 . Use EntityFramework.Extended
context.Tasks.Update(
t => t.StatusId == 1,
t => new Task {StatusId = 2});
3 . Make your own extension for EF. There is an article Bulk Delete where this goal was achieved by inheriting ObjectContext class. It's worth to take a look. Bulk insert/update can be implemented in the same way.
You may not want to hear it, but your best option is to not use EF for bulk operations. For updating a field across a table of records, use an Update statement in the database (possibly called through a stored proc mapped to an EF Function). You can also use the Context.ExecuteStoreQuery method to issue an Update statement to the database.
For massive inserts, your best bet is to use Bulk Copy or SSIS. EF will require a separate hit to the database for each row being inserted.
Bulk inserts should be done using the SqlBulkCopy class. Please see pre-existing StackOverflow Q&A on integrating the two: SqlBulkCopy and Entity Framework
SqlBulkCopy is a lot more user-friendly than bcp (Bulk Copy command-line utility) or even OPEN ROWSET.
Here's what I've done successfully:
private void BulkUpdate()
{
var oc = ((IObjectContextAdapter)_dbContext).ObjectContext;
var updateQuery = myIQueryable.ToString(); // This MUST be above the call to get the parameters.
var updateParams = GetSqlParametersForIQueryable(updateQuery).ToArray();
var updateSql = $#"UPDATE dbo.myTable
SET col1 = x.alias2
FROM dbo.myTable
JOIN ({updateQuery}) x(alias1, alias2) ON x.alias1 = dbo.myTable.Id";
oc.ExecuteStoreCommand(updateSql, updateParams);
}
private void BulkInsert()
{
var oc = ((IObjectContextAdapter)_dbContext).ObjectContext;
var insertQuery = myIQueryable.ToString(); // This MUST be above the call to get the parameters.
var insertParams = GetSqlParametersForIQueryable(insertQuery).ToArray();
var insertSql = $#"INSERT INTO dbo.myTable (col1, col2)
SELECT x.alias1, x.alias2
FROM ({insertQuery}) x(alias1, alias2)";
oc.ExecuteStoreCommand(insertSql, insertParams.ToArray());
}
private static IEnumerable<SqlParameter> GetSqlParametersForIQueryable<T>(IQueryable<T> queryable)
{
var objectQuery = GetObjectQueryFromIQueryable(queryable);
return objectQuery.Parameters.Select(x => new SqlParameter(x.Name, x.Value));
}
private static ObjectQuery<T> GetObjectQueryFromIQueryable<T>(IQueryable<T> queryable)
{
var dbQuery = (DbQuery<T>)queryable;
var iqProp = dbQuery.GetType().GetProperty("InternalQuery", BindingFlags.Instance | BindingFlags.NonPublic | BindingFlags.Public);
var iq = iqProp.GetValue(dbQuery, null);
var oqProp = iq.GetType().GetProperty("ObjectQuery", BindingFlags.Instance | BindingFlags.NonPublic | BindingFlags.Public);
return (ObjectQuery<T>)oqProp.GetValue(iq, null);
}
public static bool BulkDelete(string tableName, string columnName, List<object> val)
{
bool ret = true;
var max = 2000;
var pages = Math.Ceiling((double)val.Count / max);
for (int i = 0; i < pages; i++)
{
var count = max;
if (i == pages - 1) { count = val.Count % max; }
var args = val.GetRange(i * max, count);
var cond = string.Join("", args.Select((t, index) => $",#p{index}")).Substring(1);
var sql = $"DELETE FROM {tableName} WHERE {columnName} IN ({cond}) ";
ret &= Db.ExecuteSqlCommand(sql, args.ToArray()) > 0;
}
return ret;
}
I agree with the accepted answer that ef is probably the wrong technology for bulk inserts.
However, I think it's worth having a look at EntityFramework.BulkInsert.

Returning a DataTable using Entity Framework ExecuteStoreQuery

I am working with a system that has many stored procedures that need to be displayed. Creating entities for each of my objects is not practical.
Is it possible and how would I return a DataTable using ExecuteStoreQuery ?
public ObjectResult<DataTable> MethodName(string fileSetName) {
using (var dataContext = new DataContext(_connectionString))
{
var returnDataTable = ((IObjectContextAdapter)dataContext).ObjectContext.ExecuteStoreQuery<DataTable>("SP_NAME","SP_PARAM");
return returnDataTable;
}
Yes it's possible, but it should be used for just dynamic result-set or raw SQL.
public DataTable ExecuteStoreQuery(string commandText, params Object[] parameters)
{
DataTable retVal = new DataTable();
retVal = context.ExecuteStoreQuery<DataTable>(commandText, parameters).FirstOrDefault();
return retVal;
}
Edit: It's better to use classical ADO.NET to get the data model rather than using Entity Framework because most probably you cannot use DataTable even if you can run the method: context.ExecuteStoreQuery<DataTable>(commandText, parameters).FirstOrDefault();
ADO.NET Example:
public DataSet GetResultReport(int questionId)
{
DataSet retVal = new DataSet();
EntityConnection entityConn = (EntityConnection)context.Connection;
SqlConnection sqlConn = (SqlConnection)entityConn.StoreConnection;
SqlCommand cmdReport = new SqlCommand([YourSpName], sqlConn);
SqlDataAdapter daReport = new SqlDataAdapter(cmdReport);
using (cmdReport)
{
SqlParameter questionIdPrm = new SqlParameter("QuestionId", questionId);
cmdReport.CommandType = CommandType.StoredProcedure;
cmdReport.Parameters.Add(questionIdPrm);
daReport.Fill(retVal);
}
return retVal;
}
No, I don't think that'll work - Entity Framework is geared towards returning entities and isn't meant to return DataTable objects.
If you need DataTable objects, use straight ADO.NET instead.
This method uses the connection string from the entity framework to establish an ADO.NET connection, to a MySQL database in this example.
using MySql.Data.MySqlClient;
public DataSet GetReportSummary( int RecordID )
{
var context = new catalogEntities();
DataSet ds = new DataSet();
using ( MySqlConnection connection = new MySqlConnection( context.Database.Connection.ConnectionString ) )
{
using ( MySqlCommand cmd = new MySqlCommand( "ReportSummary", connection ) )
{
MySqlDataAdapter adapter = new MySqlDataAdapter( cmd );
adapter.SelectCommand.CommandType = CommandType.StoredProcedure;
adapter.SelectCommand.Parameters.Add( new MySqlParameter( "#ID", RecordID ) );
adapter.Fill( ds );
}
}
return ds;
}
Yes it can easily be done like this:
var table = new DataTable();
using (var ctx = new SomeContext())
{
var cmd = ctx.Database.Connection.CreateCommand();
cmd.CommandText = "Select Col1, Col2 from SomeTable";
cmd.Connection.Open();
table.Load(cmd.ExecuteReader());
}
By the rule, you shouldn't use a DataSet inside a EF application. But, if you really need to (for instance, to feed a report), that solution should work (it's EF 6 code):
DataSet GetDataSet(string sql, CommandType commandType, Dictionary<string, Object> parameters)
{
// creates resulting dataset
var result = new DataSet();
// creates a data access context (DbContext descendant)
using (var context = new MyDbContext())
{
// creates a Command
var cmd = context.Database.Connection.CreateCommand();
cmd.CommandType = commandType;
cmd.CommandText = sql;
// adds all parameters
foreach (var pr in parameters)
{
var p = cmd.CreateParameter();
p.ParameterName = pr.Key;
p.Value = pr.Value;
cmd.Parameters.Add(p);
}
try
{
// executes
context.Database.Connection.Open();
var reader = cmd.ExecuteReader();
// loop through all resultsets (considering that it's possible to have more than one)
do
{
// loads the DataTable (schema will be fetch automatically)
var tb = new DataTable();
tb.Load(reader);
result.Tables.Add(tb);
} while (!reader.IsClosed);
}
finally
{
// closes the connection
context.Database.Connection.Close();
}
}
// returns the DataSet
return result;
}
In my Entity Framework based solution I need to replace one of my Linq queries with sql - for efficiency reasons.
Also I want my results in a DataTable from one stored procedure so that I could create a table value parameter to pass into a second stored procedure. So:
I'm using sql
I don't want a DataSet
Iterating an IEnumerable probably isn't going to cut it - for efficiency reasons
Also, I am using EF6, so I would prefer DbContext.SqlQuery over ObjectContext.ExecuteStoreQuery as the original poster requested.
However, I found that this just didn't work:
_Context.Database.SqlQuery<DataTable>(sql, parameters).FirstOrDefault();
This is my solution. It returns a DataTable that is fetched using an ADO.NET SqlDataReader - which I believe is faster than a SqlDataAdapter on read-only data. It doesn't strictly answer the question because it uses ADO.Net, but it shows how to do that after getting a hold of the connection from the DbContext
protected DataTable GetDataTable(string sql, params object[] parameters)
{
//didn't work - table had no columns or rows
//return Context.Database.SqlQuery<DataTable>(sql, parameters).FirstOrDefault();
DataTable result = new DataTable();
SqlConnection conn = Context.Database.Connection as SqlConnection;
if(conn == null)
{
throw new InvalidCastException("SqlConnection is invalid for this database");
}
using (SqlCommand cmd = new SqlCommand(sql, conn))
{
cmd.Parameters.AddRange(parameters);
conn.Open();
using (SqlDataReader reader = cmd.ExecuteReader())
{
result.Load(reader);
}
return result;
}
}
The easiest way to return a DataTable using the EntityFramework is to do the following:
MetaTable metaTable = Global.DefaultModel.GetTable("Your EntitySetName");
For example:
MetaTable metaTable = Global.DefaultModel.GetTable("Employees");
Maybe your stored procedure could return a complex type?
http://blogs.msdn.com/b/somasegar/archive/2010/01/11/entity-framework-in-net-4.aspx