Consider the following code.
var items = from i in context.Items
select i;
var item = items.FirstOrDefault();
item.this = "that";
item.that = "this";
var items2 = from i in context.Items
where i.this == "that"
select i;
var data = items2.FirstOrDefault();
context.SaveChanges();
I'm trying to confirm that items2 will not include my modifications to item. In other words, items2's copy of item will not include the unsaved changes.
Have you tried it? =)
By default, your objects are being tracked and cached by the context, so that the objects in your second query actually do reflect changes in the first.
You may want to call context.Items.AsNoTracking() on the one of your two "items" to get the behavior you want.
Edit: Actually, this is a strange question. I just noticed that your items2 hasn't even hit the database yet, since you haven't called ToList() or FirstorDefault(). It remains an IQueryable that will hit the database after your code snippet and will therefore contain the changed value.
HOWEVER, if you call ToList() on items2, you'll encounter the caching scenario I outlined above.
In case of "var item" your query is executed the moment you used FirstOrDefault(). But for var items2 the query is still not executed. Now in your case result of items2 will always be affected by the updates you have done in the first query.
It will contain the modifications, only way to do is create a new context and query the new context.
Related
I want to duplicate a list of objects. I want to do some things to to that dupliucate but not affect the original list.
Heres my code:
var tempHolidays = <Holiday>[];
tempHolidays = holidays;
tempHolidays[widget.index].start = widget.start;
tempHolidays[widget.index].end = widget.end;
The results i'm seeing would suggest the actions carried out on tempHolidays are mirroring on holidays? Is this possible or do I have a bug elsewhere?
What you are doing is just passing the references of the initial list to the second.
To duplicate a list use toList(), it return a different List object with the same element of the original list
tempHolidays = holidays.toList()
I'm using Entity-Framework 6.1.3 with a sqlite database.
During page load I'm initializing some properties in a loop (see below).
foreach (var trade in model.Trades)
{
trade.ExchangeRates = Db.ExchangeRates.Local;
trade.BaseCurrency = Prj_TradAc.Properties.Settings.Default.BaseCurrency;
}
Db.ExchangeRates.Local never hits the Database which is expected.
So I was expecting to only assign a reference to Db.ExchangeRates.Local which should be fast.
However with only ~500 Trades the loop takes almost 10s!
When I do the following
var ers = Db.ExchangeRates.Local;
foreach (var trade in model.Trades)
{
trade.ExchangeRates = ers;
trade.BaseCurrency = Prj_TradAc.Properties.Settings.Default.BaseCurrency;
}
the same loop with the same amount of data takes ~40ms
So why is accessing DBSet.Local so slow?
EDIT:
Db.Configuration.AutoDetectChangesEnable = false
also makes the assignment fast. However I still do not understand why this is an issue here. My properties which I'm assigning to are just linked to fields - so no operation is going on here. There shouldn't be a change to the DBSet during assignment.
Anytime you access Local property (using the property getter), and DbContext.Configuration.AutoDetectChangesEnabled property is true (by default), EF calls ObjectContext.DetectChanges method which slows down the process.
I have a very small entity framework setup containing only a few related classes/tables and a view. I need to be able to pull a specific record from this view, namely, I need to be able to grab the record that meets two criteria, it has a specific ProfileID and a specific QuoteID.
This line is what's causing the problem:
TWProfileUpchargeTotal upchargeTotals = _context.TWProfileUpchargeTotals.Where(p => p.Profileid == profile.id && p.quoteid == _quote.quoteid).First();
I'm looping through the profiles I know about and getting their information from the view, so profile.id changes each time.
The first time this code executes it gets the correct record from the view.
The second and third (and presumably beyond that) time it executes, it retrieves the exact same record.
Any idea why or what I'm doing wrong here?
Thanks, in advance.
You've been bitten by the LINQ "gotcha" called closure. The following post (and many others) on SO detail this:
closure
What you need to do is declare a variable WITHIN the foreach you've ommited from the above code and assign the profile.id to this and use this in the Where clause.
foreach(Profile profile in ListOfProfiles)
{
var localProfile = profile;
TWProfileUpchargeTotal upchargeTotals = _context.TWProfileUpchargeTotals.Where(p => p.Profileid == localProfile.id && p.quoteid == _quote.quoteid).First();
}
I need to delete all the rows in a datatable (ADO.net). I dont want to use Foreach here.
In one single how can I delete all the rows.
Then I need to update the datatable to database.
Note: I have tried dataset.tables[0].rows.clear and dataset.clear but both are not working
Also I dont want to use sql delete query.
Post other than this if you able to provide answer.
'foreach' isn't such a simple answer to this question either -- you run into the problem where the enumerator is no longer valid for a changed collection.
The hassle here is that the Delete() behavior is different for a row in DataRowState.Added vs. Unchanged or Modified. If you Delete() an added row, it does just remove it from the collection since the data store presumably never knew about it anyway. Delete() on an unchanged or modified row simply marks it as deleted as others have indicated.
So I've ended up implementing an extension method like this to handle deleting all rows. If anyone has any better solutions, that would be great.
public static void DeleteAllRows(this DataTable table)
{
int idx = 0;
while (idx < table.Rows.Count)
{
int curCount = table.Rows.Count;
table.Rows[idx].Delete();
if (curCount == table.Rows.Count) idx++;
}
}
I'll just repost my comment here in case you want to close off this question, because I don't believe it's possible to "bulk delete" all the rows in a DataTable:
There's a difference between removing
rows from a DataTable and deleting
them. Deleting flags them as deleted
so that when you apply changes to your
database, the actual db rows are
deleted. Removing them (with Clear())
just takes them out of the in-memory
DataTable.
You'll have to iterate over the rows and delete them one by one. It's only a couple of lines of code.
this works.
for (int i = dtParts.Rows.Count - 1; i >= 0; i--)
dtParts.Rows[i].Delete();
this works with lambda expression:
But here you have to use FOREACH of Lambda. Luckily you have single line here to delete your stuff from datatable :)
myTable.AsEnumerable().ToList().ForEach(m => m.Delete());
db.table.AsEnumerable().ToList().ForEach(e => db.ProductGroupAgreemets.Remove(e));
works for me
DataTable=Datatable.Clone;
That should do it. This will copy the table structure, but not the data. (Datatable.Copy would do that)
The problem simplified:
I have a DataSet with some datatables...
I have a Winforms DataGrid bound to one of the datatables.
User sticks some rows into said datatable, via the DataGrid, let's say 3 rows;
All three rows now have their RowState = DataRowState.Added.
I now begin a sqlserver transaction.
Then call dataAdapter1.Update(dataSet1) to update rows into SqlServer.
row 1.. OK
row 2.. OK
row 3.. error at the sqlserver level (by design i enforced a unique index)
Upon detecting this error, i Rollback the sqlserver transaction.
I also try to "rollback" the datatable / dataset changes, using either of Dataset1.RejectChanges() and / or Datatable1.RejectChanges().
Problem is neither of .RejectChanges() work the way i envisaged. My datatable now has two rows (row1, row2), whose RowState = DataRowState.Unchanged; row3 has disappeared altogether.
What i want to happen is, when i roll back the sqlserver transaction, for all 3 rows in the datatable to remain in the SAME STATE just prior to the call to dataAdapter1.Update() method.
(Reason is so that the user can look at the error in the bound DataGrid, take corrective action, and attempt the Update again).
Any ideas anyone? i.e. i am looking for something equivalent to rolling back the state at the ADO dataTable level.
Ok, so i figured a way to get around this.
Get a clone of the original datatable, and update the clone.
If an error occurs, you still have the original datatable, with its original DataRowState; Furthermore, you can copy any errors that occur in the clone to the original Datatable, thus reflecting the errors in the datagrid for the user to see.
If update is successful, you simply refresh the original datatable with the clone.
VB Code:
Try
'daMyAdapter.Update(dsDataset, "MyDatatable") <-- replace original with below lines.
_dtMyDatatableClone = dsDataset.MyDatatable.Copy()
If _dtMyDatatableClone IsNot Nothing Then
daMyAdapter.Update(_dtMyDatatableClone)
'if you get here, update was successul - refresh now!
dsDataset.MyDatatable.Clear()
dsDataset.MyDatatable.Merge(_dtMyDatatableClone, False, MissingSchemaAction.Ignore)
End If
Catch
'uh oh, put error handler here.
End Try
I had a similar issue with trying to rollback changes to a DataTable that was bound to an Xceed DataGrid. Once the edits were made in the DataGrid, the edited values all become part of the DataRow's Current state. RejectChanges is only applicable for preventing the Proposed row state from becoming Current.
In order to revert the changes for a given row, I wrote a method to overwrite the current row version with the original version. In order to set a version as the Original, you simply call AcceptChanges() on the datatable.
public static void RevertToOriginalValues(DataRow row)
{
if (row.HasVersion(DataRowVersion.Original) && row.HasVersion(DataRowVersion.Current))
{
for (int colIndex = 0; colIndex < row.ItemArray.Length; colIndex++)
{
var original = row[colIndex, DataRowVersion.Original];
row[colIndex] = original;
}
}
}