I'm currently inserting/updating fields like this (if there's a better way, please say so - we're always learning)
public void UpdateChallengeAnswers(List<ChallengeAnswerInfo> model, Decimal field_id, Decimal loggedUserId)
{
JK_ChallengeAnswers o;
foreach (ChallengeAnswerInfo a in model)
{
o = this.FindChallengeAnswerById(a.ChallengeAnswerId);
if (o == null) o = new JK_ChallengeAnswers();
o.answer = FilterString(a.Answer);
o.correct = a.Correct;
o.link_text = "";
o.link_url = "";
o.position = FilterInt(a.Position);
o.updated_user = loggedUserId;
o.updated_date = DateTime.UtcNow;
if (o.challenge_id == 0)
{
// New record
o.challenge_id = field_id; // FK
o.created_user = loggedUserId;
o.created_date = DateTime.UtcNow;
db.JK_ChallengeAnswers.AddObject(o);
}
else
{
// Update record
this.Save();
}
}
this.Save(); // Commit changes
}
As you can see there is 2 times this.Save() (witch invokes db.SaveChanges();)
when Adding we place the new object into a Place Holder with the AddObject method, in other words, the new object is not committed right away and we can place as many objects we want.
But when it's an update, I need to Save first before moving on to the next object, is there a method that I can use in order to, let's say:
if (o.challenge_id == 0)
{
// New record
o.challenge_id = field_id;
o.created_user = loggedUserId;
o.created_date = DateTime.UtcNow;
db.JK_ChallengeAnswers.AddObject(o);
}
else
{
// Update record
db.JK_ChallengeAnswers.RetainObject(o);
}
}
this.Save(); // Only save once when all objects are ready to commit
}
So if there are 5 updates, I don't need to save into the database 5 times, but only once at the end.
Thank you.
Well if you have an object which is attached to the graph, if you modify values of this object, then the entity is marked as Modified.
If you simply do .AddObject, then the entity is marked as Added.
Nothing has happened yet - only staging of the graph.
Then, when you execute SaveChanges(), EF will translate the entries in the OSM to relevant store queries.
Your code looks a bit strange. Have you debugged through (and ran a SQL trace) to see what is actually getting executed? Because i can't see why you need that first .Save, because inline with my above points, since your modifying the entities in the first few lines of the method, an UPDATE statement will most likely always get executed, regardless of the ID.
I suggest you refactor your code to handle new/modified in seperate method. (ideally via a Repository)
Taken from Employee Info Starter Kit, you can consider the code snippet as below:
public void UpdateEmployee(Employee updatedEmployee)
{
//attaching and making ready for parsistance
if (updatedEmployee.EntityState == EntityState.Detached)
_DatabaseContext.Employees.Attach(updatedEmployee);
_DatabaseContext.ObjectStateManager.ChangeObjectState(updatedEmployee, System.Data.EntityState.Modified);
_DatabaseContext.SaveChanges();
}
Related
I'm creating a change log on my records and ChangeTracking only reads records from the main class.
I've already added slow loading and the child classes are being loaded inside my EntityEntry, but I can't capture them through entry.Properties
My class
private void OnBeforeSaveChanges()
{
ChangeTracker.DetectChanges();
var auditEntries = new List<AuditEntry>();
foreach (var entry in ChangeTracker.Entries())
{
if (entry.Entity is AuditLog || entry.State == EntityState.Detached || entry.State == EntityState.Unchanged)
continue;
var auditEntry = new AuditEntry(entry)
{
TableName = entry.Entity.GetType().Name,
UserId = _aplicationUser.GetId
};
auditEntries.Add(auditEntry);
foreach (var property in entry.Properties)
{
string propertyName = property.Metadata.Name;
if (property.Metadata.IsPrimaryKey())
{
auditEntry.KeyValues[propertyName] = property.CurrentValue;
continue;
}
switch (entry.State)
{
case EntityState.Added:
auditEntry.AuditType = AuditType.Create;
auditEntry.NewValues[propertyName] = property.CurrentValue;
break;
case EntityState.Deleted:
auditEntry.AuditType = AuditType.Delete;
auditEntry.OldValues[propertyName] = property.OriginalValue;
break;
case EntityState.Modified:
if (property.IsModified)
{
auditEntry.ChangedColumns.Add(propertyName);
auditEntry.AuditType = AuditType.Update;
auditEntry.OldValues[propertyName] = property.OriginalValue;
auditEntry.NewValues[propertyName] = property.CurrentValue;
}
break;
}
}
}
foreach (var auditEntry in auditEntries)
{
var auditDto = auditEntry.ToAudit();
AuditLogs.Add(new AuditLog(auditDto.Id, auditDto.UserId, auditDto.Type, auditDto.TableName, auditDto.DateTime,
auditDto.OldValues, auditDto.NewValues, auditDto.AffectedColumns, auditDto.PrimaryKey));
}
}
The change tracker reports on a per-entity basis. If you have a Parent entity and a Child collection beneath it and make a change to the parent and one or more children, the parent only shows up in the change tracker because of the changes to the parent's properties. It does not list the properties of all related / child entities. Any modified children would appear in the ChangeTracker as Child entities.
For example:
var parent = context.Parents.Include(x => x.Children).Single(x => x.Id == parentId);
var oldestChild = parent.Children.OrderByDescending(x => x.Age).FirstOrDefault();
parent.SomeValue = newValue;
if (oldestChild != null)
oldestChild.OtherValue = someOtherValue;
context.SaveChanges();
In this case the change tracker would include Parent if, and only if the newValue differed from the existing SomeValue. The change tracker would also include the Child denoted as the oldestChild if such a child actually exists, and the someOtherValue differed from that child's OtherValue. If the Child's value was the only one that actually changed then the Change tracker would not include that child's Parent, only the Child reference. This means you cannot really capture an easy "complete" before and after snapshot of an entire object graph. (Parent and all of it's children before and after, or including the parent when one of it's children happens to change) You can do it, but it will involve piecing the deltas together based on what's in the change tracker.
When it comes to writing audit records I would suggest using a bounded DbContext solely for this purpose that only manages the audit log entities and keeps the entity structure as simple as absolutely possible. This way main app context can intercept and inspect changes without worrying about audit changes, and the audit DbContext avoids this overhead, just performing the raw writes.
please help me in converting my after trigger to batch apex.
This trigger fires when opportunity stage changes to won.
It runs through line items and checks if forecast(custom objet) exists with that acunt.if yes,iit links to them..if no,itt will create a new forecat.
my trigger works fine forr some records.but to mass update i am getting timed out error.So opting batch apex but i had never written it.pls help me.
trigger Accountforecast on Opportunity (after insert,after update) {
List<Acc_c> AccproductList =new List<Acc_c>();
List<Opportunitylineitem> opplinitemlist =new List<Opportunitylineitem>();
list<opportunitylineitem > oppdate= new list<opportunitylineitem >();
List<Acc__c> accquery =new List<Acc__c>();
List<date> dt =new List<date>();
Set<Id> sProductIds = new Set<Id>();
Set<Id> sAccountIds = new Set<Id>();
Set<id> saccprodfcstids =new set<Id>();
Acc__c accpro =new Acc__c();
string aname;
Integer i;
Integer myIntMonth;
Integer myIntyear;
Integer myIntdate;
opplinitemlist=[select Id,PricebookEntry.Product2.Name,opp_account__c,Opp_account_name__c,PricebookEntry.Product2.id, quantity,ServiceDate,Acc_Product_Fcst__c from Opportunitylineitem WHERE Opportunityid IN :Trigger.newMap.keySet() AND Acc__c=''];
for(OpportunityLineItem oli:opplinitemlist) {
sProductIds.add(oli.PricebookEntry.Product2.id);
sAccountIds.add(oli.opp_account__c);
}
accquery=[select id,Total_Qty_Ordered__c,Last_Order_Qty__c,Last_Order_Date__c,Fcst_Days_Period__c from Acc__c where Acc__c.product__c In :sproductids and Acc__c.Account__c in :saccountids];
for(Acc__c apf1 :accquery){
saccprodfcstids.add(apf1.id);
}
if(saccprodfcstids!=null){
oppdate=[select servicedate from opportunitylineitem where Acc__c IN :saccprodfcstids ];
i =[select count() from Opportunitylineitem where acc_product_fcst__c in :saccprodfcstids];
}
for(Opportunity opp :trigger.new)
{
if(opp.Stagename=='Closed Won')
{
for(opportunitylineitem opplist:opplinitemlist)
{
if(!accquery.isempty())
{
for(opportunitylineitem opldt :oppdate)
{
string myDate = String.valueOf(opldt);
myDate = myDate.substring(myDate.indexof('ServiceDate=')+12);
myDate = myDate.substring(0,10);
String[] strDate = myDate.split('-');
myIntMonth = integer.valueOf(strDate[1]);
myIntYear = integer.valueOf(strDate[0]);
myIntDate = integer.valueOf(strDate[2]);
Date d = Date.newInstance(myIntYear, myIntMonth, myIntDate);
dt.add(d);
}
dt.add(opp.closedate);
dt.sort();
integer TDays=0;
system.debug('*************dt:'+dt.size());
for(integer c=0;c<dt.size()-1;c++)
{
TDays=TDays+dt[c].daysBetween(dt[c+1]);
}
for(Acc_product_fcst__c apf:accquery)
{
apf.Fcst_Days_Period__c = TDays/i;
apf.Total_Qty_Ordered__c =apf.Total_Qty_Ordered__c +opplist.quantity;
apf.Last_Order_Qty__c=opplist.quantity;
apf.Last_Order_Date__c=opp.CloseDate ;
apf.Fcst_Qty_Avg__c=apf.Total_Qty_Ordered__c/(i+1);
Opplist.Acc__c =apf.Id;
}
}
else{
accpro.Account__c=opplist.opp_account__c;
accpro.product__c=opplist.PricebookEntry.Product2.Id;
accpro.opplineitemid__c=opplist.id;
accpro.Total_Qty_Ordered__c =opplist.quantity;
accpro.Last_Order_Qty__c=opplist.quantity;
accpro.Last_Order_Date__c=opp.CloseDate;
accpro.Fcst_Qty_Avg__c=opplist.quantity;
accpro.Fcst_Days_Period__c=7;
accproductList.add(accpro);
}
}
}
}
if(!accproductlist.isempty()){
insert accproductlist;
}
update opplinitemlist;
update accquery;
}
First of all, you should take a look at this: Apex Batch Processing
Once you get a better idea on how batches work, we need to take into account the following points:
Identify the object that requires more processing. Account? Opportunity?
Should the data be maintained across batch calls? Stateful?
Use correct data structure in terms of performance. Map, List?
From your code, we can see you have three objects: OpportunityLineItems, Accounts, and Opportunities. It seems that your account object is using the most processing here.
It seems you're just keeping track of dates and not doing any aggregations. Thus, you don't need to maintain state across batch calls.
Your code has a potential of hitting governor limits, especially memory limits on the heap. You have a four-nested loop. Our suggestion would be to maintain opportunity line items related to Opportunities in a Map rather than in a List. Plus, we can get rid of those unnecessary for loops by refactoring the code as follows:
Note: This is just a template for the batch you will need to construct.
globalglobal Database.QueryLocator start(Database.BatchableContext BC) class AccountforecastBatch implements Database.Batchable<sObject>
{
global Database.QueryLocator start(Database.BatchableContext BC)
{
// 1. Do some initialization here: (i.e. for(OpportunityLineItem oli:opplinitemlist) {sProductIds.add(oli.PricebookEntry.Product2.id)..}
// 2. return Opportunity object here: return Database.getQueryLocator([select id,Total_Qty_Ordered__c,Last_Order_Qty ....]);
}
global void execute(Database.BatchableContext BC, List<sObject> scope)
{
// 1. Traverse your scope which at this point will be a list of Accounts
// 2. You're adding dates inside the process for Opportunity Line Items. See if you can isolate this process outside the for loops with a Map data structure.
// 3. You have 3 potential database transactions here (insert accproductlist;update opplinitemlist; update accquery; ). Ideally, you will only need one DB transaction per batch.If you can complete step 2 above, you might only need to update your opportunity line items. Otherwise, you're trying to do more than one thing in a method and you will need to redesign your solution
}
global void finish(Database.BatchableContext BC)
{
// send email or do some other tasks here
}
}
I have a form that allows the user to make changes to a widget, then enter a list of additional widgets that the same changes will be applied to. Using Entity Framework, I have the following working but it's slow and doesn't seem very efficient:
//objectToSave: widgets - array of widgets to save changes to
// pcram - changes to apply to each widget
public HttpResponseMessage PutPcram(ObjectToSave objectToSave)
{
if (!ModelState.IsValid)
{
return Request.CreateErrorResponse(HttpStatusCode.BadRequest, ModelState);
}
//loop through the widgets we want to save changes to
for (var i = 0; i < objectToSave.widgets.Length; i++)
{
var e = db.PcramChanges.Find(objectToSave.widgets[i]);
var excluded = new[] { "widgetID" };
var x = db.Entry(e);
foreach (var name in x.CurrentValues.PropertyNames.Except(excluded))
{
x.Property(name).IsModified = true;
db.Entry(e).Property(name).CurrentValue = db.Entry(objectToSave.pcram).Property(name).CurrentValue;
}
}
try
{
db.SaveChanges();
}
catch (DbUpdateConcurrencyException ex)
{
return Request.CreateErrorResponse(HttpStatusCode.NotFound, ex);
}
return Request.CreateResponse(HttpStatusCode.OK);
}
I basically want to save the changes made to the selected widget to multiple widgets and this was the only way I could come up with for excluding the "widgetID", which is the primary key. Any suggestions on how to improve this?
I know I'm late, but what about ForEach?
objectToSave.widgets.ForEach(o => { o.Prop1 = Val1, o.Prop2 = Val2 });
Db.ObjectContext.SaveChanges();
Entity Framework (at least up to EF5.0, may be in EF6.0 the situation had been changed) could not perform batch updates. It generates single query for each row update. I can suggest taking a look at EntityFramework.Extended library, which has extension methods, allowing performing batch update and batch delete.
EF's change tracking engine is slow enough and it is fired on every property change.
If this is the reason in your case, try to wrap the cycle in
db.Configuration.AutoDetectChangesEnabled = false;
for (int i = 0; i < objectToSave.widgets.Length; i++)
{
// here is the logic
}
Db.ObjectContext.DetectChanges();
Db.Configuration.AutoDetectChangesEnabled = true;
You can try to do several operations in different threads, this is always way faster. Try to grab a group of widgets and do the changes in one thread. The other group of widgets in another thread and so on. The smaller the group is (more threads) it's probably faster until your pc says the opposite. See here, it's one way to do it.
Ok, I must be working too hard because I can't get my head around what it takes to use the Entity Framework correctly.
Here is what I am trying to do:
I have two tables: HeaderTable and DetailTable. The DetailTable will have 1 to Many records for each row in HeaderTable. In my EDM I set up a Relationship between these two tables to reflect this.
Since there is now a relationship setup between these tables, I thought that by quering all the records in HeaderTable, I would be able to access the DetailTable collection created by the EDM (I can see the property when quering, but it's null).
Here is my query (this is a Silverlight app, so I am using the DomainContext on the client):
// myContext is instatiated with class scope
EntityQuery<Project> query = _myContext.GetHeadersQuery();
_myContext.Load<Project>(query);
Since these calls are asynchronous, I check the values after the callback has completed. When checking the value of _myContext.HeaderTable I have all the rows expected. However, the DetailsTable property within _myContext.HeaderTable is empty.
foreach (var h in _myContext.HeaderTable) // Has records
{
foreach (var d in h.DetailTable) // No records
{
string test = d.Description;
}
I'm assuming my query to return all HeaderTable objects needs to be modified to somehow return all the HeaderDetail collectoins for each HeaderTable row. I just don't understand how this non-logical modeling stuff works yet.
What am I doing wrong? Any help is greatly appriciated. If you need more information, just let me know. I will be happy to provide anything you need.
Thanks,
-Scott
What you're probably missing is the Include(), which I think is out of scope of the code you provided.
Check out this cool video; it explained everything about EDM and Linq-to-Entities to me:
http://msdn.microsoft.com/en-us/data/ff628210.aspx
In case you can't view video now, check out this piece of code I have based on those videos (sorry it's not in Silverlight, but it's the same basic idea, I hope).
The retrieval:
public List<Story> GetAllStories()
{
return context.Stories.Include("User").Include("StoryComments").Where(s => s.HostID == CurrentHost.ID).ToList();
}
Loading the the data:
private void LoadAllStories()
{
lvwStories.DataSource = TEContext.GetAllStories();
lvwStories.DataBind();
}
Using the data:
protected void lvwStories_ItemDataBound(object sender, ListViewItemEventArgs e)
{
if (e.Item.ItemType == ListViewItemType.DataItem)
{
Story story = e.Item.DataItem as Story;
// blah blah blah....
hlStory.Text = story.Title;
hlStory.NavigateUrl = "StoryView.aspx?id=" + story.ID;
lblStoryCommentCount.Text = "(" + story.StoryComments.Count.ToString() + " comment" + (story.StoryComments.Count > 1 ? "s" : "") + ")";
lblStoryBody.Text = story.Body;
lblStoryUser.Text = story.User.Username;
lblStoryDTS.Text = story.AddedDTS.ToShortTimeString();
}
}
I'm new to the entity framework and I'm really confused about how savechanges works. There's probably a lot of code in my example which could be improved, but here's the problem I'm having.
The user enters a bunch of picks. I make sure the user hasn't already entered those picks.
Then I add the picks to the database.
var db = new myModel()
var predictionArray = ticker.Substring(1).Split(','); // Get rid of the initial comma.
var user = Membership.GetUser();
var userId = Convert.ToInt32(user.ProviderUserKey);
// Get the member with all his predictions for today.
var memberQuery = (from member in db.Members
where member.user_id == userId
select new
{
member,
predictions = from p in member.Predictions
where p.start_date == null
select p
}).First();
// Load all the company ids.
foreach (var prediction in memberQuery.predictions)
{
prediction.CompanyReference.Load();
}
var picks = from prediction in predictionArray
let data = prediction.Split(':')
let companyTicker = data[0]
where !(from i in memberQuery.predictions
select i.Company.ticker).Contains(companyTicker)
select new Prediction
{
Member = memberQuery.member,
Company = db.Companies.Where(c => c.ticker == companyTicker).First(),
is_up = data[1] == "up", // This turns up and down into true and false.
};
// Save the records to the database.
// HERE'S THE PART I DON'T UNDERSTAND.
// This saves the records, even though I don't have db.AddToPredictions(pick)
foreach (var pick in picks)
{
db.SaveChanges();
}
// This does not save records when the db.SaveChanges outside of a loop of picks.
db.SaveChanges();
foreach (var pick in picks)
{
}
// This saves records, but it will insert all the picks exactly once no matter how many picks you have.
//The fact you're skipping a pick makes no difference in what gets inserted.
var counter = 1;
foreach (var pick in picks)
{
if (counter == 2)
{
db.SaveChanges();
}
counter++;
}
I've tested and the SaveChanges doesn't even have to be in the loop.
The below code works, too.
foreach (var pick in picks)
{
break;
}
db.SaveChanges()
There's obviously something going on with the context I don't understand. I'm guessing I've somehow loaded my new picks as pending changes, but even if that's true I don't understand I have to loop over them to save changes.
Can someone explain this to me?
Here's updated working code based on Craig's responses:
1) Remove the Type then loop over the results and populate new objects.
var picks = (from prediction in predictionArray
let data = prediction.Split(':')
let companyTicker = data[0]
where !(from i in memberQuery.predictions
select i.Company.ticker).Contains(companyTicker)
select new //NO TYPE HERE
{
Member = memberQuery.member,
Company = db.Companies.Where(c => c.ticker == companyTicker).First(),
is_up = data[1] == "up", // This turns up and down into true and false.
}).ToList();
foreach (var prediction in picks)
{
if (includePrediction)
{
var p = new Prediction{
Member = prediction.Member,
Company = prediction.Company,
is_up = prediction.is_up
};
db.AddToPredictions(p);
}
}
2) Or if I don't want the predictions to be saved, I can detach the predictions.
foreach (var prediction in picks) {
if (excludePrediction)
{
db.Detach(prediction)
}
}
The reason is here:
select new Prediction
{
Member = memberQuery.member,
These lines will (once the IEnumerable is iterated; LINQ is lazy) :
Instantiate a new Prediction
Associate that Prediction with an existing Member, *which is attached to db.
Associating an instance of an entity with an attached entity automatically adds that entity to the context of the associated, attached entity.
So as soon as you start iterating over predictionArray, the code above executes and you have a new entity in your context.