Entity Framework Update Multiple Rows - entity-framework

I have a form that allows the user to make changes to a widget, then enter a list of additional widgets that the same changes will be applied to. Using Entity Framework, I have the following working but it's slow and doesn't seem very efficient:
//objectToSave: widgets - array of widgets to save changes to
// pcram - changes to apply to each widget
public HttpResponseMessage PutPcram(ObjectToSave objectToSave)
{
if (!ModelState.IsValid)
{
return Request.CreateErrorResponse(HttpStatusCode.BadRequest, ModelState);
}
//loop through the widgets we want to save changes to
for (var i = 0; i < objectToSave.widgets.Length; i++)
{
var e = db.PcramChanges.Find(objectToSave.widgets[i]);
var excluded = new[] { "widgetID" };
var x = db.Entry(e);
foreach (var name in x.CurrentValues.PropertyNames.Except(excluded))
{
x.Property(name).IsModified = true;
db.Entry(e).Property(name).CurrentValue = db.Entry(objectToSave.pcram).Property(name).CurrentValue;
}
}
try
{
db.SaveChanges();
}
catch (DbUpdateConcurrencyException ex)
{
return Request.CreateErrorResponse(HttpStatusCode.NotFound, ex);
}
return Request.CreateResponse(HttpStatusCode.OK);
}
I basically want to save the changes made to the selected widget to multiple widgets and this was the only way I could come up with for excluding the "widgetID", which is the primary key. Any suggestions on how to improve this?

I know I'm late, but what about ForEach?
objectToSave.widgets.ForEach(o => { o.Prop1 = Val1, o.Prop2 = Val2 });
Db.ObjectContext.SaveChanges();

Entity Framework (at least up to EF5.0, may be in EF6.0 the situation had been changed) could not perform batch updates. It generates single query for each row update. I can suggest taking a look at EntityFramework.Extended library, which has extension methods, allowing performing batch update and batch delete.

EF's change tracking engine is slow enough and it is fired on every property change.
If this is the reason in your case, try to wrap the cycle in
db.Configuration.AutoDetectChangesEnabled = false;
for (int i = 0; i < objectToSave.widgets.Length; i++)
{
// here is the logic
}
Db.ObjectContext.DetectChanges();
Db.Configuration.AutoDetectChangesEnabled = true;

You can try to do several operations in different threads, this is always way faster. Try to grab a group of widgets and do the changes in one thread. The other group of widgets in another thread and so on. The smaller the group is (more threads) it's probably faster until your pc says the opposite. See here, it's one way to do it.

Related

Entity Framework is too slow during mapping data up to 100k

I have min 100 000 data into a Job_Details table and I'm using Entity Framework to map the data.
This is the code:
public GetJobsResponse GetImportJobs()
{
GetJobsResponse getJobResponse = new GetJobsResponse();
List<JobBO> lstJobs = new List<JobBO>();
using (NSEXIM_V2Entities dbContext = new NSEXIM_V2Entities())
{
var lstJob = dbContext.Job_Details.ToList();
foreach (var dbJob in lstJob.Where(ie => ie.IMP_EXP == "I" && ie.Job_No != null))
{
JobBO job = MapBEJobforSearchObj(dbJob);
lstJobs.Add(job);
}
}
getJobResponse.Jobs = lstJobs;
return getJobResponse;
}
I found to this line is taking about 2-3 min to execute
var lstJob = dbContext.Job_Details.ToList();
How can i solve this issue?
To outline the performance issues with your example: (see inline comments)
public GetJobsResponse GetImportJobs()
{
GetJobsResponse getJobResponse = new GetJobsResponse();
List<JobBO> lstJobs = new List<JobBO>();
using (NSEXIM_V2Entities dbContext = new NSEXIM_V2Entities())
{
// Loads *ALL* entities into memory. This effectively takes all fields for all rows across from the database to your app server. (Even though you don't want it all)
var lstJob = dbContext.Job_Details.ToList();
// Filters from the data in memory.
foreach (var dbJob in lstJob.Where(ie => ie.IMP_EXP == "I" && ie.Job_No != null))
{
// Maps the entity to a DTO and adds it to the return collection.
JobBO job = MapBEJobforSearchObj(dbJob);
lstJobs.Add(job);
}
}
// Returns the DTOs.
getJobResponse.Jobs = lstJobs;
return getJobResponse;
}
First: pass your WHERE clause to EF to pass to the DB server rather than loading all entities into memory..
public GetJobsResponse GetImportJobs()
{
GetJobsResponse getJobResponse = new GetJobsResponse();
using (NSEXIM_V2Entities dbContext = new NSEXIM_V2Entities())
{
// Will pass the where expression to be DB server to be executed. Note: No .ToList() yet to leave this as IQueryable.
var jobs = dbContext.Job_Details..Where(ie => ie.IMP_EXP == "I" && ie.Job_No != null));
Next, use SELECT to load your DTOs. Typically these won't contain as much data as the main entity, and so long as you're working with IQueryable you can load related data as needed. Again this will be sent to the DB Server so you cannot use functions like "MapBEJobForSearchObj" here because the DB server does not know this function. You can SELECT a simple DTO object, or an anonymous type to pass to a dynamic mapper.
var dtos = jobs.Select(ie => new JobBO
{
JobId = ie.JobId,
// ... populate remaining DTO fields here.
}).ToList();
getJobResponse.Jobs = dtos;
return getJobResponse;
}
Moving the .ToList() to the end will materialize the data into your JobBO DTOs/ViewModels, pulling just enough data from the server to populate the desired rows and with the desired fields.
In cases where you may have a large amount of data, you should also consider supporting server-side pagination where you pass a page # and page size, then utilize a .Skip() + .Take() to load a single page of entries at a time.

Manatee.Trello Moving Cards

I'm writing a small application to manage Trello Boards in only a few aspects such as sorting Cards on a List, moving/copying Cards based on Due Date and/or Labels, archiving Lists on a regular basis and generating reports based on Labels, etc. As such, I've been putting together a facade around the Manatee.Trello library to simplify the interface for my services.
I've been getting comfortable with the library and things have been relatively smooth. However, I wrote an extension method on the Card class to move Cards within or between Lists, and another method that calls this extension method repeatedly to move all Cards from one List to another.
My issue is that when running the code on a couple of dummy lists with 7 cards in one, it completes without error, but at least one card doesn't actually get moved (though as many as 3 cards have failed to move). I can't tell if this is because I'm moving things too rapidly, or if I need to adjust the TrelloConfiguration.ChangeSubmissionTime, or what. I've tried playing around with delays but it doesn't help.
Here is my calling code:
public void MoveCardsBetweenLists(
string originListName,
string destinationListName,
string originBoardName,
string destinationBoardName = null)
{
var fromBoard = GetBoard(originBoardName); // returns a Manatee.Trello.Board
var toBoard = destinationBoardName == null
|| destinationBoardName.Equals(originBoardName, StringComparison.OrdinalIgnoreCase)
? fromBoard
: GetBoard(destinationBoardName);
var fromList = GetListFromBoard(originListName, fromBoard); // returns a Manatee.Trello.List from the specified Board
var toList = GetListFromBoard(destinationListName, toBoard);
for (int i = 0; i < fromList.Cards.Count(); i++)
{
fromList.Cards[i].Move(1, toList);
}
}
Here is my extension method on Manatee.Trello.Card:
public static void Move(this Card card, int position, List list = null)
{
if (list != null && list != card.List)
{
card.List = list;
}
card.Position = position;
}
I've created a test that replicates the functionality you want. Basically, I create 7 cards on my board, move them to another list, then delete them (just to maintain initial state).
private static void Run(System.Action action)
{
var serializer = new ManateeSerializer();
TrelloConfiguration.Serializer = serializer;
TrelloConfiguration.Deserializer = serializer;
TrelloConfiguration.JsonFactory = new ManateeFactory();
TrelloConfiguration.RestClientProvider = new WebApiClientProvider();
TrelloAuthorization.Default.AppKey = TrelloIds.AppKey;
TrelloAuthorization.Default.UserToken = TrelloIds.UserToken;
action();
TrelloProcessor.Flush();
}
#region http://stackoverflow.com/q/39926431/878701
private static void Move(Card card, int position, List list = null)
{
if (list != null && list != card.List)
{
card.List = list;
}
card.Position = position;
}
[TestMethod]
public void MovingCards()
{
Run(() =>
{
var list = new List(TrelloIds.ListId);
var cards = new List<Card>();
for (int i = 0; i < 10; i++)
{
cards.Add(list.Cards.Add("test card " + i));
}
var otherList = list.Board.Lists.Last();
for(var i = 0; i < cards.Count; i++)
{
Move(card, i, otherList);
}
foreach (var card in cards)
{
card.Delete();
}
});
}
#endregion
Quick question: Are you calling TrelloProcessor.Flush() before your execution ends? If you don't, then some changes will likely remain in the request processor queue when the application ends, so they'll never be sent. See my wiki page on processing requests for more information.
Also, I've noticed that you're using 1 as the position for each move. By doing this, you'll end up with an unreliable ordering. The position data that Trello uses is floating point. To position a card between two other cards, it simply takes the average of the other cards. In your case, (if the destination list is empty), I'd suggest sending in the indexer variable for the ordering. If the destination list isn't empty, you'll need to calculate a new position based on the other cards in the list (by the averaging method Trello uses).
Finally, I like the extension code you have. If you have ideas that you think would be useful to add to the library, please feel free to fork the GitHub repo and create a pull request.

please help me to convert trigger to batch apex

please help me in converting my after trigger to batch apex.
This trigger fires when opportunity stage changes to won.
It runs through line items and checks if forecast(custom objet) exists with that acunt.if yes,iit links to them..if no,itt will create a new forecat.
my trigger works fine forr some records.but to mass update i am getting timed out error.So opting batch apex but i had never written it.pls help me.
trigger Accountforecast on Opportunity (after insert,after update) {
List<Acc_c> AccproductList =new List<Acc_c>();
List<Opportunitylineitem> opplinitemlist =new List<Opportunitylineitem>();
list<opportunitylineitem > oppdate= new list<opportunitylineitem >();
List<Acc__c> accquery =new List<Acc__c>();
List<date> dt =new List<date>();
Set<Id> sProductIds = new Set<Id>();
Set<Id> sAccountIds = new Set<Id>();
Set<id> saccprodfcstids =new set<Id>();
Acc__c accpro =new Acc__c();
string aname;
Integer i;
Integer myIntMonth;
Integer myIntyear;
Integer myIntdate;
opplinitemlist=[select Id,PricebookEntry.Product2.Name,opp_account__c,Opp_account_name__c,PricebookEntry.Product2.id, quantity,ServiceDate,Acc_Product_Fcst__c from Opportunitylineitem WHERE Opportunityid IN :Trigger.newMap.keySet() AND Acc__c=''];
for(OpportunityLineItem oli:opplinitemlist) {
sProductIds.add(oli.PricebookEntry.Product2.id);
sAccountIds.add(oli.opp_account__c);
}
accquery=[select id,Total_Qty_Ordered__c,Last_Order_Qty__c,Last_Order_Date__c,Fcst_Days_Period__c from Acc__c where Acc__c.product__c In :sproductids and Acc__c.Account__c in :saccountids];
for(Acc__c apf1 :accquery){
saccprodfcstids.add(apf1.id);
}
if(saccprodfcstids!=null){
oppdate=[select servicedate from opportunitylineitem where Acc__c IN :saccprodfcstids ];
i =[select count() from Opportunitylineitem where acc_product_fcst__c in :saccprodfcstids];
}
for(Opportunity opp :trigger.new)
{
if(opp.Stagename=='Closed Won')
{
for(opportunitylineitem opplist:opplinitemlist)
{
if(!accquery.isempty())
{
for(opportunitylineitem opldt :oppdate)
{
string myDate = String.valueOf(opldt);
myDate = myDate.substring(myDate.indexof('ServiceDate=')+12);
myDate = myDate.substring(0,10);
String[] strDate = myDate.split('-');
myIntMonth = integer.valueOf(strDate[1]);
myIntYear = integer.valueOf(strDate[0]);
myIntDate = integer.valueOf(strDate[2]);
Date d = Date.newInstance(myIntYear, myIntMonth, myIntDate);
dt.add(d);
}
dt.add(opp.closedate);
dt.sort();
integer TDays=0;
system.debug('*************dt:'+dt.size());
for(integer c=0;c<dt.size()-1;c++)
{
TDays=TDays+dt[c].daysBetween(dt[c+1]);
}
for(Acc_product_fcst__c apf:accquery)
{
apf.Fcst_Days_Period__c = TDays/i;
apf.Total_Qty_Ordered__c =apf.Total_Qty_Ordered__c +opplist.quantity;
apf.Last_Order_Qty__c=opplist.quantity;
apf.Last_Order_Date__c=opp.CloseDate ;
apf.Fcst_Qty_Avg__c=apf.Total_Qty_Ordered__c/(i+1);
Opplist.Acc__c =apf.Id;
}
}
else{
accpro.Account__c=opplist.opp_account__c;
accpro.product__c=opplist.PricebookEntry.Product2.Id;
accpro.opplineitemid__c=opplist.id;
accpro.Total_Qty_Ordered__c =opplist.quantity;
accpro.Last_Order_Qty__c=opplist.quantity;
accpro.Last_Order_Date__c=opp.CloseDate;
accpro.Fcst_Qty_Avg__c=opplist.quantity;
accpro.Fcst_Days_Period__c=7;
accproductList.add(accpro);
}
}
}
}
if(!accproductlist.isempty()){
insert accproductlist;
}
update opplinitemlist;
update accquery;
}
First of all, you should take a look at this: Apex Batch Processing
Once you get a better idea on how batches work, we need to take into account the following points:
Identify the object that requires more processing. Account? Opportunity?
Should the data be maintained across batch calls? Stateful?
Use correct data structure in terms of performance. Map, List?
From your code, we can see you have three objects: OpportunityLineItems, Accounts, and Opportunities. It seems that your account object is using the most processing here.
It seems you're just keeping track of dates and not doing any aggregations. Thus, you don't need to maintain state across batch calls.
Your code has a potential of hitting governor limits, especially memory limits on the heap. You have a four-nested loop. Our suggestion would be to maintain opportunity line items related to Opportunities in a Map rather than in a List. Plus, we can get rid of those unnecessary for loops by refactoring the code as follows:
Note: This is just a template for the batch you will need to construct.
globalglobal Database.QueryLocator start(Database.BatchableContext BC) class AccountforecastBatch implements Database.Batchable<sObject>
{
global Database.QueryLocator start(Database.BatchableContext BC)
{
// 1. Do some initialization here: (i.e. for(OpportunityLineItem oli:opplinitemlist) {sProductIds.add(oli.PricebookEntry.Product2.id)..}
// 2. return Opportunity object here: return Database.getQueryLocator([select id,Total_Qty_Ordered__c,Last_Order_Qty ....]);
}
global void execute(Database.BatchableContext BC, List<sObject> scope)
{
// 1. Traverse your scope which at this point will be a list of Accounts
// 2. You're adding dates inside the process for Opportunity Line Items. See if you can isolate this process outside the for loops with a Map data structure.
// 3. You have 3 potential database transactions here (insert accproductlist;update opplinitemlist; update accquery; ). Ideally, you will only need one DB transaction per batch.If you can complete step 2 above, you might only need to update your opportunity line items. Otherwise, you're trying to do more than one thing in a method and you will need to redesign your solution
}
global void finish(Database.BatchableContext BC)
{
// send email or do some other tasks here
}
}

Entity Framework - Issue returning Relationship Entity

Ok, I must be working too hard because I can't get my head around what it takes to use the Entity Framework correctly.
Here is what I am trying to do:
I have two tables: HeaderTable and DetailTable. The DetailTable will have 1 to Many records for each row in HeaderTable. In my EDM I set up a Relationship between these two tables to reflect this.
Since there is now a relationship setup between these tables, I thought that by quering all the records in HeaderTable, I would be able to access the DetailTable collection created by the EDM (I can see the property when quering, but it's null).
Here is my query (this is a Silverlight app, so I am using the DomainContext on the client):
// myContext is instatiated with class scope
EntityQuery<Project> query = _myContext.GetHeadersQuery();
_myContext.Load<Project>(query);
Since these calls are asynchronous, I check the values after the callback has completed. When checking the value of _myContext.HeaderTable I have all the rows expected. However, the DetailsTable property within _myContext.HeaderTable is empty.
foreach (var h in _myContext.HeaderTable) // Has records
{
foreach (var d in h.DetailTable) // No records
{
string test = d.Description;
}
I'm assuming my query to return all HeaderTable objects needs to be modified to somehow return all the HeaderDetail collectoins for each HeaderTable row. I just don't understand how this non-logical modeling stuff works yet.
What am I doing wrong? Any help is greatly appriciated. If you need more information, just let me know. I will be happy to provide anything you need.
Thanks,
-Scott
What you're probably missing is the Include(), which I think is out of scope of the code you provided.
Check out this cool video; it explained everything about EDM and Linq-to-Entities to me:
http://msdn.microsoft.com/en-us/data/ff628210.aspx
In case you can't view video now, check out this piece of code I have based on those videos (sorry it's not in Silverlight, but it's the same basic idea, I hope).
The retrieval:
public List<Story> GetAllStories()
{
return context.Stories.Include("User").Include("StoryComments").Where(s => s.HostID == CurrentHost.ID).ToList();
}
Loading the the data:
private void LoadAllStories()
{
lvwStories.DataSource = TEContext.GetAllStories();
lvwStories.DataBind();
}
Using the data:
protected void lvwStories_ItemDataBound(object sender, ListViewItemEventArgs e)
{
if (e.Item.ItemType == ListViewItemType.DataItem)
{
Story story = e.Item.DataItem as Story;
// blah blah blah....
hlStory.Text = story.Title;
hlStory.NavigateUrl = "StoryView.aspx?id=" + story.ID;
lblStoryCommentCount.Text = "(" + story.StoryComments.Count.ToString() + " comment" + (story.StoryComments.Count > 1 ? "s" : "") + ")";
lblStoryBody.Text = story.Body;
lblStoryUser.Text = story.User.Username;
lblStoryDTS.Text = story.AddedDTS.ToShortTimeString();
}
}

Is there an Update Object holder on Entity Framework?

I'm currently inserting/updating fields like this (if there's a better way, please say so - we're always learning)
public void UpdateChallengeAnswers(List<ChallengeAnswerInfo> model, Decimal field_id, Decimal loggedUserId)
{
JK_ChallengeAnswers o;
foreach (ChallengeAnswerInfo a in model)
{
o = this.FindChallengeAnswerById(a.ChallengeAnswerId);
if (o == null) o = new JK_ChallengeAnswers();
o.answer = FilterString(a.Answer);
o.correct = a.Correct;
o.link_text = "";
o.link_url = "";
o.position = FilterInt(a.Position);
o.updated_user = loggedUserId;
o.updated_date = DateTime.UtcNow;
if (o.challenge_id == 0)
{
// New record
o.challenge_id = field_id; // FK
o.created_user = loggedUserId;
o.created_date = DateTime.UtcNow;
db.JK_ChallengeAnswers.AddObject(o);
}
else
{
// Update record
this.Save();
}
}
this.Save(); // Commit changes
}
As you can see there is 2 times this.Save() (witch invokes db.SaveChanges();)
when Adding we place the new object into a Place Holder with the AddObject method, in other words, the new object is not committed right away and we can place as many objects we want.
But when it's an update, I need to Save first before moving on to the next object, is there a method that I can use in order to, let's say:
if (o.challenge_id == 0)
{
// New record
o.challenge_id = field_id;
o.created_user = loggedUserId;
o.created_date = DateTime.UtcNow;
db.JK_ChallengeAnswers.AddObject(o);
}
else
{
// Update record
db.JK_ChallengeAnswers.RetainObject(o);
}
}
this.Save(); // Only save once when all objects are ready to commit
}
So if there are 5 updates, I don't need to save into the database 5 times, but only once at the end.
Thank you.
Well if you have an object which is attached to the graph, if you modify values of this object, then the entity is marked as Modified.
If you simply do .AddObject, then the entity is marked as Added.
Nothing has happened yet - only staging of the graph.
Then, when you execute SaveChanges(), EF will translate the entries in the OSM to relevant store queries.
Your code looks a bit strange. Have you debugged through (and ran a SQL trace) to see what is actually getting executed? Because i can't see why you need that first .Save, because inline with my above points, since your modifying the entities in the first few lines of the method, an UPDATE statement will most likely always get executed, regardless of the ID.
I suggest you refactor your code to handle new/modified in seperate method. (ideally via a Repository)
Taken from Employee Info Starter Kit, you can consider the code snippet as below:
public void UpdateEmployee(Employee updatedEmployee)
{
//attaching and making ready for parsistance
if (updatedEmployee.EntityState == EntityState.Detached)
_DatabaseContext.Employees.Attach(updatedEmployee);
_DatabaseContext.ObjectStateManager.ChangeObjectState(updatedEmployee, System.Data.EntityState.Modified);
_DatabaseContext.SaveChanges();
}