I have min 100 000 data into a Job_Details table and I'm using Entity Framework to map the data.
This is the code:
public GetJobsResponse GetImportJobs()
{
GetJobsResponse getJobResponse = new GetJobsResponse();
List<JobBO> lstJobs = new List<JobBO>();
using (NSEXIM_V2Entities dbContext = new NSEXIM_V2Entities())
{
var lstJob = dbContext.Job_Details.ToList();
foreach (var dbJob in lstJob.Where(ie => ie.IMP_EXP == "I" && ie.Job_No != null))
{
JobBO job = MapBEJobforSearchObj(dbJob);
lstJobs.Add(job);
}
}
getJobResponse.Jobs = lstJobs;
return getJobResponse;
}
I found to this line is taking about 2-3 min to execute
var lstJob = dbContext.Job_Details.ToList();
How can i solve this issue?
To outline the performance issues with your example: (see inline comments)
public GetJobsResponse GetImportJobs()
{
GetJobsResponse getJobResponse = new GetJobsResponse();
List<JobBO> lstJobs = new List<JobBO>();
using (NSEXIM_V2Entities dbContext = new NSEXIM_V2Entities())
{
// Loads *ALL* entities into memory. This effectively takes all fields for all rows across from the database to your app server. (Even though you don't want it all)
var lstJob = dbContext.Job_Details.ToList();
// Filters from the data in memory.
foreach (var dbJob in lstJob.Where(ie => ie.IMP_EXP == "I" && ie.Job_No != null))
{
// Maps the entity to a DTO and adds it to the return collection.
JobBO job = MapBEJobforSearchObj(dbJob);
lstJobs.Add(job);
}
}
// Returns the DTOs.
getJobResponse.Jobs = lstJobs;
return getJobResponse;
}
First: pass your WHERE clause to EF to pass to the DB server rather than loading all entities into memory..
public GetJobsResponse GetImportJobs()
{
GetJobsResponse getJobResponse = new GetJobsResponse();
using (NSEXIM_V2Entities dbContext = new NSEXIM_V2Entities())
{
// Will pass the where expression to be DB server to be executed. Note: No .ToList() yet to leave this as IQueryable.
var jobs = dbContext.Job_Details..Where(ie => ie.IMP_EXP == "I" && ie.Job_No != null));
Next, use SELECT to load your DTOs. Typically these won't contain as much data as the main entity, and so long as you're working with IQueryable you can load related data as needed. Again this will be sent to the DB Server so you cannot use functions like "MapBEJobForSearchObj" here because the DB server does not know this function. You can SELECT a simple DTO object, or an anonymous type to pass to a dynamic mapper.
var dtos = jobs.Select(ie => new JobBO
{
JobId = ie.JobId,
// ... populate remaining DTO fields here.
}).ToList();
getJobResponse.Jobs = dtos;
return getJobResponse;
}
Moving the .ToList() to the end will materialize the data into your JobBO DTOs/ViewModels, pulling just enough data from the server to populate the desired rows and with the desired fields.
In cases where you may have a large amount of data, you should also consider supporting server-side pagination where you pass a page # and page size, then utilize a .Skip() + .Take() to load a single page of entries at a time.
Related
Our current system is using Lazyloading by default (it is something I am going to be disabling but it can't be done right now)
For this basic query I want to return two tables, CustomerNote and Note.
This is my query
using (var newContext = new Entities(true))
{
newContext.Configuration.LazyLoadingEnabled = false;
var result = from customerNotes in newContext.CustomerNotes.Include(d=>d.Note)
join note in newContext.Notes
on customerNotes.NoteId equals note.Id
where customerNotes.CustomerId == customerId
select customerNotes;
return result.ToList();
}
My result however only contains the data in the CustomerNote table
The linked entities Customer and Note are both null, what am I doing wrong here?
I got it working with the following which is much simpler than what I've found elsewhere
Context.Configuration.LazyLoadingEnabled = false;
var result = Context.CustomerNotes.Where<CustomerNote>(d => d.CustomerId == customerId)
.Include(d=>d.Note)
.Include(d=>d.Note.User);
return result.ToList();
This returns my CustomerNote table, related Notes and related Users from the Notes.
That is callled eager loading you want to achieve.
var customerNotes = newContext.CustomerNotes.Include(t=> t.Node).ToList();
This should work, i don't really understand the keyword syntax.
If the code above doesn't work try this:
var customerNotes = newContext.CustomerNotes.Include(t=> t.Node).Select(t=> new {
Node = t.Node,
Item = t
}).ToList();
I'm using WCF RIA in a Lightswitch project to create some query results. This query brings back all results regardless. I cannot make it filter the records based on the parameter passed (string Town).
public IQueryable<Enquiries> TestQuery(string Town)
{
List<Enquiries> riaenqs = new List<Enquiries>();
var enqs = this.Context.ClientEnquiries
.Include("Client")
.Include("Client.Town")
.OrderBy(enq => enq.Id);
if (Town != null)
{
enqs.Where(enq => enq.Client.Town.TownName == Town);
}
foreach (ClientEnquiry item in enqs.ToList())
{
Enquiries enq = new Enquiries();
enq.Id = item.Id;
enq.ClientName = item.Client.FirstName + " " + item.Client.Surname;
enq.Town = item.Client.Town != null ? item.Client.Town.TownName : null;
riaenqs.Add(enq);
}
return riaenqs.AsQueryable();
}
During debugging I can see that the Town is correctly populated and I can see that the query is built accordingly if Town is not null. However, when I hit the foreach statement where the linq to ef query is executed I always get all the results. I just cannot figure out where I'm slipping up.
The LINQ methods like the Where do not modify the collection/expression but always returning a new one.
So you need to reassign the result of the Where to your original variable enqs:
if (Town != null)
{
enqs = enqs.Where(enq => enq.Client.Town.TownName == Town);
}
I am trying to look up record using if I have the key then use Find if not use Where
private ApplicationDbContext db = new ApplicationDbContext();
public bool DeactivatePrice(int priceId = 0, string sponsorUserName = "")
{
var prices = db.BeveragePrices;
// if we have an id then find
if (priceId != 0)
{
prices = prices.Find(priceId);
}
else
{
prices = prices.Where(b => b.UserCreated == sponsorUserName);
}
if (prices != null)
{
// do something
}
return true;
I get the following error for
prices = prices.Find(priceId);
Cannot convert app.Model.BeveragePrices from system.data.entity.dbset
I am copying the pattern from this answer but something must be different.
Seems you forgot to put a predicate inside the Find function call. Also you need to do ToList on the collection. The second option is a lot more efficient. The first one gets the whole collection before selection.
Another note commented by #Alla is that the find returns a single element. So I assume another declaration had been made for 'price' in the first option I state down here.
price = prices.ToList.Find(b => b.PriceId == priceId);
Or
prices = prices.Select(b => b.PriceId == priceId);
I assume the field name is PriceId.
A newbie question. I am using EntityFramework 4.0. The backend database has a function that will return a subset of records based on time.
Example of working code is:
var query = from rx in context.GetRxByDate(tencounter,groupid)
select rx;
var result = context.CreateDetachedCopy(query.ToList());
return result;
I need to verify that a record does not exist in the database before inserting a new record. Before performing the "Any" filter, I would like to populate the context.Rxes with a subset of the larger backend database using the above "GetRxByDate()" function.
I do not know how to populate "Rxes" before performing any further filtering since Rxes is defined as
IQueryable<Rx> Rxes
and does not allow "Rxes =.. ". Here is what I have so far:
using (var context = new EnityFramework())
{
if (!context.Rxes.Any(c => c.Cform == rx.Cform ))
{
// Insert new record
Rx r = new Rx();
r.Trx = realtime;
context.Add(r);
context.SaveChanges();
}
}
I am fully prepared to kick myself since I am sure the answer is simple.
All help is appreciated. Thanks.
Edit:
If I do it this way, "Any" seems to return the opposite results of what is expected:
var g = context.GetRxByDate(tencounter, groupid).ToList();
if( g.Any(c => c.Cform == rx.Cform ) {....}
I'm new to the entity framework and I'm really confused about how savechanges works. There's probably a lot of code in my example which could be improved, but here's the problem I'm having.
The user enters a bunch of picks. I make sure the user hasn't already entered those picks.
Then I add the picks to the database.
var db = new myModel()
var predictionArray = ticker.Substring(1).Split(','); // Get rid of the initial comma.
var user = Membership.GetUser();
var userId = Convert.ToInt32(user.ProviderUserKey);
// Get the member with all his predictions for today.
var memberQuery = (from member in db.Members
where member.user_id == userId
select new
{
member,
predictions = from p in member.Predictions
where p.start_date == null
select p
}).First();
// Load all the company ids.
foreach (var prediction in memberQuery.predictions)
{
prediction.CompanyReference.Load();
}
var picks = from prediction in predictionArray
let data = prediction.Split(':')
let companyTicker = data[0]
where !(from i in memberQuery.predictions
select i.Company.ticker).Contains(companyTicker)
select new Prediction
{
Member = memberQuery.member,
Company = db.Companies.Where(c => c.ticker == companyTicker).First(),
is_up = data[1] == "up", // This turns up and down into true and false.
};
// Save the records to the database.
// HERE'S THE PART I DON'T UNDERSTAND.
// This saves the records, even though I don't have db.AddToPredictions(pick)
foreach (var pick in picks)
{
db.SaveChanges();
}
// This does not save records when the db.SaveChanges outside of a loop of picks.
db.SaveChanges();
foreach (var pick in picks)
{
}
// This saves records, but it will insert all the picks exactly once no matter how many picks you have.
//The fact you're skipping a pick makes no difference in what gets inserted.
var counter = 1;
foreach (var pick in picks)
{
if (counter == 2)
{
db.SaveChanges();
}
counter++;
}
I've tested and the SaveChanges doesn't even have to be in the loop.
The below code works, too.
foreach (var pick in picks)
{
break;
}
db.SaveChanges()
There's obviously something going on with the context I don't understand. I'm guessing I've somehow loaded my new picks as pending changes, but even if that's true I don't understand I have to loop over them to save changes.
Can someone explain this to me?
Here's updated working code based on Craig's responses:
1) Remove the Type then loop over the results and populate new objects.
var picks = (from prediction in predictionArray
let data = prediction.Split(':')
let companyTicker = data[0]
where !(from i in memberQuery.predictions
select i.Company.ticker).Contains(companyTicker)
select new //NO TYPE HERE
{
Member = memberQuery.member,
Company = db.Companies.Where(c => c.ticker == companyTicker).First(),
is_up = data[1] == "up", // This turns up and down into true and false.
}).ToList();
foreach (var prediction in picks)
{
if (includePrediction)
{
var p = new Prediction{
Member = prediction.Member,
Company = prediction.Company,
is_up = prediction.is_up
};
db.AddToPredictions(p);
}
}
2) Or if I don't want the predictions to be saved, I can detach the predictions.
foreach (var prediction in picks) {
if (excludePrediction)
{
db.Detach(prediction)
}
}
The reason is here:
select new Prediction
{
Member = memberQuery.member,
These lines will (once the IEnumerable is iterated; LINQ is lazy) :
Instantiate a new Prediction
Associate that Prediction with an existing Member, *which is attached to db.
Associating an instance of an entity with an attached entity automatically adds that entity to the context of the associated, attached entity.
So as soon as you start iterating over predictionArray, the code above executes and you have a new entity in your context.