Quickbooks CustomerQuery with Iterator returning duplicate records - intuit-partner-platform

I have a strange issue, and I am not sure where the error is
I am working on C#, and while running a query with filters against the Customer object, I get a strange series of results
If the filter I set should return 8 records, then my implementation returns 8, 8, 3, 1, 1, 1 records (in that order, in every subsequent call, and all those are duplicate)...
This is how I implemented the Iterator
var customerQuery = new CustomerQuery()
{
ChunkSize = "8",
Item = Guid.NewGuid().ToString("N"),
IncludeFinancialIndicator = false,
IncludeFinancialIndicatorSpecified = true,
MinimumBalanceSpecified = true,
MinimumBalance = 92,
SynchronizedFilterSpecified = true,
SynchronizedFilter = SynchronizedFilterEnumType.Synchronized,
};
var results = new List<Customer>();
int count;
//Loop until find all the results.
do
{
IEnumerable<Customer> partialResult = customerQuery.ExecuteQuery<Customer>(context);
count = partialResult.Count();
//First pass here returns 8 records
//Second pass returns same 8 records again
//third pass returns 3 records
//Three more passes with 1 record each
results.AddRange(partialResult);
} while (count > 0);
Am I doing something wrong or missing anything?
Thanks!
Edit:
this is the code implementing the Paging option....
var customerSet = new List<Customer>();
List<Customer> customerQueryResult = null;
int startPage = 1;
var qbdCustomerQuery = new CustomerQuery();
do
{
qbdCustomerQuery.ChunkSize = "10";
qbdCustomerQuery.MinimumBalanceSpecified = true;
qbdCustomerQuery.MinimumBalance = 92;
qbdCustomerQuery.ItemElementName = ItemChoiceType4.StartPage;
qbdCustomerQuery.Item = startPage.ToString();
customerQueryResult = qbdCustomerQuery.ExecuteQuery<Customer>(context).ToList();
if (customerQueryResult.Count > 0) { customerSet.AddRange(customerQueryResult); }
startPage++;
} while (customerQueryResult.Count > 0);
If I set ChunkSize to "100" I get the 8 expected records, but when trying to run it with ChunkSize = "10" I get 8,8,1 (duplicate records)

Related

or-tools - add constraint to jobshop problem

I would like to extend a bit the job shop example given here.
Given each machine has power consumption and at any given time (duration unit) there's a max power consumption on all machines, i would like to prevent overlap only if the sum of all machine consumptions passes a fixed value.
var allJobs =
new[] {
new[] {
// job0
new { machine = 0, duration = 3, power = 5 }, // task0
new { machine = 1, duration = 2, power = 2 }, // task1
new { machine = 2, duration = 2, power = 1 }, // task2
}.ToList(),
new[] {
// job1
new { machine = 0, duration = 2, power = 5 }, // task0
new { machine = 2, duration = 1, power = 2 }, // task1
new { machine = 1, duration = 4, power = 1}, // task2
}.ToList(),
new[] {
// job2
new { machine = 1, duration = 4, power = 1 }, // task0
new { machine = 2, duration = 3, power = 2 }, // task1
}.ToList(),
}.ToList();
The "flexible job job" example covers interval overlap but not the sum of a new variable and with the addon of this must be checked on every duration unit.
Any pointers on how i can achieve this?
Regards
For each task, create an interval. Use this interval in the no_overlap constraint of the corresponding machine.
Then add a cumulative constraint with the fixed value from the description as capacity.
Then for each task, add a demand (interval of the task, power of the task) to the cumulative constraint using this method: https://google.github.io/or-tools/dotnet/classGoogle_1_1OrTools_1_1Sat_1_1CumulativeConstraint.html#a922ab6a70999c054e9e94aa2dfb96be4
indeed that's the way to go.
i was able to implement it using your directions.
then i tryed to extend it a little further using the same method.
each machine, has a 'type', and and sum of power per 'type' could not exced a given value.
i grouped the intervals by type then call noOverlap on them.
But ive making a mistake somewhere as it aint correctly calculating despite give a solution.
//id, type
var machinesTypes = new[] {
1, 1, 2
};
var allJobs =
new[] {
new[] {
// job0
new { machine = 0, duration = 3, power = 5, type = 1 }, // task0
new { machine = 1, duration = 2, power = 4, type = 1 }, // task1
new { machine = 2, duration = 1, power = 2, type = 2 } , // task2
}.ToList()
}.ToList();
int numMachines = 0;
foreach (var job in allJobs)
{
foreach (var task in job)
{
numMachines = Math.Max(numMachines, 1 + task.machine);
}
}
int[] allMachines = Enumerable.Range(0, numMachines).ToArray();
// Computes horizon dynamically as the sum of all durations.
int horizon = 0;
foreach (var job in allJobs)
{
foreach (var task in job)
{
horizon += task.duration;
}
}
// Creates the model.
CpModel model = new CpModel();
//full power capacity Constraint
IntVar full_power_capacity = model.NewIntVar(0, 11, $"full_power_capacity");
CumulativeConstraint cc_full_power = model.AddCumulative(full_power_capacity);
//type power constraint
Dictionary<int, CumulativeConstraint> cc_all_machines = new Dictionary<int, CumulativeConstraint>();
foreach (int machine in allMachines)
{
int machineType = machinesTypes[allMachines[machine]];
if (!cc_all_machines.ContainsKey(machineType))
{
IntVar type_power_capacity = model.NewIntVar(0, 6, $"type_power_capacity_{machineType}");
CumulativeConstraint cc_type_power = model.AddCumulative(type_power_capacity);
cc_all_machines.Add(machineType, cc_type_power);
}
}
Dictionary<Tuple<int, int>, Tuple<IntVar, IntVar, IntervalVar>> allTasks = new Dictionary<Tuple<int, int>, Tuple<IntVar, IntVar, IntervalVar>>(); // (start, end, duration)
Dictionary<int, List<IntervalVar>> machineToIntervals = new Dictionary<int, List<IntervalVar>>();
Dictionary<int, List<IntervalVar>> machineTypesToIntervals = new Dictionary<int, List<IntervalVar>>();
for (int jobID = 0; jobID < allJobs.Count(); ++jobID)
{
var job = allJobs[jobID];
for (int taskID = 0; taskID < job.Count(); ++taskID)
{
var task = job[taskID];
String suffix = $"_{jobID}_{taskID}";
IntVar start = model.NewIntVar(0, horizon, "start" + suffix);
IntVar end = model.NewIntVar(0, horizon, "end" + suffix);
IntervalVar interval = model.NewIntervalVar(start, task.duration, end, "interval" + suffix);
var key = Tuple.Create(jobID, taskID);
allTasks[key] = Tuple.Create(start, end, interval);
if (!machineToIntervals.ContainsKey(task.machine))
{
machineToIntervals.Add(task.machine, new List<IntervalVar>());
}
machineToIntervals[task.machine].Add(interval);
cc_full_power.AddDemand(interval, task.power);
string suffix_2 = $"_{task.type}";
IntVar start_2 = model.NewIntVar(0, horizon, "start_2" + suffix_2);
IntVar end_2 = model.NewIntVar(0, horizon, "end_2" + suffix_2);
IntervalVar interval_2 = model.NewIntervalVar(start_2, task.duration, end_2, "interval_2" + suffix_2);
if (!machineTypesToIntervals.ContainsKey(task.type))
{
machineTypesToIntervals.Add(task.type, new List<IntervalVar>());
}
machineTypesToIntervals[task.type].Add(interval_2);
cc_all_machines[task.type].AddDemand(interval_2, task.power);
}
}
// Create and add disjunctive constraints.
foreach (int machine in allMachines)
{
model.AddNoOverlap(machineToIntervals[machine]);
}
foreach (var item in cc_all_machines)
{
model.AddNoOverlap(machineTypesToIntervals[item.Key]);
}
// Makespan objective.
IntVar objVar = model.NewIntVar(0, horizon, "makespan");
model.Minimize(objVar);
// Solve
CpSolver solver = new CpSolver();
CpSolverStatus status = solver.Solve(model);

Google chart data from external source

I am using C# webmethod to provide data to Google line chart. However I am stuck with the format of the data.
Currently I can only send data in list (object) format. But I want to send data in other format.
Is this possible? Please find my code snippet below:
ASP Page
//call to get data from ASP web method
$.ajax({
type: "POST",`enter code here`
contentType: 'application/json',
data: '{}',
url: 'AdminDashboard.aspx/GetChartData',
beforeSend: function () { alert("before send"); },
complete: function () { alert("complete"); },
success: function (data) {
var linedata1 = new google.visualization.arrayToDataTable(data.d);
linedata1.insertColumn(0, 'date', linedata1.getColumnLabel(0));
// copy values from column 1 (old column 0) to column 0, converted to Date
for (var i = 0; i < linedata1.getNumberOfRows() ; i++) {
var val = linedata1.getValue(i, 1);
if (val != '' && val != null) {
var dateArray = val.split('/');
var year = dateArray[2];
var month = dateArray[0] - 1; // convert to javascript's 0-indexed months
var day = dateArray[1];
linedata1.setValue(i, 0, new Date(year, month, day));
}
}
// remove column 1 (the old column 0)
linedata1.removeColumn(1);
dashboard.bind(programmaticSlider, programmaticChart);
dashboard.draw(linedata1);
}
ASP webmethod
[WebMethod]
public static List<object> GetChartData()
{
DataTable chartData = new DataTable();
DataTable tktData = new DataTable();
Array arrdata=null;
SPList configList, tktList;
SPQuery dataQuery;
SPListItemCollection configColl;
SPSecurity.RunWithElevatedPrivileges(delegate()
{
using (SPSite oSite = new SPSite("http://dsknomoe11:9696/"))
{
using (SPWeb oWeb = oSite.OpenWeb())
{
//Get Config list value for Ticket Priority and Ticket Status
configList = oWeb.Lists.TryGetList("SCMS_Configuration");
dataQuery = new SPQuery();
dataQuery.Query = "<Where><Or><Eq><FieldRef Name='Title' /><Value Type='Text'>Ticket Priority</Value></Eq><Eq><FieldRef Name='Title' /><Value Type='Text'>Ticket Status</Value></Eq></Or></Where>";
configColl = configList.GetItems(dataQuery);
//Ticket Priority and status
foreach(SPListItem oItem in configColl)
{
//Get Ticket Priority
if (oItem["Title"].ToString().Equals("Ticket Priority"))
{
tktPriority = oItem["value"].ToString().Split(';');
}
//Get Ticket Status
else if (oItem["Title"].ToString().Equals("Ticket Status"))
{
tktStatus = oItem["value"].ToString().Split(';');
}
}
//Add columns to DataTable ChartData
foreach (string s in tktStatus)
{
chartData.Columns.Add(s);
}
//Get Config list value for Ticket Priority and Ticket Status
tktList = oWeb.Lists.TryGetList("SCMS_Tickets");
dataQuery = new SPQuery();
dataQuery.Query = "<Where><Eq><FieldRef Name='isAct' /><Value Type='Choice'>Yes</Value></Eq></Where><OrderBy><FieldRef Name='Created' Ascending='True' /></OrderBy><GroupBy Collapse='True'><FieldRef Name='Created' /></GroupBy>";
dataQuery.ViewFields = "<FieldRef Name='Created' /><FieldRef Name='tckPrty' /><FieldRef Name='tckStat' />";
dataQuery.ViewFieldsOnly = true;
tktData = tktList.GetItems(dataQuery).GetDataTable();
var grpdata= tktData.AsEnumerable().Select(x => new { Date = Convert.ToDateTime(x[0]).Date.ToString("MM/dd/yyyy"), Status = x[2] }).ToArray();
arrdata = grpdata.GroupBy(l => l.Date).Select(g => new
{
Date = g.Key,
Open = g.Count(l => (string)l.Status == "Open"),
Closed = g.Count(l => (string)l.Status == "Closed"),
InProgress = g.Count(l => (string)l.Status == "In-Progress"),
Total = g.Count(l => ((string)l.Status == "Open") || ((string)l.Status == "Closed") || ((string)l.Status == "In-Progress"))
}).ToArray();
}
}
});
List<object> list = new List<object>();
list.Add(new object[] {"Date","Open","Closed","In-Progress","Total" });
string strDt, strOpen, strClosed, strInP, strTotal;
for (int i = 0; i < arrdata.Length; i++)
{
var spltData = arrdata.GetValue(i).ToString().Replace('{',' ').Replace('}',' ').Split(',');
strDt = spltData[0];
strOpen = spltData[1];
strClosed = spltData[2];
strInP = spltData[3];
strTotal = spltData[4].Trim();
list.Add(new object[] { Convert.ToDateTime(strDt.Split('=').Last()).Date.ToString("MM/dd/yyyy"),
Convert.ToInt32(strOpen.Split('=').Last()),
Convert.ToInt32(strClosed.Split('=').Last()),
Convert.ToInt32(strInP.Split('=').Last()),
Convert.ToInt32(strTotal.Split('=').Last())
});
}
return list;
}
Please help me with this.
1. My basic requirement is to create a datatable/or any other format with dynamic columns and then bind the data to google mutli-line chart.
For clarity, my current data format i.e list data is as
Date Open Closed InProgress
1/2/13 2 0 0
2/2/13 1 2 0
3/3/13 0 0 1
4/3/13 0 1 0
You can serve the data in many formats (JSON, CSV, TSV, HTML), I strongly recommend to you to use JSON, you can see the Google default structure here:
I got it working by using the JSON format.
i have posted the snippet, might help some one.
//for dynamic data binding
string jsonData = #"{ ""cols"" :[ { ""label"" : ""Type"" , ""type"" : ""string"" },{ ""label"" : ""Count"" , ""type"" : ""number"" }], ""rows"" :[{""c"":[{""v"":""Mushrooms""},{""v"":3}]}," +
#"{""c"":[{""v"":""Total""},{""v"":1}]}," +
#"{""c"":[{""v"":""Open""},{""v"":1}]}," +
#"{""c"":[{""v"":""Closed""},{""v"":1}]}," +
#"{""c"":[{""v"":""In-Progress""},{""v"":2}]}]}";
The catch was to use "double quotes" for every value. JSON recognizes "double quotes".
Thanks

When using NumericField, get absolutely nothing back every time

Been playing with Lucene.NET the last two days.
After reading up on Dates, I was led to believe that Dates are best converted to Milliseconds, and stored in NumericField, with Indexing=true, and Store=No.
But now nothing ever returns - it must be something basic, but I'm just not seeing it.
The saving code is as follows:
...
else if (type == typeof (DateTime?))
{
var typedValue = (DateTime?) value;
field = numericField = new NumericField(documentFieldName, 4, Field.Store.YES, true);
long milliseconds = typedValue.HasValue?(typedValue.Value.Date.Ticks/TimeSpan.TicksPerMillisecond):0;
numericField.SetLongValue(milliseconds);
doc.Add(numericField);
}
...
else
{
field = stringField = new Field(
documentFieldName,
(value != null)?value.ToString():string.Empty,
Store.YES,
Field.Index.ANALYZED) ;
doc.Add(stringField);
}
// Write the Document to the catalog
indexWriter.AddDocument(doc);
When I query for docs against the values saved in Field ... no problem.
When I query for documents by matching against the values in NumericFields, nothing returns.
Where did I go wrong?
Thanks for your help.
Lucene.Net.Analysis.Analyzer analyzer = new Lucene.Net.Analysis.Standard.StandardAnalyzer(Lucene.Net.Util.Version.LUCENE_30);
var q2 = NumericRangeQuery.NewLongRange("Val", 3, 3, true, true);
var uxy2 = documentSearchManagementService.Search("Students", termQuery, "Id");
Using:
public ScoredDocumentResult[] Search(string indexName, Query query, params string[] hitFieldNamesToReturn)
{
if (_configuration.IndexRootDirectory.IsNullOrEmpty())
{
throw new Exception("Configuration.IndexRootDirectory has not been configued yet.");
}
indexName.ValidateIsNotNullOrEmpty("indexName");
hitFieldNamesToReturn.ValidateIsNotDefault("hitFieldNamesToReturn");
//Specify the index file location where the indexes are to be stored
string indexFileLocation = Path.Combine(_configuration.IndexRootDirectory, indexName);
Lucene.Net.Store.Directory luceneDirectory = Lucene.Net.Store.FSDirectory.Open(indexFileLocation);
IndexSearcher indexSearcher = new IndexSearcher(luceneDirectory);
TopScoreDocCollector topScoreDocCollector = TopScoreDocCollector.Create(10, true);
indexSearcher.Search(query, topScoreDocCollector);
List<ScoredDocumentResult> results = new List<ScoredDocumentResult>();
foreach (var scoreDoc in topScoreDocCollector.TopDocs(0, 10).ScoreDocs)
{
ScoredDocumentResult resultItem = new ScoredDocumentResult();
Lucene.Net.Documents.Document doc = indexSearcher.Doc(scoreDoc.Doc);
resultItem.Score = scoreDoc.Score;
List<ScoredDocumentFieldResult> fields = new List<ScoredDocumentFieldResult>();
foreach (string fieldName in hitFieldNamesToReturn)
{
string fieldValue = doc.Get(fieldName);
fields.Add(new ScoredDocumentFieldResult{Key= fieldName,Value=fieldValue});
}
resultItem.FieldValues = fields.ToArray();
results.Add(resultItem);
}
indexSearcher.Close();
return results.ToArray();
}

StreamInsight, event matching

I ran into a situation with StreamInsight where I have 1 input source with different types of events that need to be treated differently, but are eventually matched with other events from the same source.
I created a (MUCH) simpler scenario below where an input source generates random numbers (for simplicity sake, 1s and 0s). If the number is even, we want to save it until further notice (unknown duration). If the number is odd, we want to match it with n-1 from the even stream and then remove n-1 from the even stream. If there is no match, the odd number is simply processed through as is with no further calculation. I have everything working as expected up to the point of removing a matching n-1 from the even stream. A match is made, and the match gets pushed to the output adapter, but remains available for another join to be made to the given event. What I have gathered of several days worth of experimentation and research, is somehow I need to clip the even stream event duration (ClipEventDuration), presumably as part of the filter in GenerateEvenJoinQuery, however, everything I have
tried has produced no change, or undesired results. I have also tried changing the evenStream to an Interval shape with even less luck. Any help or suggestions would be greatly appreciated.
For example, given a simplified list:
[ 1, 0, 1, 1, 0, 0, 1, 1, 1 ]
I would expect the output to look like:
[ 1, 100, 1, 100, 100, 1 ]
I would also be accepting of as the real scenario I'm working with, the first output isn't actually possible:
Note the 2nd and 3rd 0s are joined to a single 1.
[ 1, 100, 1, 100, 1, 1 ]
...
CepStream<int> inputStream = CepStream<int>.Create(
app
, "ATGInputStream"
, typeof(InputAdapterFactory)
, new InputConfig()
, EventShape.Point);
var everythingFilter = from e in inputStream select e;
Query everythingQuery = GenerateEverythingQuery(app, inputStream);
var everythingStream = everythingQuery.ToStream<int>("everythingStream");
Query oddQuery = GenerateOddQuery(app, everythingStream);
var oddAts = new AdvanceTimeSettings(new AdvanceTimeGenerationSettings(1, TimeSpan.FromTicks(-1), false), null, AdvanceTimePolicy.Drop);
var oddStream = oddQuery.ToStream<int>("oddStream", oddAts);
// only inject a cti in to even when we need it
var ats = new AdvanceTimeSettings(null, new AdvanceTimeImportSettings("oddStream"), AdvanceTimePolicy.Adjust);
Query evenQuery = GenerateEvenQuery(app, everythingStream);
var evenStream = evenQuery.ToStream<int>("evenStream", ats);
Query joinQuery = GenerateOddEvenJoinQuery(app, evenStream, oddStream);
var joinStream = joinQuery.ToStream<int>("joinStream");
...
private Query GenerateOddEvenJoinQuery(Application app, CepStream<int> evenStream, CepStream<int> oddStream) {
// (o * e) + 100 is an easy way to tell we had a match
var filter = (from o in oddStream
from e in evenStream
where e == (o - 1)
select (o * e) + 100);
// LEFT ANTI SEMI JOIN
var filter2 = from o in oddStream
where (from e in evenStream where e == o - 1 select e).IsEmpty()
select o;
var joinFilter = filter.Union(filter2);
return joinFilter.ToQuery(
app
, "Number Join Query"
, "Joins number streams."
, EventShape.Point
, StreamEventOrder.FullyOrdered);
}
private Query GenerateEvenQuery(Application app, CepStream<int> stream) {
var evenFilter = (from e in stream where e % 2 == 0 select e).AlterEventDuration(e => TimeSpan.MaxValue);
return evenFilter.ToQuery(
app
, "EvenQuery"
, ""
, EventShape.Edge
, StreamEventOrder.FullyOrdered);
}
private Query GenerateOddQuery(Application app, CepStream<int> stream) {
var filter = (from f in stream where (f % 2) == 1 select f);
return filter.ToQuery(
app
, "OddQuery"
, "Queries for odd numbers in stream."
, EventShape.Point
, StreamEventOrder.FullyOrdered);
}
private Query GenerateEverythingQuery(Application app, CepStream<int> stream) {
var everythingFilter = from e in stream select e;
return everythingFilter.ToQuery(
app
, "EverythingQuery"
, "Queries everything from the input stream."
, EventShape.Point
, StreamEventOrder.FullyOrdered);
}
SOLUTION:
While I was hoping for something a little more elaborate and potentially faster, the delayed processing may help with performance.
public Program() {
public Program() {
...
var stream = CepStream<RandomNumber>.Create(
app
, "StaticInputStream"
, typeof(StaticInputAdapterFactory)
, new InputConfig()
, EventShape.Point);
var processedStream = stream.Scan(new StreamMatcher());
Query consoleQuery = GenerateConsoleOutputQuery(app, processedStream);
...
}
private Query GenerateConsoleOutputQuery(Application app, CepStream<int> stream) {
var filter = from f in stream select f;
return filter.ToQuery(
app
, "Console Output Query"
, "Queries for messages to output to the console."
, typeof(OutputAdapterFactory)
, new OutputConfig()
, EventShape.Point
, StreamEventOrder.FullyOrdered);
}
public class StreamMatcher : CepPointStreamOperator<RandomNumber, int> {
private List<int> unmatched = new List<int>();
public override bool IsEmpty {
get { return false; }
}
public override IEnumerable<int> ProcessEvent(PointEvent<RandomNumber> inputEvent) {
if(inputEvent.Payload.value % 2 == 0) {
unmatched.Add(inputEvent.Payload.value);
} else {
var result = inputEvent.Payload.value;
int match = -1;
try {
match = (from f in unmatched where f == result - 1 select f).Take(1).Single();
unmatched.Remove(match);
} catch { }
if(match > -1) {
result += match + 100;
}
yield return result;
}
}
}
}
public class RandomNumber {
public int value { get; set; }
public DateTime timeStamp { get; set; }
}
You might consider using a UDSO ( User defined stream operator), where you can keep the state of your zeros.
void Main()
{
var randomNumbers = new []
{
new RandomNumber(){ Value = 1, TimeStamp = DateTime.Parse("2012-01-01 10:01:00 AM") },
new RandomNumber(){ Value = 0, TimeStamp = DateTime.Parse("2012-01-01 10:02:00 AM") },
new RandomNumber(){ Value = 0, TimeStamp = DateTime.Parse("2012-01-01 10:02:00 AM") },
new RandomNumber(){ Value = 1, TimeStamp = DateTime.Parse("2012-01-01 10:03:00 AM") },
new RandomNumber(){ Value = 1, TimeStamp = DateTime.Parse("2012-01-01 10:04:00 AM") },
new RandomNumber(){ Value = 0, TimeStamp = DateTime.Parse("2012-01-01 10:05:00 AM") },
};
var stream = randomNumbers.ToPointStream(Application,
e=> PointEvent.CreateInsert(e.TimeStamp ,e),
AdvanceTimeSettings.IncreasingStartTime) ;
var query = stream.Scan(new MyCalculation());
query.Dump();
}
public class MyCalculation : CepPointStreamOperator<RandomNumber,string>
{
private Queue<int> _queue = new Queue<int>() ;
public override bool IsEmpty
{
get { return false; }
}
public override IEnumerable<string> ProcessEvent(PointEvent<RandomNumber> inputEvent)
{
if (inputEvent.Payload.Value % 2 == 0)
{
_queue.Enqueue(inputEvent.Payload.Value);
}
else
{
var result= inputEvent.Payload.Value.ToString() ;
var list = _queue.ToArray();
for (int i = 0; i < list.Count(); i++)
{
result += list[i];
}
yield return result ;
}
}
}
public class RandomNumber
{
public int Value {get;set;}
public DateTime TimeStamp {get;set;}
}

Need more efficient way to convert Entities to DataTable

So, I have a method that converts entities into a DataTable. The only problem is it is very slow. I made sure to call .ToList() on the IQueryable to make it go ahead and load before processing the results into a DataTable. It takes hardly any time to load the 3000+ rows into memory. However, the real time slayer is in the following iteration in the method:
foreach (var index in imgLeaseIndexes)
{
DataRow dataRow = dataTable.NewRow();
dataRow["StateCode"] = index.StateCode;
dataRow["CountyCode"] = index.CountyCode;
dataRow["EntryNumber"] = index.EntryNumber;
dataRow["Volume"] = index.Volume;
dataRow["Page"] = index.Page;
dataRow["PageCount"] = index.ImgLocation.PageCount;
dataRow["CreateDate"] = index.ImgLocation.CreateDate;
dataTable.Rows.Add(dataRow);
}
And here is the complete method, for what it's worth:
private DataTable buildImgLeaseIndexDataTable(List<ImgLeaseIndex> imgLeaseIndexes)
{
var dataTable = new DataTable();
var dataColumns = new List<DataColumn>();
var tdiReportProperties =
new List<string>() { "StateCode", "CountyCode", "EntryNumber", "Volume", "Page", "PageCount", "CreateDate" };
Type imgLeaseIndexType = imgLeaseIndexes.FirstOrDefault().GetType();
PropertyInfo[] imgLeaseIndexPropertyInfo = imgLeaseIndexType.GetProperties();
dataColumns.AddRange(
(from propertyInfo in imgLeaseIndexPropertyInfo
where tdiReportProperties.Contains(propertyInfo.Name)
select new DataColumn()
{
ColumnName = propertyInfo.Name,
DataType = (propertyInfo.PropertyType.IsGenericType &&
propertyInfo.PropertyType.GetGenericTypeDefinition() == typeof(Nullable<>)) ?
propertyInfo.PropertyType.GetGenericArguments()[0] : propertyInfo.PropertyType
})
.ToList());
Type imgLocationType = imgLeaseIndexes.FirstOrDefault().ImgLocation.GetType();
PropertyInfo[] imgLocationPropertyInfo = imgLocationType.GetProperties();
dataColumns.AddRange(
(from propertyInfo in imgLocationPropertyInfo
where tdiReportProperties.Contains(propertyInfo.Name)
select new DataColumn()
{
ColumnName = propertyInfo.Name,
DataType = (propertyInfo.PropertyType.IsGenericType &&
propertyInfo.PropertyType.GetGenericTypeDefinition() == typeof(Nullable<>)) ?
propertyInfo.PropertyType.GetGenericArguments()[0] : propertyInfo.PropertyType
})
.ToList());
dataTable.Columns.AddRange(dataColumns.ToArray());
foreach (var index in imgLeaseIndexes)
{
DataRow dataRow = dataTable.NewRow();
dataRow["StateCode"] = index.StateCode;
dataRow["CountyCode"] = index.CountyCode;
dataRow["EntryNumber"] = index.EntryNumber;
dataRow["Volume"] = index.Volume;
dataRow["Page"] = index.Page;
dataRow["PageCount"] = index.ImgLocation.PageCount;
dataRow["CreateDate"] = index.ImgLocation.CreateDate;
dataTable.Rows.Add(dataRow);
}
return dataTable;
}
Does anyone have ideas on how I can make this more efficient and why it is so slow as is?
UPDATE:
I removed the reflection and explicitly set the data columns at compile time per the feedback I've received so far, but it is still really slow. This is what the updated code looks like:
private DataTable buildImgLeaseIndexDataTable(List<ImgLeaseIndex> imgLeaseIndexes)
{
var dataTable = new DataTable();
var stateCodeDataColumn = new DataColumn();
stateCodeDataColumn.ColumnName = "StateCode";
stateCodeDataColumn.Caption = "State Code";
stateCodeDataColumn.DataType = typeof(Int16);
dataTable.Columns.Add(stateCodeDataColumn);
var countyCodeDataColumn = new DataColumn();
countyCodeDataColumn.ColumnName = "CountyCode";
countyCodeDataColumn.Caption = "County Code";
countyCodeDataColumn.DataType = typeof(Int16);
dataTable.Columns.Add(countyCodeDataColumn);
var entryNumberDataColumn = new DataColumn();
entryNumberDataColumn.ColumnName = "EntryNumber";
entryNumberDataColumn.Caption = "Entry Number";
entryNumberDataColumn.DataType = typeof(string);
dataTable.Columns.Add(entryNumberDataColumn);
var volumeDataColumn = new DataColumn();
volumeDataColumn.ColumnName = "Volume";
volumeDataColumn.DataType = typeof(string);
dataTable.Columns.Add(volumeDataColumn);
var pageDataColumn = new DataColumn();
pageDataColumn.ColumnName = "Page";
pageDataColumn.DataType = typeof(string);
dataTable.Columns.Add(pageDataColumn);
var pageCountDataColumn = new DataColumn();
pageCountDataColumn.ColumnName = "PageCount";
pageCountDataColumn.Caption = "Page Count";
pageCountDataColumn.DataType = typeof(string);
dataTable.Columns.Add(pageCountDataColumn);
var createDateDataColumn = new DataColumn();
createDateDataColumn.ColumnName = "CreateDate";
createDateDataColumn.Caption = "Create Date";
createDateDataColumn.DataType = typeof(DateTime);
dataTable.Columns.Add(createDateDataColumn);
foreach (var index in imgLeaseIndexes)
{
DataRow dataRow = dataTable.NewRow();
dataRow["StateCode"] = index.StateCode;
dataRow["CountyCode"] = index.CountyCode;
dataRow["EntryNumber"] = index.EntryNumber;
dataRow["Volume"] = index.Volume;
dataRow["Page"] = index.Page;
dataRow["PageCount"] = index.ImgLocation.PageCount;
dataRow["CreateDate"] = index.ImgLocation.CreateDate;
dataTable.Rows.Add(dataRow);
}
return dataTable;
}
Any ideas on what else might be causing this?
UPDATE 2:
So it looks like other people have had this problem - specific to creating and setting the DataRows. My co-worker came across this:
The DataRow value setter is slow!
I'm going to try some of the stuff suggested in the link.
As Thiago mentioned in the comments the problem is the use of reflection.
The reflection part is where you are using PropertyInfo. You are getting the structure of the data types at runtime. But these are known at compile time.