When using NumericField, get absolutely nothing back every time - lucene.net

Been playing with Lucene.NET the last two days.
After reading up on Dates, I was led to believe that Dates are best converted to Milliseconds, and stored in NumericField, with Indexing=true, and Store=No.
But now nothing ever returns - it must be something basic, but I'm just not seeing it.
The saving code is as follows:
...
else if (type == typeof (DateTime?))
{
var typedValue = (DateTime?) value;
field = numericField = new NumericField(documentFieldName, 4, Field.Store.YES, true);
long milliseconds = typedValue.HasValue?(typedValue.Value.Date.Ticks/TimeSpan.TicksPerMillisecond):0;
numericField.SetLongValue(milliseconds);
doc.Add(numericField);
}
...
else
{
field = stringField = new Field(
documentFieldName,
(value != null)?value.ToString():string.Empty,
Store.YES,
Field.Index.ANALYZED) ;
doc.Add(stringField);
}
// Write the Document to the catalog
indexWriter.AddDocument(doc);
When I query for docs against the values saved in Field ... no problem.
When I query for documents by matching against the values in NumericFields, nothing returns.
Where did I go wrong?
Thanks for your help.
Lucene.Net.Analysis.Analyzer analyzer = new Lucene.Net.Analysis.Standard.StandardAnalyzer(Lucene.Net.Util.Version.LUCENE_30);
var q2 = NumericRangeQuery.NewLongRange("Val", 3, 3, true, true);
var uxy2 = documentSearchManagementService.Search("Students", termQuery, "Id");
Using:
public ScoredDocumentResult[] Search(string indexName, Query query, params string[] hitFieldNamesToReturn)
{
if (_configuration.IndexRootDirectory.IsNullOrEmpty())
{
throw new Exception("Configuration.IndexRootDirectory has not been configued yet.");
}
indexName.ValidateIsNotNullOrEmpty("indexName");
hitFieldNamesToReturn.ValidateIsNotDefault("hitFieldNamesToReturn");
//Specify the index file location where the indexes are to be stored
string indexFileLocation = Path.Combine(_configuration.IndexRootDirectory, indexName);
Lucene.Net.Store.Directory luceneDirectory = Lucene.Net.Store.FSDirectory.Open(indexFileLocation);
IndexSearcher indexSearcher = new IndexSearcher(luceneDirectory);
TopScoreDocCollector topScoreDocCollector = TopScoreDocCollector.Create(10, true);
indexSearcher.Search(query, topScoreDocCollector);
List<ScoredDocumentResult> results = new List<ScoredDocumentResult>();
foreach (var scoreDoc in topScoreDocCollector.TopDocs(0, 10).ScoreDocs)
{
ScoredDocumentResult resultItem = new ScoredDocumentResult();
Lucene.Net.Documents.Document doc = indexSearcher.Doc(scoreDoc.Doc);
resultItem.Score = scoreDoc.Score;
List<ScoredDocumentFieldResult> fields = new List<ScoredDocumentFieldResult>();
foreach (string fieldName in hitFieldNamesToReturn)
{
string fieldValue = doc.Get(fieldName);
fields.Add(new ScoredDocumentFieldResult{Key= fieldName,Value=fieldValue});
}
resultItem.FieldValues = fields.ToArray();
results.Add(resultItem);
}
indexSearcher.Close();
return results.ToArray();
}

Related

Error VS403357 when batch creating many work items in Azure DevOps using .NET API

I'm trying to use the Azure DevOps .NET API to batch create WorkItems in a AzureDevOps repository, but when I submit the batch request, I'm getting back an error message: "VS403357: Work items in the batch are expected to be unique, but found work item with ID -1 in more than one request."
Here's my code:
public void ExecuteWorkItemMigration(int[] workItemIds, IProgress<ProgressResult> progress = null)
{
var wiql = "SELECT * FROM WorkItems";
var query = new Query(_workItemStore, wiql, workItemIds);
var workItemCollection = query.RunQuery();
string projectName = MainSettings.AzureDevOpsSettings.ProjectName;
List<WitBatchRequest> batchRequests = new List<WitBatchRequest>();
foreach (WorkItemTfs tfsWorkItem in workItemCollection)
{
JsonPatchDocument document = CreateJsonPatchDocument(tfsWorkItem);
string workItemType = GetWorkItemType(tfsWorkItem);
WitBatchRequest wibr = _azureDevopsWorkItemTrackingClient.CreateWorkItemBatchRequest(projectName, workItemType,
document, true, true);
batchRequests.Add(wibr);
}
List<WitBatchResponse> results = _azureDevopsWorkItemTrackingClient.ExecuteBatchRequest(batchRequests).Result;
}
private static JsonPatchDocument CreateJsonPatchDocument(WorkItemTfs tfsWorkItem, int id = -1)
{
var document = new JsonPatchDocument();
document.Add(
new JsonPatchOperation
{
Path = "/id",
Operation = Operation.Add,
Value = id
});
document.Add(
new JsonPatchOperation
{
Path = "/fields/System.Title",
Operation = Operation.Add,
Value = tfsWorkItem.Title
});
if (tfsWorkItem.Fields.Contains("ReproSteps"))
document.Add(
new JsonPatchOperation
{
Path = "/fields/Microsoft.VSTS.TCM.ReproSteps",
Operation = Operation.Add,
Value = tfsWorkItem.Fields["ReproSteps"].Value
});
}
Any suggestions about what I need to do to get this working properly?
I have tried submitting different unique ID's but it doesn't seem to prevent the error from happening.
You need to use unique negative ID's for creating the WorkItem ID.
Something like this:
public void ExecuteWorkItemMigration(int[] workItemIds, IProgress<ProgressResult> progress = null)
{
var wiql = "SELECT * FROM WorkItems";
var query = new Query(_workItemStore, wiql, workItemIds);
var workItemCollection = query.RunQuery();
string projectName = MainSettings.AzureDevOpsSettings.ProjectName;
List<WitBatchRequest> batchRequests = new List<WitBatchRequest>();
int id = -1;
foreach (WorkItemTfs tfsWorkItem in workItemCollection)
{
JsonPatchDocument document = CreateJsonPatchDocument(tfsWorkItem, id--);
string workItemType = GetWorkItemType(tfsWorkItem);
WitBatchRequest wibr = _azureDevopsWorkItemTrackingClient.CreateWorkItemBatchRequest(projectName, workItemType,
document, true, true);
batchRequests.Add(wibr);
}
List<WitBatchResponse> results = _azureDevopsWorkItemTrackingClient.ExecuteBatchRequest(batchRequests).Result;
}

Trying to Move Cards

I am trying to move a card I have found using the search function from one list to another on the same board.
I have tried to set up a new list using the a new factory.list and searching for the list using FirstorDefault LINQ Query but keep getting an Error.
Below is some code I have tried to move my cards.
string query = jnum;
var search = factory.Search(query, 1, SearchModelType.Cards, new IQueryable[] { board });
await search.Refresh();
var CardList = search.Cards.ToList();
foreach (var card in CardList)
{
string tName = card.Name.Substring(0, 6);
if (tName == jnum.Trim())
{
cardid = card.Id;
}
}
var FoundCard = factory.Card(cardid);
string FoundListid = FoundCard.List.Id;
var fromlist = factory.List(FoundListid);
if (Item != null)
{
if (mail.Body.ToUpper().Contains("Approved for Print".ToUpper()))
{
//var ToList = factory.List("5db19603e4428377d77963b4");
var ToList = board.Lists.FirstOrDefault(l => l.Name == "Signed Off");
FoundCard.List = ToList;
// from on proof
//MessageBox.Show("Approved for Print");
}
else if (mail.Body.ToUpper().Contains("Awaiting Review".ToUpper()))
{
//var ToList = factory.List("5db19603e4428377d77963b3");
var ToList = board.Lists.FirstOrDefault(l => l.Name == "On Proof");
FoundCard.List = ToList;
// from in progress or to start
// MessageBox.Show("Awaiting Review");
}
else if (mail.Body.ToUpper().Contains("Amends".ToUpper()))
{
var ToList = factory.List("5dc9442eb245e60a39b3d4a7");
FoundCard.List = ToList;
// from on proof
//MessageBox.Show("Amends");
}
else
{
// non job mail
}
}
I keep getting a is not a valid value Error.
Thanks for help

Google chart data from external source

I am using C# webmethod to provide data to Google line chart. However I am stuck with the format of the data.
Currently I can only send data in list (object) format. But I want to send data in other format.
Is this possible? Please find my code snippet below:
ASP Page
//call to get data from ASP web method
$.ajax({
type: "POST",`enter code here`
contentType: 'application/json',
data: '{}',
url: 'AdminDashboard.aspx/GetChartData',
beforeSend: function () { alert("before send"); },
complete: function () { alert("complete"); },
success: function (data) {
var linedata1 = new google.visualization.arrayToDataTable(data.d);
linedata1.insertColumn(0, 'date', linedata1.getColumnLabel(0));
// copy values from column 1 (old column 0) to column 0, converted to Date
for (var i = 0; i < linedata1.getNumberOfRows() ; i++) {
var val = linedata1.getValue(i, 1);
if (val != '' && val != null) {
var dateArray = val.split('/');
var year = dateArray[2];
var month = dateArray[0] - 1; // convert to javascript's 0-indexed months
var day = dateArray[1];
linedata1.setValue(i, 0, new Date(year, month, day));
}
}
// remove column 1 (the old column 0)
linedata1.removeColumn(1);
dashboard.bind(programmaticSlider, programmaticChart);
dashboard.draw(linedata1);
}
ASP webmethod
[WebMethod]
public static List<object> GetChartData()
{
DataTable chartData = new DataTable();
DataTable tktData = new DataTable();
Array arrdata=null;
SPList configList, tktList;
SPQuery dataQuery;
SPListItemCollection configColl;
SPSecurity.RunWithElevatedPrivileges(delegate()
{
using (SPSite oSite = new SPSite("http://dsknomoe11:9696/"))
{
using (SPWeb oWeb = oSite.OpenWeb())
{
//Get Config list value for Ticket Priority and Ticket Status
configList = oWeb.Lists.TryGetList("SCMS_Configuration");
dataQuery = new SPQuery();
dataQuery.Query = "<Where><Or><Eq><FieldRef Name='Title' /><Value Type='Text'>Ticket Priority</Value></Eq><Eq><FieldRef Name='Title' /><Value Type='Text'>Ticket Status</Value></Eq></Or></Where>";
configColl = configList.GetItems(dataQuery);
//Ticket Priority and status
foreach(SPListItem oItem in configColl)
{
//Get Ticket Priority
if (oItem["Title"].ToString().Equals("Ticket Priority"))
{
tktPriority = oItem["value"].ToString().Split(';');
}
//Get Ticket Status
else if (oItem["Title"].ToString().Equals("Ticket Status"))
{
tktStatus = oItem["value"].ToString().Split(';');
}
}
//Add columns to DataTable ChartData
foreach (string s in tktStatus)
{
chartData.Columns.Add(s);
}
//Get Config list value for Ticket Priority and Ticket Status
tktList = oWeb.Lists.TryGetList("SCMS_Tickets");
dataQuery = new SPQuery();
dataQuery.Query = "<Where><Eq><FieldRef Name='isAct' /><Value Type='Choice'>Yes</Value></Eq></Where><OrderBy><FieldRef Name='Created' Ascending='True' /></OrderBy><GroupBy Collapse='True'><FieldRef Name='Created' /></GroupBy>";
dataQuery.ViewFields = "<FieldRef Name='Created' /><FieldRef Name='tckPrty' /><FieldRef Name='tckStat' />";
dataQuery.ViewFieldsOnly = true;
tktData = tktList.GetItems(dataQuery).GetDataTable();
var grpdata= tktData.AsEnumerable().Select(x => new { Date = Convert.ToDateTime(x[0]).Date.ToString("MM/dd/yyyy"), Status = x[2] }).ToArray();
arrdata = grpdata.GroupBy(l => l.Date).Select(g => new
{
Date = g.Key,
Open = g.Count(l => (string)l.Status == "Open"),
Closed = g.Count(l => (string)l.Status == "Closed"),
InProgress = g.Count(l => (string)l.Status == "In-Progress"),
Total = g.Count(l => ((string)l.Status == "Open") || ((string)l.Status == "Closed") || ((string)l.Status == "In-Progress"))
}).ToArray();
}
}
});
List<object> list = new List<object>();
list.Add(new object[] {"Date","Open","Closed","In-Progress","Total" });
string strDt, strOpen, strClosed, strInP, strTotal;
for (int i = 0; i < arrdata.Length; i++)
{
var spltData = arrdata.GetValue(i).ToString().Replace('{',' ').Replace('}',' ').Split(',');
strDt = spltData[0];
strOpen = spltData[1];
strClosed = spltData[2];
strInP = spltData[3];
strTotal = spltData[4].Trim();
list.Add(new object[] { Convert.ToDateTime(strDt.Split('=').Last()).Date.ToString("MM/dd/yyyy"),
Convert.ToInt32(strOpen.Split('=').Last()),
Convert.ToInt32(strClosed.Split('=').Last()),
Convert.ToInt32(strInP.Split('=').Last()),
Convert.ToInt32(strTotal.Split('=').Last())
});
}
return list;
}
Please help me with this.
1. My basic requirement is to create a datatable/or any other format with dynamic columns and then bind the data to google mutli-line chart.
For clarity, my current data format i.e list data is as
Date Open Closed InProgress
1/2/13 2 0 0
2/2/13 1 2 0
3/3/13 0 0 1
4/3/13 0 1 0
You can serve the data in many formats (JSON, CSV, TSV, HTML), I strongly recommend to you to use JSON, you can see the Google default structure here:
I got it working by using the JSON format.
i have posted the snippet, might help some one.
//for dynamic data binding
string jsonData = #"{ ""cols"" :[ { ""label"" : ""Type"" , ""type"" : ""string"" },{ ""label"" : ""Count"" , ""type"" : ""number"" }], ""rows"" :[{""c"":[{""v"":""Mushrooms""},{""v"":3}]}," +
#"{""c"":[{""v"":""Total""},{""v"":1}]}," +
#"{""c"":[{""v"":""Open""},{""v"":1}]}," +
#"{""c"":[{""v"":""Closed""},{""v"":1}]}," +
#"{""c"":[{""v"":""In-Progress""},{""v"":2}]}]}";
The catch was to use "double quotes" for every value. JSON recognizes "double quotes".
Thanks

Quickbooks CustomerQuery with Iterator returning duplicate records

I have a strange issue, and I am not sure where the error is
I am working on C#, and while running a query with filters against the Customer object, I get a strange series of results
If the filter I set should return 8 records, then my implementation returns 8, 8, 3, 1, 1, 1 records (in that order, in every subsequent call, and all those are duplicate)...
This is how I implemented the Iterator
var customerQuery = new CustomerQuery()
{
ChunkSize = "8",
Item = Guid.NewGuid().ToString("N"),
IncludeFinancialIndicator = false,
IncludeFinancialIndicatorSpecified = true,
MinimumBalanceSpecified = true,
MinimumBalance = 92,
SynchronizedFilterSpecified = true,
SynchronizedFilter = SynchronizedFilterEnumType.Synchronized,
};
var results = new List<Customer>();
int count;
//Loop until find all the results.
do
{
IEnumerable<Customer> partialResult = customerQuery.ExecuteQuery<Customer>(context);
count = partialResult.Count();
//First pass here returns 8 records
//Second pass returns same 8 records again
//third pass returns 3 records
//Three more passes with 1 record each
results.AddRange(partialResult);
} while (count > 0);
Am I doing something wrong or missing anything?
Thanks!
Edit:
this is the code implementing the Paging option....
var customerSet = new List<Customer>();
List<Customer> customerQueryResult = null;
int startPage = 1;
var qbdCustomerQuery = new CustomerQuery();
do
{
qbdCustomerQuery.ChunkSize = "10";
qbdCustomerQuery.MinimumBalanceSpecified = true;
qbdCustomerQuery.MinimumBalance = 92;
qbdCustomerQuery.ItemElementName = ItemChoiceType4.StartPage;
qbdCustomerQuery.Item = startPage.ToString();
customerQueryResult = qbdCustomerQuery.ExecuteQuery<Customer>(context).ToList();
if (customerQueryResult.Count > 0) { customerSet.AddRange(customerQueryResult); }
startPage++;
} while (customerQueryResult.Count > 0);
If I set ChunkSize to "100" I get the 8 expected records, but when trying to run it with ChunkSize = "10" I get 8,8,1 (duplicate records)

Need more efficient way to convert Entities to DataTable

So, I have a method that converts entities into a DataTable. The only problem is it is very slow. I made sure to call .ToList() on the IQueryable to make it go ahead and load before processing the results into a DataTable. It takes hardly any time to load the 3000+ rows into memory. However, the real time slayer is in the following iteration in the method:
foreach (var index in imgLeaseIndexes)
{
DataRow dataRow = dataTable.NewRow();
dataRow["StateCode"] = index.StateCode;
dataRow["CountyCode"] = index.CountyCode;
dataRow["EntryNumber"] = index.EntryNumber;
dataRow["Volume"] = index.Volume;
dataRow["Page"] = index.Page;
dataRow["PageCount"] = index.ImgLocation.PageCount;
dataRow["CreateDate"] = index.ImgLocation.CreateDate;
dataTable.Rows.Add(dataRow);
}
And here is the complete method, for what it's worth:
private DataTable buildImgLeaseIndexDataTable(List<ImgLeaseIndex> imgLeaseIndexes)
{
var dataTable = new DataTable();
var dataColumns = new List<DataColumn>();
var tdiReportProperties =
new List<string>() { "StateCode", "CountyCode", "EntryNumber", "Volume", "Page", "PageCount", "CreateDate" };
Type imgLeaseIndexType = imgLeaseIndexes.FirstOrDefault().GetType();
PropertyInfo[] imgLeaseIndexPropertyInfo = imgLeaseIndexType.GetProperties();
dataColumns.AddRange(
(from propertyInfo in imgLeaseIndexPropertyInfo
where tdiReportProperties.Contains(propertyInfo.Name)
select new DataColumn()
{
ColumnName = propertyInfo.Name,
DataType = (propertyInfo.PropertyType.IsGenericType &&
propertyInfo.PropertyType.GetGenericTypeDefinition() == typeof(Nullable<>)) ?
propertyInfo.PropertyType.GetGenericArguments()[0] : propertyInfo.PropertyType
})
.ToList());
Type imgLocationType = imgLeaseIndexes.FirstOrDefault().ImgLocation.GetType();
PropertyInfo[] imgLocationPropertyInfo = imgLocationType.GetProperties();
dataColumns.AddRange(
(from propertyInfo in imgLocationPropertyInfo
where tdiReportProperties.Contains(propertyInfo.Name)
select new DataColumn()
{
ColumnName = propertyInfo.Name,
DataType = (propertyInfo.PropertyType.IsGenericType &&
propertyInfo.PropertyType.GetGenericTypeDefinition() == typeof(Nullable<>)) ?
propertyInfo.PropertyType.GetGenericArguments()[0] : propertyInfo.PropertyType
})
.ToList());
dataTable.Columns.AddRange(dataColumns.ToArray());
foreach (var index in imgLeaseIndexes)
{
DataRow dataRow = dataTable.NewRow();
dataRow["StateCode"] = index.StateCode;
dataRow["CountyCode"] = index.CountyCode;
dataRow["EntryNumber"] = index.EntryNumber;
dataRow["Volume"] = index.Volume;
dataRow["Page"] = index.Page;
dataRow["PageCount"] = index.ImgLocation.PageCount;
dataRow["CreateDate"] = index.ImgLocation.CreateDate;
dataTable.Rows.Add(dataRow);
}
return dataTable;
}
Does anyone have ideas on how I can make this more efficient and why it is so slow as is?
UPDATE:
I removed the reflection and explicitly set the data columns at compile time per the feedback I've received so far, but it is still really slow. This is what the updated code looks like:
private DataTable buildImgLeaseIndexDataTable(List<ImgLeaseIndex> imgLeaseIndexes)
{
var dataTable = new DataTable();
var stateCodeDataColumn = new DataColumn();
stateCodeDataColumn.ColumnName = "StateCode";
stateCodeDataColumn.Caption = "State Code";
stateCodeDataColumn.DataType = typeof(Int16);
dataTable.Columns.Add(stateCodeDataColumn);
var countyCodeDataColumn = new DataColumn();
countyCodeDataColumn.ColumnName = "CountyCode";
countyCodeDataColumn.Caption = "County Code";
countyCodeDataColumn.DataType = typeof(Int16);
dataTable.Columns.Add(countyCodeDataColumn);
var entryNumberDataColumn = new DataColumn();
entryNumberDataColumn.ColumnName = "EntryNumber";
entryNumberDataColumn.Caption = "Entry Number";
entryNumberDataColumn.DataType = typeof(string);
dataTable.Columns.Add(entryNumberDataColumn);
var volumeDataColumn = new DataColumn();
volumeDataColumn.ColumnName = "Volume";
volumeDataColumn.DataType = typeof(string);
dataTable.Columns.Add(volumeDataColumn);
var pageDataColumn = new DataColumn();
pageDataColumn.ColumnName = "Page";
pageDataColumn.DataType = typeof(string);
dataTable.Columns.Add(pageDataColumn);
var pageCountDataColumn = new DataColumn();
pageCountDataColumn.ColumnName = "PageCount";
pageCountDataColumn.Caption = "Page Count";
pageCountDataColumn.DataType = typeof(string);
dataTable.Columns.Add(pageCountDataColumn);
var createDateDataColumn = new DataColumn();
createDateDataColumn.ColumnName = "CreateDate";
createDateDataColumn.Caption = "Create Date";
createDateDataColumn.DataType = typeof(DateTime);
dataTable.Columns.Add(createDateDataColumn);
foreach (var index in imgLeaseIndexes)
{
DataRow dataRow = dataTable.NewRow();
dataRow["StateCode"] = index.StateCode;
dataRow["CountyCode"] = index.CountyCode;
dataRow["EntryNumber"] = index.EntryNumber;
dataRow["Volume"] = index.Volume;
dataRow["Page"] = index.Page;
dataRow["PageCount"] = index.ImgLocation.PageCount;
dataRow["CreateDate"] = index.ImgLocation.CreateDate;
dataTable.Rows.Add(dataRow);
}
return dataTable;
}
Any ideas on what else might be causing this?
UPDATE 2:
So it looks like other people have had this problem - specific to creating and setting the DataRows. My co-worker came across this:
The DataRow value setter is slow!
I'm going to try some of the stuff suggested in the link.
As Thiago mentioned in the comments the problem is the use of reflection.
The reflection part is where you are using PropertyInfo. You are getting the structure of the data types at runtime. But these are known at compile time.