Extending TokenStream - lucene.net

I am trying to index into a document a field with one term that has a payload.
Since the only constructor of Field that can work for me takes a TokenStream, I decided to inherit from this class and give the most basic implementation for what I need:
public class MyTokenStream : TokenStream
{
TermAttribute termAtt;
PayloadAttribute payloadAtt;
bool moreTokens = true;
public MyTokenStream()
{
termAtt = (TermAttribute)GetAttribute(typeof(TermAttribute));
payloadAtt = (PayloadAttribute)GetAttribute(typeof(PayloadAttribute));
}
public override bool IncrementToken()
{
if (moreTokens)
{
termAtt.SetTermBuffer("my_val");
payloadAtt.SetPayload(new Payload(/*bye[] data*/));
moreTokens = false;
}
return false;
}
}
The code which was used while indexing:
IndexWriter writer = //init tndex writer...
Document d = new Document();
d.Add(new Field("field_name", new MyTokenStream()));
writer.AddDocument(d);
writer.Commit();
And the code that was used during the search:
IndexSearcher searcher = //init index searcher
Query query = new TermQuery(new Term("field_name", "my_val"));
TopDocs result = searcher.Search(query, null, 10);
I used the debugger to verify that call to IncrementToken() actually sets the TermBuffer.
My problem is that the returned TopDocs instance returns no documents, and I cant understand why... Actually I started from TermPositions (which gives me approach to the Payload...), but it also gave me no results.
Can someone explain to me what am I doing wrong?
I am currently using Lucene .NET 2.9.2

After you set the TermBuffer you need to return true from IncrementToken, you return false when you have nothing to feed the TermBuffer with anymore

Related

Entity Framework 6: is it possible to update specific object property without getting the whole object?

I have an object with several really large string properties. In addition, it has a simple timestamp property.
What I trying to achieve is to update only timestamp property without getting the whole huge object to the server.
Eventually, I would like to use EF and to do in the most performant way something equivalent to this:
update [...]
set [...] = [...]
where [...]
Using the following, you can update a single column:
var yourEntity = new YourEntity() { Id = id, DateProp = dateTime };
using (var db = new MyEfContextName())
{
db.YourEntities.Attach(yourEntity);
db.Entry(yourEntity).Property(x => x.DateProp).IsModified = true;
db.SaveChanges();
}
OK, I managed to handle this. The solution is the same as proposed by Seany84, with the only addition of disabling validation, in order to overcome issue with required fields. Basically, I had to add the following line just before 'SaveChanges():
db.Configuration.ValidateOnSaveEnabled = false;
So, the complete solution is:
var yourEntity = new YourEntity() { Id = id, DateProp = dateTime };
using (var db = new MyEfContextName())
{
db.YourEntities.Attach(yourEntity);
db.Entry(yourEntity).Property(x => x.DateProp).IsModified = true;
db.Configuration.ValidateOnSaveEnabled = false;
db.SaveChanges();
}

Testing With A Fake DbContext and Autofixture and Moq

SO follow this example
example and how make a fake DBContex For test my test using just this work fine
[Test]
public void CiudadIndex()
{
var ciudades = new FakeDbSet<Ciudad>
{
new Ciudad {CiudadId = 1, EmpresaId =1, Descripcion ="Santa Cruz", FechaProceso = DateTime.Now, MarcaBaja = null, UsuarioId = 1},
new Ciudad {CiudadId = 2, EmpresaId =1, Descripcion ="La Paz", FechaProceso = DateTime.Now, MarcaBaja = null, UsuarioId = 1},
new Ciudad {CiudadId = 3, EmpresaId =1, Descripcion ="Cochabamba", FechaProceso = DateTime.Now, MarcaBaja = null, UsuarioId = 1}
};
//// Create mock unit of work
var mockData = new Mock<IContext>();
mockData.Setup(m => m.Ciudades).Returns(ciudades);
// Setup controller
var homeController = new CiudadController(mockData.Object);
// Invoke
var viewResult = homeController.Index();
var ciudades_de_la_vista = (IEnumerable<Ciudad>)viewResult.Model;
// Assert..
}
Iam tryign now to use Autofixture-Moq
to create "ciudades" but I cant. I try this
var fixture = new Fixture();
var ciudades = fixture.Build<FakeDbSet<Ciudad>>().CreateMany<FakeDbSet<Ciudad>>();
var mockData = new Mock<IContext>();
mockData.Setup(m => m.Ciudades).Returns(ciudades);
I get this error
Cant convert System.Collections.Generic.IEnumerable(FakeDbSet(Ciudad)) to System.Data.Entity.IDbSet(Ciudad)
cant put "<>" so I replace with "()" in the error message
Implementation of IContext and FakeDbSet
public interface IContext
{
IDbSet<Ciudad> Ciudades { get; }
}
public class FakeDbSet<T> : IDbSet<T> where T : class
how can make this to work?
A minor point... In stuff like:
var ciudades_fixture = fixture.Build<Ciudad>().CreateMany<Ciudad>();
The second type arg is unnecessary and should be:
var ciudades_fixture = fixture.Build<Ciudad>().CreateMany();
I really understand why you need a FakeDbSet and the article is a bit TL;DR... In general, I try to avoid faking and mucking with ORM bits and instead dealing with interfaces returning POCOs to the max degree possible.
That aside... The reason the normal syntax for initialising the list works is that there is an Add (and IEnumerable) in DBFixture. AutoFixture doesn't have a story for that pattern directly (after all it is compiler syntactic sugar and not particularly amenable to reflection or in line with any other conventions) but you can use AddManyTo as long as there is an ICollection in play. Luckily, within the impl of FakeDbSet as in the article, the following gives us an in:-
public ObservableCollection<T> Local
{
get { return _data; }
}
As ObservableCollection<T> derives from ICollection<T>, you should be able to:
var ciudades = new FakeDbSet<Cuidad>();
fixture.AddManyTo(ciudades.Local);
var mockData = new Mock<IContext>();
mockData.Setup(m => m.Ciudades).Returns(ciudades);
It's possible to wire up a customization to make this prettier, but at least you have a way to manage it. The other option is to have something implement ICollection (or add a prop with a setter taking IEnumerable<T> and have AF generate the parent object, causing said collection to be filled in.
Long superseded side note: In your initial question, you effectively have:
fixture.Build<FakeDbSet<Ciudad>>().CreateMany()
The problem becomes clearer then - you are asking AF to generate Many FakeDbSet<Ciudad>s, which is not what you want.
I haven't used AutoFixture in a while, but shouldn't it be:
var ciudades = new FakeDbSet<Ciudad>();
fixture.AddManyTo(ciudades);
for the moment I end doing this, I will keep reading about how use automoq, cause I'm new in this
var fixture = new Fixture();
var ciudades_fixture = fixture.Build<Ciudad>().CreateMany<Ciudad>();
var ciudades = new FakeDbSet<Ciudad>();
foreach (var item in ciudades_fixture)
{
ciudades.Add(item);
}
var mockData = new Mock<IContext>();
fixture.Create<Mock<IContext>>();
mockData.Setup(r => r.Ciudades).Returns(ciudades);

How do you increment a field in mongodb using c#

Thought this would be pretty straight forward, but my value is remaining the same (0).
What I'd like to do is increment my UnreadMessages field when the user receives a message they haven't read and then decrement it when they have. So I thought code like this would work:
var userHelper = new MongoHelper<User>();
//increment
userHelper.Collection.Update(Query.EQ("Id", userId.ToId()), Update.Inc("UnreadMessages", 1));
//decrement
userHelper.Collection.Update(Query.EQ("Id", userId.ToId()), Update.Inc("UnreadMessages", -1));
After running these no errors are thrown but the value doesn't change either. And no I'm not running one after the other as the code above could be interpreted :)
Update
Here's my helper class:
public class MongoHelper<T> : Sandbox.Services.IMongoHelper<T> where T : class
{
public MongoCollection<T> Collection { get; private set; }
public MongoHelper()
{
var con = new MongoConnectionStringBuilder(ConfigurationManager.ConnectionStrings["MongoDB"].ConnectionString);
var server = MongoServer.Create(con);
var db = server.GetDatabase(con.DatabaseName);
Collection = db.GetCollection<T>(typeof(T).Name.ToLower());
}
}
and thanks to Travis' answer I was able to pull this off:
MongoHelper<UserDocument> userHelper = new MongoHelper<UserDocument>();
var user = userHelper.Collection.FindAndModify(Query.EQ("Username", "a"), SortBy.Null, Update.Inc("MessageCount", 1), true).GetModifiedDocumentAs<UserDocument>();
Not sure what your helper does. Here is a working snippet I use:
var query = Query.And(Query.EQ("_id", keyName));
var sortBy = SortBy.Null;
var update = Update.Inc("KeyValue", adjustmentAmount);
var result = collection.FindAndModify(query, sortBy, update, true);
So, "query" finds the document, update does the increment, and FindAndModify puts them together and actually hits the database.

DataSet does not support System.Nullable<>

i have an app which has btn to preview report made in crystal report. I added Dataset as datasource of the report and dragged datatable from the toolbox and added the fields I need as columns. I got the instruction from this link http://aspalliance.com/2049_Use_LINQ_to_Retrieve_Data_for_Your_Crystal_Reports.2. This is my 2nd report the first one works and did not encounter any prob at all that is why i am confused, not to mention it also has nullable column. the error says: DataSet does not support System.Nullable<>.
private void ShowReportView()
{
string reportFile = "JudgeInfoFMReport.rpt";
ObservableCollection<tblJudgeFileMaint> judgeFileMaintList;
judgeFileMaintList = GenerateReport();
if (judgeFileMaintList.Count > 0)
{
CrystalReportViewerUC crview2 = new CrystalReportViewerUC();
crview2.SetReportPathFile(reportFile, judgeFileMaintList);
crview2.ShowDialog();
}
else
{
System.Windows.MessageBox.Show("No record found.", module, MessageBoxButton.OK, MessageBoxImage.Information);
}
}
private ObservableCollection<tblJudgeFileMaint> GenerateReport()
{
var result = FileMaintenanceBusiness.Instance.GetAllJudgeInfoList();
return new ObservableCollection<tblJudgeFileMaint>(result);
}
The error is in the part where I set datasource report.SetDataSource
public bool SetReportPathFile(string reportPathFile, IEnumerable enumerable)
{
string reportFolder = #"\CrystalReportViewer\Reports\";
string filename = System.Windows.Forms.Application.StartupPath + reportFolder + reportPathFile; // "\\Reports\\CrystalReports\\DateWiseEmployeeInfoReport.rpt";
ReportPathFile = filename;
report.Load(ReportPathFile);
report.SetDataSource(enumerable);
report.SetDatabaseLogon("sa", "admin007");
bRet = true;
}
_IsLoaded = bRet;
return bRet;
}
I read some answers and says I should set the null value to DBNUll which I did in the properties window of each column if it is nullable. Can anyone help me please? thanks
Your question can be seen in this post, but in a generic way ... that way you can pass an Object to a DataSet typed!
.NET - Convert Generic Collection to DataTable
figured it out. by using a collectionextention, copied somewhere, I forgot the link. Os to whoever it is who made the class, credits to you.
class method looks like this.
public statis class CollectionExtension {
public static DataSet ToDataSet<T>(this IEnumerable<T> collection, string dataTableName)
{
if (collection == null)
{
throw new ArgumentNullException("collection");
}
if (string.IsNullOrEmpty(dataTableName))
{
throw new ArgumentNullException("dataTableName");
}
DataSet data = new DataSet("NewDataSet");
data.Tables.Add(FillDataTable(dataTableName, collection));
return data;
}
}
then you can use it by doing this in getting your source to your report:
private DataSet GenerateNeutralContEducReport(string dsName)
{
var contEduHistoryList = FileMaintenanceBusiness.Instance.GetManyNeutralFMContEducHistoryInfobyKeyword(CurrentNeutralFM.NeutralID, "NeutralID").ToList();
return CollectionExtensions.ToDataSet<tblContinuingEducationHistory>(contEduHistoryList, dsName);
}
I found little help from the other proposed answers but this solution worked.
A different way to solve this problem is to make the data column nullable.
DataColumn column = new DataColumn("column", Type.GetType("System.Int32"));
column.AllowDBNull = true;
dataTable.Columns.Add(column);
https://learn.microsoft.com/en-us/dotnet/api/system.data.datacolumn.allowdbnull?view=netcore-3.1
foreach (PropertyDescriptor property in properties)
{
dt.Columns.Add(property.Name, Nullable.GetUnderlyingType(property.PropertyType) ?? property.PropertyType);
}

StandardAnalyzer seems to not being involved when indexing data, NHibernate.Search

Im building a search function for an application with Lucene.NET and NHibernate.Search. To index the existing data I am using this method:
public void SynchronizeIndexForAllUsers()
{
var fullTextSession = Search.CreateFullTextSession(m_session);
var users = GetAll();
foreach (var user in users)
{
if (!user.IsDeleted)
{
fullTextSession.Index(user);
}
}
}
Where I have marked the fields I want to index with following attribute:
[Field(Index.Tokenized, Store = Store.Yes, Analyzer = typeof(StandardAnalyzer))]
public virtual string FirstName
{
get { return m_firstName; }
set { m_firstName = value; }
}
But when I then inspect the indicies in Luke the fields still have uppercases, commas etc. which should have been removed by the StandardAnalyzer.
Does anyone have know what I am doing wrong?
I had similiar problem to yours, but I've been trying to use WhitespaceAnalyzer. Setting it in Field attribute didn't work for me either.
I've ended up setting it globally. I am using FluentNHibernate for configuration and it looks like that:
this._sessionFactory =
Fluently.Configure()
.Database(MsSqlConfiguration.MsSql2005
.ConnectionString(cs => cs
// cut
.ShowSql()
)
.Mappings(m => m.FluentMappings
// cut
)
.ExposeConfiguration(cfg =>
{
// important part: lucene.net and nhibernate.search
cfg.SetProperty("hibernate.search.default.directory_provider", typeof(NHibernate.Search.Store.FSDirectoryProvider).AssemblyQualifiedName);
cfg.SetProperty("hibernate.search.default.indexBase", #"~\Lucene");
cfg.SetProperty("hibernate.search.indexing_strategy", "event");
cfg.SetProperty(NHibernate.Search.Environment.AnalyzerClass, typeof(WhitespaceAnalyzer).AssemblyQualifiedName);
cfg.SetListener(NHibernate.Event.ListenerType.PostUpdate, new FullTextIndexEventListener());
cfg.SetListener(NHibernate.Event.ListenerType.PostInsert, new FullTextIndexEventListener());
cfg.SetListener(NHibernate.Event.ListenerType.PostDelete, new FullTextIndexCollectionEventListener());
})
.BuildSessionFactory();
Take a look at NHibernate.Search.Environment.AnalyzerClass. Funny thing is that it won't work for generic fulltext queries (i think that Lucene will use StandardAnalyzer), but that's another story :).
Hope this helps.