AddToSetEach operator won't return the correct query (Builders v 2.4.4) - mongodb

I've some code to update an array (add to set). The code is currently using the legacy builder:
var data = new[] { 10, 20, 30 };
var fieldName = "data";
var oldWay = Update.AddToSetEach(fieldName , new BsonArray(data));
Console.WriteLine($"Old way:\n {oldWay.ToJson()} \n\n");
This works perfectly and prints:
Old way:
{ "$addToSet" : { "users" : { "$each" : [10, 20, 30] } } }
But when trying to use the new Builders class, I can't get it to work correctly. I'm using MongoDB.Driver 2.4.4. My code:
var data = new[] { 10, 20, 30 };
var fieldName = "data";
var newWay = Builders<BsonDocument>.Update.AddToSetEach(fieldName, new BsonArray(data)).ToJson();
Console.WriteLine($"New way:\n {newWay} \n\n");
The output is:
New way:
{ "_t" : "AddToSetUpdateDefinition`2" }
Any thoughts?
Thank you!

I don't think .ToJson() renders the string how you want it to on a FilterDefinition. It used to be a little easier in the old driver.
Someone wrote an extension method though:
public static class MongoExtensions
{
public static BsonDocument RenderToBsonDocument<T>(this UpdateDefinition<T> filter)
{
var serializerRegistry = BsonSerializer.SerializerRegistry;
var documentSerializer = serializerRegistry.GetSerializer<T>();
return filter.Render(documentSerializer, serializerRegistry);
}
}
Then you can
var data = new[] { 10, 20, 30 };
var fieldName = "data";
var newWay = Builders<BsonDocument>.Update.AddToSetEach(fieldName, new BsonArray(data));
var newWayJson = newWay .RenderToBsonDocument().ToJson();
Console.WriteLine($"New way:\n {newWayJson } \n\n");

Related

Using Dynamic LINQ with EF.Functions.Like

On the Dynamic LINQ website there's an example using the Like function.
I am unable to get it to work with ef core 3.1
[Test]
public void DynamicQuery()
{
using var context = new SamDBContext(Builder.Options);
var config = new ParsingConfig { ResolveTypesBySimpleName = true };
var lst = context.Contacts.Where(config, "DynamicFunctions.Like(FirstName, \"%Ann%\")".ToList();
lst.Should().HaveCountGreaterThan(1);
}
Example from the Dynamic LINQ website
var example1 = Cars.Where(c => EF.Functions.Like(c.Brand, "%t%"));
example1.Dump();
var config = new ParsingConfig { ResolveTypesBySimpleName = true };
var example2 = Cars.Where(config, "DynamicFunctions.Like(Brand, \"%t%\")");
example2.Dump();
Looks like my code. But I am getting the following error
System.Linq.Dynamic.Core.Exceptions.ParseException : No property or field 'DynamicFunctions' exists in type 'Contact'
you don't need the ResolveTypesBySimpleName, implement your wont type provider.
The piece below people to use PostgreSQL ILike with unnaccent
public class LinqCustomProvider : DefaultDynamicLinqCustomTypeProvider
{
public override HashSet<Type> GetCustomTypes()
{
var result = base.GetCustomTypes();
result.Add(typeof(NpgsqlFullTextSearchDbFunctionsExtensions));
result.Add(typeof(NpgsqlDbFunctionsExtensions));
result.Add(typeof(DbFunctionsExtensions));
result.Add(typeof(DbFunctions));
result.Add(typeof(EF));
return result;
}
}
// ....
var expressionString = $"EF.Functions.ILike(EF.Functions.Unaccent(People.Name), \"%{value}%\")";
var config = new ParsingConfig()
{
DateTimeIsParsedAsUTC = true,
CustomTypeProvider = new LinqCustomProvider()
};
return query.Where(config, expressionString);
Hope this helps people, took me some time to get this sorted.

Dynamic list using array from anthor list

My application is ASP.NET MVC 5 / SQL Server.
I am trying to select specific columns from a list based on an array:
First list has 200 columns: Age, Gender, .....
var list1 = _reportRepository.ShowMasteView().ToList();
Second list has 20 columns: Age, Gender, ......
From the view I select the items to be displayed:
string[] lits2 = showColumn.Where(c => c.Value == true).Select(c=> c.Key).ToArray();
I get
To get these two specific columns, I tried
var nList = list1.Select(t2 => lits2.Any(t1 => t2.Contains(t1)));
I get an error
Can not resolve symbol "Contains"
I was able to do it using the following
var keys = "Age,Gender";
var connection =
ConfigurationManager.ConnectionStrings["DALEntities"].ConnectionString;
using (var dataAdapter = new SqlDataAdapter("SELECT " + keys
+ " from dbo.vw_MasterView", connection))
{
var dataTable = new DataTable();
dataAdapter.Fill(dataTable);
dataAdapter.FillSchema(dataTable, SchemaType.Mapped);
return dataTable;
}
Is there a better way in linq?
From my understand it appears you are trying to extract/select a dynamic object that only has the desired properties/columns.
This can be achieved by building a dynamic expression/function to apply to the Select
The following builds an expression based on the model type and the provided properties
static class DynamicExtensions {
public static IQueryable<dynamic> SelectDynamic<TModel>(this IQueryable<TModel> query, ISet<string> propertyNames) {
var selector = query.BuildSelectorFor(propertyNames);
return query.Select(selector);
}
static Expression<Func<TModel, dynamic>> BuildSelectorFor<TModel>(this IQueryable<TModel> query, ISet<string> propertyNames) {
var modelType = typeof(TModel);
var properties = modelType.GetProperties().Where(p => propertyNames.Contains(p.Name));
// Manually build the expression tree for
// the lambda expression v => new { PropertyName = v.PropertyName, ... }
// (TModel v) =>
var parameter = Expression.Parameter(modelType, "v");
// v.PropertyName
var members = properties.Select(p => Expression.PropertyOrField(parameter, p.Name));
var addMethod = typeof(IDictionary<string, object>).GetMethod(
"Add", new Type[] { typeof(string), typeof(object) });
// { { "PropertyName", v.PropertyName}, ... }
var elementInits = members.Select(m =>
Expression.ElementInit(addMethod, Expression.Constant(m.Member.Name), Expression.Convert(m, typeof(object))));
// new ExpandoObject()
var newExpando = Expression.New(typeof(ExpandoObject));
// new ExpandoObject() { { "PropertyName", v.PropertyName}, ... }
var expando = Expression.ListInit(newExpando, elementInits);
// (TModel v) => new ExpandoObject() { { "PropertyName", v.PropertyName}, ... }
var lambdaExpression = Expression.Lambda<Func<TModel, dynamic>>(expando, parameter);
return lambdaExpression;
}
}
This takes advantage of ExpandoObject whose members can be dynamically added and removed at run time.
The following test was used as an example of how the above function is invoked.
[TestMethod]
public void DynamicList() {
var list1 = new List<Person>
{
new Person{ Gender = "Male", Age = 10, FirstName = "Nama1", SampleNumber = 12},
new Person{ Gender = "Male", Age = 12, FirstName = "Nama2", SampleNumber = 13},
new Person{ Gender = "Female", Age = 13, FirstName = "Nama3", SampleNumber = 14},
new Person{ Gender = "Male", Age = 14, FirstName = "Nama4", SampleNumber = 15},
};
var keys = new string[] { "Age", "Gender", };
var nList = list1.AsQueryable().SelectDynamic(new HashSet<string>(keys));
foreach (IDictionary<string, object> row in nList) {
var msg = $"{{ {keys[0]} = {row[keys[0]]}, {keys[1]} = {row[keys[1]]} }}";
Debug.WriteLine(msg);
}
}
and produces the following output
{ Age = 10, Gender = Male }
{ Age = 12, Gender = Male }
{ Age = 13, Gender = Female }
{ Age = 14, Gender = Male }
The dynamic objects can be used in the View and it is a simple matter of calling the desired members.
For example suppose you have a model as follows
public class MyViewModel {
public string MyProperty { get; set; }
public string[] Keys { get; set; }
public List<dynamic> MyDynamicProperty { get; set; }
}
that was populated with data and given to the view
var list1 = _reportRepository.ShowMasteView();
var keys = new string[] { "Age", "Gender", };
var nList = list1.AsQueryable().SelectDynamic(new HashSet<string>(keys));
var viewModel = new MyViewModel {
MyProperty = "Hello World",
MyDynamicProperty = nList.ToList(),
Keys = keys
};
return View(viewModel);
Then in the view you can use the model as desired, casting to get access to members in the expando object.
#model MyViewModel
...
<h2>#Model.MyProperty</h2>
<table>
<tr>
#foreach(string key in Model.Keys) {
<th>#key</th>
}
</tr>
#foreach (IDictionary<string, object> row in Model.MyDynamicProperty) {
<tr>
#foreach(string key in Model.Keys) {
<td>#row[#key]</td>
}
</tr>
}
</table>
I think you just need to use Contains on your list2.
var nList = list1.Where(t => lits2.Contains(t1));
Contains is a method for Lists. The code you had was trying to use it on a string.
If you have two list of a person's class
public class Person
{
public int id { get; set; }
public string name { get; set; }
}
If the lists are as below:
var list1 = new List<Person>
{
new Person{ id = 1, name = "Nama1"},
new Person{ id = 2, name = "Nama2"},
new Person{ id = 3, name = "Nama3"},
new Person{ id = 4, name = "Nama4"},
};
var list2 = new List<Person>
{
new Person{ id = 1, name = "Nama1"},
new Person{ id = 2, name = "Nama2"},
};
You can filter in the following ways
var keys = list2.Select(x => x.id).ToList();
var filter1= list1.Where(x => keys.Contains(x.id)).ToList();
var filter2= list1.Where(x => keys.Contains(x.id)).Select(x => new { x.name }).ToList();
var filter3= list1.Select(x => new
{
id = x.id,
name = x.name,
check = keys.Contains(x.id)
}).Where(x => x.check).ToList();
If you have array of string
you can use below code
array string same
var lis1 = new string[] {"name1", "name2","name3" };
var lis2 = new string[] { "name1" };
You can filter array of string in the following ways
var items1= lis1.Where(x=>lis2.Contains(x)).ToList();
var items= lis1.Select(x=> new { x, check= lis2.Contains(x) }).Where(x=>x.check == true).ToList();

How Use GeoWithin with mongodb C# Driver 2.4

I need to use GeoWithin to find nearly point to the user after I take his location, did any body used before?
By the way This is a sample for near and GeoWithin:
var point = GeoJson.Point(GeoJson.Geographic(-73.97, 40.77));
var filter = Builders<BsonDocument>.Filter.Near("location", point, 10);
var a = collection.Find(filter1).Any();
var filter2 = Builders<BsonDocument>.Filter.GeoWithin("location", point);
var b = collection.Find(filter2).Any();
you can also use GeoWithinPolygon and GeoWithinCenter.
double[,] polygon = new double[,] { { -73.97, 40.77 }, { -73.9928, 40.7193 }, { -73.9375, 40.8303 }, { -73.97, 40.77 } };
var filter3 = Builders<BsonDocument>.Filter.GeoWithinPolygon("location", polygon);
var c = collection.Find(filter3).Any();
var filter4 = Builders<BsonDocument>.Filter.GeoWithinCenter("location", -73.97, 40.77, 10);
var d = collection.Find(filter4).Any();
Hope it helps.

Equivalent for SetUnion in C# driver

I have a field in mongodb document which contains an array of numbers. I want to update this field with newly received numbers. But i need to add the numbers only if the number is not present in the array. In MongoDB we can use $setUnion but i am not sure on C# driver side. Anyone please suggest the solution.
$setUnion is Used to produce aggregation output.
You need to use AddToSetEach from c# driver
Please find full code snippet with check after insert
public static void Main()
{
var client = new MongoClient("mongodb://localhost:27017");
var database = client.GetDatabase("test");
var collection = database.GetCollection<KalaimaniData>("kalaimani");
// create array to inser
var arrayToInsert = new[] { 1, 4, 5, 6 };
var arrayToMerge = new[] { 2, 3, 4, 5 };
var arrayExpected = new[] { 1, 4, 5, 6, 2, 3 };
var doc = new KalaimaniData { Numbers = arrayToInsert };
collection.InsertOne(doc);
var filter = Builders<KalaimaniData>.Filter.Eq(x => x.Id, doc.Id);
var updateDef = new UpdateDefinitionBuilder<KalaimaniData>().AddToSetEach(x => x.Numbers, arrayToMerge);
collection.UpdateOne(filter, updateDef);
// retrive and compare
var changed = collection.Find(filter).First();
var matched = 0;
foreach (var element in arrayExpected)
{
if (changed.Numbers.Contains(element))
{
matched++;
}
}
if (changed.Numbers.Length == matched)
{
Console.WriteLine("OK");
}
else
{
Console.WriteLine("NOK");
}
Console.ReadLine();
}
/// <summary>TODO The kalaimani data.</summary>
class KalaimaniData
{
/// <summary>Gets or sets the id.</summary>
public ObjectId Id { get; set; }
/// <summary>Gets or sets the numbers.</summary>
public int[] Numbers { get; set; }
}

How to group the results of a Lucene.Net search?

I have managed to create document and do some complex searching too but facing problem in grouping some search result.
There are books which are displayed after search which is fine. Along with this Author grouping with count need to done which will be based on same search query.
Example,
Author Name | Count
A | 12
B | 2
I am using Lucene.Net 3.0.3.0 which does not support grouping but there might be some work around. I need same feature with price ranges too.
Everything is possible if you write a custom Collector. What you describe are facets, and can easily be solved by counting the document values yourself. The core part is calling the IndexSearcher.Search overload accepting a collector. The collector should read values, usually implemented with a field-cache implementation and do the calculation needed.
This is a short demonstration using some classes from my demo-project Corelicious.Lucene.
var postTypes = new Dictionary<Int32, Int32>();
searcher.Search(query, new DelegatingCollector((reader, doc, scorer) => {
var score = scorer.Score();
if (score > 0) {
var postType = SingleFieldCache.Default.GetInt32(reader, "PostTypeId", doc);
if (postType.HasValue) {
if (postTypes.ContainsKey(postType.Value)) {
postTypes[postType.Value]++;
} else {
postTypes[postType.Value] = 1;
}
}
}
}));
Full code:
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text.RegularExpressions;
using System.Xml;
using Corelicious.Lucene;
using Lucene.Net.Analysis;
using Lucene.Net.Analysis.Standard;
using Lucene.Net.Documents;
using Lucene.Net.Index;
using Lucene.Net.QueryParsers;
using Lucene.Net.Search;
using Lucene.Net.Store;
using Directory = Lucene.Net.Store.Directory;
using Version = Lucene.Net.Util.Version;
namespace ConsoleApplication {
public static class Program {
public static void Main(string[] args) {
Console.WriteLine ("Creating directory...");
var directory = new RAMDirectory();
var analyzer = new StandardAnalyzer(Version.LUCENE_30);
CreateIndex(directory, analyzer);
var userQuery = "calculate pi";
var queryParser = new QueryParser(Version.LUCENE_30, "Body", analyzer);
var query = queryParser.Parse(userQuery);
Console.WriteLine("Query: '{0}'", query);
var indexReader = IndexReader.Open(directory, readOnly: true);
var searcher = new IndexSearcher(indexReader);
var postTypes = new Dictionary<Int32, Int32>();
searcher.Search(query, new DelegatingCollector((reader, doc, scorer) => {
var score = scorer.Score();
if (score > 0) {
var postType = SingleFieldCache.Default.GetInt32(reader, "PostTypeId", doc);
if (postType.HasValue) {
if (postTypes.ContainsKey(postType.Value)) {
postTypes[postType.Value]++;
} else {
postTypes[postType.Value] = 1;
}
}
}
}));
Console.WriteLine("Post type summary");
Console.WriteLine("Post type | Count");
foreach(var pair in postTypes.OrderByDescending(x => x.Value)) {
var postType = (PostType)pair.Key;
Console.WriteLine("{0,-10} | {1}", postType, pair.Value);
}
Console.ReadLine ();
}
public enum PostType {
Question = 1,
Answer = 2,
Tag = 4
}
public static void CreateIndex(Directory directory, Analyzer analyzer) {
using (var writer = new IndexWriter(directory, analyzer, true, IndexWriter.MaxFieldLength.UNLIMITED))
using (var xmlStream = File.OpenRead("/Users/sisve/Downloads/Stack Exchange Data Dump - Sept 2011/Content/092011 Mathematics/posts.xml"))
using (var xmlReader = XmlReader.Create(xmlStream)) {
while (xmlReader.ReadToFollowing("row")) {
var tags = xmlReader.GetAttribute("Tags") ?? String.Empty;
var title = xmlReader.GetAttribute("Title") ?? String.Empty;
var body = xmlReader.GetAttribute("Body");
var doc = new Document();
// tags are stored as <tag1><tag2>
foreach (Match match in Regex.Matches(tags, "<(.*?)>")) {
doc.Add(new Field("Tags", match.Groups[1].Value, Field.Store.NO, Field.Index.NOT_ANALYZED));
}
doc.Add(new Field("Title", title, Field.Store.NO, Field.Index.ANALYZED));
doc.Add(new Field("Body", body, Field.Store.NO, Field.Index.ANALYZED));
doc.Add(new Field("PostTypeId", xmlReader.GetAttribute("PostTypeId"), Field.Store.NO, Field.Index.NOT_ANALYZED));
writer.AddDocument(doc);
}
writer.Optimize();
writer.Commit();
}
}
}
}