Querying OCB from JavaScript (WireCloud) - fiware-orion

I'm trying to get type fields for each attribute of my entities. Quering Orion and getting entities is not the problem (I do this through NGSI Source widget) but the way getting these parameters.
From NGSI Source (usual suscription to Orion instance):
var doInitialSubscription = function doInitialSubscription() {
this.subscriptionId = null;
this.ngsi_server = MashupPlatform.prefs.get('ngsi_server');
this.ngsi_proxy = MashupPlatform.prefs.get('ngsi_proxy');
this.connection = new NGSI.Connection(this.ngsi_server, {
ngsi_proxy_url: this.ngsi_proxy
});
var types = MashupPlatform.prefs.get('ngsi_entities').split(new RegExp(',\\s*'));
var entityIdList = [];
var entityId;
for (var i = 0; i < types.length; i++) {
entityId = {
id: '.*',
type: types[i],
isPattern: true
};
entityIdList.push(entityId);
}
var attributeList = null;
var duration = 'PT3H';
var throttling = null;
var notifyConditions = [{
'type': 'ONCHANGE',
'condValues': MashupPlatform.prefs.get('ngsi_update_attributes').split(new RegExp(',\\s*'))
}];
var options = {
flat: true,
onNotify: handlerReceiveEntity.bind(this),
onSuccess: function (data) {
this.subscriptionId = data.subscriptionId;
this.refresh_interval = setInterval(refreshNGSISubscription.bind(this), 1000 * 60 * 60 * 2); // each 2 hours
window.addEventListener("beforeunload", function () {
this.connection.cancelSubscription(this.subscriptionId);
}.bind(this));
}.bind(this)
};
this.connection.createSubscription(entityIdList, attributeList, duration, throttling, notifyConditions, options);
};
var handlerReceiveEntity = function handlerReceiveEntity(data) {
for (var entityId in data.elements) {
MashupPlatform.wiring.pushEvent("entityOutput", JSON.stringify(data.elements[entityId]));
}
};
To MyWidget:
MashupPlatform.wiring.registerCallback("entityInput", function (entityString) {
var entity;
entity = JSON.parse(entityString);
id = entity.id;
type = entity.type;
for(var attr in entity){
attribute = entity[attr];
}
I'm trying to code something similar to obtain the value of type fields. How can I do that? (I'm sure it's quite easy...)

You cannot make use of the current NGSI source operator implementation (at least v3.0.2) if you want to get the type metadata of attributes as the NGSI source makes use of the flat option (discarding that info).
We are studying updating this operator to allow creating subscriptions without using the flat option. The main problem here is that other components expect the data provided by this operator being provided in the format returned when using the flat option. I will update this answer after analysing deeper the issue.

Related

Using Dynamic LINQ with EF.Functions.Like

On the Dynamic LINQ website there's an example using the Like function.
I am unable to get it to work with ef core 3.1
[Test]
public void DynamicQuery()
{
using var context = new SamDBContext(Builder.Options);
var config = new ParsingConfig { ResolveTypesBySimpleName = true };
var lst = context.Contacts.Where(config, "DynamicFunctions.Like(FirstName, \"%Ann%\")".ToList();
lst.Should().HaveCountGreaterThan(1);
}
Example from the Dynamic LINQ website
var example1 = Cars.Where(c => EF.Functions.Like(c.Brand, "%t%"));
example1.Dump();
var config = new ParsingConfig { ResolveTypesBySimpleName = true };
var example2 = Cars.Where(config, "DynamicFunctions.Like(Brand, \"%t%\")");
example2.Dump();
Looks like my code. But I am getting the following error
System.Linq.Dynamic.Core.Exceptions.ParseException : No property or field 'DynamicFunctions' exists in type 'Contact'
you don't need the ResolveTypesBySimpleName, implement your wont type provider.
The piece below people to use PostgreSQL ILike with unnaccent
public class LinqCustomProvider : DefaultDynamicLinqCustomTypeProvider
{
public override HashSet<Type> GetCustomTypes()
{
var result = base.GetCustomTypes();
result.Add(typeof(NpgsqlFullTextSearchDbFunctionsExtensions));
result.Add(typeof(NpgsqlDbFunctionsExtensions));
result.Add(typeof(DbFunctionsExtensions));
result.Add(typeof(DbFunctions));
result.Add(typeof(EF));
return result;
}
}
// ....
var expressionString = $"EF.Functions.ILike(EF.Functions.Unaccent(People.Name), \"%{value}%\")";
var config = new ParsingConfig()
{
DateTimeIsParsedAsUTC = true,
CustomTypeProvider = new LinqCustomProvider()
};
return query.Where(config, expressionString);
Hope this helps people, took me some time to get this sorted.

CosmosDB Paging Return Value

I am trying to return paging results the request from CosmosDB. I saw this example from here but I am not sure what to do with the response variable.
// Fetch query results 10 at a time.
var queryable = client.CreateDocumentQuery<Book>(collectionLink, new FeedOptions { MaxItemCount = 10 });
while (queryable.HasResults)
{
FeedResponse<Book> response = await queryable.ExecuteNext<Book>();
}
Am I suppose to return it directly? Or do I have to do something further with the response variable? I tried to return the response variable directly and it's not working. Here's my code:
public async Task<IEnumerable<T>> RunQueryAsync(string queryString)
{
var feedOptions = new FeedOptions { MaxItemCount = 3 };
IQueryable<T> filter = _client.CreateDocumentQuery<T>(_collectionUri, queryString, feedOptions);
IDocumentQuery<T> query = filter.AsDocumentQuery();
var response = new FeedResponse<T>();
while (query.HasMoreResults)
{
response = await query.ExecuteNextAsync<T>();
}
return response;
}
Update:
After reading #Evandro Paula's answer, I followed the URL and changed my implementation to below. But it is still giving me 500 status code:
public async Task<IEnumerable<T>> RunQueryAsync(string queryString)
{
var feedOptions = new FeedOptions { MaxItemCount = 1 };
IQueryable<T> filter = _client.CreateDocumentQuery<T>(_collectionUri, queryString, feedOptions);
IDocumentQuery<T> query = filter.AsDocumentQuery();
List<T> results = new List<T>();
while (query.HasMoreResults)
{
foreach (T t in await query.ExecuteNextAsync())
{
results.Add(t);
}
}
return results;
}
And here's the exception message:
Cross partition query is required but disabled. Please set
x-ms-documentdb-query-enablecrosspartition to true, specify
x-ms-documentdb-partitionkey, or revise your query to avoid this
exception., Windows/10.0.17134 documentdb-netcore-sdk/1.9.1
Update 2:
I added the EnableCrossPartitionQuery to true and I am able to get the response from CosmosDB. But I am not able to get the 1 item that I defined. Instead, I got 11 items.
Find below a simple example on how to use the CosmosDB/SQL paged query:
private static async Task Query()
{
Uri uri = new Uri("https://{CosmosDB/SQL Account Name}.documents.azure.com:443/");
DocumentClient documentClient = new DocumentClient(uri, "{CosmosDB/SQL Account Key}");
int currentPageNumber = 1;
int documentNumber = 1;
IDocumentQuery<Book> query = documentClient.CreateDocumentQuery<Book>("dbs/{CosmoDB/SQL Database Name}/colls/{CosmoDB/SQL Collection Name}", new FeedOptions { MaxItemCount = 10 }).AsDocumentQuery();
while (query.HasMoreResults)
{
Console.WriteLine($"----- PAGE {currentPageNumber} -----");
foreach (Book book in await query.ExecuteNextAsync())
{
Console.WriteLine($"[{documentNumber}] {book.Id}");
documentNumber++;
}
currentPageNumber++;
}
}
Per exception described in your question Cross partition query is required but disabled, update the feed options as follows:
var feedOptions = new FeedOptions { MaxItemCount = 1, EnableCrossPartitionQuery = true};
Find a more comprehensive example at https://github.com/Azure/azure-documentdb-dotnet/blob/d17c0ca5be739a359d105cf4112443f65ca2cb72/samples/code-samples/Queries/Program.cs#L554-L576.
you are not specifying any where criteria for your specific item...so you are getting all results..try specifying criteria for the item (id , name etc) you are looking for. And keep in mind cross partition queries consume much more RUs n time, you can revisit architecture of your data model..Ideally always do queries with in same partition

Resolving Promise Angular 2

I have the following problem.
In a function I have a promise as a return type. This function is in the class Hierarchy.
updateNodeValues(entity: String, data: {}): Promise<any>{
let jsonBody = JSON.stringify(data);
let url = environment.endpointCore + '/api/' + entity + '/' + data['id'];
return this.http.put(url, jsonBody, this.options)
.toPromise()
.then(response => {
return response;
})
.catch(this.handleError);
}
This function is in class node.
onSubmit(): void{
var currentForm = this.form.value;
var entityName = this.inflection.classify(this.node.type).toLowerCase();
var requiredData = {};
for(var i = 0; i < this.formItems.length; i++){
this.formItems[i].value = currentForm[Object.keys(currentForm)[i]];
}
for(var i=0; i<this.formItems.length; i++){
requiredData[this.globalService.camelize(this.formItems[i].label)] = this.formItems[i].value
}
Promise.resolve(this.hierarchyService.updateNodeValues(entityName, requiredData)).then(response => {
alert(response.ok);
if(response.ok){
this.globalService.showSuccessMessage('Values updated');
this.refreshGui(requiredData);
}
});
this.editMode = false;
}
The problem is that when i try to resolve promise and invoke this.refreshGui(requireddata) nothing is happening. I have read about how the fat arrow is preserving the 'context' of this, and I do not understand why invoking this method is not doing anything, while invoking successMessage produces expected outcome.
The method that I am invoking looks like this, and it is also in the class node.
private refreshGui(data: {}){
this._node.data = data;
this.objectProperties = new Array();
this.nodeChildren = new Array();
for (var property in data) {
var propertyValue = data[property];
if (propertyValue instanceof Array) {
this.nodeChildren.push({label: property, value: "total: ".concat(propertyValue.length.toString())});
} else {
this.objectProperties.push({label: property, value: propertyValue});
}
}
}
The solution that I found to be working was to implement custom event. The problem was that within the async callback resolution, the context of what this is would "get lost". The fat arrow enabled me to invoke class method with this, but the properties within the would be "lost". Because of this reason I have took the logic from the method, and put it in the callback part and set expected and needed results in some variable. This variable was passed to my custom event and set to class variable in the custom event handler appropriately.

How should i update documents, each with different update data set, in mongodb collections

I have mongodb in which there is 3 huge collections say 'A', 'B' and 'C'
Each collection contains about 2 million documents.
There are certain properties for each of the document.
Each document need to be updated based on those values of certain properties, from which i can determine what should be the '$set' to that document.
currently i am using the same approach for each collection.
that to find all documents in batches. collection them in memory (which i think the culprit for the current approach), then one by one update them all.
For the first collection(that have similar data as in other collections), it takes 10 minutes to get completed. then the next two collections taking 2 hours approx to get the task done or mongodb client get crashed earlier.
There is something wrong and no desired in the current approach.
Model.collection.find({}).batchSize(BATCH).toArray(function(err, docs){
if(err || !docs || !docs.length)
return afterCompleteOneCollection(err);
var spec = function(index) {
if(index % 1000 === 0) console.log('at index : ' + index);
var toSet = { };
var toUnset = { };
var over = function(){
var afterOver = function(err){
if(err) return afterCompleteOneCollection(err);
if(index < docs.length - 1) spec(index+1);
else afterCompleteOneCollection(null);
};
var sb = Object.keys(toSet).length;
var ub = Object.keys(toUnset).length;
if(sb || ub) {
var all = {};
if(sb) all.$set = toSet;
if(ub) all.$unset = toUnset;
Model.collection.update({ _id : docs[index]._id }, all, {}, afterOver);
} else afterOver(null);
};
forEachOfDocument(docs[index], toSet, toUnset, over);
};
spec(0);
});
Is there any better solution for the same.?
The streaming approach from here http://mongodb.github.io/node-mongodb-native/api-generated/cursor.html#stream worked for me
This is what i am doing :
var stream = Model.collection.find().stream();
stream.on('data', function(data){
if(data){
var toSet = { };
var toUnset = { };
var over = function(){
var afterOver = function(err){
if(err) console.log(err);
};
var sb = Object.keys(toSet).length;
var ub = Object.keys(toUnset).length;
if(sb || ub) {
var all = {};
if(sb) all.$set = toSet;
if(ub) all.$unset = toUnset;
Model.collection.update({ _id : data._id }, all, {}, afterOver);
} else afterOver(null);
};
forEachOfDocument(data, toSet, toUnset, over);
}
});
stream.on('close', function() {
afterCompleteOneCollection();
});

Merge Self-tracking entities

Graph of objects stored in the database and the same object graph is serialized into a binary package. Package is transmitted over the network to the client, then it is necessary to merge data from the package and data from the database.
Source code of merge:
//objList - data from package
var objectIds = objList.Select(row => row.ObjectId).ToArray();
//result - data from Database
var result = SomeService.Instance.LoadObjects(objectIds);
foreach (var OSobj in objList)
{
var obj = result.Objects.ContainsKey(OSobj.ObjectId)
? result.Objects[OSobj.ObjectId]
: result.Objects.CreateNew(OSobj.ObjectId);
var targetObject = result.DataObjects.Where(x => x.ObjectId == OSobj.ObjectId).FirstOrDefault();
targetObject.StopTracking();
var importedProperties = ImportProperties(targetObject.Properties, OSobj.Properties);
targetObject.Properties.Clear();
foreach (var property in importedProperties)
{
targetObject.Properties.Add(property);
}
targetObject.StartTracking();
}
return result;
And code of ImportProperties method:
static List<Properties> ImportProperties(
IEnumerable<Properties> targetProperties,
IEnumerable<Properties> sourceProperties)
{
Func<Guid, bool> hasElement = targetProperties
.ToDictionary(e => e.PropertyId, e => e)
.ContainsKey;
var tempTargetProperties = new List<Properties>();
foreach (var sourceProperty in sourceProperties)
{
if (!hasElement(sourceProperty.PropertyId))
{
sourceProperty.AcceptChanges();
tempTargetProperties.Add(sourceProperty.MarkAsAdded());
}
else
{
sourceProperty.AcceptChanges();
tempTargetProperties.Add(sourceProperty.MarkAsModified());
}
}
return tempTargetProperties;
}
Server save incoming changes like this :
_context.ApplyChanges("OSEntities.Objects", entity);
_context.SaveChanges(SaveOptions.DetectChangesBeforeSave);
When the server tries to save the changes occur exception:
AcceptChanges cannot continue because the object's key values conflict with another object in the ObjectStateManager. Make sure that the key values are unique before calling AcceptChanges.
But if I change the code of ImportProperties method, the error does not occur and the changes are saved successfully:
static List<Properties> ImportProperties(
IEnumerable<Properties> targetProperties,
IEnumerable<Properties> sourceProperties)
{
Func<Guid, bool> hasElement = targetProperties.ToDictionary(e => e.PropertyId, e => e).ContainsKey;
var tempTargetProperties = new List<Properties>();
foreach (var sourceProperty in sourceProperties)
{
if (!hasElement(sourceProperty.PropertyId))
{
var newProp = new Properties
{
ElementId = sourceProperty.ElementId,
Name = sourceProperty.Name,
ObjectId = sourceProperty.ObjectId,
PropertyId = sourceProperty.PropertyId,
Value = sourceProperty.Value
};
tempTargetProperties.Add(newProp);
}
else
{
var modifiedProp = new Properties
{
ElementId = sourceProperty.ElementId,
Name = sourceProperty.Name,
ObjectId = sourceProperty.ObjectId,
PropertyId = sourceProperty.PropertyId,
Value = sourceProperty.Value
};
modifiedProp.MarkAsModified();
tempTargetProperties.Add(modifiedProp);
}
}
return tempTargetProperties;
}
Why is there an exception?
When you transport an object graph (Entity with n-level deep navigation properties) to a client application the entities will record any changes made in their respective change trackers. When entity (or object graph) is sent back to the server side of the application basically all you need to do is:
try
{
using(Entities context = new Entities())
{
context.ApplyChanges(someEntity);
context.SaveChanges();
}
}
catch
{
...
}
I don't see the need of all the code above you posted. What are you trying to achieve with that code?