Why my filter is not working in v2.ODataModel "read"? - sapui5

I am using the OData model to read data. But it doesn't work. Check the code below:
getGuid: function(pernr) {
var self = this;
var url = "/PersonalDetailSet?$filter=Pernr eq '00000001'";
self.setBusy(true);
this.oModel.read(url, {
success: function(res) {
// ...
},
error: function() {
// ...
}
});
}
I don't know why the filter in url is not working now?

Check if your OData service supports the $filter query in the first place.
Use the read method correctly:myV2ODataModel.read("/PersonalDetailSet"/* No $filter queries here! */, {
filters: [ // <-- Should be an array, not a Filter instance!
new Filter({ // required from "sap/ui/model/Filter"
path: "myField",
operator: FilterOperator.EQ, // required from "sap/ui/model/FilterOperator"
value1: "..."
})
],
// ...
});
API reference: sap.ui.model.odata.v2.ODataModel#read
API reference: sap.ui.model.Filter

First you check whether you are getting model in the scope or not. As i can see this.oModel which is not proper way of getting model. Better use this.getModel() or this.getView().getModel() and then check the call. Passing filter is not the right way but still it should work.

If you want to apply additional URL Parameters in the read function you have to do this via the "urlParameters" parameter:
getGuid: function(pernr){
var self = this;
var url = "/PersonalDetailSet";
self.setBusy(true);
this.oModel.read(url, {
urlParameters: {
"$filter" : "Pernr eq '00000001'"
},
success: function(res){
self.setBusy(false);
self.guid = res.results[0].Guid;
},
error: function() {
self.setBusy(false);
}
});
}

Related

mongodb, express.js. Add new doc to array of documents selector is id

I want to add a new document to an array of documents. So I pass in my param which is the _id of the document I want to add to. Then I need to just add it to the array. I thought I had it working but it was actually adding a nested array to that array. I realized this because I am also trying to sort it so newly added documents are at top. So I ended up having to go back and try and fix my add query. As of now it basically just says cannot add values. This is why I have been using mongodb client, express, await.
I have been looking at mongodb manual and trying what they have but cannot get it to work, obviously something wrong with my adding of new document. Anyone see the issue or show me an example? Thanks!
app.post("/addComment/:id", async (request, response) => {
let mongoClient = new MongoClient(URL, { useUnifiedTopology: true });
try {
await mongoClient.connect();
let id = new ObjectId(request.sanitize(request.params.id));
request.body.comments = { $push: {"comments.author": "myTestPOSTMAN - 1", "comments.comment":
"myTestCommPostMan - 1"}};
let selector = { "_id":id };
//let newValues = {$push: {"comments.comment": "myTestCommPostMan - 1", "comments.author":
"myTestPOSTMAN - 1"}};
let newValues = request.body.comments;
let result = await mongoClient.db(DB_NAME).collection("photos").updateOne(selector,
newValues);
if (JSON.parse(result).n <= 0) {
response.status(404);
response.send({error: "No documents found with ID"});
mongoClient.close();
return;
}
response.status(200);
response.send(result);
} catch (error) {
response.status(500);
response.send({error: error.message});
throw error;
} finally {
mongoClient.close();
}
});
Using post man this is what my json looks like and what the array of documents looks like I am trying to add to.
{"comments": [
{
"comment": "pm - test3",
"author": "pm - test4"
}
]
}
do the mongodb connection outside the function, no need to connect and disconnect everytime when function call, don't create unusual variables too much.
for push object you need to provide main key name and assign object to it.
let mongoClient = new MongoClient(URL, { useUnifiedTopology: true });
await mongoClient.connect();
app.post("/addComment/:id", async (request, response) => {
try {
let result = await mongoClient.db(DB_NAME).collection("photos").updateOne(
{ "_id": new ObjectId(request.sanitize(request.params.id)) },
{ $push: { comments: request.body.comments } }
);
if (JSON.parse(result).n <= 0) {
response.status(404).send({ error: "No documents found with ID" });
return;
}
response.status(200).send(result);
} catch (error) {
response.status(500).send({ error: error.message });
}
});

Perform a facet search query with Algolia autocomplete

My index objects has a city field and I'd like to retrieve these with autocomplete, but documentation seems missing about how to perform a query (only basic search documentation is available), I found a prototype IndexCore.prototype.searchForFacetValues in the autocomplete.js but I have no idea to use it.
You should be able to use the following source:
var client = algoliasearch("YourApplicationID", "YourSearchOnlyAPIKey");
var index = client.initIndex("YourIndex");
autocomplete("#search-input", { hint: false }, [
{
source: function(query, callback) {
index
.searchForFacetValues({
facetName: "countries",
facetQuery: query
})
.then(function(answer) {
callback(answer.hits);
})
.catch(function() {
callback([]);
});
},
displayKey: "my_attribute",
templates: {
suggestion: function(suggestion) {
return suggestion._highlightResult.my_attribute.value;
}
}
}
]);
This uses the searchForFacetValues method to get the results.

Handling nested callbacks/promises with Mongoose

I am a beginner with Node.js and Mongoose. I spent an entire day trying to resolve an issue by scouring through SO, but I just could not find the right solution. Basically, I am using the retrieved values from one collection to query another. In order to do this, I am iterating through a loop of the previously retrieved results.
With the iteration, I am able to populate the results that I need. Unfortunately, the area where I am having an issue is that the response is being sent back before the required information is gathered in the array. I understand that this can be handled by callbacks/promises. I tried numerous ways, but I just haven't been successful with my attempts. I am now trying to make use of the Q library to facilitate the callbacks. I'd really appreciate some insight. Here's a snippet of the portion where I'm currently stuck:
var length = Object.keys(purchasesArray).length;
var jsonArray = [];
var getProductDetails = function () {
var deferred = Q.defer();
for (var i = 0; i < length; i++) {
var property = Object.keys(purchasesArray)[i];
if (purchasesArray.hasOwnProperty(property)) {
var productID = property;
var productQuery = Product.find({asin:
productQuery.exec(function (err, productList) {
jsonArray.push({"productName": productList[0].productName,
"quantity": purchasesArray[productID]});
});
}
}
return deferred.promise;
};
getProductDetails().then(function sendResponse() {
console.log(jsonArray);
response = {
"message": "The action was successful",
"products": jsonArray
};
res.send(response);
return;
}).fail(function (err) {
console.log(err);
})
});
I am particularly able to send one of the two objects in the jsonArray array as the response is being sent after the first element.
Update
Thanks to Roamer-1888 's answer, I have been able to construct a valid JSON response without having to worry about the error of setting headers after sending a response.
Basically, in the getProductDetails() function, I am trying to retrieve product names from the Mongoose query while mapping the quantity for each of the items in purchasesArray. From the function, eventually, I would like to form the following response:
response = {
"message": "The action was successful",
"products": jsonArray
};
where, jsonArray would be in the following form from getProductDetails :
jsonArray.push({
"productName": products[index].productName,
"quantity": purchasesArray[productID]
});
On the assumption that purchasesArray is the result of an earlier query, it would appear that you are trying to :
query your database once per purchasesArray item,
form an array of objects, each containing data derived from the query AND the original purchasesArray item.
If so, and with few other guesses, then the following pattern should do the job :
var getProductDetails = function() {
// map purchasesArray to an array of promises
var promises = purchasesArray.map(function(item) {
return Product.findOne({
asin: item.productID // some property of the desired item
}).exec()
.then(function product {
// Here you can freely compose an object comprising data from :
// * the synchronously derived `item` (an element of 'purchasesArray`)
// * the asynchronously derived `product` (from database).
// `item` is still available thanks to "closure".
// For example :
return {
'productName': product.name,
'quantity': item.quantity,
'unitPrice': product.unitPrice
};
})
// Here, by catching, no individual error will cause the whole response to fail.
.then(null, (err) => null);
});
return Promise.all(promises); // return a promise that settles when all `promises` are fulfilled or any one of them fails.
};
getProductDetails().then(results => {
console.log(results); // `results` is an array of the objects composed in getProductDetails(), with properties 'productName', 'quantity' etc.
res.json({
'message': "The action was successful",
'products': results
});
}).catch(err => {
console.log(err);
res.sendStatus(500); // or similar
});
Your final code will differ in detail, particularly in the composition of the composed object. Don't rely on my guesses.

Take advantage of blueprints / waterline findWhere inside custom controller

I have a bear model and I'm using it with blueprint REST.
// api/models/Bear.js
module.exports = {
attributes: {
name: {
type: 'string',
required: true
}
}
};
I'd like to perform some calculations to bears based on exactly the same criterias as the standard findWhere. Indeed I'd like to be able to request
GET /bear/details
exactly just like I request
GET /bear
So I could find bear details with :
complex query like ?where={}
fields like ?name=
but also sending json in body like {name: ''}
or maybe even using ?limit= etc.
The controller looks like this :
// api/controllers/BearController.js
module.exports = {
getDetails: function (req, res) {
Bear.find().exec(function (err, bears){
if (err) return res.serverError(err);
var bearsDetails = _.map(bears, function(bear) {
return {
id: bear.id,
nameLength: bear.name.length,
reversedName: bear.split('').reverse().join('')
};
});
return res.json(bearsDetails);
});
}
};
And I have a custom route that looks like this
// config/routes.js
module.exports.routes = {
'get /bear/details': 'BearController.getDetails'
}
=> How to automaticaly filter models exactly like in a findWhere request, in a custom controller, without reinventing the wheel ?
Apparently I figured it out myself digging into sails' find() source code. One can use actionUtil's parseCriteria(req). I personaly wrapped it into a service for cleanliness purpose.
Roughly :
api/services/ActionUtilService.js
module.exports = require('../../node_modules/sails/lib/hooks/blueprints/actionUtil');
api/controllers/BearController.js
module.exports = {
getDetails: function (req, res) {
let criteria = ActionUtilService.parseCriteria(req);
Bear.find(criteria).exec(function (err, bears){
if (err) return res.serverError(err);
var bearsDetails = _.map(bears, function(bear) {
return {
id: bear.id,
nameLength: bear.name.length,
reversedName: bear.split('').reverse().join('')
};
});
return res.json(bearsDetails);
});
}
};
For cleanliness I've wrapped it into

Model attributes destroyed on form data save

I'm having a problem saving an existing model after a form submit in my backbone/marionette app.
So, I submit the form and use backbone.syphon to create a representation of the form data. The object I build looks like this :
{languages: {
{
de_DE: {
default: false
enabled: true
},
en_US: {
...
}
...
}
}
I'm trying to save it to a backbone model with attributes that looks like this:
attributes: {
id: "5"
languages: {
de_DE: {
default: false,
label: "German"
language: "de_DE"
selected: false
},
en_CA: {
...
},
...
}
}
The problem is that when I save the existing model using model.save(data) using the above data structure for my data, the default and label instances are completely removed from my model. They're not even sent to the server. They are just completelely removed, though they do sit in the previousAttrs object.
The instance of my model's sync setup looks so:
sync: function(method, model, options){
var baseUrl = window.location.origin+'/qp-api/v1/master-app/';
var config = {}
switch(method){
case "create":
break;
case "read":
config = _.extend(config, {
method: "GET",
url: baseUrl+this.id+'/languages'
});
break;
case "update":
config =_.extend({
method: "PUT",
url: baseUrl+this.id+'/languages'
});
break;
case "delete":
break;
};
options = _.extend(options, config);
return Backbone.Model.prototype.sync.call(this, method, model, options);
},
What am I doing wrong? I thought backbone's save function would only update the changed attrs. It looks to me like my data object should map to the setup of my models attrs. Shouldn't they just update? Am I not understanding something about how existing model's save?
At first I want to mention that it's not a good idea to make such checks if(languages.save(data){ .... }) . model.save() will return promise object, so your if condition will not work as expected.
One of the solutions for your issue is to override languages model's save method.
var Languages = Backbone.Model.extend({
// Declaration of you model
save: function (attrs, options) {
// merge attrs here with defaults/initial values
return this.constructor.__super__.save.call(this, attrs, options);
}
})
Hope this helps!