Loopback REST connector, data mapping response to model? - rest

I've setup a simple "product" model (ie {id:"string","name":string, etc}) and setup a datasource using the REST connector to a remote URL that returns a JSON blob containing dozens of fields, how do I go about mapping the fields from the remote response to my local model? Whenever I execute my method I'm getting back the raw response from the remote....I was expecting, at a minimum, to get back an empty version of my model.

I'm pretty sure you will have to override the find() method on your model and perform this mapping work manually.
Something like this:
module.exports = function(app) {
var Product = app.models.Product;
var find = Product.find;
Product.find = function(filter, cb) {
// invoke the default method
find.call(Product, function(err, original_results) {
var results = {}; // a placeholder for your expected results
results.name = original_results.id;
results.name = original_results.name;
results.description = original_results.long_description;
// and so on
cb(null, results)
});
}
}

Related

What is the difference between setJSON, setData and loadData?

This is regarding the mentioned methods of sap.ui.model.json.JSONModel in SAPUI5:
setJSON
setData
loadData
What is the difference between these 3 methods? When do we use these methods and can we use more than 1 of them for the same purpose?
Have a look at the well documented API Reference for JSONModel.
In summary (from SAP Documentation):
setData: Sets the data, passed as a JS object tree, to the model.
e.g
var data = {
"ProductCollection": [{
"titleId": 0,
"Name": "Olayinka O",
"ProductId": "001",
"chartValue": 75,
"ProductPicUrl": "sap-icon://competitor"
}]
};
var oModel = new sap.ui.model.json.JSONModel(data);
//OR
var oModel = new sap.ui.model.json.JSONModel();
oModel.setData(data);
/*setdata, could also be a odata url in json format*/
loadData:
Load JSON-encoded data from the server using a GET HTTP request and store the resulting JSON data in the model. Note: Due to browser security restrictions, most "Ajax" requests are subject to the same origin policy, the request can not successfully retrieve data from a different domain, subdomain, or protocol.
e.g. you can use this to load/GET changes to the data/model and automatically updates the view if that specific model has being binded by reloading the url. If you use load, you don't need the other two in my opinion and loadData with not work on local json data.
var sURL = "https://cors-anywhere.herokuapp.com/https://services.odata.org/V3/Northwind/Northwind.svc/Products?$format=json";
var oModel = new sap.ui.model.json.JSONModel();
//if called in setInterval, all changes in the backend will be updated in the view if binded in this case every second
setInterval(oModel.loadData(sURL, true), 1000);
setJSON :
Sets the data, passed as a string in JSON format, to the model.
i.e. Same as Set Data but strict JSON
Luckily, the source code of UI5 is quite readable and often the better documentation than most of the API descriptions. Here is what each one of the APIs does basically:
setJSON
"Parse the JSON text and call setData"
JSONModel.prototype.setJSON = function(sJSON, bMerge) {
var oJSONData;
try {
oJSONData = jQuery.parseJSON(sJSON);
this.setData(oJSONData, bMerge);
} catch (e) {
// ...
}
};
Source
setData
"Store the data and notify all dependent bindings (checkUpdate)"
JSONModel.prototype.setData = function(oData/*plain JS object*/, bMerge){
if (bMerge) {
this.oData = /* merge with existing data */;
} else {
this.oData = oData;
} // ...
this.checkUpdate(); // notifies dependent bindings
};
Source
loadData
"Load data from the given remote URL and call setData" --> Please check the source here.
In short, they all call setData at some point.
Which API to call in which situation depends on in which format you have the data available.
The data are in JSON text --> setJSON
The data are somewhere else --> loadData
I already have the data in JS object / array ---> setData
setData
You have a JavaScript object and want to use this data as your model
const oJSONData = {
data: {
id: 4,
first_name: "Eve",
last_name: "Holt",
avatar: "https://s3.amazonaws.com/uifaces/faces/twitter/marcoramires/128.jpg"
}
};
oJSONModel.setData(oData);
setJSON
You have a String that when parsed represents a JavaScript object and want to use this data as your model
const sJSONData = '{"data":{"id":4,"first_name":"Eve","last_name":"Holt","avatar":"https://s3.amazonaws.com/uifaces/faces/twitter/marcoramires/128.jpg"}}';
oJSONModel.setJSON(sJSONData);
loadData
You want to access a remote API which returns data as JSON and want to use this data as your model
const sURL = "https://reqres.in/api/users/4";
oJSONModel.loadData(sURL);

How do you get a syncfusion custom adapter to work with the feathers socket.io client

feathers-client 2.3.0
syncfusion-javascript 15.3.29
I have been trying for awhile to create a syncfusion custom adapter for the feathers socket.io version of it's client. I know I can use rest to get data but in order for me to do offline sync I need to use the feathers-offline-realtime plugin.
Also I am using this in an aurelia project so I am using es6 imports with babel.
Here is a code snippet I have tried, I can post the whole thing if needed.
I am also not sure if just using the Adapter vs UrlAdapter is correct as I need sorting and paging to hit the server and not just to do it locally. I think I can figure that part out if I can at least get some data back.
Note: Per Prince Oliver I am adding a clarification to the question I need to be able to call any methods of the adapter as well besides just proccessQuery such as onSort. When the datagrid calls the onSort method I need to be able to call my api using the feathers socket.io client since it handles socket.io in a special manner for offline capabilities.
import io from 'socket.io-client';
import * as feathers from 'feathers-client';
const baseUrl = 'http://localhost:3030';
const socket = io.connect(baseUrl);
const client = feathers.default()
.configure(feathers.hooks())
.configure(feathers.socketio(socket));
const customers = client.service('customers');
export class FeathersAdapter {
feathersAdapter = new ej.Adaptor().extend({
processQuery: function (ds, query) {
let results
makeMeLookSync(function* () {
results = yield customers.find();
console.log(results);
});
The result is undefined. I have tried several other ways but this one seems like it should work.
REVISED CODE:
I am now getting data but also strange error as noted in the picture when I call
let results = await customers.find();
The process then continues and I get data but when the result variable is returned there is still no data in the grid.
async processQuery(ds, query) {
let baseUrl = 'http://localhost:3030';
let socket = io.connect(baseUrl);
let client = feathers.default()
.configure(feathers.hooks())
.configure(feathers.socketio(socket));
let customers = client.service('customers');
let results = await customers.find();
var result = results, count = result.length, cntFlg = true, ret, key, agg = {};
for (var i = 0; i < query.queries.length; i++) {
key = query.queries[i];
ret = this[key.fn].call(this, result, key.e, query);
if (key.fn == "onAggregates")
agg[key.e.field + " - " + key.e.type] = ret;
else
result = ret !== undefined ? ret : result;
if (key.fn === "onPage" || key.fn === "onSkip" || key.fn === "onTake" || key.fn === "onRange") cntFlg = false;
if (cntFlg) count = result.length;
}
return result;
The processQuery method in the DataManager is used to process the parameter which are set in the ej.Query like skip, take, page before fetching the data. Then the data is fetched asynchronously based on these parameters and fetched data is processed in processResponse method to perform operations like filtering or modifying. The processQuery function operates synchronously and it does not wait for the asynchronous process to complete. Hence the returned data from the API did not get bound on the Grid and throws undefined error.
So, if you are using the socket.io to fetch the data from the API, then the data can be directly bound to the Grid control using the dataSource property. Once the dataSource is updated with the result, it will be reflected in Grid automatically through two-way binding.
[HTML]
<template>
<div>
<ej-grid e-data-source.bind="gridData" e-columns.bind="cols"> </ej-grid>
</div>
</template>
[JS]
let baseUrl = 'http://localhost:3030';
let socket = io.connect(baseUrl);
let client = feathers.default()
.configure(feathers.hooks())
.configure(feathers.socketio(socket));
let customers = client.service('customers');
let results = await customers.find();
this.gridData = results; // bind the data to Grid

Calling XSJS file in SAPUI5 for writng data on HANA

I'm trying to create a service to write data from a SAPUI5 frontend to HANA tables.
As far as I checked, I believe this is not possible via OData Services. So I've found another way, sing an XSJS file with a SQL INSERT statement.
Now, my problem is using this in UI5. As with OData, I would use something like oModel.create but now I think this doesn't work like that.
Those anyone has a clue?
Thanks!
Eva
UPDATE:
After using the first answer, I tried to create an entry in a HANA Table, but I'm getting a 500 error. Here's the code of the xsjs file:
var data = '', conn = $.db.getConnection(), pstmt;
if($.request.body){
data = $.request.parameters.get("firstName");
}
var conn = $.db.getConnection();
var pstmt = conn.prepareStatement( 'INSERT INTO "Z003HB1N"."T_TEST" (FIRSTNAME) VALUES(?)' );
pstmt.setString(1,data);
pstmt.execute();
pstmt.close();
conn.commit();
conn.close();
doResponse(200,'');
$.response.contentType = 'text/plain';
$.response.setBody('Upload ok');
$.response.status = 200;
Any clue what might be wrong?
You can simply use .ajax calls to call your new oData service on HANA and use parameters to hand over the desired values to your .xsjs service.
Example:
var query = "firstName=Sherlock&lastName=Holmes"
jQuery.ajax({
url : "url/to/your/Service.xsjs?" + query,
success : function(response) {
// will be called once the xsjs file sends a response
console.log(response);
},
error : function(e) {
// will be called in case of any errors:
console.log(e);
}
});
On HANA you can access the provided parameters in your service like this:
var firstName = $.request.parameters.get("firstName");
var lastName = $.request.parameters.get("lastName");

Meteor: Unique MongoDB URL for different users

I'm very keen to utilize Meteor as the framework for my next project. However, there is a requirement to keep customer data separated into different MongoDB instances for users from different customers.
I have read on this thread that it could be as simple as using this:
var d = new MongoInternals.RemoteCollectionDriver("<mongo url>");
C = new Mongo.Collection("<collection name>", { _driver: d });
However, I was dished this error on my server/server.js. I'm using meteor 0.9.2.2
with meteor-platform 1.1.0.
Exception from sub Ep9DL57K7F2H2hTBz Error: A method named '/documents/insert' is already defined
at packages/ddp/livedata_server.js:1439
at Function._.each._.forEach (packages/underscore/underscore.js:113)
at _.extend.methods (packages/ddp/livedata_server.js:1437)
at Mongo.Collection._defineMutationMethods (packages/mongo/collection.js:888)
at new Mongo.Collection (packages/mongo/collection.js:208)
at Function.Documents.getCollectionByMongoUrl (app/server/models/documents.js:9:30)
at null._handler (app/server/server.js:12:20)
at maybeAuditArgumentChecks (packages/ddp/livedata_server.js:1594)
at _.extend._runHandler (packages/ddp/livedata_server.js:943)
at packages/ddp/livedata_server.js:737
Can anyone be so kind as to enlighten me whether or not I have made a mistake somewhere?
Thanks.
Br,
Ethan
Edit: This is my server.js
Meteor.publish('userDocuments', function () {
// Get company data store's mongo URL here. Simulate by matching domain of user's email.
var user = Meteor.users.findOne({ _id: this.userId });
if (!user || !user.emails) return;
var email = user.emails[0].address;
var mongoUrl = (email.indexOf('#gmail.com') >= 0) ?
'mongodb://localhost:3001/company-a-db' :
'mongodb://localhost:3001/company-b-db';
// Return documents
return Documents.getCollectionByMongoUrl(mongoUrl).find();
});
and this is the server side model.js
Documents = function () { };
var documentCollections = { };
Documents.getCollectionByMongoUrl = function (url) {
if (!(url in documentCollections)) {
var driver = new MongoInternals.RemoteCollectionDriver(url);
documentCollections[url] = new Meteor.Collection("documents", { _driver: driver });
}
return documentCollections[url];
};
Observation: The first attempt to new a Meteor.Collection works fine. I can continue to use that collection multiple times. But when I log out and login as another user from another company (in this example by using an email that is not from #gmail.com), the error above is thrown.
Downloaded meteor's source codes and peeked into mongo package. There is a way to hack around having to declare different collection names on the mongodb server based on Hubert's suggestion.
In the server side model.js, I've made these adaptation:
Documents.getCollectionByMongoUrl = function (userId, url) {
if (!(userId in documentCollections)) {
var driver = new MongoInternals.RemoteCollectionDriver(url);
documentCollections[userId] = new Meteor.Collection("documents" + userId, { _driver: driver });
documentCollections[userId]._connection = driver.open("documents", documentCollections[userId]._connection);
}
return documentCollections[userId];
};
Super hack job here. Be careful when using this!!!!
I believe Meteor distinguish its collections internally by the name you pass to them as the first argument, so when you create the "documents" collection the second time, it tries to override the structure. Hence the error when trying to create the /documents/insert method the second time.
To work around this, you could apply a suffix to your collection name. So instead of:
new Meteor.Collection('documents', { _driver: driver });
you should try:
new Meteor.Collection('documents_' + userId, { _driver: driver })

Application with fake data source for UI development

I have a web application with an Angular / Breeze client side calling into a Breeze Web API, which uses an Entity Framework code first model. I have a datacontext (Angular service) responsible for all communications with server.
I would like to completely separate the server development from the client side development so developers need not even have .NET installed on their system. I would like the solution to require very little coding in way of creating fakes, because the app is changing frequently and I do not want to have to rewrite fakes every time my implementation changes. I have a bunch of test data in the database that I would like to make available on the client.
What is a good way (standard way?) to achieve this?
Just create mocks. You don't even have to make a RESTful call if you don't want to, just have your service decide whether to hit the server or pull from cache and load up your cache locally on start -
function loadMocks (manager) {
var personMockOne = manager.createEntity('Person', { id: 1, firstName: 'John', lastName: 'Smith' });
var companyMockOne = manager.createEntity('Company', { id: 1, name: 'Acme Inc.' });
companyMockOne.employees.push(personMockOne);
}
http://pwkad.wordpress.com/2014/02/02/creating-mocks-with-breeze-js/
To Expand...
Doing this requires a bit of extra set up. I personally always write my queries separate from my controller / view model logic through a service which takes parameters. A few example parameters are always something like parameters and forceRemote. The idea is that when you go to execute the query you can decide whether to hit the server or query locally. A quick example -
function queryHereOrThere (manager, parameters, forceRemote) {
var query = breeze.EntityQuery().from('EntityName').using(manager);
query.where(parameters);
if (!forceRemote) {
query.executeQueryLocally();
} else {
query.executeQuery();
}
}
Here is my current solution.
Get data from the server with a 'unit test' that creates a Breeze Web API controller and uses it to gather the breeze metadata and all the test data from the database, then writes that data to testData.json and breezeMetadata.json.
Abstract the creation of the Breeze Entity Manager to an Angular service entityManager.
Create a fakeEntityManager Angular service, which: 1) creates the entity manager, 2) overrides the EntityManager.executeQuery function to always use the local version, and 3) loads up the mgr with the test data. The code for that service is below.
In the datacontext service, use the $injector service to conditionally inject a real or a fake entity manager.
datacontext.js
angular.module('app').factory('datacontext', ['$injector','config', datacontext]);
function datacontext($injector, config) {
if (config.useLocalData === true) {
var mgr = $injector.get('fakeEntityManager');
} else var mgr = $injector.get('entityManager');
...
fakeEntityManager.js
(function() {
'use strict';
var serviceId = 'fakeEntityManager';
angular.module('app').factory(serviceId, ['breeze', 'common', em]);
function em(breeze, common) {
var $q = common.$q;
var mgr = getMgr();
populateManager(["Projects", "People", "Organizations"]);
return mgr;
function getMgr() {
breeze.EntityManager.prototype.executeQuery = function(query) {
return $q.when(this.executeQueryLocally(query)).then(function (results) {
var data = {
results: results
};
if (query.inlineCountEnabled == true) data.inlineCount = results.length;
return data;
});
};
var metaData = < PASTE JSON HERE >
new breeze.ValidationOptions({ validateOnAttach: false }).setAsDefault();
var metadataStore = new breeze.MetadataStore();
metadataStore.importMetadata(metaData, true);
return new breeze.EntityManager({
dataService: new breeze.DataService(
{
serviceName: "fakeApi",
hasServerMetadata: false // don't ask the server for metadata
}),
metadataStore: metadataStore
});
}
function populateManager(resources) {
var testData = < PASTE JSON HERE >;
resources.forEach(function (resource) {
testData[resource].forEach(function (entity) {
mgr.createEntity(mgr.metadataStore.getEntityTypeNameForResourceName(resource), entity);
});
});
}
}
})();
If you don't use inlineCount queries there is no need to override executeQuery. You can just add the following property to the EntityManager constructor's parameter:
queryOptions: new breeze.QueryOptions({ fetchStrategy: breeze.FetchStrategy.FromLocalCache })
Todo: Override the EntityManager.saveChanges() function (or somehow configure the entity manager) to prevent calls to the server while still allowing entities to be edited and saved locally.