I have a web application with an Angular / Breeze client side calling into a Breeze Web API, which uses an Entity Framework code first model. I have a datacontext (Angular service) responsible for all communications with server.
I would like to completely separate the server development from the client side development so developers need not even have .NET installed on their system. I would like the solution to require very little coding in way of creating fakes, because the app is changing frequently and I do not want to have to rewrite fakes every time my implementation changes. I have a bunch of test data in the database that I would like to make available on the client.
What is a good way (standard way?) to achieve this?
Just create mocks. You don't even have to make a RESTful call if you don't want to, just have your service decide whether to hit the server or pull from cache and load up your cache locally on start -
function loadMocks (manager) {
var personMockOne = manager.createEntity('Person', { id: 1, firstName: 'John', lastName: 'Smith' });
var companyMockOne = manager.createEntity('Company', { id: 1, name: 'Acme Inc.' });
companyMockOne.employees.push(personMockOne);
}
http://pwkad.wordpress.com/2014/02/02/creating-mocks-with-breeze-js/
To Expand...
Doing this requires a bit of extra set up. I personally always write my queries separate from my controller / view model logic through a service which takes parameters. A few example parameters are always something like parameters and forceRemote. The idea is that when you go to execute the query you can decide whether to hit the server or query locally. A quick example -
function queryHereOrThere (manager, parameters, forceRemote) {
var query = breeze.EntityQuery().from('EntityName').using(manager);
query.where(parameters);
if (!forceRemote) {
query.executeQueryLocally();
} else {
query.executeQuery();
}
}
Here is my current solution.
Get data from the server with a 'unit test' that creates a Breeze Web API controller and uses it to gather the breeze metadata and all the test data from the database, then writes that data to testData.json and breezeMetadata.json.
Abstract the creation of the Breeze Entity Manager to an Angular service entityManager.
Create a fakeEntityManager Angular service, which: 1) creates the entity manager, 2) overrides the EntityManager.executeQuery function to always use the local version, and 3) loads up the mgr with the test data. The code for that service is below.
In the datacontext service, use the $injector service to conditionally inject a real or a fake entity manager.
datacontext.js
angular.module('app').factory('datacontext', ['$injector','config', datacontext]);
function datacontext($injector, config) {
if (config.useLocalData === true) {
var mgr = $injector.get('fakeEntityManager');
} else var mgr = $injector.get('entityManager');
...
fakeEntityManager.js
(function() {
'use strict';
var serviceId = 'fakeEntityManager';
angular.module('app').factory(serviceId, ['breeze', 'common', em]);
function em(breeze, common) {
var $q = common.$q;
var mgr = getMgr();
populateManager(["Projects", "People", "Organizations"]);
return mgr;
function getMgr() {
breeze.EntityManager.prototype.executeQuery = function(query) {
return $q.when(this.executeQueryLocally(query)).then(function (results) {
var data = {
results: results
};
if (query.inlineCountEnabled == true) data.inlineCount = results.length;
return data;
});
};
var metaData = < PASTE JSON HERE >
new breeze.ValidationOptions({ validateOnAttach: false }).setAsDefault();
var metadataStore = new breeze.MetadataStore();
metadataStore.importMetadata(metaData, true);
return new breeze.EntityManager({
dataService: new breeze.DataService(
{
serviceName: "fakeApi",
hasServerMetadata: false // don't ask the server for metadata
}),
metadataStore: metadataStore
});
}
function populateManager(resources) {
var testData = < PASTE JSON HERE >;
resources.forEach(function (resource) {
testData[resource].forEach(function (entity) {
mgr.createEntity(mgr.metadataStore.getEntityTypeNameForResourceName(resource), entity);
});
});
}
}
})();
If you don't use inlineCount queries there is no need to override executeQuery. You can just add the following property to the EntityManager constructor's parameter:
queryOptions: new breeze.QueryOptions({ fetchStrategy: breeze.FetchStrategy.FromLocalCache })
Todo: Override the EntityManager.saveChanges() function (or somehow configure the entity manager) to prevent calls to the server while still allowing entities to be edited and saved locally.
Related
The common process we follow today to get the data on client script:
OnChange client script:
function onChange(control, oldValue, newValue, isLoading, isTemplate) {
if (isLoading || newValue === '') {
return;
}
var user = g_form.getValue('u_user');
//Call script include
var ga = new GlideAjax('global.sampleUtils'); //Scriptinclude
ga.addParam('sysparm_name', 'getUserDetails'); //Method
ga.addParam('userId',user); //Parameters
ga.getXMLAnswer(getResponse);
function getResponse(response){
console.log(response);
var res = JSON.parse(response);
console.log(res);
g_form.setValue('u_phone',res.mobile_phone);
g_form.setValue('u_email',res.email);
}
}
Script include:
var sampleUtils = Class.create();
sampleUtils.prototype = Object.extendsObject(AbstractAjaxProcessor, {
getUserDetails: function(){ //Function
var userId = this.getParameter('userId'); //Params
obj = {};
var grSysUser = new GlideRecord('sys_user');
if (grSysUser.get(userId)) {
obj.mobile_phone = grSysUser.getValue('mobile_phone');
obj.email = grSysUser.getValue('email');
}
gs.addInfoMessage(obj+JSON.stringify(obj));
return JSON.stringify(obj);
},
type: 'sampleUtils'
});
DEMO Link: https://youtu.be/nNUsfglmj_M
As an alternative to glideAjax you can EfficientGlideRecord
new EfficientGlideRecord('sys_user')
.addQuery('sys_id', newValue) //On Change client script, we will get sys_id of user in newValue variable
.addField('mobile_phone', true) //Get display value
.query(function (egrSysUser) {
if(egrSysUser.next()) {
g_form.setValue('phone', egrSysUser.getDisplayValue('mobile_phone'));
}
});
What is EfficientGlideRecord?
EfficientGlideRecord is the best alternate way to use GlideAjax.
It is a client-side API class from which you can perform asynchronous client-side GlideRecord-style queries while maximizing performance.
Benefits:
Low code configuration with Huge performance improvement.
No need to worry about security loopholes, because it enforces ACLs.
No more concerns about creating new client callable script includes and maintaining
the logic there.
Dependencies:
To use the EfficientGlideRecord we need to commit the attached update-set or find the latest version from the given link https://github.com/thisnameissoclever/ServiceNow-EfficientGlideRecord/releases.
Add the package to Portal record -> JS Includes.
and that's it, and you are good at using the EfficientGlideRecord syntax.
To know more about EfficientGlideRecord, Refer the below link(s):
https://snprotips.com/efficientgliderecord
I have the following scenario:
I have the following template:
<ul>
{{#each persons}}
{{Name}}
{{/each}}
</ul>
where persons = ReactiveVar([]) in the template .js file.
and I'm updating the persons variable in the callback of a HTTP Rest API:
var instance = Template.instance();
API(url, (error, result) = instance.persons.set(result)) //result is an array
Nothing happens on the UI. How can I fix this? (I am willing to use simple array as well but the condition is to populate the array from an API callback).
Binding external APIs to a template can be solved with a classic Template instance' ReactiveVar / ReactiveDict (let's call them reactive source). Note, that you should not make these calls or updates to a reactive source in a helper but rather inside an event or inside onCreated.
Let's take your template:
<ul>
{{#each persons}}
{{Name}}
{{/each}}
</ul>
We then make the call inside the onCreated function:
Template.myTemplate.onCreated(function () {
const instance = this
instance.state = new ReactiveDict()
instance.state.set('persons', [])
// Template's internal tracker
instance.autorun(() => {
API(url, (error, result) => instance.state.set('persons', result)) //result is an array
})
})
And return the data only by the reactive source in the helper:
Template.myTemplate.helpers({
persons() {
return Template.instance().state.get('persons')
}
})
Now this brings another problem: The external API is usually not reactive, causing the autorun to not trigger again if the data in the external API has changed. If the source would be a Mongo collection, the Template's internal Tracker would automatically re-run and update your persons state.
If you want to get the external data only once, it is fine. However, in order to scan the external api for changes you have some different options:
Easy way: use a timer (setInterval):
let timerId
Template.myTemplate.onCreated(function () {
const instance = this
instance.state = new ReactiveDict()
instance.state.set('persons', [])
timerId = setInterval(() => {
API(url, (error, result) => instance.state.set('persons', result)) //result is an array
}, 5000) // scans each 5 seconds for updates
})
Template.myTemplate.onDestroyed(function () {
if (timerId) {
clearInterval(timerId)
timerId = null
}
})
Pros
simple to implement
fine grained tuning of timing for a fluent experience
Cons
setInterval is a sink
you have to clean it up to prevent memory leaks (in onDestroyed)
Hard way: Let the external API's service call you!
If you have the option to let the external service connect and call your app via ddp you can let the external service decide, when the data has changed and is ready to fire, so your current app can update automatically.
You need a method and a collection for this:
server and client:
export const ExternalData = new Mongo.Collection('externalData')
server:
import ExternalData from 'path/to/externalData'
Meteor.methods({
'myApp.updateExternalData'(args) {
// check permissions...
// check data integrity...
const {url} = args
const {data} = args
ExternalData.update({url}, {$set: data})
}
})
Meteor.publish({
'myApp.externalData'(url) {
return ExternalData.find({url})
}
})
Now on the client you just need to subscribe to the data and update the reactive var automatically:
client:
import ExternalData from 'path/to/externalData'
Template.myTemplate.onCreated(function () {
const instance = this
// subscribe to changes
instance.autorun(() => {
const subscription = this.subscribe('myApp.externalData', url)
if (subscription.ready()) {
console.log('myApp.externalData is ready')
}
})
})
Template.myTemplate.helpers({
persons() {
return ExternalData.find({})
}
})
External Service / APP:
// if the external app is a meteor app you are lucky and can go with:
// https://docs.meteor.com/api/connections.html#DDP-connect
// Otherwise you can use the npm package:
// https://www.npmjs.com/package/ddp
// For authentication you can use:
// https://github.com/reactioncommerce/meteor-ddp-login
// or
// https://www.npmjs.com/package/ddp-login
const connection = // create a ddp connection
function onDataChanged () {
const data = //... get data from the backend of your ext. servie
const url = //... and the url for which the data is relevant
// call the app to update the data:
connection.call('myApp.updateExternalData', {url, data})
}
Pros:
Template automatically updates when the collection updates
No timers = no sinks!
Requires no additional reactive variable
You can use the collection to make external data persistent, cache it or create a revision / history
You can plug / unplug the external services (better scaling, less dependencies)
Cons:
High learning curve (but it's worth the effort)
Works only if you have control over the external service
More code = more potential errors so more tests to write
In my table controller, I have:
public IQueryable<MyTable> GetAllMyTable()
I would like to replace the above with my own:
[HttpGet, Route("tables/MyTable")]
public IEnumerable<MyTable> GetAllMyTable()
But I get this response when I call it:
HTTP/1.1 405 Method Not Allowed
Somehow the Web API routing does not reach my method.
Why I'm doing this: the original method produces an inefficient Entity Framework SQL query that takes 3 seconds per call on my local test environment. This is running the query captured from SQL Profiler directly in SQL Mgt Studio. An equivalent query takes less than a second to run. Terrible.
Worse, the inefficient EF queries consumes lots of Azure SQL DTUs, tempting you to up your Azure subscription level if you want a quick fix.
Azure Mobile Apps is wonderful, but the multiple layers of abstraction makes it hard to really see what's going on under the hood, and therefore harder to tune.
Any help would be much appreciated.
HTTP/1.1 405 Method Not Allowed
Per my understanding, the error is obvious. You could send the GET HTTP verb to your endpoint tables/MyTable for retrieving the data. You need to check your request against your mobile app backend via fiddler.
Azure Mobile Apps is wonderful, but the multiple layers of abstraction makes it hard to really see what's going on under the hood, and therefore harder to tune.
For the common table controller, it would look like this:
public IQueryable<Message> GetAllMessage()
{
return Query();
}
The Query() method under EntityDomainManager.cs would equal as follows:
IQueryable<TData> query = this.Context.Set<TData>();
if (!includeDeleted)
{
query = query.Where(item => !item.Deleted);
}
return query;
If it deals with the ODATA queries (e.g. $top, $skip, $filter, etc.), the Nested SQL statement would be generated. We could modify the action to clarify it as follows:
public IEnumerable<Message> GetAllMessage(ODataQueryOptions opt)
{
var message = context.Set<Message>();
var query2=opt.ApplyTo(message, new ODataQuerySettings());
return query2.Cast<Message>().ToList();
}
Here's my rather crude attempt at bypassing the Entity Framework/OData plumbing and using direct SQL. (Wouldn't it be great if Dapper is supported!) This one works well, and is faster than the nested SQL that EF produces. The handling of OData is hacky; I have not had time to investigate using OData to extract the values for UpdatedAt, skip, and top.
I'm only using this approach for one method that needs optimisation. This is the method that the Azure Mobile App client calls when doing a pull.
public IEnumerable<MyTable> GetAllMyTable()
{
var qryValues = HttpUtility.ParseQueryString(Request.RequestUri.Query);
var updatedAtFilter = qryValues["$filter"];
var skip = qryValues["$skip"];
var top = qryValues["$top"];
if (updatedAtFilter != null)
{
var r = new Regex(#"^.+datetimeoffset'(?<time>.+)'.+$", RegexOptions.None);
var m = r.Match(updatedAtFilter);
if (m.Success)
{
var updatedAt = m.Groups["time"].Value.Replace("T", " ");
var sqlString = #"SELECT T0.*
FROM MyTable T0
WHERE T0.UpdatedAt >= #UpdatedAt
ORDER BY UpdatedAt, Id
OFFSET #Skip ROWS
FETCH NEXT #Top ROWS ONLY";
var updatedAtParam = new SqlParameter("UpdatedAt", SqlDbType.DateTimeOffset);
updatedAtParam.Value = updatedAt;
var skipParam = new SqlParameter("Skip", SqlDbType.Int);
skipParam.Value = int.Parse(skip);
var topParam = new SqlParameter("Top", SqlDbType.Int);
topParam.Value = int.Parse(top);
var data = _context.Database.SqlQuery<MyTable>(sqlString, new object[] { updatedAtParam, skipParam, topParam }).AsEnumerable<MyTable>();
return data;
}
}
return null;
}
I've setup a simple "product" model (ie {id:"string","name":string, etc}) and setup a datasource using the REST connector to a remote URL that returns a JSON blob containing dozens of fields, how do I go about mapping the fields from the remote response to my local model? Whenever I execute my method I'm getting back the raw response from the remote....I was expecting, at a minimum, to get back an empty version of my model.
I'm pretty sure you will have to override the find() method on your model and perform this mapping work manually.
Something like this:
module.exports = function(app) {
var Product = app.models.Product;
var find = Product.find;
Product.find = function(filter, cb) {
// invoke the default method
find.call(Product, function(err, original_results) {
var results = {}; // a placeholder for your expected results
results.name = original_results.id;
results.name = original_results.name;
results.description = original_results.long_description;
// and so on
cb(null, results)
});
}
}
I'm very keen to utilize Meteor as the framework for my next project. However, there is a requirement to keep customer data separated into different MongoDB instances for users from different customers.
I have read on this thread that it could be as simple as using this:
var d = new MongoInternals.RemoteCollectionDriver("<mongo url>");
C = new Mongo.Collection("<collection name>", { _driver: d });
However, I was dished this error on my server/server.js. I'm using meteor 0.9.2.2
with meteor-platform 1.1.0.
Exception from sub Ep9DL57K7F2H2hTBz Error: A method named '/documents/insert' is already defined
at packages/ddp/livedata_server.js:1439
at Function._.each._.forEach (packages/underscore/underscore.js:113)
at _.extend.methods (packages/ddp/livedata_server.js:1437)
at Mongo.Collection._defineMutationMethods (packages/mongo/collection.js:888)
at new Mongo.Collection (packages/mongo/collection.js:208)
at Function.Documents.getCollectionByMongoUrl (app/server/models/documents.js:9:30)
at null._handler (app/server/server.js:12:20)
at maybeAuditArgumentChecks (packages/ddp/livedata_server.js:1594)
at _.extend._runHandler (packages/ddp/livedata_server.js:943)
at packages/ddp/livedata_server.js:737
Can anyone be so kind as to enlighten me whether or not I have made a mistake somewhere?
Thanks.
Br,
Ethan
Edit: This is my server.js
Meteor.publish('userDocuments', function () {
// Get company data store's mongo URL here. Simulate by matching domain of user's email.
var user = Meteor.users.findOne({ _id: this.userId });
if (!user || !user.emails) return;
var email = user.emails[0].address;
var mongoUrl = (email.indexOf('#gmail.com') >= 0) ?
'mongodb://localhost:3001/company-a-db' :
'mongodb://localhost:3001/company-b-db';
// Return documents
return Documents.getCollectionByMongoUrl(mongoUrl).find();
});
and this is the server side model.js
Documents = function () { };
var documentCollections = { };
Documents.getCollectionByMongoUrl = function (url) {
if (!(url in documentCollections)) {
var driver = new MongoInternals.RemoteCollectionDriver(url);
documentCollections[url] = new Meteor.Collection("documents", { _driver: driver });
}
return documentCollections[url];
};
Observation: The first attempt to new a Meteor.Collection works fine. I can continue to use that collection multiple times. But when I log out and login as another user from another company (in this example by using an email that is not from #gmail.com), the error above is thrown.
Downloaded meteor's source codes and peeked into mongo package. There is a way to hack around having to declare different collection names on the mongodb server based on Hubert's suggestion.
In the server side model.js, I've made these adaptation:
Documents.getCollectionByMongoUrl = function (userId, url) {
if (!(userId in documentCollections)) {
var driver = new MongoInternals.RemoteCollectionDriver(url);
documentCollections[userId] = new Meteor.Collection("documents" + userId, { _driver: driver });
documentCollections[userId]._connection = driver.open("documents", documentCollections[userId]._connection);
}
return documentCollections[userId];
};
Super hack job here. Be careful when using this!!!!
I believe Meteor distinguish its collections internally by the name you pass to them as the first argument, so when you create the "documents" collection the second time, it tries to override the structure. Hence the error when trying to create the /documents/insert method the second time.
To work around this, you could apply a suffix to your collection name. So instead of:
new Meteor.Collection('documents', { _driver: driver });
you should try:
new Meteor.Collection('documents_' + userId, { _driver: driver })