Loki.js lost data in Ionic Apps - ionic-framework

I am developing an Ionic app using loki.js, but every time it refresh the app press f5 i loss all data stored in loki database.
Why it happen?
I using no chache in my ionic app.

It could be that when you press F5 the data saved in memory is not written to the target json file yet.
You can try to set explicitly a time range to save the data when you instantiate loki:
var _db = new Loki('./database/db.json', {
autoload: true,
autosave: true,
autosaveInterval: 5000 // 5 secs
});
function add(newPatient) {
return $q(function(resolve, reject) {
try {
var _patientsColl = GetCollection(dbCollections.PATIENTS);
if (!_patientsColl) {
_patientsColl = _db.addCollection(dbCollections.PATIENTS,{indices:['firstname','lastname']});
}
_patientsColl.insert(newPatient);
console.log('Collection data: ', _patientsColl.data);
resolve();
}
catch (err) {
reject(err);
}
});
}
function GetCollection(collectionName){
return _db.getCollection(collectionName);
}
With "autosaveInterval" the data in memory will be written to the JSON file every 5 seconds (you can adjust this value as you prefer).
EDIT
I added the code I use to save a new document into my collection and even with page refresh, the data is stored correctly. I use "autosave" among the db settings, maybe you can set it as well, in case there is a path that is not correctly catch when you explicitly trigger saving.

Put the file in the /Users/xxx directory,then magical things happened, the program is running normally.

Related

how to update service worker when online and reloading the page

Hello all I am having the following code to load the page on offline, I have a requirement where I need to cache few pages and exclude few from cache. The code I wrote is working fine but when ever I push any updates to the site it is loading from cache how to reload it from server instead of cache when online
I referred to this blog and sum up with some modifications
https://www.charistheo.io/blog/2021/03/cache-handling-with-service-workers-and-the-cache-api/
self.addEventListener("install", function (e) {
self.skipWaiting();
e.waitUntil(async function () {
const cache = await caches.open("app");
await cache.addAll([
"/Scripts/jquery-3.6.0.min.js",
"/ReportableEvent/Index",
"/NearMissReport/Index",
"/ReportableEvent/InternalReview?ReportableEventId=0"
)];
}());
});
var uncachedPaths = new Set(["/Reportable/Reports", "/Dashboard/Index"]);
self.addEventListener("fetch", async function (e) {
if (e.request.method !== "GET") {
return;
}
e.respondWith(
caches.match(e.request, { ignoreSearch: true }).then(async cachedResponse => {
if (cachedResponse) {
return cachedResponse;
}
else {
if (e.request.url.includes("service-worker.js") && !navigator.onLine) {
return;
}
// Fetch the requested resource from the network if it does not exist in the cache.
const networkResponse = await fetch(e.request);
// Response needs to be cloned if going to be used more than once.
const clonedResponse = networkResponse.clone();
if (!uncachedPaths.has(new URL(e.request.url).pathname)) {
// Save response to runtime cache for later use.
const runtimeCache = await caches.open("app");
runtimeCache.put(e.request, networkResponse);
}
// Respond with the cloned network response.
return Promise.resolve(clonedResponse);
}
})
);
});
Also is this correct to cache the page which has Query strings "/ReportableEvent/InternalReview?ReportableEventId=0"
There are various approaches how to serve files from cache and how to update/invalidate the cache. please, refer here.
Also, if you are updating your service worker js files and want to update the cache as well, try appending version to cache name. For instance,
var version = 1.0;
var cachename= 'app'+version;
And every time you update the file, you can increment the version. This way every updates makes it new cache and files are stored afresh in cache.

How to Catch Error When Data is not Sent on Angularfire when adding data to firebase?

Im using angularfire to save data into my firebase. Here is a quick code.
$scope.users.$add({
Name:$scope.username,
Age:$scope.newage,
Contact: $scope.newcontact,
});
alert('Saved to firebase');
I am successful in sending these data to my firebase however how can I catch an error if these data are not saved successfully? Any ideas?
EDIT
So after implementing then() function.
$scope.users.$add({
Name:'Frank',
Age:'20',
Contact: $scope.newcontact,
}).then(function(ref) {
alert('Saved.');
}).catch(function(error) {
console.error(error); //or
console.log(error);
alert('Not Saved.');
});
When connected to the internet. The then() function is fine. It waits for those data to be saved in firebase before showing the alert.
What I want is for it to tell me that data is not saved. catch error function is not firing when i am turning off my internet connection and submitting those data.
When you call $add() it returns a promise. To detect when the data was saved, implement then(). To detect when saving failed, implement catch():
var list = $firebaseArray(ref);
list.$add({ foo: "bar" }).then(function(ref) {
var id = ref.key;
console.log("added record with id " + id);
list.$indexFor(id); // returns location in the array
}).catch(function(error) {
console.error(error);
});
See the documentation for add().
Update
To detect when the data cannot be saved due to not having a network connection is a very different problem when it comes to the Firebase Database. Not being able to save in this case is not an error, but merely a temporary condition. This condition doesn't apply just to this set() operation, but to all read/write operation. For this reason, you should handle it more globally, by detecting connection state:
var connectedRef = firebase.database().ref(".info/connected");
connectedRef.on("value", function(snap) {
if (snap.val() === true) {
alert("connected");
} else {
alert("not connected");
}
});
By listening to .info/connected your code can know that the user is not connected to the Firebase Database and handle it according to your app's needs.

Make the sails.js server monitor file change using chokidar and then emit socket message

I use chokidar to monitor if files have been changed in a folder. At the moment it is triggered when a user updates an Experiment Model in my ExperimentController.
var chokidar = require('chokidar');
...
var watcher = chokidar.watch('assets/output-model-files', {ignored: /[\/\\]\./, persistent: true});
watcher.on('change', function(path) {
...read my changed file and update the content of my database, and send a socket publishUpdate message...
... read file content in an asynchronous way
fs.readFile(path,"utf-8", function (err, data) {
... update the experiment object with the content of the changed file
Experiment.update(req.param('id'), ExpObj, function expUpdated(err) {});
... send a message via socket saying that the exeriment object has been updated
Experiment.publishUpdate(req.param('id'), {name: exp.name,
results: JSON.stringify(myres),
action: ('file has just been updated. nb of trajectories: '+totalNbTrajectories)
});
But I would like to constantly monitor any change in the target folder and send Experiment.publishUpdate messages when it happens from the moment when the sails.js server starts, and not when a user update an experiment object.
Where could I place that chokidar.watch(...) code on the server side so as to update an experiment object from a file change? socket.js?
Ok. I found that locating the code in bootstrap.js seems to do the job perfectly reagrding the even triggering...
My bootstrap.js now looks like:
var chokidar = require('chokidar');
var fs = require('fs');
var os = require('os');
var sys = require('sys');
module.exports.bootstrap = function(cb) {
// It's very important to trigger this callack method when you are finished
// with the bootstrap! (otherwise your server will never lift, since it's waiting on the bootstrap)
User.update({}, {online: false},
function userUpdated(err, users) {
if (err) {
console.log(err);
} else {
var watcher = chokidar.watch('assets/output-model-files', {ignored: /[\/\\]\./, persistent: true});
watcher.on('change', function(path) {
console.log('File', path, 'has been changed');
// do file reading and presumably Experiment publish update here
});
}
cb();
}
)
};

Exporting large data in csv for users

I am trying to make a system where users are allowed to export data in csv format. It is no problem in formatting the data in csv format. I am having he problem in processing the request for csv export data. suppose a User requests for exporting all data and it is huge. I think it would not be best way to ask user to wait untill the request is complete right? Should I tell users that exporting of data is in progress and we will notify once it is complete? If yes i should use background process for it right?
If possible, define some kind of configurable 'setting' which might default to 10Mb. If you can tell the export will be more than 10Mb you ask the user if they want to export such a large rowset. If possible, estimate the size and let them know how large that will be. If you show them a progress bar while you're running a background export you'll have to have some sort of process to process communication to pass that back to the user command form. Have an option on the form to change that setting to a number closer to the one they want as a warning level.
I finally went with streaming data to users in csv format using nodejs.
function csvExport(req, res){
mysqlPool.getConnection(function(err, connection){
if(err){
throw err;
}
res.header('Content-disposition', 'attachment; filename=connects.csv');
res.header('Content-Type', 'text/csv');
var csv_header_row = "Email,First Name,Last Name,Status,Created\n";
res.write(csv_header_row);
var query = connection.query('SELECT * FROM contacts where user_id = ? AND deleted = 0', [req.params.user_id]);
query
.on('error', function(err) {
//handle error
})
.on('fields', function(fields) {
})
.on('result', function(row) {
var csv_row = row.email+','+row.first_name+','+row.last_name+','+row.status+','+row.created+"\n";
res.write(csv_row);
})
.on('end', function() {
res.end('', function(){
console.log('done');
});
});
});
}

Ember - clear form after submitting

I created very simple Ember app, using Ember Data. There is one form where the user creates the entity and submits. It's in BandsNewView (created automatically by Ember), controlled by BandsNewController:
App.BandsNewController = Ember.Controller.extend({
cancel: function() {
this.transitionTo('bands');
},
save: function() {
App.Band.createRecord(this);
this.get('store').commit();
this.set('name');
this.set('description');
this.transitionTo('bands');
}
});
I wonder whether there is simplier solution to "clean up" (i.e empty) the form after saving new Band entity? Can I say something like this.set(), which would empty all the fields? Or is my approach essentially wrong and should I do it completely differently?
The pattern that I've been enjoying is creating and destroying the object on enter and exit of the route itself.
App.BandsNewRoute = Ember.Route.extend({
model: function(params) {
return App.Band.createRecord({});
},
save: function() {
this.get('currentModel.store').commit();
return this.transitionTo('bands');
},
exit: function() {
var model = this.get('currentModel');
if (model.get("isNew") && !model.get("isSaving")) {
return model.get('transaction').rollback();
}
}
});
As you can see, it makes the exit function a little more complex, but it will be exactly the same for every object creation route, so you can factor it out. Now your templates can just bind straight to the model's properties and the model will be saved on save, or rolled back on exit (which will clear the form)
If you are planning on possibly changing other data models and not saving them, or have unsaved models, a way to safely clear the model away is to put it in it's own transaction. I only tend to use this though for objects that are not the main focus of my current flow.
App.BandsNewRoute = Ember.Route.extend({
model: function(params) {
var transaction = this.get('store').transaction();
return transaction.createRecord(App.Band, {})
}
});
Everything else can stay the same.