Is it possible to execute a long running function before the browser is reloaded? - progressive-web-apps

I do prevent a page reload in my web application by the following function:
window.onbeforeunload = (event) => {
const e = event || window.event;
// Cancel the event
e.preventDefault();
save_user_data_to_indexed_db();
if (e) {
e.returnValue = ''; // Legacy method for cross browser support
}
return ''; // Legacy method for cross browser support
};
However, the save_user_data_to_indexed_db() function is not being executed during the "Reload site?" message. I thought that if I could execute my function during the displayed message, I could maybe automatically answer the same dialog programmatically and let the browser continue reloading the page.
Is there a way to make the browser wait for this kind of operation?

Generally, there is no way to make the browser wait. What I often do in this case is write the data to an intermediate place, such as localStorage, synchronously, and then asynchronously copy that data over to indexedDB later on when there is time, such as when the page is next loaded again, or from within a service worker.

Related

How to do actions when MongoDB Realm Web SDK change stream closes or times out?

I want to delete all of a user's inserts in a collection when they stop watching a change stream from a React client. I'm using the Realm Web SDK for this.
Here's a summary of my code with what I want to do at the end of it:
import * as Realm from "realm-web";
const realmApp: Realm.App = new Realm.App({ id: realmAppId });
const credentials = Realm.Credentials.anonymous();
const user: Realm.User = await realmApp.logIn(credentials);
const mongodb = realmApp?.currentUser?.mongoClient("mongodb-atlas");
const users = mongodb?.db("users").collection("users");
const changeStream = users.watch();
for await (const change of changeStream) {
switch (change.operationType) {
case "insert": {
...
break;
}
case ...
}
}
// This pseudo-code shows what I want to do
changeStream.on("close", () => // delete all user's inserts)
changeStream.on("timeout", () => // delete all user's inserts)
changeStream.on("user closes app thus also closing stream", () => ... )
Realm Web SDK patterns seem rather different from the NodeJS ones and do not seem to include a method for closing a stream or for running a callback when it closes. In any case, they don't fit my use case.
These MongoDB Realm Web docs lead to more docs about Realm. Unless I'm missing it, both sets don't talk about how to monitor for closing and timing out of a change stream watcher instantiated from the Realm Web SDK, and how to do something when it happens.
I thought another way to do this would be in Realm's Triggers. But it doesn't seem likely from their docs.
Can this even be done from a front end client? Is there a way to do this on MongoDB itself in a "serverless" way?
If you want to delete the inserts specifically when a (client-)listener of a change-stream stops listening you have to implement some logic on client side. There is currently no way to get notified of such even within Mongodb Realm.
Sice a watcher could be closed because the app / browser is closed I would recommend against running the deletion logic on your client. Instead notify a server (or call a Mongodb Realm function / http endpoint) to make the deletions.
You can use the Beacon API to reliably send a request to trigger the delete, even when the window unloads.
Client side
const inserts = [];
for await (const change of changeStream) {
switch (change.operationType) {
case 'insert': inserts.push(change);
}
}
// This point is only reached if the generator returns / stream closes
navigator.sendBeacon('url/to/endpoint', JSON.stringify(inserts));
// Might also add a handler to catch users closing the app.
window.addEventListener('unload', sendBeacon);
Note that the unload event is not reliable MDN. But there are some alternatives which maybe be good enough for your use case.
Inside a realm function you could delete the documents.
That being said, maybe there is a better way to do what you want to achieve. Is it really the timeout of the change stream listener that has to trigger the delete or some other userevent?

Preventing router from navigating

I need to prevent router to be navigated to another page (which is done by changing the hash) if some changes are made. Tried with HashChanger but it just fires 'hashChange' events with no way to prevent it from bubbling. The answer can be inside of JS-Signals library but it's unavailable directly for user-created SAP components.
There is a stop function on the router https://sapui5.hana.ondemand.com/#/api/sap.ui.core.routing.Router/methods/stop
if you call it, the router will stop listening to hashchanges.
There is also function isStopped().
To (re-)activate the router, call initialize(...).
Instead of stopping the router entirely, navigation can be prevented by event.preventDefault() within the navigate event handler.
<App xmlns="sap.m" navigate=".onNavigate">
onNavigate: function(event) {
if (/* Pending changes, no authorization [1], etc. */) {
event.preventDefault();
const { isBack, isBackToPage, isBackToTop } = event.getParameters();
const isBackNavigation = isBack || isBackToPage || isBackToTop;
window.history.go(isBackNavigation ? 1 : -1);
// Inform the user ...
}
},
Demo: https://embed.plnkr.co/wp6yes
Further discussion about interrupting navigation: https://github.com/SAP/openui5/issues/3411
[1]: If the reason for preventing navigation is the lack of authorization, it's not enough to block the user on the client-side only. The server needs to make sure that no unauthorized resource is sent to the client in the first place. See the response from #matz3 on GitHub.

What are the options for offline registration and forms?

I have a project that caters for individuals with poor internet connections in predominantly rural areas. I need to allow for users to download(or any other applicable means), or fill out details offline and then when they are ready and the internet connection is ready the data filled out offline should sync with the online database and give a report.
The offline form also needs the same validation as online, to ensure no time wastage.
What are the options I know that HTML 5 has an offline application ability. I would prefer an open source option, which will allow people with intermittent internet issues to continue filling out a form or series of forms even though internet has dropped, and the data sync when internet reconnects.
So what are the best options? Having the user requiring to download a large application is also not the best case, I would prefer a browser or small download solution. Maybe even a way of downloading a validatable form in some format for re-upload.
This is something I've been muddling through myself as some of the users of the site I am currently tasked with building have poor connections or would like to fill in forms away from a network for various reasons. Depending on your precise needs and your customer's browser compatibility, the solution I've decided to go with is to use the HTML5 cache capability you mention in your post.
The amount of data stored is not that great, and it will mean that the webpage you want them to fill in is available offline.
If you couple this with the localStorage interface you can keep all form submissions until they regain connection.
As an example of my current solution:
The cache.php file, to write the manifest
<?php
header("Content-Type: text/cache-manifest");
echo "CACHE MANIFEST\n";
$pages = array(
//an array of the pages you want cached for later
);
foreach($pages as $page) {
echo $page."\n";
}
$time = new datetime("now");
//this makes sure that the cache is different when the browser checks it
//otherwise the cache will not be rebuilt even if you change a cached page
echo "#Last Build Time: ".$time->format("d m Y H:i:s T");
You can then have a simple ajax script checking for connection
setInterval( function() {
$.ajax({
url: 'testconnection.php',
type: 'post',
data: { 'test' : 'true' },
error: function(XHR, textStatus, errorThrown) {
if(textStatus === 'timeout') {
//update a global var saying connection is down
noCon = true;
}
}
});
if(hasUnsavedData) {
//using the key/value pairs in localstorage, put together a data object and ajax it into the database
//once complete, return unsavedData to false to prevent refiring this until we have new data
//also using localStorage.removeItem(key) to clear out all localstorage info
}
}, 20000 /*medium gap between calls, do whatever works best for you here*/);
Then for your form submission script, use localstorage if that noCon variable is set to true
$(/*submit button*/).on("click", function(event) {
event.preventDefault();
if(noCon) {
//go through all inputs in some way and put to localstorage, your method is up to you
$("input").each( function() {
var key = $(this).attr("name"), val = $(this).val();
localStorage[key] = val;
});
hasUnsavedData = true;
//update a global variable to let the script above know to save information
} else {
//or if there's connection
$("form").submit();
//submit the form in some manner
}
});
I've not tested every script on this page, but they're written based on the skeleton of what my current solution is doing, minus a lot of error checking etc, so hopefully it will give you some ideas on how to approach this
Suggestions for improvements are welcomed

node.js and socket.io: different connections for different "sessions"

I've got a node.js application that 'streams' tweets to users. At the moment, it just searches Twitter for a hard-coded string, but I'd like to allow users to configure this in the URL (eg. by visiting /?q=stackoverflow).
At the moment, my code looks a bit like this:
app.get('/', function (req, res) {
// page rendering skipped
io.sockets.on('connection', function (socket) {
twit.stream('user', {track: 'stackoverflow'}, function(stream) {
stream.on('data', function (data) {
socket.volatile.emit('tweet', data);
}
});
});
});
The question is, how do I make it so that each user can see a different stream of tweets simultaneously? At the moment, it works fine in a single browser tab, but it falls over as soon as a second one is opened - and the error is fairly deep down inside socket.io. Am I misusing it?
I haven't fully got my head around socket.io yet, so that could be the issue.
Thanks in advance!
Every time a new request comes in, you are redefining the connection callback with io.sockets.on - you should move that block of code outside of app.get, after your initialization statement of the io object.

Ajax.Updater output garbled only on iPhone first load

When a user loads my page for the first time on an iPhone (works fine on Android, IE, FF,
Opera, Chrome, Safari), the two portions of the page generated by a Prototype/Scriptaculous Ajax.Updater call are garbled - they look as if a binary file were injected into the page or the character map was scrambled. If the user then reloads the page, or uses the page's tabs to navigate around via Ajax.Updater requests, everything is then fine. It's only the very first time the page is loaded in a browser session that this occurs. Here are the relevant calls with a bit of context:
soundManager.onready(function(){
new Ajax.Updater('PlayerSet', 'http://' + location.host +
playerHTMLloc, {method: 'post', onComplete: startPlayer});
});
This is only called once per site visit (so the user has to reload in order to get it to display correctly). It calls a python script that writes html to stdout.
Here's the other:
show: function(elm) {
var id = elm.identify();
elm.addClassName(id.sub('-html', '-selected'));
var link = 'ajax/' + id.sub('-', '.');
$('centercontent').update('<div id="floaterForSpinner"></div><div
id="centerSpinner"><img src="images/ajax-loader.gif"></div>');
new Ajax.Updater('centercontent', link, {evalScripts: 'true',
method: 'post'});
}
This is part of a small class that handles tabs on the page. Again, only the first time show() is called does the error occur. After that the tabber works normally. The updater is just pulling html text files from the server.
The issue occurs with both Prototype/Scripty 1.6.1/1.8.3 and 1.7/1.9.0.
The post and receive headers are identical for the first and subsequent loads, and the acceptable charset is Accept-Charset ISO-8859-1,utf-8;q=0.7,*;q=0.7 in all cases per Firebug.
I don't have an iPhone myself, and none of the off or online iPhone simulators I've tried reproduce the problem, so testing this is going to be a nightmare. Hence, anything anyone could do to help, would be, uh, very... helpful.
UPDATE based on questions I received on the GG prototype list:
All the code above is called after the DOM is loaded:
document.observe('dom:loaded', function() {
Ajax.Responders.register({onCreate: removeListeners});
Ajax.Responders.register({onComplete: postAJAX});
new Lightbox();
initMailList();
AT = new AjaxTabber('tablist');
initInternalLinkListener();
initIE6msgClose();
$('PlayerSet').update('<div style="text-align:center">
<img src="images/ajax-loader.gif"></div>');
soundManager.onready(function(){
new Ajax.Updater('PlayerSet', 'http://' + location.host +
playerHTMLloc, {method: 'post', onComplete: startPlayer});
});
});
AjaxTabber is the tab class that contains the show() function I mentioned earlier. The document.observe function above is in the last js file in the header.
UPDATE #2:
Replacing
document.observe('dom:loaded', function() {
with
Event.observe(window, 'load', function() {
in the 3rd code block fixes the garbled loads. However, the fix raises new questions/issues:
Why do the Ajax.Updater loads need to have the entire page loaded to work correctly? A DOM load should be all that's necessary. There's no reason to need the images loaded for an ajax load to work.
My overall page performance is now substantially degraded to fix an iPhone only problem. I'd really like to go back to loading once the DOM load is complete.
Calling update() and then Ajax.Updater on the same element one after another like this might introduce timing problems that can be difficult to diagnose. I recommend doing this instead (to add a "loading" indicator to your Ajax-loading element):
new Ajax.Updater('elementID', '/path/to/server', {
parameters: {},
method: 'get',
onCreate: function(){
$('elementID').update('placeholder html here');
},
onSuccess: function(){
// any other cleanup here
}
});
The onCreate callback hook will guarantee to run and complete before the request is sent and the element is updated by A.U.