Workbox Precaching / RuntimeCaching hybrid - progressive-web-apps

I have converted my app into a PWA with workbox, and using the precaching strategy
Right now I reload the page when the workbox worker has finished refetching the cache
// Register service worker extract
import { register } from 'register-service-worker';
if (process.env.NODE_ENV === 'production') {
register(`${process.env.BASE_URL}service-worker.js`, {
updatefound() {
// New content is downloading.
},
updated() {
// New content is available; refresh.
setTimeout(() => {
window.location.reload(true);
}, 500);
},
});
}
// Service worker extract
import { precacheAndRoute } from 'workbox-precaching/precacheAndRoute';
precacheAndRoute(self.__WB_MANIFEST);
self.skipWaiting();
But I find it really bothering to have a stale version for 2-5 seconds and then the page reloaded with the new version
What I would like to achieve, is RuntimeCaching when an update is found, the new files are used directly instead of refetching the cache in the background
Is there a way to configure workbox for that, so that I can reload the page straight away
// Register service worker extract
updatefound() {
window.location.reload(true);
},
updated() {
},
And the worbox worker will not serve the cache on the reloaded page and instead make the network requests on the fly, basically a Precaching and RuntimeCaching hybrid to get the best of both worlds?
I couldn't find anything that achieves that anywhere

What you're describing is using a NetworkFirst strategy, along with optionally "warming" the runtime cache with the content that you want to make sure is available offline.
Precaching, with its cache-first approach to serving content, doesn't sound like an appropriate solution to your use case.

Related

Service worker goes to redudent phase because pre-caching files are blocked by Ad Blocker(a chrome extension.)

All the logic get failed because chrome extensions block one of my js files,
Is there a way to make pre-caching more robust? even if I get an error when caching some files, I can still cache most of them correctly, and I have run time caching.
If you're using workbox-precaching, then the answer is no—it's designed so that service worker installation will only proceed if all of the items in the precache manifest were successfully added to the cache. That way, you're guaranteed to have a functional set of resources available offline. (There's a longstanding feature request for adding support for "optional" precaching, but it's not clear how that would work in practice.)
I'd recommend using runtime caching for URLs that are optional, and might be blocked by browser extensions. If you want to "warm" the cache in this scenario, and you don't care if the cache population fails, you can add your own logic along the lines of:
import {CacheFirst} from 'workbox-strategies';
import {registerRoute} from 'workbox-routing';
import {precacheAndRoute} from 'workbox-precaching';
const OPTIONAL_CACHE_NAME = 'optional-resources';
const OPTIONAL_URLS = [
// Add URLs here that might be blocked.
];
self.addEventListener('install', (event) => {
event.waitUntil((async () => {
const cache = await caches.open(RUNTIME_CACHE_NAME);
for (const url of OPTIONAL_URLS) {
try {
await cache.add(url);
} catch (e) {
// Ignore failures due to, e.g., a content blocker.
}
}
})());
});
// Precache everything in the manifest, which you need to
// configure to exclude your "optional" URLs.
precacheAndRoute(self.__WB_MANIFEST);
// Use a cache-first runtime strategy.
registerRoute(
// Check url.pathname, or url.href, if OPTIONAL_URLS
// contains full URLs.
({url}) => OPTIONAL_URLS.includes(url.pathname),
new CacheFirst({cacheName: OPTIONAL_CACHE_NAME}),
);

How RestBase wiki handle caching

Following the installation of RestBase using standard config, I have a working version of summary API.
The problem that the caching mechanism seems strange to me.
The piece of code would decide whether to look at a table cache for fast response. But I cannot make it a server-cache depend on some time-constrain (max-age when the cache is written for example). It means that the decision to use cache or not entirely depend on clients.
Can someone explain the workflow of RestBase caching mechanism?
// Inside key.value.js
getRevision(hyper, req) {
//This one get the header from client request and decide to use cache
or not depend on the value. Does it mean server caching is non-existent?
if (mwUtil.isNoCacheRequest(req)) {
throw new HTTPError({ status: 404 });
}
//If should use cache, below run
const rp = req.params;
const storeReq = {
uri: new URI([rp.domain, 'sys', 'table', rp.bucket, '']),
body: {
table: rp.bucket,
attributes: {
key: rp.key
},
limit: 1
}
};
return hyper.get(storeReq).then(returnRevision(req));
}
Cache invalidation is done by the change propagation service, which is triggered on page edits and similar events. Cache control headers are probably set in the Varnish VCL logic. See here for a full Wikimedia infrastructure diagram - it is outdated but gives you the generic idea of how things are wired together.

self.addEventListener('fetch', function(e) { }) is not working

I have a doubt in PWA and will be glad if someone helps me with that. In my PWA I don't have any problem with storing static files like HTML, JS & CSS. But am facing Issues on dynamic data. i.e : my self.addEventListener('fetch', function(e) { }) is not getting called, but other functionalities are working fine i.e: 'install' and 'active' event.
to be more particular, I am using #angular/service-worker which worked fine but I created another sw file called sw.js. In my sw-js I'm listening to the events like 'install' 'active' and 'fetch'. My sw.js fetch is not getting called whereas the other two methods work well. But while fetching the ngsw-worker.js's fetch method alone gets called.
The thing I need is to make CRUD Operations in PWA with angular.
Thanks in advance!
You can do the dynamic caching like below , the service worker will intercept every request and add in to the cache.
self.addEventListener("fetch", function (event) {
event.respondWith(
caches.open("dynamiccache").then(function (cache) {
return fetch(event.request).then(function (res) {
cache.put(event.request, res.clone());
return res;
})
})
)
}
Note : You can't cache POST Requests
Can service workers cache POST requests?

What are the options for offline registration and forms?

I have a project that caters for individuals with poor internet connections in predominantly rural areas. I need to allow for users to download(or any other applicable means), or fill out details offline and then when they are ready and the internet connection is ready the data filled out offline should sync with the online database and give a report.
The offline form also needs the same validation as online, to ensure no time wastage.
What are the options I know that HTML 5 has an offline application ability. I would prefer an open source option, which will allow people with intermittent internet issues to continue filling out a form or series of forms even though internet has dropped, and the data sync when internet reconnects.
So what are the best options? Having the user requiring to download a large application is also not the best case, I would prefer a browser or small download solution. Maybe even a way of downloading a validatable form in some format for re-upload.
This is something I've been muddling through myself as some of the users of the site I am currently tasked with building have poor connections or would like to fill in forms away from a network for various reasons. Depending on your precise needs and your customer's browser compatibility, the solution I've decided to go with is to use the HTML5 cache capability you mention in your post.
The amount of data stored is not that great, and it will mean that the webpage you want them to fill in is available offline.
If you couple this with the localStorage interface you can keep all form submissions until they regain connection.
As an example of my current solution:
The cache.php file, to write the manifest
<?php
header("Content-Type: text/cache-manifest");
echo "CACHE MANIFEST\n";
$pages = array(
//an array of the pages you want cached for later
);
foreach($pages as $page) {
echo $page."\n";
}
$time = new datetime("now");
//this makes sure that the cache is different when the browser checks it
//otherwise the cache will not be rebuilt even if you change a cached page
echo "#Last Build Time: ".$time->format("d m Y H:i:s T");
You can then have a simple ajax script checking for connection
setInterval( function() {
$.ajax({
url: 'testconnection.php',
type: 'post',
data: { 'test' : 'true' },
error: function(XHR, textStatus, errorThrown) {
if(textStatus === 'timeout') {
//update a global var saying connection is down
noCon = true;
}
}
});
if(hasUnsavedData) {
//using the key/value pairs in localstorage, put together a data object and ajax it into the database
//once complete, return unsavedData to false to prevent refiring this until we have new data
//also using localStorage.removeItem(key) to clear out all localstorage info
}
}, 20000 /*medium gap between calls, do whatever works best for you here*/);
Then for your form submission script, use localstorage if that noCon variable is set to true
$(/*submit button*/).on("click", function(event) {
event.preventDefault();
if(noCon) {
//go through all inputs in some way and put to localstorage, your method is up to you
$("input").each( function() {
var key = $(this).attr("name"), val = $(this).val();
localStorage[key] = val;
});
hasUnsavedData = true;
//update a global variable to let the script above know to save information
} else {
//or if there's connection
$("form").submit();
//submit the form in some manner
}
});
I've not tested every script on this page, but they're written based on the skeleton of what my current solution is doing, minus a lot of error checking etc, so hopefully it will give you some ideas on how to approach this
Suggestions for improvements are welcomed

My app cant update cache while pwa is active on laravel

I'm making an application with PWA but when in the database I update something in the main URL that appears no changes have occurred, and I have to clear the cache that is in the service worker, whereas in mobile, the end user might not have to clear the cache first to use the application. , is there a solution to my problem?
I have done several alternatives such as masking the URL to avoid caching the service worker but I know it's not efficient
it's my service worker file
importScripts('https://storage.googleapis.com/workboxcdn/releases/4.3.1/workbox-sw.js');
if (workbox) {
workbox.setConfig({
debug: true
});
// top-level routes we want to precache
workbox.precaching.precacheAndRoute(['/123', '/123']);
// injected assets by Workbox CLI
workbox.precaching.precacheAndRoute([
//my precache data is here
]);
// match routes for homepage, blog and any sub-pages of blog
workbox.routing.registerRoute(
/^\/(?:()?(\/.*)?)$/,
new workbox.strategies.NetworkFirst({
cacheName: 'static-resources',
})
);
// js/css files
workbox.routing.registerRoute(
/\.(?:js|css)$/,
new workbox.strategies.NetworkFirst({
cacheName: 'static-resources',
})
);
// images
workbox.routing.registerRoute(
// Cache image files.
/\.(?:png|jpg|jpeg|svg|gif)$/,
// Use the cache if it's available.
new workbox.strategies.NetworkFirst({
// Use a custom cache name.
cacheName: 'image-cache',
plugins: [
new workbox.expiration.Plugin({
// Cache upto 50 images.
maxEntries: 50,
// Cache for a maximum of a week.
maxAgeSeconds: 7 * 24 * 60 * 60,
})
],
})
);
}
I don't see any logic in your service worker to invalidate cached data/responses when you post updates to the server. You will have to add that logic to your fetch handler.
FYI workbox does not include that by default you will have to create your own logic for that. Workbox only has some common caching strategies in place. It can't provide modules for every site's custom caching scenarios. That is why the tool is extensible.