I'm making an application with PWA but when in the database I update something in the main URL that appears no changes have occurred, and I have to clear the cache that is in the service worker, whereas in mobile, the end user might not have to clear the cache first to use the application. , is there a solution to my problem?
I have done several alternatives such as masking the URL to avoid caching the service worker but I know it's not efficient
it's my service worker file
importScripts('https://storage.googleapis.com/workboxcdn/releases/4.3.1/workbox-sw.js');
if (workbox) {
workbox.setConfig({
debug: true
});
// top-level routes we want to precache
workbox.precaching.precacheAndRoute(['/123', '/123']);
// injected assets by Workbox CLI
workbox.precaching.precacheAndRoute([
//my precache data is here
]);
// match routes for homepage, blog and any sub-pages of blog
workbox.routing.registerRoute(
/^\/(?:()?(\/.*)?)$/,
new workbox.strategies.NetworkFirst({
cacheName: 'static-resources',
})
);
// js/css files
workbox.routing.registerRoute(
/\.(?:js|css)$/,
new workbox.strategies.NetworkFirst({
cacheName: 'static-resources',
})
);
// images
workbox.routing.registerRoute(
// Cache image files.
/\.(?:png|jpg|jpeg|svg|gif)$/,
// Use the cache if it's available.
new workbox.strategies.NetworkFirst({
// Use a custom cache name.
cacheName: 'image-cache',
plugins: [
new workbox.expiration.Plugin({
// Cache upto 50 images.
maxEntries: 50,
// Cache for a maximum of a week.
maxAgeSeconds: 7 * 24 * 60 * 60,
})
],
})
);
}
I don't see any logic in your service worker to invalidate cached data/responses when you post updates to the server. You will have to add that logic to your fetch handler.
FYI workbox does not include that by default you will have to create your own logic for that. Workbox only has some common caching strategies in place. It can't provide modules for every site's custom caching scenarios. That is why the tool is extensible.
Related
I am working on PWA project. As per requirement, cached data is dynamic. Means, we are not using self.__WB_Manifest to cache the data. Instead of that, all the URL's sending over the API and storing those URL's under one cache.
Now issue is that, if certain html page, which is in same domain and same origin but not cached, in those case we want to show offline.html page.
Usually it will show if we use self.__WB_Manifest. But, as per requirement, we are not using it.
We do have offline.html page in cached data. Even it's present in folder structure, but it not triggering it.
Can anybody please help me guide on it, in case how to pickup the URL from cache and show as a offline page when app is in offline mode and particular page is not cached.
Currently using Workbox V6
addEventListener('fetch',async (event)=>{
console.log("add event listener fetch" + event);
event.respondWith(
caches.match(event.request).then(async (cachedResponse) =>{
console.log("inside cached response" + event.request.mode);
if (cachedResponse) {
console.log("fetch event if");
return cachedResponse;
}else if(!cachedResponse){
console.log("fetch event else");
return fetch(event.request);
}else{
const cache = await caches.open(OFFLINE_PAGES_RESPONSE);
return cache.match('/offline.html');
}
}).catch(async()=>{
console.log("inside cache");
const cache = await caches.open(OFFLINE_PAGES_RESPONSE);
return cache.match('/offline.html');
})
);
});
All the logic get failed because chrome extensions block one of my js files,
Is there a way to make pre-caching more robust? even if I get an error when caching some files, I can still cache most of them correctly, and I have run time caching.
If you're using workbox-precaching, then the answer is no—it's designed so that service worker installation will only proceed if all of the items in the precache manifest were successfully added to the cache. That way, you're guaranteed to have a functional set of resources available offline. (There's a longstanding feature request for adding support for "optional" precaching, but it's not clear how that would work in practice.)
I'd recommend using runtime caching for URLs that are optional, and might be blocked by browser extensions. If you want to "warm" the cache in this scenario, and you don't care if the cache population fails, you can add your own logic along the lines of:
import {CacheFirst} from 'workbox-strategies';
import {registerRoute} from 'workbox-routing';
import {precacheAndRoute} from 'workbox-precaching';
const OPTIONAL_CACHE_NAME = 'optional-resources';
const OPTIONAL_URLS = [
// Add URLs here that might be blocked.
];
self.addEventListener('install', (event) => {
event.waitUntil((async () => {
const cache = await caches.open(RUNTIME_CACHE_NAME);
for (const url of OPTIONAL_URLS) {
try {
await cache.add(url);
} catch (e) {
// Ignore failures due to, e.g., a content blocker.
}
}
})());
});
// Precache everything in the manifest, which you need to
// configure to exclude your "optional" URLs.
precacheAndRoute(self.__WB_MANIFEST);
// Use a cache-first runtime strategy.
registerRoute(
// Check url.pathname, or url.href, if OPTIONAL_URLS
// contains full URLs.
({url}) => OPTIONAL_URLS.includes(url.pathname),
new CacheFirst({cacheName: OPTIONAL_CACHE_NAME}),
);
I have converted my app into a PWA with workbox, and using the precaching strategy
Right now I reload the page when the workbox worker has finished refetching the cache
// Register service worker extract
import { register } from 'register-service-worker';
if (process.env.NODE_ENV === 'production') {
register(`${process.env.BASE_URL}service-worker.js`, {
updatefound() {
// New content is downloading.
},
updated() {
// New content is available; refresh.
setTimeout(() => {
window.location.reload(true);
}, 500);
},
});
}
// Service worker extract
import { precacheAndRoute } from 'workbox-precaching/precacheAndRoute';
precacheAndRoute(self.__WB_MANIFEST);
self.skipWaiting();
But I find it really bothering to have a stale version for 2-5 seconds and then the page reloaded with the new version
What I would like to achieve, is RuntimeCaching when an update is found, the new files are used directly instead of refetching the cache in the background
Is there a way to configure workbox for that, so that I can reload the page straight away
// Register service worker extract
updatefound() {
window.location.reload(true);
},
updated() {
},
And the worbox worker will not serve the cache on the reloaded page and instead make the network requests on the fly, basically a Precaching and RuntimeCaching hybrid to get the best of both worlds?
I couldn't find anything that achieves that anywhere
What you're describing is using a NetworkFirst strategy, along with optionally "warming" the runtime cache with the content that you want to make sure is available offline.
Precaching, with its cache-first approach to serving content, doesn't sound like an appropriate solution to your use case.
My clients are reporting a serious user experience issue in a web application after converting it into PWA.
I need to refresh the browser twice after a dynamic update in this PHP application. It includes sharing a new post, delete, adding a comment etc. Everything works fine in the database. But cached version of the same page appears after the first refresh. It even brings up the page of a logged out user, if we access it from a new account.
I have also found few fix suggesting to modify service worker file to fetch updates directly from server. Service worker code displaying below
importScripts('https://storage.googleapis.com/workbox-cdn/releases/4.3.1/workbox-sw.js');
if (workbox) {
console.log("Yay! Workbox is loaded !");
workbox.precaching.precacheAndRoute([]);
/* cache images in the e.g others folder; edit to other folders you got
and config in the sw-config.js file
*/
workbox.routing.registerRoute(
/(.*)others(.*)\.(?:png|gif|jpg)/,
new workbox.strategies.CacheFirst({
cacheName: "images",
plugins: [
new workbox.expiration.Plugin({
maxEntries: 50,
maxAgeSeconds: 30 * 24 * 60 * 60, // 30 Days
})
]
})
);
/* Make your JS and CSS âš¡ fast by returning the assets from the cache,
while making sure they are updated in the background for the next use.
*/
workbox.routing.registerRoute(
// cache js, css, scc files
/.*\.(?:css|js|scss|)/,
// use cache but update in the background ASAP
new workbox.strategies.StaleWhileRevalidate({
// use a custom cache name
cacheName: "assets",
})
);
// cache google fonts
workbox.routing.registerRoute(
new RegExp("https://fonts.(?:googleapis|gstatic).com/(.*)"),
new workbox.strategies.CacheFirst({
cacheName: "google-fonts",
plugins: [
new workbox.cacheableResponse.Plugin({
statuses: [0, 200],
}),
],
})
);
// add offline analytics
workbox.googleAnalytics.initialize();
/* Install a new service worker and have it update
and control a web page as soon as possible
*/
workbox.core.skipWaiting();
workbox.core.clientsClaim();
} else {
console.log("Oops! Workbox didn't load 👺");
}
Following the installation of RestBase using standard config, I have a working version of summary API.
The problem that the caching mechanism seems strange to me.
The piece of code would decide whether to look at a table cache for fast response. But I cannot make it a server-cache depend on some time-constrain (max-age when the cache is written for example). It means that the decision to use cache or not entirely depend on clients.
Can someone explain the workflow of RestBase caching mechanism?
// Inside key.value.js
getRevision(hyper, req) {
//This one get the header from client request and decide to use cache
or not depend on the value. Does it mean server caching is non-existent?
if (mwUtil.isNoCacheRequest(req)) {
throw new HTTPError({ status: 404 });
}
//If should use cache, below run
const rp = req.params;
const storeReq = {
uri: new URI([rp.domain, 'sys', 'table', rp.bucket, '']),
body: {
table: rp.bucket,
attributes: {
key: rp.key
},
limit: 1
}
};
return hyper.get(storeReq).then(returnRevision(req));
}
Cache invalidation is done by the change propagation service, which is triggered on page edits and similar events. Cache control headers are probably set in the Varnish VCL logic. See here for a full Wikimedia infrastructure diagram - it is outdated but gives you the generic idea of how things are wired together.