Why is my Workbox GenerateSW showing my offline page while connected? - progressive-web-apps

I'm trying to setup my offline page using Workbox GenerateSW() and running into an issue where on the first load after I clear site data and hard refresh displays my homepage, but on subsequent loads I am getting the offline page I set up even though I'm online. I have a multi page PHP app that has the assets served up by a CDN. I run the GenerateSW() task in a JS file called by an npm node script.
Here is my GenerateSW() code...
// Pull in .env file values...
const dotEnv = require('dotenv').config({ path: '/var/www/my-project/sites/www/.env' });
if (dotEnv.error) {
throw dotEnv.error
}
const {generateSW} = require('workbox-build');
// Used to break cache on generate of new SW where file is composed of multiple pieces that can't be watched.
const genRanHex = (size = 24) => [...Array(size)].map(() => Math.floor(Math.random() * 16).toString(16)).join('');
const mode = 'development';
generateSW({
swDest: './sites/www/public/service-worker.js',
skipWaiting: true,
clientsClaim: true,
cleanupOutdatedCaches: true,
cacheId: genRanHex(),
mode: mode,
navigateFallback: '/offline',
offlineGoogleAnalytics: mode === 'production',
globDirectory: './sites/assets/public',
globPatterns: [
'img/shell/**/*.{svg,png}',
'dist/**/*.{js,css}',
'manifest.json'
],
modifyURLPrefix: {
'dist/': `${dotEnv.parsed.APPLICATION_ASSETS_CDN}/dist/`,
'img/shell/': `${dotEnv.parsed.APPLICATION_ASSETS_CDN}/img/shell/`,
},
ignoreURLParametersMatching: [/v/],
additionalManifestEntries: [
{
"url": "/offline",
"revision": genRanHex()
}
],
runtimeCaching: []
}).then(({count, size}) => {
console.log(`Generated service worker, which will precache ${count} files, totaling ${size} bytes.`);
}).catch(console.error);

navigateFallback is not actually offline page. From workbox docs:
If specified, all navigation requests for URLs that aren't precached will be fulfilled with the HTML at the URL provided. You must pass in the URL of an HTML document that is listed in your precache manifest. This is meant to be used in a Single Page App scenario, in which you want all navigations to use common App Shell HTML.
For offline page, this question might help.

So the accepted answer was correct in my misuse of navigateFallback which I was trying to use as an offline fallback for non cached routes. After some digging and tinkering, I found the correct way to go about it. The important part that I missed or was not documented well enough on Workbox is that the offline fallback happens at the runtimeCache level...
// Pull in .env file values...
const dotEnv = require('dotenv').config({ path: '/var/www/my-project/sites/www/.env' });
if (dotEnv.error) {
throw dotEnv.error
}
const {generateSW} = require('workbox-build');
// Used to break cache on generate of new SW where file is composed of multiple pieces that can't be watched.
const genRanHex = (size = 24) => [...Array(size)].map(() => Math.floor(Math.random() * 16).toString(16)).join('');
const mode = 'development';
generateSW({
swDest: './sites/www/public/service-worker.js',
skipWaiting: true,
clientsClaim: true,
cleanupOutdatedCaches: true,
cacheId: genRanHex(),
mode: mode,
offlineGoogleAnalytics: mode === 'production',
globDirectory: './sites/assets/public',
globPatterns: [
'img/shell/**/*.{svg,png}',
'dist/**/*.{js,css}',
'manifest.json'
],
modifyURLPrefix: {
'dist/': `${dotEnv.parsed.APPLICATION_ASSETS_CDN}/dist/`,
'img/shell/': `${dotEnv.parsed.APPLICATION_ASSETS_CDN}/img/shell/`,
},
ignoreURLParametersMatching: [/v/],
additionalManifestEntries: [
{
"url": "/offline",
"revision": genRanHex()
}
],
runtimeCaching: [
{
urlPattern: /^https:\/\/([\w+\.\-]+www\.mysite\.tv)(|\/.*)$/,
handler: 'StaleWhileRevalidate',
options: {
cacheName: 'core',
precacheFallback: {
fallbackURL: '/offline' // THIS IS THE KEY
}
}
}
]
}).then(({count, size}) => {
console.log(`Generated service worker, which will precache ${count} files, totaling ${size} bytes.`);
}).catch(console.error);

Related

Android Enterprises Device Enrollment Stuck with NodeJs Generated QR Code with Service Account Authentication

As mentioned in the google documents i have tested the following process
URL to quick start: https://colab.research.google.com/github/google/android-management-api-samples/blob/master/notebooks/quickstart.ipynb#scrollTo=pjHfDSb8BoBP
Create Enterprise
Create Policy
Enroll the device
Then I have used the NODEJS API of Android Enterprises to develop the server based solution, which is working fine as per the documentation for all the functions such as get, create, delete the policy, devices, enterprises.
The issue i am facing is with the QR code generated from NODE application, when i scan the QR code generated from NODEJS application, the device got stuck at system update.
Following is my Policy update function
router.post('/update/:id', async function(req, res) {
const {title,policy_body,update_mask,enroll_url} = req.body;
// here we are callng the android managment API to and then the response we will update to database
const amApiBody = {
name: policy_body.name,
updateMask:update_mask,
requestBody:policy_body
}
const policy_update_response = await amApi.updatePolicy(amApiBody);
const p = await policyModel.update(req.params.id,title,policy_update_response,enroll_url);
res.json(p)
});
AmAPI file
this.updatePolicy = async function (body)
{
const auth = new google.auth.GoogleAuth({
scopes: ['https://www.googleapis.com/auth/androidmanagement'],
});
const authClient = await auth.getClient();
google.options({auth: authClient});
// Get the list of available policies
const res = await androidmanagement.enterprises.policies.patch(body);
console.log('requestFinalBody=',body);
return res.data;
}
Following is my policy data obtained by running above function
policy_create_response= {
name: 'enterprises/LC019rjnor/policies/policy1',
version: '14',
applications: [
{
packageName: 'com.google.samples.apps.iosched',
installType: 'FORCE_INSTALLED',
autoUpdateMode: 'AUTO_UPDATE_HIGH_PRIORITY'
},
{
packageName: 'com.dekaisheng.courier',
installType: 'FORCE_INSTALLED',
autoUpdateMode: 'AUTO_UPDATE_HIGH_PRIORITY'
}
],
keyguardDisabledFeatures: [ 'KEYGUARD_DISABLED_FEATURE_UNSPECIFIED' ],
defaultPermissionPolicy: 'GRANT',
uninstallAppsDisabled: true,
keyguardDisabled: true,
tetheringConfigDisabled: true,
dataRoamingDisabled: true,
networkEscapeHatchEnabled: true,
bluetoothDisabled: true,
debuggingFeaturesAllowed: true,
funDisabled: true,
kioskCustomLauncherEnabled: true
}
Note i have exported the variable to the terminal as follows before running the app, the auth.json is the service account credential file.
export GOOGLE_APPLICATION_CREDENTIALS="/Users/Mac/Projects/wajid/mdm/server/env/auth.json"
Thanks for the help in advance
I figured out that in nodeJS API I was passing wrong property name of Policy value in the request body.
Code before fix
parent: this.getParent(policyName),
requestBody:{
“name”: “my_policy"
}
Code after fix
parent: this.getParent(policyName),
requestBody:{
"policyName”: “my_policy"
}

Swagger Tools Production Build Node js

We implemented the swagger in our nodeJs application. As of now we are created production build using webpack and remove the controller and services file.
bin/www.js
const YAML = require('yamljs');
const swaggerTools = require('swagger-tools');
const swaggerDoc = YAML.safeLoad('./swagger.yaml');
// swaggerRouter configuration
const swaggerOptions = {
controllers: path.join(__dirname, '../public/javascripts/controllers'),
useStubs: true, // Conditionally turn on stubs (mock mode)
};
// Initialize the Swagger middleware
swaggerTools.initializeMiddleware(swaggerDoc, (middleware) => {
// Interpret Swagger resources and attach metadata to request - must be first in swagger-tools middleware chain
app.use(middleware.swaggerMetadata());
// validate the security using JWT token
app.use(middleware.swaggerSecurity({
Bearer: auth.verifyToken
}));
// Validate Swagger requests
app.use(middleware.swaggerValidator({
validateResponse: true
}));
// Route validated requests to appropriate controller
app.use(middleware.swaggerRouter(swaggerOptions));
// Serve the Swagger documents and Swagger UI
app.use(middleware.swaggerUi());
});
If we did the same in production build and the swagger middleware expecting the same path to resolve. after build we delete the public folder.
Webpack code
const path = require('path');
const nodeExternals = require('webpack-node-externals');
module.exports = {
entry: {
server: './bin/www',
},
output: {
path: path.join(__dirname, 'dist'),
publicPath: '/',
filename: 'server.build.js',
},
target: 'node',
node: {
// Need this when working with express, otherwise the build fails
__dirname: false, // if you don't put this is, __dirname
__filename: false, // and __filename return blank or /
},
externals: [nodeExternals()],
module: {
rules: [
{
// Transpiles ES6-8 into ES5
test: /\.js$/,
exclude: /node_modules/,
use: {
loader: 'babel-loader',
},
},
],
},
};
Pleas help us to create a build using swagger middleware
Thanks in advance
Swagger tools is not a package bundler like webpack. So you will still need to provide it the controller files. Since you are deleting /public from prod then there is no way for swagger tools middleware to get the files it needs. Webpack in this case is basically building a dist from your code which is why it's ok to delete the controller and services.

How to exclude url in workbox runtime caching?

I am using workbox-build for Gulp in my django project. All works correct, but there are some problems with admin urls. As I see, /admin/* urls caching in runtimes - I can see them in Chrome DevTools/Application/Cache. How can I exclude admin urls from runtime caching?
gulp.js:
gulp.task('service-worker', () => {
return workboxBuild.injectManifest({
globDirectory: '/var/www/example.com/',
swSrc: '/var/www/example.com/core/templates/core/serviceWorker/sw-dev.js',
swDest: '/var/www/example.com/core/templates/core/serviceWorker/sw.js',
globPatterns:['**/*.{html,js,css,jpg,png,ttf,otf}'],
globIgnores: ['admin\/**','media\/**','core\/**','static/admin\/**','static/core/scripts/plugins/**']
}).then(({count, size, warnings}) => {
});
sw.js:
importScripts("https://storage.googleapis.com/workbox- cdn/releases/3.4.1/workbox-sw.js");
workbox.precaching.precacheAndRoute([]);
workbox.googleAnalytics.initialize();
workbox.routing.registerRoute(
workbox.strategies.cacheFirst({
// Use a custom cache name
cacheName: 'image-cache',
plugins: [
new workbox.expiration.Plugin({
// Cache only 20 images
maxEntries: 30,
// Cache for a maximum of a week
maxAgeSeconds: 7 * 24 * 60 * 60,
})
],
})
);
workbox.routing.registerRoute(
/.*\.(?:ttf|otf)/,
workbox.strategies.cacheFirst({
cacheName: 'font-cache',
})
);
workbox.routing.registerRoute(
new RegExp('\/$'),
workbox.strategies.staleWhileRevalidate()
);
workbox.routing.registerRoute(
new RegExp('contacts\/$'),
workbox.strategies.staleWhileRevalidate()
);
workbox.routing.registerRoute(
new RegExp('pricelist\/$'),
workbox.strategies.staleWhileRevalidate()
);
In addition to providing RegExps for routing, Workbox's registerRoute() method supports matchCallback functions. I think that they're easier to make sense of, and recently most of the examples in the public documentation have migrated to use them.
workbox.routing.registerRoute(
// Match all navigation requests, except those for URLs whose
// path starts with '/admin/'
({request, url}) => request.mode === 'navigate' &&
!url.pathname.startsWith('/admin/'),
new workbox.strategies.StaleWhileRevalidate()
);

Using react-hot-loader with custom babel preset

My app doesn't support older browsers, and I like to trim down the set of Babel transforms to make the code easier to debug (so that the code in the debugger looks like more the original source).
However, when I migrate to react-hot-loader 3, this no longer works. That is, I can get RHL 3 to work with the standard es2015 preset, but not with my custom set of transforms. What happens is that the react components are rendered but never mounted, and won't respond to any events.
The set of transforms I am trying to use is:
var babel_plugins = [
'transform-runtime',
'transform-object-rest-spread',
// Transforms needed for modern browsers only
'babel-plugin-check-es2015-constants',
'babel-plugin-transform-es2015-block-scoping',
'babel-plugin-transform-es2015-function-name',
'babel-plugin-transform-es2015-parameters',
'babel-plugin-transform-es2015-destructuring',
// No longer needed with Webpack 2
// 'babel-plugin-transform-es2015-modules-commonjs',
'react-hot-loader/babel',
];
In response to the comments, here's more information:
Here's how I'm using the AppContainer:
export default (
<AppContainer>
<Router history={browserHistory}>
(My routes here...)
</Router>
</AppContainer>
);
And here's my dev server setup:
// Adjust the config for hot reloading.
config.entry = {
main: [
'react-hot-loader/patch',
'webpack-dev-server/client?http://127.0.0.1:8000', // WebpackDevServer host and port
'webpack/hot/only-dev-server', // "only" prevents reload on syntax errors
'./src/main.js', // Your appʼs entry point
],
frame: './src/frame_main.js', // Entry point for popup tab
};
config.plugins.push(new webpack.HotModuleReplacementPlugin());
const compiler = webpack(config);
const server = new WebpackDevServer(compiler, {
contentBase: path.resolve(__dirname, '../builds/'),
historyApiFallback: true,
stats: 'errors-only',
hot: true,
});
server.listen(8000, '127.0.0.1', () => {});
Here's the relevant portion of my webpack config:
test: /\.jsx?$/,
include: __dirname + '/src',
exclude: __dirname + '/src/libs',
use: [
{
loader: 'babel-loader',
options: {
plugins: babel_plugins,
presets: babel_presets
},
},
{
loader: 'eslint-loader',
},
]

Using Grunt to Mock Endpoints

I'm using Yeoman, Grunt, and Bower, to construct a platform for building a frontend independently of a a backend. The idea would be that all of my (AngularJS) controller, services, factories, etc live in this project, and get injected afterwards into my serverside codebase based off the result of grunt build.
My question is:
How can I mock endpoints so that the Grunt server responds to the same endpoints as my (Rails) App will?
At the moment I am using:
angular.module('myApp', ['ngResource'])
.run(['$rootScope', function ($rootScope) {
$rootScope.testState = 'test';
}]);
And then in each of my individual services:
mockJSON = {'foo': 'myMockJSON'}
And on every method:
if($rootScope.testState == 'test'){
return mockJSON;
}
else {
real service logic with $q/$http goes here
}
Then after grunt build, testState = 'test' gets removed.
This is clearly a relatively janky architecture. How can I avoid it? How can I have Grunt respond to the same endpoints as my app (some of which have dynamic params) apply some logic (if necessary), and serve out a json file (possibly dependent on path params)?
I've fixed this issue by using express to write a server that responds with static json.
First I created a directory in my project called 'api'. Within that directory I have the following files:
package.json:
{
"name": "mockAPI",
"version": "0.0.0",
"dependencies": {
"express": "~3.3.4"
}
}
Then I run npm install in this directory.
index.js:
module.exports = require('./lib/server');
lib/server.js:
express = require('express');
var app = express();
app.get('/my/endpoint', function(req, res){
res.json({'foo': 'myMockJSON'});
});
module.exports = app
and finally in my global Gruntfile.js:
connect: {
options: {
port: 9000,
hostname: 'localhost',
},
livereload: {
options: {
middleware: function (connect, options) {
return [
lrSnippet,
mountFolder(connect, '.tmp'),
mountFolder(connect, yeomanConfig.app),
require('./api')
];
}
}
},
Then the services make the requests, and the express server serves the correct JSON.
After grunt build, the express server is simply replaced by a rails server.
As of grunt-contrib-connect v.0.7.0 you can also just add your custom middleware to the existing middleware stack without having to manually rebuild the existing middleware stack.
livereload: {
options: {
open: true,
base: [
'.tmp',
'<%= config.app %>'
],
middleware: function(connect, options, middlewares) {
// inject a custom middleware into the array of default middlewares
middlewares.push(function(req, res, next) {
if (req.url !== '/my/endpoint') {
return next();
}
res.writeHead(200, {'Content-Type': 'application/json' });
res.end("{'foo': 'myMockJSON'}");
});
return middlewares;
}
}
},
See https://github.com/gruntjs/grunt-contrib-connect#middleware for the official documentation.
Alternatively you can use the grunt-connect-proxy to proxy everything that is missing in your test server to an actual backend.
It's quite easy to install, just one thing to remember when adding proxy to your livereload connect middleware is to add it last, like this:
middleware: function (connect) {
return [
lrSnippet,
mountFolder(connect, '.tmp'),
mountFolder(connect, yeomanConfig.app),
proxySnippet
];
}
grunt-connect-prism is similar to the Ruby project VCR. It provides an easy way for front end developers to record HTTP responses returned by their API (or some other remote source) and replay them later. It's basically an HTTP cache, but for developers working on a Single Page Application (SPA). You can also generate stubs for API calls that don't exist, and populate them the way you want.
It's useful for mocking complex & high latency API calls during development. It's also useful when writing e2e tests for your SPA only, removing the server from the equation. This results in much faster execution of your e2e test suite.
Prism works by adding a custom connect middleware to the connect server provided by the grunt-contrib-connect plugin. While in 'record' mode it will generate a file per response on the filesystem with content like the following:
{
"requestUrl": "/api/ponies",
"contentType": "application/json",
"statusCode": 200,
"data": {
"text": "my little ponies"
}
}
DISCLAIMER: I'm the author of this project.
You can use Apache proxy and connect your REST server with gruntjs.
Apache would do this:
proxy / -> gruntjs
proxy /service -> REST server
you would use your application hitting Apache and angular.js application would think that is talking with itself so no cross domain problem.
Here is a great tutorial on how to set this up:
http://alfrescoblog.com/2014/06/14/angular-js-activiti-webapp-with-activiti-rest/
Just my alternative way that based on Abraham P's answer. It does not need to install express within 'api' folder. I can separate the mock services for certain files. For example, my 'api' folder contains 3 files:
api\
index.js // assign all the "modules" and then simply require that.
user.js // all mocking for user
product.js // all mocking for product
file user.js
var user = function(req, res, next) {
if (req.method === 'POST' && req.url.indexOf('/user') === 0) {
res.end(
JSON.stringify({
'id' : '5463c277-87c4-4f1d-8f95-7d895304de12',
'role' : 'admin'
})
);
}
else {
next();
}
}
module.exports = user;
file product.js
var product = function(req, res, next) {
if (req.method === 'POST' && req.url.indexOf('/product') === 0) {
res.end(
JSON.stringify({
'id' : '5463c277-87c4-4f1d-8f95-7d895304de12',
'name' : 'test',
'category': 'test'
})
);
}
else {
next();
}
}
module.exports = product;
index.js just assigns all the "modules" and we simply require that.
module.exports = {
product: require('./product.js'),
user: require('./user.js')
};
My Gruntfile.js file
connect: {
options: {
port: 9000,
// Change this to '0.0.0.0' to access the server from outside.
hostname: 'localhost',
livereload: 35729
},
livereload: {
options: {
open: true,
middleware: function (connect) {
return [
connect.static('.tmp'),
connect().use(
'/bower_components',
connect.static('./bower_components')
),
connect.static(appConfig.app),
require('./api').user,
require('./api').product,
];
}
}
}