progressive-web-application with webpack.js - progressive-web-apps

In webpack.js version 5 I wanted to use PWA which according to their webpack documentation:
https://webpack.js.org/guides/progressive-web-application/
I was told to use Workbox, but Workbox is only used for Offline mode, for example, if I want to use other features such as Notifications, IndexedDB, etc., what code should I add in webpack.config.js and also how to use manifest.json With webpack?

Assuming you want the precaching behavior described in https://webpack.js.org/guides/progressive-web-application/ and you want to customize your service worker, the cleanest approach would be to use the InjectManifest workbox-webpack-plugin, and pass in the path to your custom service worker file as the swSrc option.
The InjectManifest plugin will take care of bundling your swSrc file, and it will replace self.__WB_MANIFEST with a precache manifest based on your webpack compilation's assets.
// webpack.config.js
const path = require('path');
const {InjectManifest} = require('workbox-webpack-plugin');
module.exports = {
// Your other webpack config goes here.
plugins: [
new InjectManifest({
swSrc: './service-worker.js',
swDest: 'service-worker.js',
// Any other config if needed.
}),
],
};
// service-worker.js
import {precacheAndRoute} from 'workbox-precaching';
// Add in additional imports here as needed.
// This line of code ensures that everything in your webpack
// compilation gets precache and served.
precacheAndRoute(self.__WB_MANIFEST);
// Your custom service worker code goes here.

Related

Caching external downloads with Workbox

I'm working on a GatsbyJS site using gatsby-plugin-offline which is available at example.com and would like to make PDF files to which I link on example.com but are at download.example.com/example.pdf available offline. Is that possible?
Yes, it's possible. I'm not 100% familiar with gatsby-plugin-offline's configuration, but it looks like https://www.gatsbyjs.org/packages/gatsby-plugin-offline/#available-options describes a process for appending additional service worker logic to thee end of its default configuration:
plugins: [{
resolve: `gatsby-plugin-offline`,
options: {
appendScript: require.resolve(`src/custom-sw-code.js`),
},
}]
Then in src/custom-sw-code.js:
workbox.routing.registerRoute(
({url}) => url.pathname.endsWith('.pdf'),
// Use StaleWhileRevalidate, CacheFirst, etc. as desired.
new workbox.strategies.StaleWhileRevalidate({cacheName: 'pdfs'})
);

How to implement a serviceworker in SFCC (Demandware)

I was wondering if anyone here has experience with implementing a service worker in SFCC/Demandware.
I generate a service worker with Webpack with sw-precache-webpack-plugin
The problem is: a service worker should be available from the root of the domain. so site.com/sw.js.
JS files will come normally in the static/ folder.
Anyone an idea how to serve this JS file from the root of the project in Demandware/SFCC?
Unfortunately, registering a service worker under an scope that is in an upper path than the service worker file itself does not work (as stated in MDN):
The service worker will only catch requests from clients under the service worker's scope.
The max scope for a service worker is the location of the worker.
(Source: https://developer.mozilla.org/en-US/docs/Web/API/Service_Worker_API/Using_Service_Workers)
Solution
Here is a suggestion for a working approach for serving "/sw.js" in Demandware (Sales Force):
Create a new controller (or pipeline), e.g. "ServiceWorker-GetFile"; the response should be a file content, which can be read from whatever source you wish:
Content asset (dw.content.ContentMgr.getContent());
Library file (dw.content.ContentMgr.getContent() or directly reading a file with dw.io.File / dw.io.FileReader);
even Site preference (although I wouldn't recommend it);
Create an entry in Business Manager / Merchant Tools / SEO / Aliases to route "/sw.js" to "ServiceWorker-GetFile", i.e. use something along:
{
...
"your-host" : [
...,
{
"if-site-path": "/sw.js",
"pipeline": "ServiceWorker-GetFile"
}
]
}
This may seem like an unnecessary overhead, but it was the only way I could findfor serving files with root path in the URI.
Serving other root files as well
By expanding the controller (renaming it to, say, "Content-GetFile" and adding GET/POST parameters like "name" and/or "source") this could be conveniently used for other files as well ("/manifest.json", "/.well-known/assetlinks.json" etc.). In the next example of Business Manager / ... / Aliases, let Content-GetFile accept two parameters: "name" (which would be a file name in the content library or a content asset ID) and "source" (which would be "file" or "asset"):
...
{
"if-site-path": "/sw.js",
"pipeline": "Content-GetFile",
"params": {
"name": "/ServiceWorker/sw.js",
"source": "file"
}
},
{
"if-site-path": "/manifest.json",
"pipeline": "Content-GetFile",
"params": {
"name": "MANIFEST_JSON",
"source": "asset"
}
}
Note that your code should handle appropriately the base paths of the resources (e.g. "/ServiceWorker/sw.js" from the above example does not speak much; you should know whether this is a path in a content library or a path relative to "cartridges//static/default/js/").
Dynamic content
Since the suggested approach uses a controller, you can dynamically process the content before serving it to the user (e.g. if you need to add/remove the "/v12435145145/" part from DMW links). Sky is your limits. :)
I'm currently messing around with the service workers on DW as well.
In my case I have directly added the script inside a footer.isml file like this:
<script>
//init service worker
if ('serviceWorker' in navigator) {
window.addEventListener('load', () => {
navigator.serviceWorker
.register("${URLUtils.staticURL('/lib/sw/sw.js')}")
.then(registration => {
console.log(
`Service Worker registered! Scope: ${registration.scope}`
);
})
.catch(err => {
console.log(`Service Worker registration failed: ${err}`);
});
});
}
</script>
This works for me, well at least I can see the Service Worker registered message.
I also had some issues due to the SSL certificate since my development environment doesn't have a proper SSL but it's using HTTPs routes, so chrome was complaining about it, I needed to run chrome via terminal using this command:
/Applications/Google\ Chrome.app/Contents/MacOS/Google\ Chrome --user-data-dir=/tmp/foo --ignore-certificate-errors --unsafely-treat-insecure-origin-as-secure=[YOUR DOMAIN]
Unfortunately I'm not able to make work any line of code inside that service worker file, even I tried on Safari, since it has a Service Workers option in the develop menu, but it's not showing any service worker running.
I Hope it will helps you.

How can I know the base url used in a running test in protractor?

I'm trying to do navigation test in protractor and don't see any consitency with the baseUrl in the config and the url used in the test.
protractor.conf.js
exports.config = {
baseUrl: 'http://localhost:4200/'
}
navbar.e2e-spec.ts
import { NavbarPage } from './navbar.po';
import * as protractor from './../protractor.conf.js';
describe('navbar', () => {
let navbar: NavbarPage;
const baseUrl = protractor.config.baseUrl;
beforeEach(() => {
navbar = new NavbarPage();
browser.get('/');
});
it(`should see showcase nav item, be able to (click) it,
and expect to be navigated to showcase page`, () => {
const anchorShowcase = navbar.anchorShowcase;
expect(anchorShowcase.isDisplayed()).toBe(true);
anchorShowcase.click();
browser.waitForAngular();
expect(browser.getCurrentUrl()).toBe(baseUrl + '/showcase');
});
});
Although when I run the e2e test it uses a different port:
** NG Live Development Server is listening on localhost:49154, open your browser on http://localhost:49154/ **
Why is the test url set to port 49154. This apparently seems to be the default if you start a new angular-cli project: https://github.com/angular/angular-cli
How can I get control over the baseUrl / Or is http://localhost:49154/ safe to use for all my angular cli projects?
By default when you do ng e2e the command take --serve value as true. It means it will build and serve at that in a particular URL. Not the baseUrl you passed in protractor.conf.js
that is why, you are getting a random URL served when testing you app like http://localhost:49154/
Now as you don't want build during test and want to test existing build (URL) like http://localhost:4200/ you need to pass --no-serve in your command line and it will pick baseUrl from the protractor.conf.js
you can also pass baseUrl in the command line like below. note that this not baseUrl but --base-href=
ng e2e --no-serve --base-href=https://someurl.com:8080
When running Angular CLI's ng e2e command, it states in the wiki that the default port will be random, as seen here:
https://github.com/angular/angular-cli/wiki/e2e
Under the serve submenu.
The e2e command can take in all the same arguments as serve so to keep the port the same just pass in --port my-port-number to the ng e2e command.
As far as that port being safe to use, I wouldn't use it, it is just a random port after all. I would stick to the default unless you have a use-case for changing it. The port is mainly relevant for the dev server, not so much for where ever the production code runs.
Aniruddha Das's solution doesn't work anymore as this option isn't there from Angular CLI 6.x version, you can try following -
ng e2e --dev-server-target=
please see following reference

Static resource reload with akka-http

In short: is it possible to reload static resources using akka-http?
A bit more:
I have Scala project.
I'm using App object to launch my Main
class.
I'm using getFromResourceDirectory to locate my resource
folder.
What I would like to have is to hot-swap my static resources during development.
For example, I have index.html or application.js, which I change and I want to see changes after I refresh my browser without restarting my server. What is the best practise of doing such thing?
I know that Play! allows that, but don't want to base my project on Play! only because of that.
Two options:
Easiest: use the getFromDirectory directive instead when running locally and point it to the path where your files you want to 'hotload' are, it serves them directly from the file system, so every time you change a file and load it through Akka HTTP it will be the latest version.
getFromResourceDirectory loads files from the classpath, the resources are available because SBT copies them into the class directory under target every time you build (copyResources). You could configure sbt using unmanagedClasspath to make it include the static resource directory in the classpath. If you want to package the resources in the artifact when running package however this would require some more sbt-trixery (if you just put src/resources in unmanagedClasspath it will depend on classpath ordering if the copied ones or the modified ones are used).
I couldn't get it to work by adding to unmanagedClasspath so I instead used getFromDirectory. You can use getFromDirectory as a fallback if getFromResourceDirectory fails like this.
val route =
pathSingleSlash {
getFromResource("static/index.html") ~
getFromFile("../website/static/index.html")
} ~
getFromResourceDirectory("static") ~
getFromDirectory("../website/static")
First it tries to look up the file in the static resource directory and if that fails, then checks if ../website/static has the file.
The below code try to find the file in the directory "staticContentDir". If the file is found, it is sent it back to the client. If it is not found, it tries by fetching the file from the directory "site" in the classpath.
The user url is: http://server:port/site/path/to/file.ext
/site/ comes from "staticPath"
val staticContentDir = calculateStaticPath()
val staticPath = "site"
val routes = pathPrefix(staticPath) {
entity(as[HttpRequest]) { requestData =>
val fullPath = requestData.uri.path
encodeResponse {
if (Files.exists(staticContentDir.resolve(fullPath.toString().replaceFirst(s"/$staticPath/", "")))) {
getFromBrowseableDirectory(staticContentDir.toString)
} else {
getFromResourceDirectory("site")
}
}
}
}
I hope it is clear.

Sails.js HOWTO: implement logging for HTTP requests

With the poor default logging of Sails.js not showing http request logs(even on verbose). What is the best way implement http request logging to console so i can see if I am getting malformed requests? Expressjs's default logging would be enough.
I would prefer a Sails.js configuration way of doing it rather then a change the source code approach is possible.
Has anyone had experience with this. My google searches seem oddly lacking information.
Running Sails v0.9.8 on Mac OSX.
There's no Sails config option to log every request, but you can add a quick logging route at the top of config/routes.js that should do the trick:
// config/routes.js
'/*': function(req, res, next) {sails.log.verbose(req.method, req.url); next();}
Maybe too late, but for future references about this, I'm using Sails 0.11 and you can config that in the middleware, in the config/http.js file
Add this function (in fact it comes as an example)
// Logs each request to the console
requestLogger: function (req, res, next) {
console.log("Requested :: ", req.method, req.url);
return next();
},
And setup it on the order var:
order: [
'startRequestTimer',
'cookieParser',
'session',
'requestLogger', // Just here
'bodyParser',
'handleBodyParserError',
'compress',
'methodOverride',
'poweredBy',
'$custom',
'router',
'www',
'favicon',
'404',
'500'
]
I forked the sails-hook-requestlogger module to write all the request logs (access logs) to file.
sails-hook-requestlogger-file
All you have to do is npm install sails-hook-requestlogger-file and you are good to go!
Usage
Just lift your app as normal and all your server requests will be logged, with useful information such as response-time, straight to your console. As a default it is activated in your dev environment but deactivated in production.
Configuration
By default, configuration lives in sails.config.requestloggerfile
You can create config/requestlogger.js and override these defaults:
Parameter Type Details
format ((string)) Defines which logging format to use. Deaults to dev.
logLocation ((string)) Defines where to log: console or file. Defaults to console.
fileLocation ((string)) Location of file relative to project root (if file is specified in logLocation. This has no effect if console is specified in logLocation.
inDevelopment ((boolean)) Whether or not to log requests in development environment. Defaults to true.
inProduction ((boolean)) Whether or not to log requests in production environment Defaults to false.
Example config/requestlogger.js file:
module.exports.requestloggerfile = {
//see: https://github.com/expressjs/morgan#predefined-formats for more formats
format: ':remote-addr - [:date[clf]] ":method :url" :status :response-time ms ":user-agent"',
logLocation: 'file',
fileLocation: '/var/log/myapp/access.log',
inDevelopment: true,
inProduction: true
};
Hope that it would help someone :)
I found this matched my needs - it uses the Morgan module for Express and hooks it all up for you: https://www.npmjs.com/package/sails-hook-requestlogger