use workbox without using cdn - progressive-web-apps

Does anybody know how to use workbox without getting it from the CDN? I tried this...
add workbox-cli to my dependencies:
"workbox-cli": "^3.6.3"
which gets me all of the following dependencies
$ ls node_modules | grep workbox
workbox-background-sync
workbox-broadcast-cache-update
workbox-build
workbox-cacheable-response
workbox-cache-expiration
workbox-cli
workbox-core
workbox-google-analytics
workbox-navigation-preload
workbox-precaching
workbox-range-requests
workbox-routing
workbox-strategies
workbox-streams
workbox-sw
Then I replaced this line in the examples
importScripts('https://storage.googleapis.com/workbox-cdn/releases/3.6.1/workbox-sw.js');
with this
importScripts('workbox-sw.js');
after copying node_modules/workbox-sw/build/workbox-sw.js to the public folder
But now I realise by looking at the network tab, that that file still gets all the other modules from the cdn
(I thought it would be build with everything inside it.)
Can anybody tell me if there is an npm package somewhere that already has everything inside it? Or should I copy the modules I need from the npm folder, and somehow tie them all together myself? Or do I have to use the webpack plugin? (which I guess will only bundle the modules that I use)

(Update: Workbox v5 makes the process of using a local copy of the Workbox runtime much simpler, and in most cases, it's the default.)
There's one more step that's required. The "Using Local Workbox Files Instead of CDN" has the details:
If you don’t want to use the CDN, it’s easy enough to switch to
Workbox files hosted on your own domain.
The simplest approach is to get the files via workbox-cli's
copyLibraries
command
or from a GitHub Release, and then tell workbox-sw where to find
these files via the modulePathPrefix config option.
If you put the files under /third_party/workbox/, you would use them
like so:
importScripts('/third_party/workbox/workbox-sw.js');
workbox.setConfig({modulePathPrefix: '/third_party/workbox/'});
With this, you’ll use only the local Workbox files.

Related

Add a local dependency to an ACI in acbuild

I'm currently experimenting with ACI construction for rkt-containers. During my experiments I've built some containers especially for the use as a dependency. I now want to use these .aci images as a dependency for other images. As these files are fetched by name (for example "quay.io/alpine-sh"), I wonder if there is a way to refer to actual local .aci files.
Is there a way to import these .aci files from the local filesystem or do I have to set up a local webserver to serve as a repository?
Dependencies in acbulid (at least till version 0.3) can be defined only as http-links,
so you need to make your aci available through http to use it as dependency in acbuild.
It's not so hard to publish your aci to make it available through http. Image archive can be actually hosted on github or bitbucket.
The recent versions of acbuild seem to support it
since the relating issue (cache dependencies across acbuild invocations #144) is closed.
Cached ACI's are stored in directories depstore-tar and depstore-expanded inside $CONTEXT_ROOT/.acbuild. If we save somehow content of those directories between acbuild init,
acis won't be downloaded over and over again.
When i played with acbuild i was so annoyed that acbulid redownloads dependencies on every build.
I've written script https://bitbucket.org/legeyda/anyorigin/src/tip/acbuild-plus
which configures symbolic links inside $CONTEXT_ROOT/.acbuild to point to
persistence directories inside /var/lib/acbuild/hack. The usage is simple:
acbuild begin
acbuild-plus init target
After that all dependencies will be cached by acbuild.
You can also manually install aci-file to be available to acbuild.
This is as simple as
acbulid-plus install <your-image.aci>
I've tested the script with acbuild v0.3.0.
You can get an example of using it in the Makefile next to acbuld-plus in the repository.

Babel transpiler global from CLI

I'm trying to get to grips with the babel transpiler. It's docs start by telling you how to install it globally, and then, shortly thereafter tells you you should never do this, and never explains how to run it that way. Well, I believe I wish to run it that way (because the presence of the node_modules directory, or possibly the .babelrc file, cripples brackets, which is the editor I'm currently needing to use).
I can run babel from the global installation easily enough, but it doesn't do anything. The only way I've succeeded in getting it to do any actual translation has been using the local invocation with the .babelrc file, which of course kills my editor (and yes, I actually do have to use that, and I'm not creating a node-based project in any other way, just plain ES6).
Is there some way to use the command line to provide the information that the .babelrc file specifies (and thereby have something other than simply file copying)? Or some other way to get babel to do what I need without physical presence in my source tree?

Ember cli - use sass addon in less project

I use broccoli-less in my ember cli project and would like to use an addon (ember-cli-materialize), which uses broccoli-sass.
After installing the addon, i get: File to read not found or unreadable ../app.scss, because i also have an app.less file in my styles dir.
As i understand, this commit Allow multiple preprocessors per type should make it possible, although i might be missing something. Has anyone managed to use ember-cli with multiple preprocessors, and what changes is needed?
Ember-cli version: 1.13.1
Ember version: 1.12.0
Thanks
I know your circumstance is different than mine but this may help others or spur a better solution. I was added to a dev team to polish up an app already styled using LESS. I favor SASS and tried to use ember-cli-sass alongside ember-cli-less without any success.
You may want to look further into Ember-Cli's add.import
By adding your input configurations to ember-cli-build.js with the above, you can leverage either your bower-components directory (if used) or vendor directory, to import a compiled CSS doc (from Sass source files) that will build alongside the project quite nicely with a simple sass --watch <input:output> command.
The LESS files are ultimately compiled to app.css, and your SASS files to vendor.css (make sure you link to the stylesheet in your index page/template).

how to run doctrine.php in command prompt

I'm trying to use Doctrine with Zend, I have copied the doctrine.php and doctrine file in the script folder in the Source Files folder.
However when I type in command prompt following command: "php doctrine.php" by entering in the scripts folder, Nothing happens, there is no error printed, the cursor just goes to next line. Can someone please tell me how can I use doctrine.php.
When using Guilherme's integration suite, you need to do a couple of things.
Download / clone the Doctrine Common, DBAL and ORM libraries and make sure they're available in your include path. For this, I usually just copy the lib/Doctrine code from each into my project's library folder. If using git, you can add them as subtree splits but that's a topic for another time ;-)
You also need the Symfony Console and Yaml namespaces. Again, it's easiest to place them in your project's library folder under library/Symfony/Component/Console and library/Symfony/Component/Yaml. These usually come as submodule dependencies in the Doctrine libraries but you can also get them from their github pages
Console
Yaml
Remove the bootstrap('Config') call from the doctrine.php script. Don't know what Guilherme was thinking there :-)
That's it, from there it should work as expected.

Developing with Qooxdoo and multiple developers

I'm interested in Qooxdoo as a possible web development framework. I have downloaded the SDK and installed it in a central location on my PC as I expect to use it on multiple projects. I used the create-application.py script to make a new test application and added all the generated files to my version control system.
I would like to be able to collaborate on this with other developers on other PCs. They are likely to have the SDK installed in a different location. The auto-generated files in Qooxdoo seem to include the SDK path in both config.json and generator.py: if the SDK path moves, the generator.py script stops working. generator.py doesn't seem to be too much of a problem as it looks in config.json for an updated path, but I'm not sure how best to handle config.json.
The only options I've thought of so far are:
Exclude it from the VCS, but there doesn't seem to be a script to regenerate it automatically, so this could be dangerous.
Add it to the VCS but have each developer modify the path line and accept that it might need to be adjusted whenever changes are merged.
Change config.json to be a path and a single 'include' line that points to a second file that contains all the non-SDK-path related information.
Use a relative path to the SDK and keep a separate, closely located copy of the SDK for every project that uses it.
Approach 1 would be ideal if the generation script existed; approach 2 is really nasty; I couldn't get approach 3 to work and approach 4 is a bit messy as it means multiple copies of the SDK littered about the place.
The Android SDK seems to deal with this very well (using approach 1), with the SDK path in its own file with a script that automatically generates that file. As far as I can tell, Qooxdoo puts lots of other important information in config.json and the only way to automatically generate that file is to create a new project.
Is there a better/recommended way to deal with this?
As an alternative to using symlinks, you can override the QOOXDOO_PATH macro on the command line:
./generate.py source -m QOOXDOO_PATH:<local_path_to_qooxdoo>
(Depending on the shell you are using you might have to apply some proper quoting of the -m argument). This way, every programmer can use his locally installed qooxdoo SDK. You can even drop the QOOXDOO_PATH entry from config.json to enforce this.
We work with a symbolic link pointing to the sdk ... config.json contains just the path of the link.