Reputation: 6113
I have a progressive web app (PWA) consisting of several files including index.html
, manifest.json
, bundle.js
and serviceWorker.js
. I update my app by uploading all these files to my host. In case it matters, I am using Firebase so I use firebase deploy
to upload the files.
Usually everything works correctly: When an existing user opens the app they still see the old version but in the background the new service worker install
s any changed files to the cache. Then when the user next opens the app it activate
s and they see the new version.
But I have a problem when a user opens the app a short time after I deploy it. What seems to happen is: The host delivers the new serviceWorker.js
but the old bundle.js
. And so the install
puts old bundle.js
in its new cache. The user gets the old functionality or worse might get an app made up of an inconsistent mixture of new and old files.
I guess it could be argued that it is the host's fault for not updating atomically, but I have no control over Firebase. And it does not sound possible anyway because the browser is sending a series of independent fetches and there can be no guarantee that they will all return a consistent version.
In case it helps, here is my serviceWorker.js
. The cacheName
strings such as "app1-a0f43550e414"
are generated by my build pipeline. Here a0f43550e414
is the hash of the latest bundle.js
so that the cache is only updated if the content of bundle.js
has changed.
"use strict";
const appName = "app1";
const cacheLookup = {
"app1-aefa820f62d2": "/",
"app1-a0f43550e414": "bundle.js",
"app1-23d94a4a7388": "manifest.json"
};
self.addEventListener("install", function (event) {
event.waitUntil(
Promise.all(Object.keys(cacheLookup).map(cacheName =>
caches.open(cacheName)
.then(cache => cache.add(cacheLookup[cacheName]))
))
);
});
self.addEventListener("activate", event => {
event.waitUntil(
caches.keys().then(cacheNames =>
Promise.all(cacheNames.map(cacheName => {
if (cacheLookup[cacheName]) {
// cacheName holds a file still needed by this version
} else if (!cacheName.startsWith(appName + "-")) {
// Do not delete the cache of other apps at same scope
} else {
console.log("Deleting out of date cache:", cacheName);
return caches.delete(cacheName);
}
}))
)
);
});
const handleCacheMiss = request =>
new Promise((_, reject) => {
reject(Error("Not in service worker cacheLookup: " + request.url));
});
self.addEventListener("fetch", event => {
const request = event.request;
event.respondWith(
caches.match(request).then(cachedResponse =>
cachedResponse || handleCacheMiss(request)
)
);
});
I have considered bundling all my HTML, CSS and JavaScript into a giant file so it cannot be inconsistent. But a PWA need several supporting files that cannot be bundled including the service worker, manifest and icons. If I bundle all that I can, the user can still get stuck with an old version of the bundle and still have inconsistent supporting files. And anyway, in the future I would like to increase granularity by doing less bundling and having more files so on a typical update only a few small files would need to be fetched.
I have also considered uploading bundle.js
and the other files with a different filename for each version. The service worker's fetch
could hide the name change so other files like index.html
can still refer to it as bundle.js
. But I don't see how this works the first time a browser loads the app. And I don't think you can rename index.html
or manifest.json
.
Upvotes: 1
Views: 1642
Reputation: 56044
It sounds like your request for bundle.js
inside of your install
handler might be fulfilled by the HTTP cache, instead of via the network.
You can try changing this current snippet:
cache.add(cacheLookup[cacheName])
to explicitly create a Request
object that has its cache mode set to reload
to ensure that the response isn't provided by the HTTP cache:
cache.add(new Request(cacheLookup[cacheName], {cache: 'reload'})
Alternatively, if you're concerned about non-atomic global deployments and you can go through the effort to generate sha256
or better hashes as part of your build process, you can make use of subresource integrity to ensure that you're getting the correct response bytes from the network that your new service worker expects.
Adapting your code would something like the following, where you'd actually have to generate the correct sha256
hashes for each file you care about during build time:
const cacheLookup = {
"sha256-[hash]": "/",
"sha256-[hash]": "bundle.js",
"sha256-[hash]": "manifest.json"
};
// Later...
cache.add(new Request(cacheLookup[cacheName], {
cache: 'reload',
integrity: cacheName,
}));
If there's an SRI mismatch, then the request for a given resource will fail, and that will cause the cache.add()
to reject, which will in turn cause the overall service worker installation to fail. Service worker installation will be retried the next time an update check happens, at which point (hopefully!) the deployment will be finished and the SRI will be valid.
Upvotes: 2