Reputation: 3275
I have a Service Worker that receives push messages from Firebase FCM. They cause notifications to show or to cancel. The user can have multiple devices (that's what the cancel is for: when the user already acted on notification A I try to dismiss it on all devices).
The problem I have is when one of the user's devices is offline or turned off altogether. Once the device goes online, firebase delivers all the messages it couldn't deliver before. So, for example, you'd get:
The SW receives these messages in rapid succession. The problem is that cancelling a notification is a lot faster than showing one (~2ms vs 16ms). So the 4th message is handled before the first (or second) message actually created the notification, the result being that the notification is not being cancelled.
// EDIT: Heavily edited question below. Added example code and broke down my questions. Also edited the title to better reflect my actual underlying question.
I tried pushing the messages in a queue and handling them one by one. Turns out this can become a bit complicated because everything in SW is async and, to make matters worse, it can be killed at any time when the browser thinks the SW finished its work. I tried to store the queue in a persistent manner but since LocalStorage is unavailable in SW I need to use the async IndexedDB API. More async calls that could cause problems (like losing items).
It's also possible that event.waitUntil
thinks my worker is done before it's actually done because I'm not correctly 'passing the torch' from promise to promise ..
Here's a (lot of) simplified code of what I tried:
// Use localforage, simplified API for IndexedDB
importScripts("localforage.min.js");
// In memory..
var mQueue = []; // only accessed through get-/setQueue()
var mQueueBusy = false;
// Receive push messages..
self.addEventListener('push', function(event) {
var data = event.data.json().data;
event.waitUntil(addToQueue(data));
});
// Add to queue
function addToQueue(data) {
return new Promise(function(resolve, reject) {
// Get queue..
getQueue()
.then(function(queue) {
// Push + store..
queue.push(data);
setQueue(queue)
.then(function(queue){
handleQueue()
.then(function(){
resolve();
});
});
});
});
}
// Handle queue
function handleQueue(force) {
return new Promise(function(resolve, reject) {
// Check if busy
if (mQueueBusy && !force) {
resolve();
} else {
// Set busy..
mQueueBusy = true;
// Get queue..
getQueue()
.then(function(queue) {
// Check if we're done..
if (queue && queue.length<=0) {
resolve();
} else {
// Shift first item
var queuedData = queue.shift();
// Store before continuing..
setQueue(queue)
.then(function(queue){
// Now do work here..
doSomething(queuedData)
.then(function(){
// Call handleQueue with 'force=true' to go past (mQueueBusy)
resolve(handleQueue(true));
});
});
}
});
}
});
}
// Get queue
function getQueue() {
return new Promise(function(resolve, reject) {
// Get from memory if it's there..
if (mQueue && mQueue.length>0) {
resolve(mQueue);
}
// Read from indexed db..
else {
localforage.getItem("queue")
.then(function(val) {
var queue = (val) ? JSON.parse(val) : [];
mQueue = queue;
resolve(mQueue);
});
}
});
}
// Set queue
function setQueue(queue) {
return new Promise(function(resolve, reject) {
// Store queue to memory..
mQueue = queue;
// Write to indexed db..
localforage.setItem("queue", mQueue)
.then(function(){
resolve(mQueue);
});
});
}
// Do something..
function doSomething(queuedData) {
return new Promise(function(resolve, reject) {
// just print something and resolve
console.log(queuedData);
resolve();
});
}
The short version of my question - with my particular use-case in mind - is: how do I handle push messages synchronously without having to use more async API's?
And if I would split those questions into multiple:
resolve(handleQueue())
inside handleQueue()
to keep it going? Or should I do return handleQueue()
? Or..?Just to apprehend the "why not use collapse_key": It's a chat app and every chat room has it's own tag. A user can participate in more than 4 chatrooms and since firebase limits the amount of collapse_keys to 4 I can't use that.
Upvotes: 3
Views: 2660
Reputation: 56034
So I'm going to go out on a limb and say that serializing things to IDB could be overkill. As long as you wait until all your pending work is done before you resolve the promise passed to event.waitUntil()
, the service worker should be kept alive. (If it takes minutes to finish that work, there's the chance that the service worker would be killed anyway, but for what you describe I'd say the risk of that is low.)
Here's a rough sketch of how I'd structure your code, taking advantage of native async
/await
support in all browsers that currently support service workers.
(I haven't actually tested any of this, but conceptually I think it's sound.)
// In your service-worker.js:
const isPushMessageHandlerRunning = false;
const queue = [];
self.addEventListener('push', event => {
var data = event.data.json().data;
event.waitUntil(queueData(data));
});
async function queueData(data) {
queue.push(data);
if (!isPushMessageHandlerRunning) {
await handlePushDataQueue();
}
}
async function handlePushDataQueue() {
isPushMessageHandlerRunning = true;
let data;
while(data = queue.shift()) {
// Await on something asynchronous, based on data.
// e.g. showNotification(), getNotifications() + notification.close(), etc.
await ...;
}
isPushMessageHandlerRunning = false;
}
Upvotes: 2