Koushik R
Koushik R

Reputation: 33

Passing FormData/File Object from content script to background script in chrome extension with Manifest V3

I'm building a chrome extension where I get a file as input from the user and pass it to my background.js (service worker in case of manifest v3) to save it to my backend. Since making cross-origin requests are blocked from content scripts I have to pass the same to my background.js and use FETCH API to save the file. When I pass the FormData or File Object to the chrome.runtime.sendMessage API it uses JSON Serialization and what I receive in my background.js is an empty object. Refer to the below snippet.

//content-script.js

attachFile(event) {
 let file = event.target.files[0];

 // file has `File` object uploaded by the user with required contents. 
 chrome.runtime.sendMessage({ message: 'saveAttachment', attachment: file }); 
}

//background.js

chrome.runtime.onMessage.addListener((request, sender) => {
 if (request.message === 'saveAttachment') {
   let file = request.attachment; //here the value will be a plain object  {}
 }
});

The same happens even when we pass the FormData from the content script.

I referred to multiple solutions suggested by the old StackOverflow questions, to use URL.createObjectURL(myfile); and pass the URL to my background.js and fetch the same file. Whereas FETCH API does not support blob URL to fetch and also XMLHttpRequest is not supported in service worker as recommended here. Can someone help me in solving this? Am so blocked with this behaviour.

Upvotes: 2

Views: 3004

Answers (3)

woxxom
woxxom

Reputation: 73506

Currently only Firefox can transfer such types directly. Chrome might be able to do it in the future.

Workaround 1.

Serialize the object's contents manually to a string, send it, possibly in several messages if the length exceeds 64MB message size limit, then rebuild the object in the background script. Below is a simplified example without splitting, adapted from Violentmonkey. It's rather slow (encoding and decoding of 50MB takes several seconds) so you may want to write your own version that builds a multipart/form-data string in the content script and send it directly in the background script's fetch.

  • content script:

    async function serialize(src) {
      const wasBlob = src instanceof Blob;
      const blob = wasBlob ? src : await new Response(src).blob();
      const reader = new FileReader();
      return new Promise(resolve => {
        reader.onload = () => resolve([
          reader.result,
          blob.type,
          wasBlob,
        ]);
        reader.readAsDataURL(blob);
      });
    }
    
  • background script, inside onMessage listener:

    const [body, type] = deserialize(message.body);
    fetch(message.url, {
      body,
      headers: {
        'Content-Type': type, 
      },
    }).then(/*........*/);
    function deserialize([base64, type, wasBlob]) {
      const str = atob(base64.slice(base64.indexOf(',') + 1));
      const len = str.length;
      const arr = new Uint8Array(len);
      for (let i = 0; i < len; i += 1) arr[i] = str.charCodeAt(i);
      if (!wasBlob) {
        type = base64.match(/^data:(.+?);base64/)[1].replace(/(boundary=)[^;]+/,
          (_, p1) => p1 + String.fromCharCode(...arr.slice(2, arr.indexOf(13))));
      }
      return [arr, type];
    }
    

Workaround 2.

Use an iframe for an html file in your extension exposed via web_accessible_resources.
The iframe will be able to do everything an extension can, like making a CORS request.

The File/Blob and other cloneable types can be transferred directly from the content script via postMessage. FormData is not clonable, but you can pass it as [...obj] and then assemble in new FormData() object.

It can also pass the data directly to the background script via navigator.serviceWorker messaging.

Example: see "Web messaging (two-way MessagePort)" in that answer.

Upvotes: 5

malininss
malininss

Reputation: 320

I found another way to pass files from a content page (or from a popup page) to a service worker. But, probably, it is not suitable for all situations,

You can intercept a fetch request sent from a content or popup page in a service worker. Then you can send this request through the service-worker, it can also be modified somehow

popup.js:

// simple fetch, but with a header indicating that the request should be intercepted
fetch(url, {
    headers: {
        'Some-Marker': 'true',
    },
});

background.js:

self.addEventListener('fetch', (event) => {
    // You can check that the request should be intercepted in other ways, for example, by the request URL
    if (event.request.headers.get('Some-Marker')) {
        event.respondWith((async () => {
            // event.request contains data from the original fetch that was sent from the content or popup page.
            // Here we make a request already in the background.js (service-worker page) and then we send the response to the content page, if it is still active
            // Also here you can modify the request hoy you want
            const result = await self.fetch(event.request);
            return result;
        })());
    }
    return null;
});

Upvotes: 0

Stainz42
Stainz42

Reputation: 1013

I have a better solution: you can actually store Blob in the IndexedDB.

// client side (browser action or any page)
import { openDB } from 'idb';

const db = await openDB('upload', 1, {
  upgrade(openedDB) {
    openedDB.createObjectStore('files', {
      keyPath: 'id',
      autoIncrement: true,
    });
  },
});
await db.clear('files');

const fileID = await db.add('files', {
  uploadURL: 'https://yours3bucketendpoint',
  blob: file,
});

navigator.serviceWorker.controller.postMessage({
  type: 'UPLOAD_MY_FILE_PLEASE',
  payload: { fileID }
});


// Background Service worker
addEventListener('message', async (messageEvent) => {
  if (messageEvent.data?.type === 'UPLOAD_MY_FILE_PLEASE') {
    const db = await openDB('upload', 1);
    const file = await db.get('files', messageEvent.data?.payload?.fileID);
    const blob = file.blob;
    const uploadURL = file.uploadURL;
    
    // it's important here to use self.fetch
    // so the service worker stays alive as long as the request is not finished
    const response = await self.fetch(uploadURL, {
      method: 'put',
      body: blob,
    });
    if (response.ok) {
      // Bravo!
    }
  }
});

Upvotes: 1

Related Questions