cooldude101
cooldude101

Reputation: 1405

How to pipe a readable stream into a URL.createObjectURL without waiting for the whole file?

I know it's doable with mediaSource but media source doesn't support all video formats (like fragmented mp4 for example). Which is a problem because my application doesn't have a server that can fix the file. It's a client side application only.

const blob = await ipfs.getBlobFromStream(hash)

const url = URL.createObjectURL(blob)

this.setState({...this.state, videoSrc: url})

const getBlobFromStream = async (hash) => {

  return new Promise(async resolve => {

    let entireBuffer

    const s = await stream(hash)
    s.on('data', buffer => {

      console.log(buffer)

      if (!entireBuffer) {
        entireBuffer = buffer
      }
      else {
        entireBuffer = concatTypedArrays(entireBuffer, buffer)
      }

    })

    s.on('end', () => {

      const arrayBuffer = typedArrayToArrayBuffer(entireBuffer)
      const blob = new Blob(arrayBuffer)
      resolve(blob)
    })

  })

}

this is the code i'm using right now, which basically waits for the entire file and puts it in a single array and then into a blob and then into URL.createObjectURL

Upvotes: 13

Views: 4052

Answers (2)

vvv
vvv

Reputation: 31

Unfortunately it is not currently possible to create a generally readable blob url with content that will be determined asynchronously.

If the goal is specifically for media playback, then there is the MediaSource API which you mention you know about. You imply that it requires server-side processing, but it is not always true - you can generate fragmented mp4 from a normal mp4 file with client-side code, for example with something like mux.js (last time I used it, it generated wrong/buggy fmp4 header so I needed some custom code to fix their stuff) or emsciptened ffmpeg or something else.

I agree with you that MediaStream API has many drawbacks/differences from a generic stream concept:

  • the data can not be arbitrary formats or arbitrarily split into chunks but must be in one of a few specific formats, i.e. fragmented mp4 or webm, and its fragmentation must follow the format's specific requirements
  • can not be read by generic url reading methods like xhr or fetch, it is only usable by audio/video elements;
  • can only be assigned to a single media element, and only once;
  • can be read non-sequentially by the corresponding media element;
  • can not control the data flow with stream-like mechanisms like backpressure or pull events, instead you need to manually monitor the media element's current position in seconds and figure out the corresponding data segments;
  • buffers a copy of the data added to it, doubling memory usage in some use-cases (you can manually remove data from its buffer to try and mitigate this);

Unfortunately for now that is the only option.

Upvotes: 0

Serkan Sipahi
Serkan Sipahi

Reputation: 691

You can do it in which you restructure your code:

await ipfs.startBlobStreaming(hash);
this.setState({...this.state, videoComplete: true});

const startBlobStreaming = async (hash) => {
  return new Promise(async (resolve) => {

    let entireBuffer;
    const s = await stream(hash);
    s.on('data', buffer => {
      if (!entireBuffer) {
        entireBuffer = buffer;
      } else {
        entireBuffer = concatTypedArrays(entireBuffer, buffer);
      }
      const arrayBuffer = typedArrayToArrayBuffer(entireBuffer);
      const blob = new Blob(arrayBuffer);
      const url = URL.createObjectURL(blob);
      this.setState({...this.state, videoSrc: url});

    });
    s.on('end', _ => resolve())
  });
}

I dont know how intensive the buffers are come into s.on but you could also collect a amount of buffer in a certain time(e.g. 1000ms) and then create the blob url.

Upvotes: 2

Related Questions