Setop
Setop

Reputation: 2510

Can't disable buffering on a cgi using nginx, fastcgi and bash

I want to make a CGI that launches a program and displays the output in the web-browser in real-time. Real-time here means as soon as a line of output is produced by the program it should be displayed on the browser page.

I choose to write it in bash to wrap the program execution so I can process request parameters.

webbrowser -> nginx+fastcgi -> bash -> program

I have a testing program that outputs one line every half a second, 10 times.

I thought I could declare a plain text context type in response header and then exec the program.

Unfortunately the output appears in the browser only at the end of the execution all at once. I tested it in Firefox and curl.

I have tested many options and any combination of them to workaround the issue :

Nothings works.

I guess the buffering issue is between nginx and bash but I find no way to disable it.

What can I try ?

Upvotes: 0

Views: 726

Answers (3)

ElFishi
ElFishi

Reputation: 53

Thx for this! I was having the same problem and couldn't find any other mention of it on the net. I took the liberty and dumbed down your approach to fit in one file:

#!/usr/bin/bash
# fcgiwrap does not flush
# send bash output as html and add ‌ zero width non joining characters to fill 4k buffer

cat << \
~~~
HTTP/1.1 200 OK
Content-Type: text/html
Cache-Control: no-cache
X-Accel-Buffering: no

<!DOCTYPE html>
<html>
 <body>
  <div style="max-height: 97vh;  margin: 0 15px; overflow: hidden auto; display: flex; flex-direction: column-reverse;">
   <div>
    <pre>
~~~

subprogram | while read l;
do
    printf '%s' "${l}      "
    printf '%*s\r\n' 682 | sed 's/ /\&zwnj;/g'
done

cat << \
~~~
    </pre>
   </div>
  </div>
 </body>
</html>
~~~

The two divs whith the first having flex-direction: column-reverse; make sure the screen keeps scrolling when new output arrives.

Maybe this is useful for others who come here.

Upvotes: 1

fnclovers
fnclovers

Reputation: 127

It's been a while, but to resolve this issue, I implemented a custom modification in the fcgiwrap code.

The root of the problem in fcgiwrap was the absence of an explicit call to FCGI_fflush(), even when there was stdout pipe was flushed.

To address this, I suggest cloning my fcgiwrap repository from GitHub using this link: fcgiwrap repository. This should help in resolving the issue.

Upvotes: 0

Setop
Setop

Reputation: 2510

I found no way to fix buffering issue in my chain. I tried perl instead of bash with no luck.

So I choose to fill the buffers : after each line of output of the controlled program I echo a bunch of '\0'. Since this content can not be process as a plain text by the web browser, I use the server sent event approach.

#!/bin/sh

printf "HTTP/1.0 200 OK\r\n"
printf "Content-type: text/event-stream\r\n"
printf "Cache-Control: no-cache\r\n"
printf "X-Accel-Buffering: no\r\n"
printf "\r\n"

flush() {
    padding=4100
    dd if=/dev/zero bs=$padding count=1 2>/dev/null
}

subprogram | while read l;
do
    printf "data: ${l}\n\n"
    flush
done

The wrapping page looks like that :

<html>
<head>
   <meta charset="UTF-8">
   <title>Server-sent events demo</title>
</head>
<body>
  <pre></pre>
<script>
var evtSource = new EventSource('/sse.sh?subprogram');
var pre = document.querySelector('pre');
evtSource.onmessage = function(e) {
  pre.textContent += e.data + '\n';
}
</script>
</body>
</html>

The web browser, in that case, takes care of removing extra '\0'.

The drawback is that the cgi output is far larger than the program output.

Upvotes: 0

Related Questions