Gachl
Gachl

Reputation: 75

Stream on the fly decompressing causes artifacts when buffer is larger than 1byte

I am currently testing several decompression libraries for a project I'm involved with to decompress http file streams on the fly. I have tried two very promising libraries and found an issue that seems to appear in both of them.

This is what I am doing:

The whole idea works fine, I'm able to uncompress and stream the compressed video directly into VLC stdin and it's rendered just fine. However I have to use a read buffer of one byte on the decompression library. Any buffer larger than one byte will cause the uncompressed data stream to be cut off. For a test I've written the decompressed stream into a file and compared it with the original video.avi and some data is just skipped by the decompression. When streaming this broken data into VLC it causes a lot of video artifacts and the playback speed is also greatly reduced.

If I knew the size of what is available to read I could trim my buffer accordingly but no library would make this information public so all I can do is read the data with a one byte buffer. Maybe my approach is wrong? Or maybe I'm overlooking something?

Here's an example code (requires VLC):

ICSharpCode.SharpZLib (http://icsharpcode.github.io/SharpZipLib/)

static void Main(string[] args)
    {
        // Initialise VLC
        Process vlc = new Process()
        {
            StartInfo =
            {
                FileName = @"C:\Program Files\VideoLAN\vlc.exe", // Adjust as required to test the code
                RedirectStandardInput = true,
                UseShellExecute = false,
                Arguments = "-"
            }
        };
        vlc.Start();
        Stream outStream = vlc.StandardInput.BaseStream;

        // Get source stream
        HttpWebRequest stream = (HttpWebRequest)WebRequest.Create("http://codefreak.net/~daniel/apps/stream60s-large.zip");
        Stream compressedVideoStream = stream.GetResponse().GetResponseStream();

        // Create local decompression loop
        MemoryStream compressedLoopback = new MemoryStream();
        ZipInputStream zipStream = new ZipInputStream(compressedLoopback);
        ZipEntry currentEntry = null;

        byte[] videoStreamBuffer = new byte[8129]; // 8kb read buffer
        int read = 0;
        long totalRead = 0;
        while ((read = compressedVideoStream.Read(videoStreamBuffer, 0, videoStreamBuffer.Length)) > 0)
        {
            // Write compressed video stream into compressed loopback without affecting current read position
            long previousPosition = compressedLoopback.Position; // Store current read position
            compressedLoopback.Position = totalRead; // Jump to last write position
            totalRead += read; // Increase last write position by current read size
            compressedLoopback.Write(videoStreamBuffer, 0, read); // Write data into loopback
            compressedLoopback.Position = previousPosition; // Restore reading position

            // If not already, move to first entry
            if (currentEntry == null)
                currentEntry = zipStream.GetNextEntry();

            byte[] outputBuffer = new byte[1]; // Decompression read buffer, this is the bad one!
            int zipRead = 0;
            while ((zipRead = zipStream.Read(outputBuffer, 0, outputBuffer.Length)) > 0)
                outStream.Write(outputBuffer, 0, outputBuffer.Length); // Write directly to VLC stdin
        }
    }

SharpCompress (https://github.com/adamhathcock/sharpcompress)

static void Main(string[] args)
    {
        // Initialise VLC
        Process vlc = new Process()
        {
            StartInfo =
            {
                FileName = @"C:\Program Files\VideoLAN\vlc.exe", // Adjust as required to test the code
                RedirectStandardInput = true,
                UseShellExecute = false,
                Arguments = "-"
            }
        };
        vlc.Start();
        Stream outStream = vlc.StandardInput.BaseStream;

        // Get source stream
        HttpWebRequest stream = (HttpWebRequest)WebRequest.Create("http://codefreak.net/~daniel/apps/stream60s-large.zip");
        Stream compressedVideoStream = stream.GetResponse().GetResponseStream();

        // Create local decompression loop
        MemoryStream compressedLoopback = new MemoryStream();
        ZipReader zipStream = null;
        EntryStream currentEntry = null;

        byte[] videoStreamBuffer = new byte[8129]; // 8kb read buffer
        int read = 0;
        long totalRead = 0;
        while ((read = compressedVideoStream.Read(videoStreamBuffer, 0, videoStreamBuffer.Length)) > 0)
        {
            // Write compressed video stream into compressed loopback without affecting current read position
            long previousPosition = compressedLoopback.Position; // Store current read position
            compressedLoopback.Position = totalRead; // Jump to last write position
            totalRead += read; // Increase last write position by current read size
            compressedLoopback.Write(videoStreamBuffer, 0, read); // Write data into loopback
            compressedLoopback.Position = previousPosition; // Restore reading position

            // Open stream after writing to it because otherwise it will not be able to identify the compression type
            if (zipStream == null)
                zipStream = (ZipReader)ReaderFactory.Open(compressedLoopback); // Cast to ZipReader, as we know the type

            // If not already, move to first entry
            if (currentEntry == null)
            {
                zipStream.MoveToNextEntry();
                currentEntry = zipStream.OpenEntryStream();
            }

            byte[] outputBuffer = new byte[1]; // Decompression read buffer, this is the bad one!
            int zipRead = 0;
            while ((zipRead = currentEntry.Read(outputBuffer, 0, outputBuffer.Length)) > 0)
                outStream.Write(outputBuffer, 0, outputBuffer.Length); // Write directly to VLC stdin
        }
    }

To test this code I recommend setting the output buffer for SharpZipLib to 2 bytes and for SharpCompress to 8 bytes. You will see the artifacts and also that the play speed of the video is wrong, the seek time should always be aligned with the number that is counting in the video.

I haven't really found any good explanation of why a larger outputBuffer that is reading from the decompression lib is causing these problems or a way to solve this other than having the tiniest possible buffer.

So my question is what I am doing wrong or if this is a general issue when reading compressed files from streams? How could I increase the outputBuffer while reading the correct data?

Any help is greatly appreciated!

Regards, Gachl

Upvotes: 1

Views: 463

Answers (1)

Dark Falcon
Dark Falcon

Reputation: 44201

You need to write only how many bytes you read. Writing the entire buffer size will add additional bytes (whatever happened to be in the buffer before). zipStream.Read is not required to read as many bytes as you request.

while ((zipRead = zipStream.Read(outputBuffer, 0, outputBuffer.Length)) > 0)
    outStream.Write(outputBuffer, 0, zipRead); // Write directly to VLC stdin

Upvotes: 1

Related Questions