user19638253
user19638253

Reputation:

Are algorithms that can compress data dramatically (like 70% - 80%) a big deal? are there any? (Question)

Note: I apologize if it isn't right to post this here


Hello,

As the title says, are algorithms that can dramatically compress data by a significant amount a big deal? Are there any data compressors out there that can compress any type of data by 70% - 80% or even 99% in some cases? I know of JPEG but it is only for images

I think mine would be able to do that however it is still a prototype currently very slow (90 kb -> 11.284 kb takes 3 mins) both in compression and decompression, you are welcome to express whether this is a fraud, or a fake because I was told this was impossible as well. However I would not speak about how I was able to build mine as I am afraid I am going to lose my leverage.

If I can make this algorithm much much faster, would this be worth anything? I would like to make some money with it, are there ways to monetize it? I am currently in need financially so I could dropout as I am still in college and start a small startup I have in mind

Also if this is worth anything and if I manage to fix its flaws and I decide to monetize it or sell it even, I would also like it to be open source as I think this would be of great help for the public is that possible?

Any insight about this would be appreciated! :)

Upvotes: 0

Views: 273

Answers (1)

Mark Adler
Mark Adler

Reputation: 112394

Yes, most lossless compression algorithms can compress by a factor of a thousand or more. If presented with, for example, a long sequence of zero bytes.

No compressor can compress "any type of data" by even one bit, and then decompress it losslessly. If some inputs are compressed, then necessarily some other inputs are expanded, by at least one bit.

Upvotes: 3

Related Questions