Fixes the Build: Thinking Outside the Box

08/01/2018

Author: Alex Rybakov, Senior Software Engineer

Cyclical Nature of Progress

What is invention? How does the progress “work”? What allowed humans to get from A to B and then to C?

Let’s consider a problem area. At some point of time we know exactly what can and cannot be done there, the rules are understood and well defined. We remember why the limits have been set where they are – enough experiments have been performed by the best experts in the area, and here are their results – anybody can reproduce them. We know multiple possible methods of solving any solvable problem within these well-defined limits and can easily choose the most efficient one. We can teach kids and explain them the rules, the limitations, and the correct way of achieving results in the area. This problem area is in a state of efficient stagnation.

Good thing about this state is that it doesn’t require a lot of “creative energy” to maintain, so inventors can start looking at new stuff. Essentially they can begin thinking about breaking the rules .

Then something forces us to start questioning the limits again and looking for redefining the rules. This “something” can be a crisis (for instance a sudden deficit of a valuable resource or a major war), or an achievement in a seemingly unrelated area, or a result of experiment which doesn’t fit the boundaries of the theory, or just a new generation of inventors with bunch of fresh ideas.

So we’ve got an idea or many new ideas which are worth trying. And we’re doing more experiments and at some point we have a breakthrough.

We find that indeed there is a way to redefine the problem, or apply a totally new method, or a new experiment provides a proof that some of used-to-be rules are not quite as strict as they looked before. Something drastically new is possible. We built a machine which can fly.

The device which broke the rules is usually a mix of legacy framework and repurposed pre-existing parts. It’s ugly, inefficient, somewhat incomplete and excessive at the same time. It can barely fly but it actually can fly.

In many cases mere existence of such a device allows many inventors to start building other devices (similar or different). We broke the rules, flying is possible, and (temporarily) there are no limits . We’re switching to a state of rapid development where many generations of never seen before designs will appear and become obsolete in a brief period of time. At this stage inventors form other areas can be attracted to the booming area allowing mankind to increase amount of “available creative energy” and to move forward faster.

Rapid development is the stage for a lot of smaller ideas and the time for finding new limits . Eventually it hits new limitations. Some will not hold but eventually enough new rules will be defined. Now it’s the time to switch to optimization stage where we learn how to achieve the result with less effort and less complexity. At this stage we see rapidly reducing variety of designs, with only few most efficient ones eventually surviving the competition. We learn how to mass produce our now-well-understood invention, and at some point we optimize so well that we can happily switch to efficient stagnation (moving most remaining inventors to different areas).

Essentially the progress is the loop of these four stages – stagnationbreakthroughrapid development, and optimization. Each stage is important – we have to be breaking the rules periodically, then we go through rapid development phase, then we have to optimize what we’ve got, and then we can prepare to move on to the next breakthrough.

Here are a few more breakthroughs in a more relevant area just for illustrations.

In mid-40s of XX century digital computers were a niche super-secret military thing for dealing with specific computational problems, essentially they were big “programmable calculators”. On the other hand, analog signal representation allowed for transmission, recording and playback of audio and video information (the first video tape recorder has been created a bit later – in 1952). It was possible to perform fairly complex data manipulation (like frequency modulation or even encryption) and implement different algorithms just using analog signal representation. Currently it is hard to imagine the world without bits and bytes, where plain analog signal is used everywhere, but it existed not too far ago and it felt totally sufficient. Think about it – an analog computer was controlling a rocket which sent the first man to space.

At some point (late forties, if I’m not mistaken), inventors realized that representing a signal as a sequence of digits is useful, convenient, and allows for much more powerful and complex data manipulation (breakthrough). At this point people didn’t know yet that the byte (a “smallest general purpose digital data container”) is needed (some computers used problem-specific data types natively), that byte will eventually have eight bits (six or seven seemed sufficient for a while), we weren’t even 100% sure that bit will win over trit (a digital element capable of representing -1, 0, and 1, see https://en.wikipedia.org/wiki/Balanced_ternary and https://en.wikipedia.org/wiki/Ternary_numeral_system for historical details) – this was a period of rapid development.

Eventually by mid-sixties we’ve got to the stage where necessary amount of bits and bytes (only eight-bit ones survived, BTW) to represent any information was well defined. For example you can teach a kid that since a single byte is needed to represent a character, anyone can easily calculate necessary amount of bytes to encode a book containing a million characters. We’ve defined the rules and the limitations and we can efficiently assess, for example, the amount of computer memory needed to store a particular set of data.

But everybody (well, almost everybody) now can compress a file containing the book. In 1949 an MIT professor Robert Fano invented a way to encode a stream of character (8-bit bytes didn’t quite exist yet) so more frequent characters would be encoded using less bits, for the price of less frequent characters requiring more bits. On average the file with a book will become smaller (obviously this only works when we have different probability (frequency) of individual characters, essentially we use the fact that all books represent only very small subset of all the possible data files). This is an example of breakthrough.

Very soon Fano’s student David Huffman invented better algorithm (both static and adaptive versions of Huffman encoding are still widely used). There is even a mathematical proof that as long as the only known thing about a stream of characters is individual character frequency and the resulting encoding should have integer number of bits per character, Huffman’s encoding

is the best possible one. Mathematical proof is a serious thing so for a while we’ve got our new limits. (A note for advanced readers – arithmetic encoding allowed for better compression by removing the limitation that character should be encoded with integer number of bits, but it wasn’t quite sufficient to be a breakthrough. It’s one of the things we’ve tried during rapid development phase).

So the digital compression happily got to efficient stagnation for many years. The only one thing was quite peculiar – while mathematically Huffman encoding was the best, in some very specific cases a simple thing like RLE could beat Huffman 100:1. This is obviously done by ignoring the rule that the source stream modeled by frequencies of individual characters, so this should give us some clue how to break the limits.

In 1977 Abraham Lempel and Jacob Ziv invented LZ77 compression algorithm, which represent the stream as a finite state machine, so probability of characters depend on the previous characters in the stream. This is another example of a breakthrough, which immediately allowed for much better compression ratio comparing to Huffman’s encoding. This breakthrough was followed by rapid development in the area which lasted for 20 or so years. Many dozens of algorithms have been derived from LZ77; some of them use Huffman as a secondary compression. At least three (LZ77+Huffman (zlib), LZ4, LZMA) are supported by the engine I’m using currently.

The amount of digitized data kept growing, as well as CPU power available for data compression and decompression. People started digitizing music, large photos and even movies. While LZ-based algorithms have been extremely efficient when compressing textual data they weren’t producing such a great results with audio or video streams.

At some point another breakthrough has happened. Think about it – it’s not needed to exactly restore what has been compressed. It should be just close enough so a human will consider it sufficiently similar to the original. It can be called cheating or another example of thinking outside the box. Modern audio, image, and video compression algorithms look at the problem of data compression from totally different angle than classical “general purpose” compression algorithms. This algorithms are somewhere in transition from “rapid development” to “efficient stagnation” phase currently.

There are probably enough illustrations – similar examples could’ve been found in many different areas of rapid progress, including board game design or aerospace industry.

So, what is the point? Progress is cyclical, all phases are important and cannot be skipped (for example it’s hard to expect a rapid development in an area without recent breakthrough). It’s awesome to be efficient within well defined boundaries. It’s absolutely awesome to look at the problem from a different angle, think “outside the box”, redefine the rules and make a breakthrough. It’s quite interesting to follow recent breakthroughs – possibly we can participate in rapid development, or it can give us competitive advantage, or we just learn new things and become a bit smarter. Or spend time simplifying existing solutions in the optimization phase. The main message is – keep moving forward!

leave a comment