Antifragile: Measuring Complexity

This post forms part of series I’m doing following the 2 day course I attended in Boston. Yaneer Bar-Yam was speaking along side Nassim N. Taleb.

It might be easier to think about the simple first. What makes something simple? A square is simple. It has 4 straight sides, all the same length and at right angles to each other. Perhaps not the most concise description but you get the idea. Simple things don’t need many words to explain them.

Yaneer asserts that it is the description of an object that can be used as a measure of its complexity. There is a study that shows roughly speaking, the number of bits of information held in a sentence is equal to the number of bits required to store this information (once you take into account vocabulary and grammar). In science these kind of order of magnitude comparisons and easily made. Some may find the ambiguity worrying.

It also turns out that the media used for the description is not important. A photo or video is just as good and the information total can be calculated in the same way. This makes even more sense if compression algorithms have done their magic. I guess any data can be used to provide the description. There are clear parallels here with information theory and the idea of Shannon Entropy. A few times in the course entropy was used however because this is generally accepted to be an absolute measure based on the state of the atoms in the system it isn’t general enough for this discussion.

So when describing the state of a system, the description needs to be long enough to distinguish between all possible states.  If you have a simple system like a traffic light, the state can be expressed simply as red, green etc. Something with more possible states, like a given state of play in a game of chess will require more data. This is fine and fits well with Shannon Entropy.

What Yaneer does next is for me a bit of genius. Like Einstein who moved away from absolute measures and made things relative to the observe Yaneer does the same for the description of the system. This now means the perspective you take when looking at a system will change the length of the description required. Yaneer uses the term scale to describe this effect. Complexity of a system isn’t an absolute measure.

For example let us consider a gas. Now at one level the information required to describe the gas will be equal to its classic entropy and will be based on the speed and velocity of all the particles, high entropy, lots of data required to describe the system and so very complex. It is hard to predict where any given particle will be. However if we move away from the micro towards the macro state of the gas then it becomes simple. We have a few variables such as density, temperature and pressure required to describe the gas at this scale. It is also easy to predict what will happen over time.

The purpose of this post and the course itself is to consider how complexity forces us to reconsider how we work with organisations of people. Consider a Roman attack formation. At the small scale this looks complex and would take some discipline and training to work correctly however at the large scale it makes things simpler for the commanding officers. It is possible for one person to direct and control the outcome of a battle.

Contrast this with the way the Viet Cong fought a guerrilla war against the overwhelming firepower of the US and South Vietnamese forces. In that case the same tactics that had worked since Roman times failed catastrophically. The hypotheses given by Yaneer is that it was the complexity on the ground that meant it was not possible for a small group of commanders to give orders that could possibly result in success. When using a command and control structure the complexity of the problem you can solve is limited to the complexity the person in control can handle.

If you want to read more on this topic then Yaneer’s book “Making Things Work” is a great read.

This entry was posted in Agile, Antifragile, Complexity. Bookmark the permalink.