Thursday, August 21, 2008

What are Complex and Perpetually Novel Outcomes?


















So, at long last, we come to the product of our initial formula. As a refresher, let's recall that formula:

Rules x Action in the face of Uncertainty x Interaction = Complex and Perpetually Novel Outcomes

(It should be apparent to most readers by now that this "formula" is slightly tongue-in-cheek. We'll explore that in a future post.)

What do we mean by "complex and perpetually novel"? Let's start with complex, and let me first say what I don't mean by complex. I don't mean the increasing complexity of, say, tying a shoelace knot. You start out with simplicity, you do a loop, you do another loop, more loops at random, until you're left with a knot of such Gordian proportions that you simply take off the shoe. I don't mean that type of complexity. 

I also don't mean the complexity of Calvinball, the game made up and continuously changed by Calvin and Hobbes. With no fixed rules, and the unpredictability of a new rule at any time, the game was immediately complex--impenetrable to even Calvin and Hobbes, generating plenty of frustration and fisticuffs.

What I mean by complexity is that, in any system with a finite and (relatively) immutable set of rules and actions, the number of possible outcomes far exceeds that which you would expect from the seemingly simple starting point. A light bulb flickered on for Peter Albin, an economic pioneer in the field of complexity, when he ran across the "game of Life" created by the brilliant John von Neumann to illustrate cellular automata theory. Here's Albin, as quoted by David Warsh: "What struck me was that in working with automata that derived from 'Life,' a very few operating principles generated model behavior which seemed to be as interesting as those produced by quite massive constructions." (Emphasis added.)

That appears to be quite an apt description of baseball in its essence. In baseball, there's really a limited number of actions you can take in any given situation. The pitcher delivers; the batter can swing or take; the fielders either make the play or it's a hit; the runners go or stay. There are always three outs. Runs score. Games end. (I'll obligingly note the worn but ever-insightful observation that, theoretically, a baseball game could never end: it's not necessarily temporally confined like other sports.)

OK, that's the "complex" part of the product. What about "perpetually novel"? This one is easier. Many of you are likely familiar with Tim Kurkjian's regular feature on "Baseball Tonight," in which he gleefully notes all the things that happened in baseball over the past week that had never, ever happened before. On first impression, it may be surprising that new outcomes could consistently happen in a game with a finite number of rules and possible actions. But if you think about all the elements that go into creating an outcome--the rules, the uncertainty, the interactions--it's not so surprising, after all.

What we find, furthermore, is a close connection between complexity (in our sense here) and perpetually novel outcomes. Here, for example, is Warsh, on his view of economic complexity: "My own sense of the meaning of the word had derived from the use of the word complexity in a famous paper by the economist Allyn Young (“Increasing Returns and Economic Progress”) to describe the growing variety of goods and services, their apparently ever-increasing specialization and differentiation, what we mean today by the supremely hazy term “development."'

Some perceptive readers will at this point likely notice the debt that this site owes to complexity theory (or chaos theory or whatever you want to label it). And while it's true that the ideas behind Box Score were semi-inspired by complexity and ideas related to it (including some of the books listed on this site), that's only part of the story.

Let's face it: complexity theory is sometimes really hard to understand. Most of us, if we're seeking different ways to understand the economy or if we're just trying to get a grasp on basic concepts, are not going to learn all the terminology of a new branch of science. So we at Box Score searched for a simpler way to understand things: a system that generated complexity but that most people could understand. Voila: baseball.

1 comment:

Unknown said...

John Conway developed the "game of life" See Wikipedia Cellular Automata article

"In the 1970s a two-state, two-dimensional cellular automaton named Game of Life became very widely known, particularly among the early computing community. Invented by John Conway, and popularized by Martin Gardner in a Scientific American article, its rules are as follows: If a black cell has 2 or 3 black neighbors, it stays black. If a black cell has less than 2 or more than 3 black neighbors it becomes white. If a white cell has 3 black neighbors, it becomes black. Despite its simplicity, the system achieves an impressive diversity of behavior, fluctuating between apparent randomness and order. One of the most apparent features of the Game of Life is the frequent occurrence of gliders, arrangements of cells that essentially move themselves across the grid. It is possible to arrange the automaton so that the gliders interact to perform computations, and after much effort it has been shown that the Game of Life can emulate a universal Turing machine. Possibly because it was viewed as a largely recreational topic, little follow-up work was done outside of investigating the particularities of the Game of Life and a few related rules."

Von Neumann did work in cellular automata but his models were much more complex than Conway's.