Book Club: The Code Economy, Chs. 6-7: Learning Curves

Welcome back to EvX’s Book Club: The Code Economy, by Philip Auerswald. Today’s entry is going to be quick, because summer has started and I don’t have much time. Ch 6 is titled Information, and can perhaps be best summarized:

The challenge is to build a reliable economy with less reliable people. In this way, the economy is an information processing organism. …

When I assert that economics must properly be understood as a branch of information theory, I am reffing to the centrality of the communication problem that exists whenever one person attempts to share know-how with another. I am referring, in other words, to the centrality of code.

Auerswald goes on to sketch some relevant background on “code” as a branch of economics:

The economics taught in undergraduate courses is a great example of history being written by the victors. Because the methodologies of neoclassical economics experienced numerous triumphs in the middle of the twentieth century, the study of the distribution of resources within the economy–choice rather than code and, to a lesser extent, consumption rather than  production–came to be taught as the totality of economics. However, the reality is that choice and code have always coexisted, not only in the economy itself but in the history of economic thought.

And, an aside, but interesting:

Indeed, from 1807 to the time Jevons was born, the volume of shipping flowing through the port of Liverpool more than doubled.

However:

And yet, while both the city’s population and worker’s wages increased steadily, a severe economic rift occurred that separated the haves from the have-nots. As wealthy Liverpudlians moved away from the docks to newly fashionable districts on the edges of the city, those left behind in the center of town faced miserable conditions.From 1830 through 1850, life expectancy actually decreased in Liverpool proper from an already miserable 32 years to a shocking 25 years.

(You should see what happened to life expectancies in Ireland around that time.)

A reliable economy built with less reliable people is one in which individuals have very little autonomy, because autonomy means unreliable people messing things up.

Thereafter, a series of economists, Herbert Simon foremost among them, put the challenges of gathering, sharing, and processing economically relevant information at the center of their work.

Taken together and combined with foundational insights from other fields–notably evolutionary biology and molecular biology–the contributions of these economists constitute a distinct domain of inquiry within economics. These contributions have focused on people as producers and on the algorithms that drive the development of the economy.

This domain of inquiry is Code Economics.

I am rather in love with taking the evolutionary model and applying it to other fields, like the spread of ideas (memes) or the growth of cities, companies, economies, or whole countries. That is kind of what we do, here at EvolutionistX.

Chapter 7 is titled Learning: The Dividend of Doing. It begins with an amusing tale about Julia Child, who did not even learn to cook until her mid to late thirties, and then became a rather famous chef/cookbook writer. (Cooking recipes are one of Auerswald’s favored examples of “code” in action.)

Next Auerswald discusses Francis Walker, first president of the American Economic Association. Walker disagreed with the “wage fund theory” and with Jevons’s simplifying assumption that firms can be modeled as simply hiring workers until it cannot make any more money by hiring more workers than by investing in more capital.

Jevons’s formulation pushes production algorithms–how businesses are actually being run–into the background and tradeoffs between labor and capital to the foreground. But as Walker notes:

“We have the phenomenon in every community and in every trade, in whatever state of the market,” Walker observes, “of some employers realizing no profits at all, while other are making fair profits; others, again, large profits; others, still, colossal profits. Side by side, in the same business, with equal command of capital, with equal opportunities, one man is gradually sinking a fortune, while another is doubling or trebling his accumulations.”

The relevant economic data, when it finally became available, confirmed Walker’s belief about the distribution of profits, yet the difference between the high-profit and low-profit firms does not appear to hinge primarily on the question of how much labor should be substituted for capital and vice versa.

Walker argued that more profitable entrepreneurs are that way because they are able to solve a difficult problem more effectively than other entrepreneurs. … three core mechanisms for the advance of code: learning, evolution, and the layering of complexity though the development of platforms.

Moreover:

…in the empirical economics of production, few discoveries have been more universal or significant than that of the firm-level learning curve. As economist James Bessen notes, “Developing the knowledge and skills needed to implement new technologies on a large scale is a difficult social problem that takes a long time to resolve… A new technology typically requires more than an invention in order to be designed, built, installed, operated, and maintained. Initially, much of this new technical knowledge develops slowly because it is learned through experience, not in the classroom.”

Those of you who are familiar with business economics probably find learning curves and firm growth curves boring and old-hat, but they’re new and quite fascinating to me. Auerswald has an interesting story about the development of airplanes and a challenge to develop cheaper, two-seat planes during the Depression–could a plane be built for under a $1,000? $700?

(Make sure to read the footnote on the speed of production of “Liberty Ships.”)

The rest of the chapter discusses the importance of proper firm management for maximizing efficiency and profits. Now, I have an instinctual aversion to managers, due to my perception that they tend to be parasitic on their workers or at least in competition with them for resources/effort, but I can admit that a well-run company is likely more profitable than a badly run one. Whether it is more pleasant for the workers is another matter, as the folks working in Amazon’s warehouses can tell you.

So why are some countries rich and others poor?

Whereas dominant variants of the neoclassical production model emphasize categories such as public knowledge and organization, which can be copied and implemented at zero cost, code economics suggests that such categories are unlikely to be significantly relevant in the practical work of creating the business entities that drive the progress of human society. This is because code at the level of a single company–what I term a “production algorithm”–includes firm-specific components. Producers far from dominant production clusters must learn to produce through a costly process of trial and error. Market-driven advances in production recipes, from which venture with proprietary value can be created, require a tenacious will to experiment, to learn, and to document carefully the results of that learning. Heterogeneity among managers… is thus central to understanding observed differences between regions and nations. …

Management and the development of technical standards combined to enable not just machines but organizations to be interoperable and collaborative. Companies thus could become far bigger and supply chains far more complex than every before.

As someone who actually likes shopping at Ikea, I guess I should thank a manager somewhere.

Auerswald points out that if communication of production algorithms and company methods were perfect and costless, then learning curves wouldn’t exist:

All of these examples underscore the following point, core to code economics: The imperfection of communication is not a theory. It is a ubiquitous and inescapable physical reality.

That’s all for now, but how are you enjoying the book? Do you have any thoughts on these chapters? I enjoyed them quite a bit–especially the part about Intel and the graphs of the distribution of management scores by country. What do you think of France and the UK’s rather lower “management” scores than the US and Germany?

Join us next week for Ch. 8: Evolution–should be exciting!

Book Club: The Code Economy chs. 3-4: Machines and Computers

Machines are fascinating.

Imagine if we discovered, by some chance, in a previously unexplored niche of the world, a group of “people” who looked exactly like us, but had no technology at all: no fire, no pots, no textiles, not even an Acheulean hand axe. Let us further assume that they were not merely feral children who’d gotten lost in the woods, but an actual community that had sustained itself for generations, and that all attempts to introduce them to the art of tool-making failed. They could not, despite often watching others build fires or throw pots, make their own–much less understand and join in a modern economy. (A bulldog can learn to ride a skateboard, but it cannot learn to make a skateboard.)

What would we think of them? Would they be “human”? Even though they look like us, they could only live in houses and wear clothes if we gave those to them; if not given food, they would have to stay in their native environment and hunt.

It is hard to imagine a “human” without technology, nor explaining the course of human history without the advance of technology. The rise of trans-Atlantic discovery, trade, and migration that so fundamentally shaped the past 500 years cannot be explained without noting the development of superior European ships, maps, and navigational techniques necessary to make the journey. Why did Europe discover America and not the other way around? Many reasons, but fundamentally, because the Americans of 1492 didn’t have ships that could make the journey and the Europeans did.

The Romans were a surprisingly advanced society, technology wise, and even during the High Middle Ages, European technology continued to advance, as with the spread of wind and water mills. But I think it was the adoption of (Hindu-)Arabic numerals, popularized among mathematicians in the 1200s by Fibonacci, but only adopted more widely around the 1400s, that really allowed the Industrial and Information Revolutions to occur. (Roman numerals are just awful for any maths.)

An 18th century set of Napier’s Bones

From the Abacus and Fibonacci’s Liber Abaci, Napier developed his “bones,” the first European mechanical calculating machine, in 1617. These were followed by Pascal’s Calculator in 1642, the slide rule (early 1600s,) and Leibniz’s Stepped Reckoner in 1672. After that, progress on the adding machine problem was so rapid that it does not do to list all of the devices and prototypes devised, but we may mention Gaspard de Prony’s impressive logarithmic and trigonometric mathematical tables, for use by human “calculators”, and Babbage‘s analytical and difference machines. The Arithmometer, patented in 1820, was the first commercially successful mechanical calculator, used in many an office for nearly a century.

The history of mechanical computing devices wouldn’t be complete without reference to the parallel development of European clocks and watches, which pioneered the use of gears to translate movement into numbers, not to mention the development of an industry capable of manufacturing small, high-quality gears to reasonably high tolerances.

Given this context, I find our culture’s focus on Babbage–whose machines, while impressive, was never actually built–and his assistant Ada of Lovelace, a bit limited. Their contributions were interesting, but taken as a whole, the history is almost an avalanche of innovations.

Along the way, the computer has absorbed many technological innovations from outside computing–the Jacquard Loom pioneered the use of punch cards; the lightbulb pioneered the vacuum tubes that eventually filled Colossus and ENIAC.

But these early computers had a problem: vacuum tubes broke often.

During and immediately after World War II a phenomenon named “the tyranny of numbers” was noticed, that is, some computational devices reached a level of complexity at which the losses from failures and downtime exceeded the expected benefits.[2] Each Boeing B-29 (put into service in 1944) carried 300–1000 vacuum tubes and tens of thousands of passive components.[notes 4] The number of vacuum tubes reached thousands in advanced computers and more than 17,000 in the ENIAC (1946).[notes 5] Each additional component reduced the reliability of a device and lengthened the troubleshooting time.[2] Traditional electronics reached a deadlock and a further development of electronic devices required reducing the number of their components.

Also:

…the 1946 ENIAC, with over 17,000 tubes, had a tube failure (which took 15 minutes to locate) on average every two days. The quality of the tubes was a factor, and the diversion of skilled people during the Second World War lowered the general quality of tubes.[29]

The invention of the semiconductor further revolutionized computing–bringing us a long way from the abacus of yesterday.

Chapter 3 takes a break from the development of beautiful machines to examine their effects on humans. Auerswald writes:

By the twentieth century, the systematic approach to analyzing the division of labor that de Prony developed would have a name: management science. the first and foremost proponent of management science was Frederick Winslow Taylor, a child of privilege who found his calling in factories. …

This first experience of factory work gave Taylor an understanding of the habits of workers that was as intimate as it was, ultimately, unfavorable. Being highly organized and precise by nature, Taylor was appalled at the lax habits and absence of structure that characterized the early twentieth-century factory floor. … However, Taylor ultimately concluded that the blame did not lie with the workers but in the lack of rigorously considered management techniques. At the center of management, Taylor determined, was the capacity to precisely define the tasks of which a “job” was comprised.

What distinguished Taylor was his absolute conviction that worker could not be left on their own to define, much less refine, the tasks that comprised their work. He argued that authority must be fully vested in scientifically determined routine–that is to say, code.

Sounds hellish.

I know very little of management science beyond what can be found in Charlie Chaplin’s Modern Times. According to Wikipedia, Vladimir Lenin described Taylorism as a “‘scientific’ system of sweating” more work from laborers.[3] However, in Taylor’s defense, I don’t think the adoption of Taylorism ever resulted in the mass starvation of millions of people, so maybe Lenin should shut up was wrong. Further:

In the course of his empirical studies, Taylor examined various kinds of manual labor. For example, most bulk materials handling was manual at the time; material handling equipment as we know it today was mostly not developed yet. He looked at shoveling in the unloading of railroad cars full of ore; lifting and carrying in the moving of iron pigs at steel mills; the manual inspection of bearing balls; and others. He discovered many concepts that were not widely accepted at the time. For example, by observing workers, he decided that labor should include rest breaks so that the worker has time to recover from fatigue, either physical (as in shoveling or lifting) or mental (as in the ball inspection case). Workers were allowed to take more rests during work, and productivity increased as a result.[11]

Also:

By factoring processes into discrete, unambiguous units, scientific management laid the groundwork for automation and offshoring, prefiguring industrial process control and numerical control in the absence of any machines that could carry it out. Taylor and his followers did not foresee this at the time; in their world, it was humans that would execute the optimized processes. (For example, although in their era the instruction “open valve A whenever pressure gauge B reads over value X” would be carried out by a human, the fact that it had been reduced to an algorithmic component paved the way for a machine to be the agent.) However, one of the common threads between their world and ours is that the agents of execution need not be “smart” to execute their tasks. In the case of computers, they are not able (yet) to be “smart” (in that sense of the word); in the case of human workers under scientific management, they were often able but were not allowed. Once the time-and-motion men had completed their studies of a particular task, the workers had very little opportunity for further thinking, experimenting, or suggestion-making. They were forced to “play dumb” most of the time, which occasionally led to revolts.

While farming has its rhythms–the cows must be milked when the cows need to be milked, and not before or after; the crops must be harvested when they are ripe–much of the farmer’s day is left to his own discretion. Whether he wants to drive fence in the morning and hoe the peas in the afternoon, or attend to the peas first and the fence later is his own business. If he wants to take a nap or pause to fish during the heat of the day it is, again, his own business.

A factory can’t work like that. If the guy who is supposed to bolt the doors onto the cars can’t just wander off to eat a sandwich and use the restroom whenever he feels like it, nor can he decide that today he feels like installing windshields. Factories and offices allow many men to work together by limiting the scope of each one’s activities.

Is this algorithmisation of labor inhuman, and should we therefore welcome its mechanization and automation?