If something is true in one field it’s probably true in others. Restricting your attention to your own field blinds you to how many important things people from other fields have figured out that are relevant to your own.
Here are a few laws – some scientific, some not – from specific fields that hold universal truths.
1. Littlewood’s law: We can expect “miracles” to happen regularly, because in a world with 7 billion people the odds of a one-in-a-billion event are pretty good.
John Littlewood was a mathematician who sought to debunk the idea of miracles being anything more than simple statistics.
Physicist Freeman Dyson – who from what I gather named the law – explains:
Littlewood’s law of miracles states that in the course of any normal person’s life, miracles happen at the rate of roughly one per month. The proof of the law is simple. During the time that we are awake and actively engaged in living our lives, roughly for eight hours each day, we see and hear things happening at a rate of one per second. So the total number of events that happen to us is about 30,000 per day, or about a million per month. With few exceptions, these events are not miracles because they are insignificant. The chance of a miracle is about one per million events. Therefore we should expect about one miracle to happen, on the average, every month.
Littlewood did most of his work in the 20th century. He would, I think, double down on the law today because social media has opened the door into other people’s lives and given tail events a spotlight like never before.
Daniel Kahneman has a related take: “Human beings cannot comprehend very large or very small numbers. It would be useful for us to acknowledge that fact.”
2. Gibson’s law: “For every PhD there is an equal and opposite PhD.” In law and public policy, the observation that equally qualified expert witnesses can come to opposite conclusions.
There is no field this doesn’t apply to, and it happens for three reasons.
One is that there’s nuance and context to almost everything involving people, so experts can seem like they’re coming to different conclusions when discussing a variation of the same topic. Harry Truman said he just wanted a one-handed economist – “Every time you come in here you say, ‘On the one hand this, on the other hand that.” But that’s how most things work. Gibson’s law is triggered when an expert – often in an innocent attempt to simplify for a lay audience – tells one side of a story that has many sides, offsets, and counterbalances.
A second is that training and data can be overwhelmed by ideological beliefs and life experiences. This is especially true in fields that study people. There are no conservative meteorologists or liberal geologists, but we happily accept the equivalent in economics and sociology.
A third is that incentives are the most powerful force in the world. They not only get people to say things that aren’t true, but actually believe those things if it’s in their career interest to do so.
3. Brandolini’s law: “The amount of energy needed to refute bullsh*t is an order of magnitude bigger than to produce it.”
Coined by Italian software developer Albert Brandolini, who also refers to it as the Bullsh*t Asymmetry Principle.
Prevalent in every field, the non-satirical version acknowledges four truths:
People don’t like to admit not understanding something, so when confronted with nonsense they are more likely to nod their heads than say “I don’t get it” – especially in a group setting.
In law, the reason the burden of proof lies with the prosecution is that it’s often impossible to prove something didn’t happen. Outside of the courtroom the opposite rule prevails, and the commentator is allowed to give an opinion but the critic must debunk him with evidence.
There is a thriving market for bad commentary because they give readers intellectual cover against their own biases, prejudices, and incentives. When many people want bad commentary to be right it becomes harder to convince them that it’s wrong.
The barriers to entry to publishing an opinion have dropped precipitously in the last two decades.
4. Goodhart’s law: When a measure becomes a target, it stops being a good measure.
Charles Goodhart is an economist who recognized that once a central bank set a specific monetary target, the historical relationship between that target leading to the outcome they want breaks down:
[T]hose subject to new policies and regulations will react in different, and often unexpected ways, [and] also takes cognisance of the fact that, having set a new policy target, the authority involved has some reputational credibility attached to successfully meeting that target, and thus may adjust its own behaviour and procedures to that end.
One reason this happens in other fields: once a goal is set, people will optimize for that goal in a way that neglects equally important parts of a system. Task your company with hitting a big sales target and customer service may wither as the goal cannibalizes employees’ attention. Or they’ll game the system to meet a goal in a way that distorts the benefit of achieving that goal. Investors set quarterly earnings goals for a CEO to meet, with a huge incentive if they’re exceeded. Then stuff like this happens:
[General Electric] for two years in a row “sold” locomotives to unnamed financial partners instead of end users in transactions that left most of the risks of ownership with GE.
The sales in 2003 and 2004 padded revenue by $381 million … critical to meeting GE’s end-of-year numbers.
This is a cousin of observer effects in physics: It’s hard to know how some things operate in the real world because the act of measuring them changes them.
5. Dollo’s law: In evolution, organisms can’t re-evolve to a former state because the path that led to its former state was so complicated that the odds of retracing that exact path round to zero.
Say an animal has a tail, and then it evolves to lose its tail. The odds that it will ever evolve to regain a tail are nil, because the path that originally gave it a tail was so complex.
This affects businesses, too. There are things that, once lost, will likely never be regained, because the chain of events that created them in the first place can’t easily be replicated.
Brand is one. Good brands are hard to build, requiring the right product at the right time targeted to the right users who want a specific thing, produced the right way by the right people, all done with consistency. Once lost brand is very hard to regain, because the odds of building a successful brand in the first place were so low to begin with.
Teams can be another. Success is often personalized among one person, discounting how important members of their team were to winning. Many star employees have joined another firm, or gone out on their own, only to realize how much of their prior success was due to the unique team they were on, not necessarily their individual skill that can be replicated elsewhere.
6. Parkinson’s Law: Work expands to fill the time available for its completion.
In 1955 historian Cyril Parkinson wrote in The Economist:
IT is a commonplace observation that work expands so as to fill the time available for its completion. Thus, an elderly lady of leisure can spend the entire day in writing and despatching a postcard to her niece at Bognor Regis. An hour will be spent in finding the postcard, another in hunting for spectacles, half-an-hour in a search for the address, an hour and a quarter in composition, and twenty minutes in deciding whether or not to take an umbrella when going to the pillar-box in the next street. The total effort which would occupy a busy man for three minutes all told may in this fashion leave another person prostrate after a day of doubt, anxiety and toil.
His point was that resources can exceed needs without people noticing. The number of employees in an organization is not necessarily related to the amount of work that needs to be done in that organization. Workers will find something to do – or the appearance of doing something – regardless of what needs to be done.
Several corollaries exist. One is that expenses expand to fill an income. Same for expectations and success. In IT, data can expand to fill a given level of storage. My phone used to hold a few hundred photos; now it holds many thousands. I’ve taken advantage of that storage increase by filling it with many thousands of stupid photos I’ll never care about.
7. Wiio’s laws: “Communication usually fails, except by accident.”
Osmo Wiio, a Finnish journalist and member of parliament, coined several laws of communication, including:
“If a message can be understood in different ways, it will be understood in just that way which does the most harm.”
“The more communication there is, the more difficult it is for communication to succeed.”
“In mass communication, the important thing is not how things are but how they seem to be.”
Wiio made these laws in the era of carefully hand-written letters. Multiply them by 10 in the emoji and social media intern era.
I could elaborate further but no one would understand.
8. Sayre’s law: In a dispute, emotions are inversely related to what’s at stake.
In 1973 the Wall Street Journal wrote:
Academics love to lay down laws. One of the more famous is attributed to the late Wallace Sayre of Columbia University. Sayre’s Third Law of Politics—no one seems to know the first two, or whether there even were a first two–holds that “academic politics is the most vicious and bitter form of politics, because the stakes are so low.”
As far as I can tell no one quotes Sayre saying the line himself. But like many smart sayings it found a deceased owner and never let go.
The logic might go something like this:
When the stakes are actually high people within a culture have a pretty good track record of putting more of their differences aside for a common cause. You bicker when there’s little downside to doing so.
The part of your brain whose bandwidth deals with threats doesn’t like to stay still. There’s a baseline level of stress people need in their lives to keep their minds alert, and if they don’t get it from legitimate sources they’ll find something meaningless to fret about. Many of you know a trust-funder who validates this theory.
9. Stigler’s law: No scientific discovery is named after its original discoverer.
University of Chicago statistician Stephen Stigler coined the law. For consistency he says he stole it from sociologist Robert Merton.
Stigler writes in his book Statistics on the Table:
Examples affirming this principle must be known to every scientist with even a passing interest in the history of his subject; in fact, I suspect that most historians of science, both amateur and professional, have had their interest fueled early in their studies by the discovery (usually accompanied by an undisguised chortle) that some famous named result was known (and better understood) by a worker a generation before the result’s namesake.
I think this happens for two reasons.
One is that few discoveries happen in isolation. Most are combinations of existing discoveries that solve a new problem with an old invention. In his book How We Got to Now, Steven Johnson writes:
Innovations usually begin life with an attempt to solve a specific problem, but once they get into circulation, they end up triggering other changes that would have been extremely difficult to predict … An innovation, or cluster of innovations, in one field ends up triggering changes that seem to belong to a different domain altogether.
Combining other people’s work into something you get credit for happens within companies, too. Bill Gates put it: “Steve [Jobs] and I will always get more credit than we deserve, because otherwise the story’s too complicated.”
The other – and more applicable to Stigler’s law – is the long history of the crowned winner being the person who communicates an idea the best, not whose idea is the best. A pop psychology book will always sell better than deep academic research with original discoveries because people are busy and lazy and want to learn about a topic with the least amount of effort required. My impression is also that 90% of “viral” content that gets recognized is luck, the product of just the right promotion by just the right person at just the right time.
10. Mill Mistakes: Assuming the familiar is the optimal.
James Mill was a 19th century Scottish economist who reasoned that a constitutional monarchy is the highest natural form of government. He had his logic, and arguing whether it’s right isn’t the point. In his book At Home in the Universe, Stuart Kauffman makes a good observation:
James Mill once deduced from what he considered indubitable first principles that a constitutional monarchy remarkably like that on England in his day was obviously the highest form of government. One is always in danger of deducing the optimality of the familiar. Let’s call this a Mill-mistake. God knows we all suffer the danger.
Assuming the familiar is the optimal requires extra skepticism because what you’re familiar with will create the most coherent story in your head, giving it extra credit points over other ideas that might hold more water but are harder to contextualize. Daniel Kahneman writes:
Neither the quantity nor the quality of the evidence counts for much in subjective confidence. The confidence that individuals have in their beliefs depends mostly on the quality of the story they can tell about what they see, even if they see little. We often fail to allow for the possibility that evidence that should be critical to our judgment is missing—what we see is all there is.
11. Hickam’s dictum: Problems in complex systems rarely have one cause.
Occam’s razor in medicine guides doctors to a diagnostic rule of thumb along the lines of, “If there are several explanations for a patient’s symptoms, choose the one that makes the fewest assumptions.” It’s known as diagnostic parsimony.
Doctor John Hickamn once pointed out the limitations of this rule: “Patients can have as many diseases as they damn well please.”
His observation was that a patient is statistically more likely to have a few common ailments than a single rare one, so the push to get to one grand underlying cause can lead to false precision at best, misdiagnosis at worst.
The human body has 11 systems, 79 organs, 206 bones, and 600 muscles. The global economy has 7 billion people and 200 million businesses.
So, you do the math.
Ideas that changed my life
Where big leaps happen
Short money rules