I’ve been thinking about the diffusion of responsibility in the modern world, and come to the conclusion that there’s a new and disturbing dimension that I haven’t seen discussed before. I want to work that out here. First, some history.
After World War 2, a lot of people were asking this question: “how could ordinary German citizens participate in such mind-boggling atrocities like the Holocaust?” It’s one thing to say that Hitler, Himmler, Höss, and the other top bosses were all murderous psychopaths, but it strains credulity to insist that everyone who put on a Nazi uniform was. That’s effectively saying that during the Weimar Republic all the Germans just decided to turn evil. Maybe something else was going on.
The horizontal diffusion of accountability
Enter Stanley Milgram. I suspect that most people reading this know about his experiments, but bear with this short account for those who don’t.1 Milgram was a psychologist interested in testing to what extent the average Joe would obey nominal authority. At the time he was on the Yale faculty, so he posted notices around New Haven, offering $4.00 (plus $.50 carfare) for volunteers to come to his lab and participate in a psychological study. Naive subjects showed up and were told that Milgram was testing the effects of punishment on learning, whether people learn better from an older or a younger person, and things like that.
Obviously that was all a ruse. Never tell your subjects what you are really looking for—that will affect their behavior and skew the results. There would always be two people in the lab to be test subjects. They would draw their roles out of a hat, told that one of them would be the teacher and the other the learner. In fact, both slips of paper said “teacher.” The supposed learner was a confederate of the experiment.
The teacher was shown a scary-looking shock generator and told the learner had to memorize word-pairs that the teacher would give him or get shocked. But the real experiment was about the teacher: how far would they go?
The learner (remember, a stooge who was never actually shocked at all) was strapped down to a chair and electrodes attached. Things would always start off well, with the learner getting the word-pairs right. Boy/girl, grass/hat, clown/face, etc. Then he would start to systematically make mistakes. With every error the teacher would have to flip a switch to shock the learner into compliance. The shock would start off small: just 15 volts. But with each new mistake the teacher would increase the voltage by 15 volt increments, finally topping out in the red zone: intense shock (255v), extreme intensity shock (315v), DANGER: severe shock (375v), until finally, ominously, X X X (450v).
As the voltage went up, the learner grunted, then complained, then begged to be released from the experiment, then screamed. Finally, at the highest levels, the learner was dead silent. Milgram wrote that no scholar he consulted in advance of the experiments expected the teachers to go very far on the shock generator. But they were the ones who got shocked: many teachers went all the way to the final switch on the machine.2
They didn’t do so happily. Often the subjects said they wanted to stop, or expressed concern for the learner. “I don’t want to be responsible for harming that man in there!” they’d say after shocking some innocent stranger they just met. “Don’t worry,” the experimenter, a skinny nerdy guy in a white lab coat with no weapons or real authority, would tell them, “The responsibility’s mine. The experiment requires that you continue.” And so, often, they did.
After the experiment ended and the teacher and learner had a happy reconciliation, the subjects were relieved and grateful everyone was all right. Still, in the moment, their moral scruples tended to have little effect on their actions.
Milgram wrote,
Even Eichmann was sickened when he toured the concentration camps, but to participate in mass murder all he had to do was sit at his desk and shuffle papers. At the same time, there man in the camp who actually dropped Zyklon-B into the gas chambers was able to justify his behavior on the grounds that he was only following orders from above. Thus there is a fragmentation of the total human act: no one man decides to carry out the evil act and is confronted with its consequences. The person who assumes full responsibility for the act has evaporated.3
It’s a mistake to think the horizontal diffusion of responsibility, spread across many people so there’s no one villainous actor, is somehow unique to the Nazis. That’s actually the opposite of the true lesson. As Milgram noted, “the studies are principally concerned with the ordinary and routine destruction carried out by everyday people following orders.”4
The vertical diffusion of accountability
Ever go into a store, have a terrible experience and then complain to an employee about it? Nine times out of ten the response is “that’s not my fault, I didn’t make things that way.” They’re right of course, but who do you complain to?
For example, I hate self-checkout. Items that refuse to scan, or don’t have proper barcodes, or aren’t in the system’s databases. The endless “please put your item in the bagging area.” The constant retort “please remove item from the bagging area.” The system’s total meltdown if you want to use your own bags. The need to wait for an actual human to come over and fix things because three leeks is an item of deep mystery to the scanner. “Please wait, system processing.” I really want to go Office Space on those machines.
But there’s no one to complain to. Are you going to bitch to the two or three remaining cashiers? The stockers? The store manager will just tell you it was corporate’s decision, and corporate will say it was a committee decision. Even if the proper committee could be identified, accountability is horizontally spread around and diluted to such an extent that everyone on it can rightly say “that’s not my fault.” Customer experience gets worse and no one’s there to blame.
I want to identify another problem: the vertical diffusion of responsibility. Here’s a personal example. I work at a regional public university in the US that offers bachelor’s and master’s degrees. We are part of a state system of higher education along with about a dozen sister schools spread around the state. When the regional universities were founded in horse-and-buggy days, people really did need a small hometown college every 100 miles. Nowadays students can jump on the highway, live in a dorm, and that need no longer exists.
Now tiny schools in the middle of nowhere are insolvent—the state won’t fund them properly and enrollments are cratering, two factors that exacerbate each other in a death spiral to extinction. So our last system chancellor decided that Bold Action is Needed. On the basis of no studies, no estimate of return-on-investment, no nothing, he decided to merge three universities into one.
The three in question are about 90 miles from each other. One has failed, one is failing, and the strongest one (a struggling swimmer trying to save two drowning swimmers) is mine. Approximately 100% of my colleagues opposed this idiotic merger and knew it would lead to catastrophe. Now catastrophe—enrollments down, faculty jumping like rats from a sinking ship, no new hires, majors eliminated, physical plant degrading, subterranean morale—is upon us.
So where’s the chancellor who ordered this mass suicide? Hahaha! C’mon, we all know the answer. He’s gone to an even bigger and cushier sinecure! That’s the way it works. The big boss put “took Bold and Decisive Action in troubled times with anticipated glorious results” on his CV and used it to get another job before the wheels came off entirely.
Here’s the best part. If anyone tries to call him out on it now, like “aren’t you the guy who destroyed those universities with that moronic merger?” He can correctly and honestly say, “no, when I left, things were moving in the right direction. It must be the fault of the new person.” The consequences of what his past self did cannot be wholly ascribed to his present self. That’s the vertical diffusion of accountability: Milgram’s “total human act” is spread out over time like butter over bread, and any time slice of an actor is responsible for only a small portion of it.
We see this all the time. A geriatric narcissist takes control of government and immediately destroys democratic norms and procedures, all for self-aggrandizement and self-enrichment. But he will be exempt from accountability because by the time it rolls around, old age will have claimed its prize. Venture capital buys a company, dismantles it and sells it for parts. But by the time the human cost is manifest, the actors have changed, venture capital has moved on, and the current harm can only be assigned to some past person who is no longer there. When a malevolent actor’s distancing from negative consequences is faster than the negative consequences appear, accountability becomes impossible. The greater the temporal distance, the less blame can be effectively assigned.
Milgram claimed that the horizontal diffusion of responsibility “is the most common characteristic of socially organized evil in modern society.” That’s only part of the story. The other part is that accountability becomes diluted across time rather than just across multiple people. The decision-consequence gap—there's often a significant time delay between when decisions are made and when their negative consequences materialize—allows decision-makers a temporal escape. They evade accountability by moving on before bad consequences materialize. This vertical diffusion of responsibility is also a common characteristic of socially organized evil in modern society.
Milgram’s work has held up well through the replication crisis. A more in-depth discussion by a psychologist is here.
Milgram tested numerous different protocols. In some the learner was in a completely different room from the teacher but could be heard through the wall, in others they were in the same room, in others the teacher had to press the learner’s hand on a shock plate, in some the experimenter was not even present, etc. While this did lead to somewhat varying results, the overall conclusions about obedience held across the board.
Stanley Milgram, Obedience to Authority (New York: Harper and Row, 1974), p. 11.
Ibid. p. 178.
I've not read it, but have you come across The Unaccountability Machine, by Dan Davies? He has a terrific substack, which makes me think it must be a terrific book...
If you're looking for more sources on the vertical diffusion of responsibility and managers outrunning their mistakes, I came across the same concept in a sociology of work class I took back in 2000, in Robert Jackall's book Moral Mazes: The World of Corporate Managers. Given that it's stuck with me for 25 years now, I'm giving it a strong but foggy recommendation.