by Mark Gutman

by Mark Gutman, July 18, 2014

Adolf Eichmann was a key Nazi in charge of the logistics of Hitler’s “final solution.” He arranged for the rounding up and deporting of hundreds of thousands of Jews and other out-of-favor groups to concentration camps to be murdered. Hitler and other Nazi leaders committed suicide rather than allow themselves to be captured, but Eichmann escaped, eventually settling in Argentina. There he lived as Ricardo Klement until 1960, when Israeli agents pulled off a daring capture-and-transport plan, and hauled him to Jerusalem to face trial. 
In what became known as the Eichmann Defense, Eichmann claimed that he had no choice for his wartime actions; he was ”just following orders.” His defense was unsuccessful, and he was hanged, but his trial focused attention again on how the Holocaust could ever have happened. Surely the blame could not be limited to a few leaders who planned everything. Soldiers did the killing, but the civilians’ willingness to turn a blind eye was crucial to pulling off such genocide. The people of Germany were looked at as moral inferiors because they would support such a killing machine. In the United States, we wouldn’t have let such a plot develop so far. Oh?
Stanley Milgram, a Yale psychologist, researched how people do at “just following orders.”1 In one particular study, he arranged for a “teacher” and a “learner” to be hooked up with an “experimenter.” (The experiment, which was conducted in various cities, involved hundreds of these groupings of three.) The teacher (who was the actual, and unwitting, subject of the test) was to ask questions of a learner and flip a switch to administer a shock to the learner for an incorrect answer. The switches showed the voltage that would be delivered, ranging from 15 volts to 450 volts. The only actual electrical shocks in the experiment were the mild ones the teachers felt before asking the questions of the students, to let the teachers feel how “real” the shocks to the learners would be. The switches were dummies, and the learners were confederates playing a role.
Before Milgram began conducting his experiment, scientists predicted that only one to three percent of the teachers would administer the maximum 450 volts, and those people would be pathological or psychopaths. In the experiment, though, 65% of the test subjects (i.e., the teachers) went clear to 450 volts. A student (actor) would yelp in pain or complain of a heart condition and/or beg to quit the experiment. The teacher would look at the scientist standing close by, directing the experiment, and express his or her discomfort with the experiment. The scientist would simply say, “Please continue,” and gradually raise the coldness of the direction to (on the fourth protest by a teacher), “You have no other choice; you must go on.”
Back to Eichmann. Could a Hitler-type program be carried out again? Using civilized people? Milgram’s experiment aside, we have plenty of evidence that we tend to be submissive to or willing to accept pronouncements by officious, uniformed people.  Experiments by Leonard Bickman, Rank & Jacobson, and others have shown that we are more likely to take orders from someone in uniform than someone in street clothes.2
Never mind experiments. In July 2011, Anders Behring Breivik used a Glock pistol and a Ruger rifle to kill 69 people, mostly teenagers, at a summer camp on an island in Norway. He used a uniform to draw a crowd. “The youth on the island gathered easily around the man in a police uniform wearing two guns.” “We thought it was great how quickly the police had come to reassure us of our safety because we had heard of the bombing in Oslo” (which Breivik had carried two hours earlier). “We have an instinct to obey authority, especially when we have little time to think through our choices.3
“We have an instinct to obey authority.” Whether the “authority” is a (real or phony) police officer or doctor or pastor, we have an instinct to rely on presumed authority to tell us what’s what. We have to trust that some of the structures set up by society are for our good. We generally learn to respect parents, teachers, bosses, trusting that they have a bigger picture than we do, and that we can feel more secure in society if we and everybody obeys authority.
We often find it easy to let authorities and others tell us what to do and what to think. Much of what we believe we believe because someone else told us. We don’t have time to learn everything firsthand. We also don’t have time to think through everything we learn secondhand. But we can become too dependent on others’ thinking. Many a harmful financial or medical or religious decision has been made because of deciding to accept what someone else asserted or demanded or pleaded.
Variations on Milgram’s experiment show that subjects are more likely to resist orders to carry out unethical actions if they have time to think and can consult with others. So you’re not quite as likely to turn into a genocide agent as Milgram suggested, given time to think and discuss.4 But if you have time to think and discuss, do you use it well? “Don’t go along with the crowd in doing evil. . .5 can be very difficult to follow. “Just say no” is not easy advice to follow when you’re dealing with some friends or co-workers. Many who cooperated with the Nazi program had plenty of time to think and discuss with others.
If two people agree on everything, one of them is unnecessary or has stopped thinking. Disagreement is unavoidable when two thinking people work together. The discussion of differences can be healthy, helping take the rough edges off of proponents of both sides, and perhaps even leading some to change to a healthier or more realistic belief. When we have time to think, to reason, we need to use that gift. To allow others to do all our thinking leaves us open to deception or Nazi cooperation. Con men (and women) depend on people’s willingness to trust with insufficient evidence.6
Long ago, Ellen White counseled, “It is the work of true education to . . . train the youth to be thinkers, and not mere reflectors of other men’s thought. Instead of confining their study to that which men have said or written, let students be directed to the sources of truth, to the vast fields opened for research in nature and revelation.7 So – “No, governor, I cannot obey that law you just signed. I don’t believe it is ethical or moral.” “Pardon me, pastor, if I respectfully disagree with you.” “Pardon me, Mom and Dad, if I don’t see if your way. I don’t blame you for seeing things the way you do, but I’m not a mere reflector of your thoughts.”
Even the apostle Paul could rightfully be challenged. The writer of Acts compliments the Bereans for investigating what Paul taught.8 Imagine Paul as he scolds those Bereans? “Why don’t you just believe what I tell you? After all, I’m an apostle. You don’t need to investigate; I’m a trustworthy source.” Paul didn’t say that, and we shouldn’t be persuaded by that type of talk.
There is a place for following orders. We’d have chaos in our society if nobody obeyed police or elected officials. But we run serious risks when we surrender our thinking processes to someone else, in the government, at the office, or in the church. The scenario of Revelation 13, in which good people cannot buy or sell, will not be aided by people who have learned to refuse to go along with authority figures when conscience will not allow it. Healthy thinkers will not excuse their own actions with the claim that they are or were “just following orders.”

1The experiment, which Milgram wrote about in Obedience to Authority: An Experimental View, has been conducted all over the world, with all kinds of variations. This column is giving a skeleton description. The experiment also drew criticism, most notably in Gina Perry’s Behind the Shock Machine. Perry found that the experiment wasn’t as carefully conducted as Milgram described it, and she presents various other problems with the experiment and its conclusions. Regardless of Perry’s recent findings,
Milgram’s experiment reignited the issue of “just following orders,” which had been used by Nazi defendants in 1945 and 1946, but had largely been forgotten since then.
2Bickman, Leonard. 1974. ”The Social Power of a Uniform.” Journal of Applied Social Psychology, Vol 4, Issue 1, pages 47-61. Abstract available at (article can be purchased).
Rank, Stephen G., and Cardell K. Jacobson. 1977. “Hospital Nurses’ Compliance with Medication Overdose Orders: A Failure to Replicate.” Journal of Health and Social Behavior 18:188-93. 

4Hofling, Charles K., Evelyne Brotzman, Sarah Dalrymple, Nancy Graves, and Chester M. Pierce. 1966. “An Experimental Study of Nurse-Physician Relations.” Journal of Nervous and Mental Disease 143, 171-80.
Rank, Stephen G., and Cardell K. Jacobson. 1977. “Hospital Nurses’ Compliance with Medication Overdose Orders: A Failure to Replicate.” Journal of Health and Social Behavior 18:188-93.
Miller, Arthur G. 1986. The Obedience Experiments: Case Study of a Controversy in Social Science. Wesport, Connecticut: Praeger.
5Exodus 23:2, Message
6Note: Breivik depended partly on that, although the campers can’t be blamed for being fooled. Trusting with legitimate reason sometimes leads to disaster. We don’t live in a perfect world.
7Education, Ellen White, p.17
8Acts 17:11