The Power of Two: How Shared Dissent Can Make All the Difference
First published in 2011, this article feels especially urgent in 2018. Erin is now a junior in college.
A few days ago, Erin, my eighth-grader, made me proud. That alone is not news. But in this case she showed courage in someone else’s defense, and when that happens, my shirt buttons grab their crash helmets and wince.
“Guess what happened today,” she said.
I gave up.
“I was at the table in the cafeteria with these three other kids, and two of them asked the other girl where she went to church. She said ‘We don’t go to church,’ and their eyes got big, and the one guy leaned forward and said, ‘But you believe in God, right?'”
Oh here we go. I shifted in my seat.
“So the girl says, ‘Not really, no.’ And their eyes got all big, and they said, ‘Well what DO you believe in then??’ And she said, ‘I believe in the universe.’ And they said, ‘So you’re like an atheist?’ And she said ‘Yes, I guess I am.'”
I looked around for popcorn and a five-dollar Coke. Nothing. “Then what??”
“Then they turned to meee…and they said, ‘What about YOU? What do YOU believe?'” Another pause. “And I said, ‘Well…I’m an atheist too. An atheist and a humanist.'”
She’s 13, old enough to try on labels, as long as she keeps thinking. She knows that. And she’s recently decided that her current thoughts add up to an atheist and a humanist.
“And I looked at the other girl, and…like this wave of total relief comes over her face.”
Oh my word. What a thing that is.
“Erin that’s so great,” I said. “Imagine how she would have felt if you weren’t there!!”
“Yeah, I know!!”
I’ll tell you who else knows — Solomon Asch.
The Asch experiment is one of the great studies in conformity. When you are alone in a room full of people whose opinion differs from yours, the pressure to conform is enormous. But when individuals were tested separately without group consensus pressures, fewer than 1 percent made any errors at all. The lesson of Solomon Asch is that most people at least some of the time will defy the clear evidence of their own senses or reason to follow the herd.
One variation in the design of the study provides a profound lesson about dissent. This is the one that Erin’s situation reminded me of. And it’s a crucial bit of knowledge for any parent wishing to raise an independent thinker and courageous dissenter.
In this version, all but one of the researcher’s confederates would give the wrong answer. The presence of just one other person who saw the evidence in the same way the subject did reduced the error rates of subjects by 75 percent. This is a crucial realization: If a group is embarking on a bad course of action, a lone dissenter may turn it around by energizing ambivalent group members to join the dissent instead of following the crowd into disaster. Just one other person resisting the norm can help others with a minority opinion find their voices.
This plays out on stages even larger than the school cafeteria. On April 17, 1961, the US government sent 1,500 Cuban exiles to invade Cuba at the Bay of Pigs. The idea was to give the US plausible deniability—barely plausible, but still. It was supposed to look like the exiles did it on their own.
Well, it did end up looking like that. The invasion was a mess of lousy planning and execution. Most of the 1,500 were killed or captured by a force of 20,000 Cuban soldiers, and the US government was forced to essentially pay a ransom of 53 million dollars for the release of the prisoners. And that’s in Mad Men dollars—it would be $510 million today. Cuba’s ties with the Soviet Union were strengthened, and the stage was set for the Cuban Missile Crisis six months later.
In short, it was a complete disaster. And in retrospect, that should have been obvious to those who planned it. But among President Kennedy’s senior advisers, the vote to go ahead had been unanimous. Why? It came out later that several of them had serious doubts beforehand but were unwilling to express those doubts since they thought everybody else was on board. It was the height of the Cold War, and nobody wanted to look “soft.” The climate of the discussions made real dissent too difficult to articulate, so a really bad idea went unchallenged.
The presidential historian Arthur Schlesinger was there for most of the discussions, and he later said that he was convinced that even one dissenter could have caused Kennedy to call off the invasion. ONE. He said he wished most of all that he had found the strength to be that dissenter.
At least Kennedy learned his lesson. During the Missile Crisis later that year, he made a point of fostering dissent and encouraging the collision of ideas among his advisers. The resulting policy led to the peaceful conclusion of what may have been the most dangerous crisis in human history (so far).
Many think that times of crisis and war are the worst possible times for argument and dissent. Hitler certainly thought so. He often said the mess of conflicting opinion in democracies would cause the Western powers to crumble before the single-minded focus of his military machine. He got the difference right but misdirected the praise. Military historians are pretty much agreed that the stifling of dissent in the Third Reich’s military decision-making was its fatal flaw. It was entirely top-down. Only if Hitler’s plans were flawless could that system be stronger than one in which ideas contend for supremacy.
So Montgomery and Patton’s pissing contests, MacArthur and Truman’s showdowns, and the constant whirl of debate among the Allies and even among the branches of the American service was a better approach to running a war than the single-minded dictates of dictators, from Napoleon to Hitler to Saddam Hussein. Crush dissent and you will most often end up shooting yourself in the foot. United We Stand is bad policy, even in wartime.
Dissent is often discouraged in the corporate world as well. Jeffrey Sonnenfeld’s research found that corporate boards that punish dissent and stress unity among their members are the most likely to wind up in bad business patterns. It’s corporations with highly contentious boards that tend to be successful. Not always—it depends on the nature of the contention—but when boards generate a wide range of viewpoints and tough questions are asked about the prevailing orthodoxy, they tend to make better decisions in the end. All ideas have to withstand a crossfire of challenge so the bad has a chance of being recognized and avoided.
A list of corporations with boards that valued conformity and punished dissent reads like a Who’s Who of corporate malfeasance: Tyco, WorldCom, Enron.
There’s something so counter-intuitive about all this. It seems on the face of it that uniting behind an idea or position or plan is the best way to ensure success. And it can be, if the idea or position or plan is good in the first place. And the best way to ensure that it is good is by fostering dissent from the beginning.
And “from the beginning” really means long before the meeting even begins — while the decision makers are still in the eighth grade cafeteria, learning to accept the presence of difference in their midst.
Had the other girl in my daughter’s story not mustered the courage to self-identify first as a person with a different perspective — in this case an atheist — Erin would have been statistically less likely to share her own non-majority view. Once the girl spoke up, Erin’s ability to join the dissent went up about 75 percent. And once Erin shared the same view, the other girl enjoyed a wave of retroactive relief at not being alone.
The other two kids also won a parting gift. They learned that the assumed default doesn’t always hold, and that the world still spins despite the presence of difference. They’re also likely to be less afraid and less astonished the next time they learn that someone doesn’t believe as they do, which can translate into greater tolerance of all kinds of difference.
Age Stories: How My Kids Met Me One Year at a Time
“Twenty-eight!”
“Hmm, okay, 28. Ooh, that’s a good one.”
Despite living with him for 13 years, I knew very little about my dad. He worked three jobs and traveled a lot. When he was in town, he came home exhausted from a hundred-mile round-trip commute. I didn’t even know he was a nonbeliever until long after his death at 45.
My mom spoke very little of him, consumed as she was with the lonely and impossible task of working full-time while raising three kids by herself two time zones away from any other family.
I’ve wondered how much my kids would remember of me if I died today. The situation is different — I’m more involved in their lives than my dad was able to be in mine, for several reasons — but I wanted a way of sharing my life with my kids that was natural and unforced.
At some point, without even meaning to, I found a way, starting a tradition in our family called “age stories.” Simple premise: At bedtime, in addition to books or songs, the kids could pick an age (“Twenty-eight!”) and I would tell them about something that happened to me at that age. For a long time it was one of their favorite bedtime options.
Through age stories, they now know about my life at age 4 (broken arm from walking on a row of metal trash cans), age 9 (stole a pack of Rollos from Target and felt so bad I fed them to my dog, nearly killing her), age 21 (broke up with my first girlfriend and got dumped by the second one), 23 (my crushing uncertainty on graduating college), 25 (the strange and cool job in LA that allowed me to meet Nixon, Reagan, Bush Sr., Jimmy Stewart, Elton John, and a hundred other celebs), 26 (when I pursued and stole their mother’s affections from the Air Force pilot she was allllmost engaged to), what happened on the days they were born, and everything — eventually just about everything — in-between.
They know how I tricked a friend into quitting pot (for a night anyway, at 15), the surreal week that followed my dad’s death (13), how I nearly cut off two fingers by reaching under a running lawnmower (17, shut up), my battles with the administration of the Catholic college where I taught (40), the time I was nearly hit by a train in Germany (38) and nearly blown off a cliff in a windstorm in Scotland (42).
Age stories can also open up important issues in an unforced way. Delaney happened to ask for 11 — my age when my parents moved us from St. Louis to LA — right before we moved her from Minneapolis to Atlanta. It was a very difficult time for her. I described my own tears and rage at 11, and the fact that I held on to my bedpost the day of the move — and how well it turned out in the end. I wasn’t surprised when she said “11” again and again during that hard transition in her own life.
We’ve talked about love, lust, death, fear, joy, lying, courage, cowardice, mistakes, triumphs, uncertainty, embarrassment, and the personal search for meaning in ways that no lecture could ever manage. They’ve come to know their dad not just as the aging monkey he is now, but as a little boy, a teenager, a twenty-something, stumbling up the very path they’re on now.
And they keep coming back for more.
Give it a try. Make it dramatic. Include lots of details and dialogue. Have fun. Then come back here and tell us how it went.
[arve url=”https://www.youtube.com/watch?v=tR-qQcNT_fY” /]
Q&A: Lean on me
(Here’s the first in my new occasional Q&A series. Click Ask a Question in the sidebar to submit your own question.)
Q: I saw a note on Pinterest recently that really grabbed me, and I’ve not been able to shake it. It was a list of suggestions for parents. One of the entries was “give your children something to believe in – because there will come a time when they are alone and scared or sad, and they’re going to need something to believe in.”
My husband and I are, at the least, agnostic….But I do want to know that if something really shakes the lives of my children, they will have some way of comforting themselves, some way of (eventually) coming to know that everything will be all right. How is this accomplished?
A: How I love this question. It cuts right to the core of the ultimate reprieve that religion offers from fear and vulnerability. Life may be incredibly hard and unfair at times, but believing that Someone Somewhere who is all-powerful and all-good has a handle on things and will see to it that justice prevails in the end… I can easily see how that idea can make life bearable, especially for those who are in much closer touch with the raw human condition than I am.
It brings to mind the Russell quote I’ve written about before: “Ever since puberty I have believed in the value of two things: kindness and clear thinking….When I felt triumphant I believed most in clear thinking, and in the opposite mood I believed most in kindness.” And there’s the key to the question. If I can’t offer them the kindness of God to lean on, what can I give my kids to help them through the inevitable times they will feel the opposite of triumphant?
You may have heard the Christian acronym J-O-Y, which stands for “Jesus, then Others, then Yourself” — the supposed formula for true happiness. Take away Jesus and you have the real-world resources I hope to build in my kids: the support of other people, and a strong self-concept.
Kids need to develop the ability to connect emotionally and meaningfully with others, and that’s a skill that starts at home when they are young. You care for your child and encourage their natural empathy for others. They become the kind of people who attract others to them in mutually supportive relationships.
As they get older, peers overtake family as the leaning posts. It’s no coincidence that teenagers often become obsessively centered on their peer group for identity and support as they are pulled through a period of rapid change, and that they focus more on those who are going through the same transition than on the all-too-familiar family they are transitioning away from.
They’ll also make connections based on interests and passions. In addition to a really tight group of friends, my daughter Erin (15) is passionately involved in photography, volunteering, volleyball, animal rescue, and acting. She’s in specific clubs that connect her to others with the same interests, and if those interests continue, she can continue to be connected to those larger passion communities throughout her life.
Those interests won’t all continue, of course, nor will all of her current friendships. Some will fall away as she grows older and her circumstances change, but she’ll retain the ability to connect. It’s not a static belief she needs, but that ability, that skill. Those mutually supportive connections with other human beings, connections she has built herself, will get her through hard times, as well as the strong self-concept on which those relationships are based.
And, when she’s 21 or 31, if we’ve built the right kinds of connections between us and earned it ourselves, her family will be that ultimate connection she can always lean on. To paraphrase Tim Minchin, we are the people who’ll make her feel safe in this world.
But I’m not headed into White Wine in the Sun here. There’s another song that captures this humanistic idea of people caring for each other better than any other.
R&B legend Bill Withers wrote it after he moved to LA in the lates 1960s following a stint in the Navy. He was really alone for the first time in his life, feeling vulnerable, away from the personal connections that had made him feel safe growing up in a small coal mining town in West Virginia. He sat down and wrote one of the great songs of all time about what he was missing. Not a particular belief, not God, but somebody to lean on. And unlike God, that human relationship can be mutual — which to my mind is SO much more satisfying and meaningful.
Being Toto
The Wizard of Oz is a secular humanist parable.
I’m not the first to suggest this possibility. But the eye roll I got from my 17-year-old son when I said it at dinner the other night could have cleared the dishes from the table. He’s currently soldiering through an AP Lit class in which the teacher earnestly insists that no cigar is ever, ever just a cigar. When one of the short stories they read described a red ovarian cyst in a jar, the teacher looked searchingly at the ceiling. “Red,” she said, drawing out the syllable and shaping her next thought with her hands. “Passion.”
“OR,” said my boy in the exasperated retelling, “red — the color of an ovarian cyst!!”
So I knew I was in for it when I claimed that The Wizard of Oz isn’t just a story about a girl and her weird dream.
But it isn’t.
Frank Baum (who wrote the book) was a religious skeptic and Ethical Culturist. Yip Harburg (who wrote the screenplay and songs) was an atheist. That doesn’t mean a thing by itself, of course. But it takes very little ceiling-gazing and hand-gesturing to see the Oz story as a direct reflection of a humanistic worldview.
Dorothy and her friends have deep, yearning human needs — for home, knowledge, heart and courage. When they express these needs, they’re told that only the omnipotent Wizard of Oz can fulfill them. They seek an audience with the Wizard, tremble in fear and awe, then are unexpectedly ordered to do battle with Sata… sorry, the Witch, who turns out pretty feeble in the end. (Water, seriously?)
When they return, having confronted their fears, the Wizard dissembles, and Toto pulls back the curtain to reveal a mere human behind all the smoke and holograms — at which point they learn that all the brains, courage, heart, and home they sought from the Wizard had always been right in their own hands.
It’s really not much of a stretch to see the whole thing as a direct debunk of religion and a celebration of humanistic self-reliance. And as a bonus, Connor actually granted me the point.
Scooby meets The Shining
- July 02, 2012
- By Dale McGowan
- In critical thinking, death, fear, My kids, Parenting, Science
- 1
Back from an EPIC two-week family vacation in California, probably our last big trip as a family unit.
We ended in Yosemite, the most sock-off-knocking place on Earth, staying outside of the park in the tiny Gold Rush town of Coulterville at the Hotel Jeffery. It was an unmissable opportunity. The Jeffery, you see, is haunted. In my enthusiasm for the idea, I even booked Room 22, “the most haunted room in the hotel.”
Right after I booked and paid for it, I ran and told the kids about this fun thing I’d done, thinking they’d jump up and down. What a putz. Connor (16) thought it was cool, but the girls pretty much jumped up and down on my head.
“What were you THINKING?!” Laney asked. “Seriously, Dad, jeez!”
“Well most of the hotels near the park are already booked!” I said defensively. “And this one had a lot of rooms available, and they’re uh…they’re cheap.”
“Gosh I wonder why.”
It shouldn’t have surprised me. My kids have a healthy skepticism, partly because I’ve been pulling their legs continually since birth. (Hey, they were having a hard time out of the canal.) But their well of experience and reading and thinking about the supernatural isn’t much deeper than mine was at their ages, and I would NOT have jumped at this chance if my dad had come up with it. Hell no. I’ve worked it all out since then, so I no longer register more than a distant, limbic twinge at this stuff. Oh yes, still that.
But I’d already handed over my gold nuggets for the rooms, so we were going to be staying at the Jeffery. But to avoid a revolt in the parking lot, I knew I’d have to offer the girls something from my own well.
My biggest breakthrough in thinking about religion was realizing I didn’t have to search for the deity to decide whether I believed; I just had to look at the reasons other people believed and decide whether they were any good. (SPOILER ALERT: Nope.) The same thing works with the paranormal. So before the trip, I showed Erin and Delaney the first two minutes of this video:
“Oh, please!” Laney said when the door opened (1:30). Erin laughed with relief. Now they were dipping into their own wells of experience. Both of them grew up in a 115-year-old Victorian house in Minnesota. Like the Jeffery, none of our door frames were quite squared, and the slightest change of air pressure would cause a door to drift open, even if you couldn’t feel it. The silliness of somebody else’s evidence helped their concerns melt away. “You’re just people like us in the universe” became one of the catch phrases of our trip.
So I thought we were done. Oy, putz!
The last leg of the trip arrived. We drove straight from LA and pulled up in front of the Jeffery, which has a very cool, fairly authentic, unpolished feeling. Gaudy wallpaper, dim lighting. Wood creaks and paint peels. No check-in counter — you get your keys from the bartender at the period saloon downstairs, which was nicely filled with bikers. And upstairs we went.
The doors of unoccupied rooms are left ajar. Of the 22 rooms in the hotel, 22 were ajar. We were the only guests for the night, and we had one room on each of the two floors — at opposite ends. Becca noticed there were no phones. And we hadn’t had cell reception for twenty miles. This was getting good.
And then it got better. Once the saloon emptied out, even the staff left. Locked the door and left. We were now the only people in the building.
Despite all this, and the sun going down, everybody was still fine fine fine…until Becca opened a little black case we’d been given with the key for Room 22. It was a ghost detection kit, with instruments like a “GaussMaster electromagnetic field meter,” a motion detector, and a laser thermometer.
Delaney had been sitting on the bed, reading the instructions, which she slowly lowered into her lap.
“I don’t want to do this.”
All of her earlier fear was right back on her face. It’s easy to dismiss mediums cooing over a door that opens by itself — literally kid’s stuff. But this looked an awful lot like science.
I wasn’t going to force her to do it, of course. But I also thought we should try to defuse her fears before the lights went out.
I picked up the instructions and read. “Hmm. Um hmm. Looks all official and sciencey.” She nodded. “Well there’s a word for that. It’s called pseudoscience. Guess what ‘pseudo’ means.”
“Fake,” I said. “Pseudoscience means fake science. Something pretending to be science that isn’t.”
Now this was interesting. From nothing more than that, she suddenly looked visibly relieved. Not completely, but better. Somehow knowing there was a word for the fakery, a whole category, gave skepticism a form of its own, something she could hold on to.
Of course having this long, fancy word didn’t really confer legitimacy any more than the sciencey words in the instructions did, any more than calling something “transubstantiation” makes it less goofy. But in that moment, having a name for “fake science” helped her see that it might be exactly that.
I read the instructions aloud for one of the gizmos. “‘If the reading is between 0.3 and 0.5, you may be in the presence of a spirit.'” We turned on the meter and pointed it at a corner. The needle went up and down from 0.0 to 0.6. “They said that means there’s a ghost there. How do we know that isn’t the normal variation?” She shrugged. “We don’t. And they know we don’t know that, so they make up numbers to freak us out and sell ghost detection kits.”
Two minutes later, we each had a device and were tiptoeing, Scooby-style, down the intentionally dark hallway, humming scary organ music, pointing at shadows and giggling. We went into dark guest rooms, scanning everything as we went, needles bouncing and lights flashing. By the time we got back to our rooms, they were back to the reaction they’d had to the video of the self-opening door.
The next day we talked about the incentive the Jeffery has to bill itself as haunted — hell, it’s what snared me! — and came up with a few ways they could do it better. I think their skeptical wells are a little fuller for the experience. And it was damn fun.
(If you have a minute, go back and enjoy the video of “orbs” around 4:00.)
Coming (Religiously) Unglued: How the American Church is Blowing Its Lead
The religious shall inherit the earth.
Last sentence of Shall the Religious Inherit the Earth? by Eric Kaufmann
Between his titular question and confident answer, Kaufmann lays out his reasons for thinking recent gains of secularism and liberalism in the developed world will gradually be reversed, an argument captured in an article of his that’s currently meming its way around.
The arguments are brutally simple: (1) Children tend to adopt the religious identity in which they were raised; (2) The religious have more children on average than seculars; and (3) The more conservatively religious they are, the more children they tend to have.
All true.
Now I’ve rather enjoyed the progressive achievements of the last 50 years and was looking forward to more. But math says no. As long as the assumptions in those statements remain unchanged, we’re stuck with a more conservative and more religious future, even in the developed world.
Fiendishly clever, that Darwin fella.
I’ve seen it suggested with varying degrees of seriousness that secular progressives need to get busy indoctrinating their kids and having more of them. I’ve already written at length about the misguided lunacy of the first idea and will again soon. But the second one is a particular knee-slapper. Talk about your Pyrrhic victories! We have fewer kids for good reasons, thangyavurrymush, including the desire to focus parental attention on fewer kids, financial constraints (including the high cost of education), awareness of population issues, and access to family planning resources. We’re not going to reverse that sensible progress to win some fuzzy demographic struggle by pumping out more puppies.
Fortunately we don’t have to go into Shockley mode after all, in part because…well, because it’s a weird and creepy suggestion, first of all, but also because the assumptions underlying Kaufmann’s work are shifting on their own, and by a lot.
A Pew study from 2009 on “faith switching” included an under-reported finding that the glue of family faith is losing its stick. While just 7 percent of respondents 65 and older have ever left the faith in which they were raised to become unaffiliated, that number rises to 13 percent for those in their 30s and 40s and 18 percent of those currently under 30. That’s 18 percent who have already left religion at a pretty darn young age. Doesn’t even count those millennials who will leave in their 30s and 40s — numbers already available for the older brackets.
Another assumption shift: Kaufmann points to the high religiosity and birthrate of recent immigrants, especially Hispanics, as a key driver. But the birthrate of US immigrants drops dramatically once they are here — presumably as they and their children gain more of the advantages listed above, including improved access to family planning resources. And as the Pew study shows, they are much more likely with each generation to dissolve the glue that holds them to their family religion.
Finally, it’s silly to think an increase in diversity is ultimately going to make us more conservative. The increasing nonwhite slice of the American pie has a strong progressive effect that overwhelms the residue of family-of-origin conservatism for everyone. Conservatism thrives on sameness. The more we are surrounded by genuine difference, the less able we each are to cling to fantasies of the One True Faith or the master race. It’s harder to keep the cartoons in place when you are cheek-and-jowl with real people of other cultures, creeds, and colors.
Here in my Atlanta suburb, for example, which a generation ago was easily 95 percent white conservative Baptist, my five most immediate neighbors are from Indonesia, Turkish Armenia, Korea, India, and Ukraine. Last week, my daughter’s Saudi-born fourth grade teacher taught her students how to write their names in Arabic. This is Atlanta, folks. And the same thing is happening pretty much everywhere I go.
So when you see articles like Kaufmann’s, relax. The picture is much more complex and promising than a simple birthrate analysis suggests. And rather than throw out our own family planning, do the obvious — support family planning for everybody.
As for religious identity, it’s becoming less of an automatic inheritance, thanks in large part to the churches themselves, which are falling over themselves to alienate their young folks and succeeding at an incredible rate. If we want to help the process of dissolving that glue, there’s no better way than creating a happy, normal place for those leaving religion to land and thrive.
6000 days
Part 3 of 3.
Go to Part 1 or Part 2.
The aim that the child should grow up to become confidently independent is synonymous with the aim that the child should grow up mentally healthy.
Psychologist John Bowlby (1956)
We’re born with brains wired up for the Paleolithic, not for the world as it is today. We’ve developed better ways of knowing and controlling the world around us, but the fears and behaviors that protected us in that era — fear of difference, hypervigilance, out-group aggression, love of clear categories and authority, magical thinking — are still with us, even though they’ve now become either pointless or dangerous.
I want to help my kids let go of those fears so they can have a better life.
Religious and social conservatism are symptoms of those fears, reactions to the problem of being a Stone Age human. For the half of the planet still living in marginal conditions, that problem is mostly unsolved. For the rest of us — thanks to agriculture, germ theory, separating our drinking water from our poop, the scientific method, and a thousand other advances, we’ve made some serious progress. And that partial solution has made all the difference, freeing us up to live better lives than we once did.
I want my kids to get that very good news.
Education, experience, and parenting take a child from Stone Age newborn to modern adult in about 6,000 days. Or so we hope. In addition to shoe tying, the five-paragraph essay, algebra, good oral hygiene, the age of the universe, the French Revolution, and how to boil an egg, there’s something else we need to help them learn, or better yet, feel — that life is better and you have more control than your factory settings would have you believe.
At a convention five years back, author/filmmaker (and Darwin great-great-grandson) Matthew Chapman was asked why Europe rapidly secularized after the Second World War while the U.S. remained devout. He paused for a moment. “Honestly,” he said, “I think socialized medicine had a lot to do with it.”
Not the answer we were expecting.
For most of the history of our species, he said, we’ve been haunted by an enormous sense of personal insecurity, and for good reason. The threat of death or incapacity was always hanging over us. Religion offered a sense of security, the illusion of control. Once the states of Europe began to relieve some of those basic fears, people began to feel a greater sense of control and security, and the need for traditional religion began to wane.
Whether that’s the whole answer or not, I think he’s on to something here. Traditional religion is driven by human insecurity. I have a good number of friends and relations in the deep and toxic end of the religious pool, and I can’t think of one who truly jumped in unpushed. Some were born into it and raised to believe they couldn’t live without it. Other experienced some kind of life crisis resulting in a terrifying loss of control that pushed those ancient buttons — and they jumped in with both feet.
I feel immense empathy for these people — even as their beliefs make me nauseous.
I also have many friends who genuinely chose religion instead of needing it. And lo and behold, these folks tend to end up in more liberal expressions, doing little harm and a lot of good. They aren’t hostages to their innate fears. In fact, they have a lot more in common with me than with the people hyperventilating and clinging to Jesus in the deep end.
I really don’t care if my kids end up identifying with religion so long as it’s a choice, not a need. And the best way I can ensure that is by using these 6,000 days to give them not just knowledge but also confidence and security.
Turns out we know how to do this. You start with a sensitive, responsive, and consistent home life. Build a strong attachment with parents and other significant adults. Don’t hit or humiliate them or let others do so. Encourage them to challenge authority, including your own. Make them comfortable with difference. Use knowledge to drive out fear. Build a sense of curiosity and wonder that will keep them self-educating for life. Let them know that your love and support are unconditional. Teach and expect responsibility and maturity. Encourage self-reliance. Help them find and develop “flow” activities and lose themselves in them.
These aren’t off the top of my head, you know — they’re straight out of the best child development research, which strongly supports attachment theory and authoritative parenting, about which more later. Bottom line, the best practices for nonreligious parenting are in sync with the best practices for…parenting.
Now isn’t THAT nice.
We may have to contend with a lot of noise in our culture and even our own extended families, but when it comes to raising “confidently independent, mentally healthy” kids, the best current knowledge is on our side. And our additional hope of keeping our kids in charge of their own worldview decisions comes along in the bargain.
Conservative religious parents have to close their eyes and swim hard upstream against this research consensus, following James Dobson et al. back to the Paleolithic. But liberal religious parents, who share most of my parenting goals, have the same advantage I do. They can even claim one of the foremost advocates of attachment theory as their own — William Sears, a sane and sensible Christian parenting author who opposes almost every major parenting position of James Dobson.
I bang on and on about how and why to let our kids intersect with religion. They’re good and important questions. But every one of those questions rests on the much more fundamental question of confidence and security. Build that foundation first, and the rest is icing.
Unnatural
(Part 2, continued from “Born This Way?“)
You’ve got to be taught to be afraid / Of people whose eyes are oddly made,
And people whose skin is a diff’rent shade / You’ve got to be carefully taught.You’ve got to be taught before it’s too late / Before you are six or seven or eight,
To hate all the people your relatives hate / You’ve got to be carefully taught!–from the Rodgers and Hammerstein musical South Pacific (1949)
It’s a riveting horror — no caption required, just the immensely sad, unaware eyes of the younger girl. There’s no reason to believe they’ve embraced the messages on their shirts yet, but every reason to assume their environment is primed to lead them there.
But is it really true that we’ve got to be taught to hate those who are different from us? Answer one way and parents can simply decline to teach them to hate. Answer the other way and there’s something we need to actively do to help them avoid it.
I think we’re more naturally inclined to hate and fear difference than not. Religion isn’t the only parting gift we got from the Paleolithic. A lot of the things we are, including some of our worst pathologies, were once strongly adaptive traits. Evolution just hasn’t had time to catch up to our circumstances. As a result, we’re a whole panel of buttons waiting to be pushed. And one of the best things a parent can do is to help those buttons rust.
Before I get to that, let’s look at more of our inheritance:
GOT TO BE TAUGHT?
A million years ago, food was desperately hard to come by, and cooperation within a small group was advantageous. But cooperating with the group next door would have doubled the mouths to feed without moving the needle much on available food. Genetic tendencies toward in-group cooperation and out-group hostility would have provided a selective advantage, as would distrust of people who dressed, looked, or acted differently from you. The more different they were, the more likely their interests conflicted with yours.
Aggressive nationalism, militarism, racism, and the exaggerated fear of immigrants and of all things foreign are modern expressions of what was once a sensible approach to staying alive. But in an interdependent world, these same characteristics can be downright harmful.
BE AFRAID
It’s a sunny Wednesday afternoon a million years ago. Two Homo erectuseses are walking through the high grass on the African savannah. Suddenly there’s movement off to the left. One of them assumes it’s something fun and goes in for a hug. The other jumps 15 feet straight up and grabs a tree limb. Even if it’s just a fluffy bunny nine times out of ten, which of these guys is more likely to pass on his genes to the next generation?
In a world bent on killing you, no characteristic would have been more useful for survival than perpetual, sweaty hypervigilance. We’ve inherited a strong tendency to assume that every shadow and sound is a threat, which in turn kept us alive and reproducing. By the time elevated blood pressure killed you off at 22, you’d already have several jittery, paranoid offspring pounding espressos and cradling stone shotguns all through the long, terrifying night.
Fast forward to a world of 7 billion people in close quarters. Suddenly it’s no longer quite so adaptive to have everybody all edgy and shooty all the time. But our brains don’t know that. One of the resulting paradoxes is that fear often increases as actual danger diminishes. If you can’t see and name it, it must be hiding, you see, which is ever so much worse. Violent crime in the U.S. recently hit the lowest level since records have been kept — in every category — but who’d ever know? Instead, we take every violent news story as proof of the opposite. We insist things are worse than ever in “this day and age,” keep cradling those shotguns…and keep forwarding those urban legends.
When you get an email warning that rapists are using $5 bills or recordings of crying babies or ether disguised as perfume to lure and capture their victims, or that child abduction rates have risen 444% since 1982 — all untrue — you’ve just received a message from the Paleolithic. But by constantly naming dangers and sounding the alarm, we feel safer.
(Think for a minute about how 9/11 — a death-dealing sneak attack by the Other — pushed our collective Paleolithic button. It was a massive confirmation of our oldest unarticulated fears, and we dropped to our collective knees.)
I could go on and on. In addition to magical thinking, fear of difference, and hypervigilance, we can add categorical thinking, enforced gender divisions, the love of weapons and authority, and much more, all of which had clear adaptive advantages during the long, dark night of our species. These things are, in a word, natural.
Which is not to say good. Rape is also natural. “From an evolutionary perspective,” says biologist/philosopher David Lahti, “considering other social species on this earth, it is remarkable that a bunch of unrelated adult males can sit on a plane together for seven hours in the presence of fertile females, with everyone arriving alive and unharmed at the end of it.” Yet it happens, ten thousand times a day, because we’ve developed a frankly unnatural social morality that trumps the natural a gratifyingly high percentage of the time.
Secularism, comfort with difference, a reasonable relaxation of vigilance, the blurring of categories (sex, gender, race, etc), the willingness to disarm ourselves and to challenge authority — these are all unnatural, recent developments, born in fits and starts out of the relative luxury of a post-Paleolithic world. I’m sure you’ll agree that they are also better responses to the world we live in now — at least those of us privileged to live in non-Paleolithic conditions.
Of course our limbic brain differs on that, but it would, wouldn’t it?
Now — the astute reader may have noticed that the things that kept us alive a million years ago line up incredibly well with the nationalistic, anti-immigrant, pro-gun, pro-authority, pro-gender-role, anti-diversity talking points of social conservatives. But if you think my point is to belittle conservatives by calling them cavemen, not so. I think there’s a lot to be gained by recognizing social conservatism, including religious conservatism, as the activation of ancient and natural fears, and to respond accordingly.
My circumstances have allowed my Paleolithic buttons to remain unpushed. That’s why I’m not a social conservative. Growing up, I was made to feel safe. I was not frightened with Satan or hell or made to question my own worth or worthiness. I was given an education, allowed to think freely, encouraged to explore the world around me and to find it wonderful. Unlike the vast majority of the friends I have who are religious conservatives, I never passed through a disempowering life crisis — a hellish divorce, a drug or alcohol spiral, the loss of a child — that may have triggered that feeling of abject helplessness before I had developed my own personal resources. So I never had to retreat into the cave of my innate fears.
In short, I’ve been lucky.
A lot of people with the same luck are religious. But in my experience, those strongly tend toward what Bruce Bawer has called the “church of love” — the tolerant, diverse, justice-oriented side of the religious spectrum, grounded in a more modern perspective but still responding to the human problem that science, admittedly, has only partly solved.
It’s rare for a person with all of the advantages listed above to freely choose the “church of law” — the narrow, hateful, Paleolithic end we rightly oppose. Those folks, one way or another, are generally thrown there, like the girls in the photo. Sometimes they find their way out, but their road is tougher than mine was.
Seeing things this way has made me more empathetic to conservative religious believers, even as I oppose the malign consequences of their beliefs. Understanding our natural inheritance also makes me frankly amazed that we ever do anything right. Given the profound mismatch between what we are and what the world is, we should all have vanished in a smoking heap by now. Instead, we create art and cure disease and write symphonies and figure out the age of the universe and somehow, despite ourselves, hang on to an essentially secular government in a predominantly religious country.
Okay, I just have to stop writing, even though I haven’t reached the punchline — what this all means for parents. So there will be a Part 3.
[EDITOR’S NOTE: After further research and smart reader input, I’ve yanked the section “Every Sperm is Sacred” from this post, which was based on hypotheses that have apparently been superseded. Science marches on!]
Born this way?
It is an interesting and demonstrable fact that all children are Atheists, and were religion not inculcated into their minds they would remain so…[T]here is no religion in human nature, nor human nature in religion. It is purely artificial, the result of education, while Atheism is natural, and, were the human mind not perverted and bewildered by the mysteries and follies of superstition, would be universal. —ERNESTINE ROSE, “A Defence of Atheism” (1861)
Boy do we secular parents love us a quote like that. It says my atheism is just a return to my natural condition, a rejection of something artificial that had been blown into my head by human culture. Like!
But in the last few years, I’ve come to think of the idea that we are born atheists as a seriously misleading one, and correcting it as Job One for secular parents.
It’s obviously true that we are born without religious belief. But this equates to what is called weak or negative atheism, the simple absence of belief in a god or gods. But what about the other major assertion there — that without inculcation, that absence would remain?
This gets at the very basic question of what religion is. The Rose quote implies that it’s a cultural construction, pure and simple. But if Ernestine Rose was right and atheism is so damn natural, why is the inculcation of religion received so eagerly and pried loose with such difficulty?
I’ve spent years chasing this question through the work of EO Wilson, Pinker, Boyer, Dennett, Diamond, Shermer and more. The result has made me less angry and frustrated and more empathetic toward the religious impulse, even as I continue to find most religious ideas both incorrect and problematic. It has also deeply informed my secular parenting in a very good way. Yet I’ve never expressed it out loud until a few months ago, when I reworked part of my parenting seminar to include it.
Thinking about religion anthropologically has made me a better proponent of my own worldview, a more effective challenger of toxic religious ideas, and a much better secular parent.
Why (the hell) we are the way we are
If you want to understand why we are the way we are, there’s no better place to look than the Paleolithic Era (2.4 million years ago – 11,000 years ago). Over 99.5 percent of the history of the genus Homo — 120,000 generations — took place during the Paleolithic. For the last 10,000 of those generations, we were anatomically modern. Same body, same brain. The brain you are carrying around in your head was evolved in response to conditions in that era, not this one. The mere 500 generations that have passed since the Paleolithic ended represent a virtual goose egg in evolutionary time.
To put it simply: we are born in the Stone Age. Childhood is a period during which we are brought — by parenting, experience, and education — into the modern world. Or not.
So if we were evolved for the Paleolithic, it seems worth asking: What was it like then? In short, it sucked to be us.
In the Lower Paleolithic, starting around 2.4 million years ago, there were an estimated 26,000 hominids on Earth. The climate was affected by frequent glacial periods that would lock up global water, leading to severe arid conditions in the temperate zones and scarce plant and animal life, making food hard to come by.
The average hominid life span was about 20 years. We lived in small bands competing for negligible resources. For two million years, our genus was balancing on the edge of extinction.
Then it got worse.
About 77,000 years ago, a supervolcano erupted in what is now Lake Toba in Indonesia. On the Volcanic Explosivity Index, (apparently created by a seven-year-old boy), this eruption was a “mega-colossal” — the highest category. Earth was plunged into a volcanic winter lasting at least a decade. The human population dropped to an estimated 5,000 individuals, each living a terrifying, marginal existence.
Now remember that these humans had the same thirsty and capable brain you and I enjoy, but few reliable methods for filling it up. The most common cause of death was infectious disease. If someone is gored by a mammoth, you can figure out how to avoid that in the future. But most people died for no apparent reason. Just broke out in bloody boils, then keeled over dead.
Imagine how terrifying such a world would be to a mind fully capable of comprehending the situation but utterly lacking in answers, and worse yet, lacking the ability to control it. It’s not hard to picture the human mind simply rebelling against that reality, declaring it unacceptable, and creating an alternate reality in its place, neatly packaged for the grateful relief of subsequent generations.
The first evidence of supernatural religion appears 130,000 years ago.
Religion solves our central problem: that we are human (to quote Jennifer Hecht), and the universe is not. It’s not really about explanation or even comfort, not exactly. It’s about seizing control, or at least imagining we have. To be fully conscious of our frailty and mortality in a hostile and indifferent universe and powerless to do anything about it would have been simply unacceptable to the human mind. So we created powerful beings whom we could ultimately control — through prayer, sacrifice, behavior changes, ritual, spinning around three times, what have you.
Conservative, traditional religion is a natural response to being human in the Paleolithic. Whether it was a good response or not is beside the point — it was the only one we had.
But we’re not in the Paleolithic anymore, you say. You certainly have the calendar on your side. We began to climb out of our situation about 500 generations ago when agriculture made it possible to stand still and live a little longer. Eventually we had the time and security to develop better responses to the problem, better ways of interrogating and controlling the world around us. But the Scientific Revolution, our biggest step forward in that journey, was just 20 generations ago. Think of that. It just happened. Our species is still suffering from the post-traumatic stress of 120,000 generations in hell. And like the battle veteran who hits the dirt when he hears a backfiring car, it takes very little to push the Paleolithic button in our heads.
Yes, your kids are born without religious belief. But they are also born with the problem of being human, which includes a strong tendency to hit the dirt when the universe backfires. One of the best things a secular parent can do is know that the Paleolithic button is there so we can help our kids resist the deeply natural urge to push it.
(Part 1 of 3. Go to Part 2.)
Hitchens’ best moment
Christopher Hitchens (1949-2011) has had a profound influence on me for years. It’s hard to think of a greater artist with the language or a more incisive thinker. He took a different approach than I do to religion and atheism, but it irritates the crap out of me when interviewers set me up as a nice-guy foil to the Horsemen. It’s not an either-or. Hitchens speaks to me, and often for me, while I’m busy reaching across aisles. I wouldn’t for a minute want to do without that voice. And when his conclusions were different from mine, he gave me serious pause. It’s damn hard to wave Hitchens away with a casual hand.
When my son Connor told me this morning that Hitchens had died, my mind went straight to what I think is his greatest moment — not one of his debates, and not a written polemic. It was what he did when he was wrong.
Several times, including in an article in Slate in late 2007, Hitchens defended U.S. interrogation methods in the “War on Terror,” saying they fell short of torture. Instead of just bloviating for applause, he agreed to test his claim by undergoing the experience himself. He relented in mortal terror after 16 seconds, then went on to write a Vanity Fair piece titled, “Believe Me, It’s Torture.”
Several liberal commentators went all John 20:29 on him at the time, saying duh, they figured out it was torture without getting under the towel themselves. A lot of conservative fans of the technique apparently need, but for some reason decline to volunteer for, the experience.
Hitchens made a false claim, then put his money where his mouth was, changed his mind, and gave me a lesson in intellectual integrity I won’t forget. It’s one of many gifts from Hitchens that I’m grateful for.