Inside Charlie’s Playhouse
Guest column by Kate Miller
President, Charlie’s Playhouse
KATE MILLER is a mother and scientist with a PhD in demography from U Penn, and a Masters in Public Health from Columbia University. In response to the terrible scarcity of toys and games to help kids understand evolution, she launched CHARLIE’S PLAYHOUSE this very month. I wrote a brief but glowing review of the company for Raising Freethinkers. In this column, Kate describes the process that led to the creation of this exciting and brilliantly-conceived resource for science-jazzed parents and their lucky kids.
Dinosaur-mania washed over my two boys a couple of years back, and in its wake came some wonderful discussions about evolution, natural selection and Charles Darwin. We turned toy boats into the Beagle and sailed around the playroom collecting plastic animals for inspection. We unfurled a roll of paper on the floor and drew ancient animals along a billion-year timeline.
Delighted by their interest, I went online one day in search of some educational games or toys on the subject. I easily found fun stuff for kids about physics, chemistry, astronomy and every other branch of science you can think of, but nothing on evolution. Yes, some wonderful children’s books about evolution, and some great videos for grownups, but no toys, no manipulatives, nothing involving physical movement or the sheer insane joy of the history of life on this planet.
I dug deeper into the market. I checked out natural history museums, suppliers of teaching materials, professional biology associations. Nothing. I made phone calls, read toy industry publications, inquired at specialty stores. Nothing. Some toys that focus on the natural world walk right up to an invisible line but will not cross over to actually use the words “evolution” or “natural selection.” Even the vast dinosaur-industrial complex doesn’t touch it. Check out the next dino toy you pick up.
My curiosity rose, along with my indignation. Why is there no infrastructure for presenting evolutionary ideas to young children? No doubt it’s due to political concerns in corporate America, yet for most people evolution does not contradict their beliefs in any way. Many parents who have been looking for evolution-themed toys have found their way to me; these parents are religious, they are secular, they are homeschoolers, they are mainstream, they are everyone. Why should this majority be deprived of educational fun stuff for their kids because of the few who politicize the issue? At the very least, kids have to be aware of evolutionary ideas for the same reason that they need to know about religion: it’s basic cultural literacy.
I also discovered that our national science standards recommend that students should not be exposed to evolution until high school, or middle school at earliest. I was raised in a household where evolution was normal, like gravity, so hearing about evolution for the first time in high school strikes me as odd, like learning that the Earth revolves around the Sun sometime around your junior prom. As member of the standards panel later told me over coffee, that recommendation was driven not by children’s inability to grasp the concepts but by elementary teachers’ discomfort with the material.
So my kids and I stumbled upon this vacant market niche, and I had what one friend calls “the entrepreneurial seizure.” Against my better judgment, we decided to start a business. Of course I hope to make a buck with this venture (wouldn’t it be nice if the kids get to go to college?) but I also hope to contribute to the scientific literacy of future generations. Oh, and also have some laughs with the kids along the way. So here is Charlie’s Playhouse. Welcome!
Put down the knife! Now back away, slowly…
The leaves are falling, temperatures are falling…and foreskins, apparently, are falling as well.
Circumcision is in the air! I received two emails recently asking for my thoughts on the procedure, both from fathers who are making the decision soon for a newborn son. Then yesterday I came across a very thoughtful post about it on the Domestic Father blog. He says most of what I would say, but I’ll go on the record here as well.
We had our son circumcised, and I wish we hadn’t. The question just snuck up on me in the form of a nurse and a clipboard when I was exhausted. “Most people do,” she said. Baaaaaa, I replied.
It was originally a religious ceremony, a (quite strange, if you think about it) symbol of faithfulness to God. But interestingly, circumcision was not common outside of Jewish and Muslim practice until the 1890s, when a few religious enthusiasts, including the strange character JH Kellogg, recommended it as a cure for “masturbatory insanity.” Kellogg spent much of his professional effort combating the sexual impulse and helping others to do the same, claiming a plague of masturbation-related deaths in which “a victim literally dies by his own hand” and offering circumcision as a vital defense. “Neither the plague, nor war, nor small-pox, nor similar diseases, have produced results so disastrous to humanity as this pernicious habit,” warned a Dr. Alan Clarke (referring to masturbation, not circumcision).
Given these jeremiads by well-titled professionals, the attitudes of American parents in the 1890s turned overnight from horror at the barbarity of this “un-Christian” practice to immediate conviction that it would save their boys from short and insane lives. It was even reverse-engineered as a symbol of Christian fidelity and membership in the church.
(Isn’t it a relief that we’ve left this kind of mass gullibility so very far behind?)
The supposed health benefits and other red herrings were created after the fact, in the early 20th century, to undergird sexual repression with a firm foundation of pseudoscience.
Anyone interested in the non-pseudo variety might look to the Council on Scientific Affairs, the American Medical Association, and dozens of similar organizations around the world who have issued statements calling the practice of circumcision “not recommended” because of associated risks. Others, including the British Medical Association, have articulated a slight possibility of slight benefits. Even so, The U.S. is the only remaining developed country in which the practice is still somewhat common — though many American HMOs no longer cover it.
The practice almost completely ended in the UK with the publication of a 1949 paper noting that 16-19 infant deaths per year were attributable to complications from the procedure.
One of my correspondents told me that “all the doctors we talk to say that it doesn’t matter one way or the other.” This seems to answer the question. No invasive medical procedure should be undertaken that does not have demonstrable benefits.
Add to that the strong possibility that sexual sensitivity is diminished, and I’d advise against it. It’s a form of genital mutilation, after all — just a more familiar one.
There’s also no rush. The boy can choose to go under the knife at 18 if he wishes. Considering just how likely that is should give any parent serious pause before greenlighting a pointless ritual relic when he’s an infant.
Best Practices 1: Widening circles of empathy
[First in a nine-part series on best practices for nonreligious parenting.]
“I feel your pain.”
–BILL CLINTON at a campaign rally in 1992“We need to…pass along the value of empathy to our children. Not sympathy, but empathy – the ability to stand in somebody else’s shoes; to look at the world through their eyes.”
–BARACK OBAMA in a speech on Father’s Day 2008
In the Preface of Raising Freethinkers I offer a list of nine best practices for nonreligious parenting. The list is drawn largely from the growing consensus of nonreligious parents and grounded when possible in the social and developmental sciences. Between now and the release, I’ll try to draw attention to all nine. They are not commandments but an attempt to capture the consensus regarding effective practices. They’re intended to be the starting point of the conversation, not the end, carved in butter, not stone. So grab a spatula and shape away!
In today’s “On Language” column in the New York Times Magazine, William Safire identifies empathy as one of the buzzwords of the current campaign. He notes that the issue of whether a given candidate could really empathize with everyday folks is nothing new. George H.W. Bush was (unfairly, but effectively) excoriated for not knowing the price of a gallon of milk in 1992. John McCain’s uncertain number of houses is assumed to undercut his empathy quotient, as Obama’s Ivy education and taste for arugula are said to undercut his.
Safire echoes Obama’s distinction between empathy and sympathy:
If you think empathy is the synonym of sympathy, I’m sorry for your confusion. Back to the Greeks: pathos is “emotion.” Sympathy feels pity for another person’s troubles…empathy identifies with whatever is going on in another’s mind…The Greek prefix sym means “together with, alongside”; the verbal prefix em goes deeper, meaning “within, inside.” When you’re sympathetic, your arm goes around the shoulders of others; when you’re empathetic, your mind lines up with what’s going on inside their heads. Big difference.
We talk about empathy as if it’s either something magical or something that can be willed into existence by saying, in essence, “Feel empathy! It’s what good people do.” Empathy is neither as easy nor as hard as we make it seem.
One school of thought in psychology (Carolyn Zahn-Waxler, Nancy Eisenberg, et al.) suggests infants are largely self-centered, putting the first twitches of empathy between 18 and 36 months. Another (led by Harry Stack Sullivan, Martin Hoffman and others) has recently made a case for “infantile empathy” toward the mother — something that would certainly make sense.
In either case, by age three, kids are reliably exhibiting empathy, which Eisenberg defines as “an affective response that stems from comprehension of another’s emotional state or condition and is similar to what the other person would be expected to feel.”
That sentence might ring some bells if you’ve followed the recent work on mirror neurons. I wrote about this in July of last year:
In your head are neurons that fire whenever you experience something. Pick up a marble, yawn, or slam your shin into a trailer hitch, and these neurons get busy. No news there. But these neurons also fire when you see someone else picking up a marble, yawning, or slamming a shin. They are called mirror neurons, and they have the powerful capacity to make you feel, quite directly, what somebody else is feeling…The implications are gi-normous, since it means we’re not completely self-contained after all…
It takes very little to see, in this remarkable neural system, the root of empathy, sympathy, compassion, conscience, cooperation, guilt, and a whole lot of other useful tendencies. It explains my kids’ tendency to wither under disapproval…Thanks to mirror neurons, the accused feels the condemnation all the more intensely. Empathizing with someone else’s rage toward you translates into a kind of self-loathing that we call guilt or conscience. Once again, no need for a supernatural agent.
So what are those “ever-wider circles” about?
Our natural tendency is to feel empathy for those who are most like us. Empathy extends outward from Mom to the rest of the family to the local tribe — all those who look and act essentially like us. And I’d argue that moral development is measurable in part by how far outward your concentric circles extend. I encourage my kids not just to think about how a person of a different gender, color, nationality, or worldview feels or thinks, but to see themselves in that person — to get those mirror neurons dancing to the tune of a shared humanity.
And why stop at the species? One of the biggest implications of evolution is a profound connectedness to the rest of life on Earth. As a recent interviewer put it, “It seems like you could be positively paralyzed” by the realization that walking the dog, eating a burger, and climbing a tree is literally walking, eating, and climbing distant cousins. True enough.
I applaud religious ideas that reinforce and sanctify connectedness, as well as seeing self in others. “See the Buddha in all things” is an example. “Love your neighbor as yourself” is another. But so many traditional ideas — religious, cultural, political — instead draw lines between people, defining in-groups and out-groups and outlining colorful punishments for those on the wrong side of that line. Having “dominion over the earth” doesn’t help matters, and Deuteronomy and Revelation are dedicated almost entirely to defining, judging, and annihilating the hated Other. Bad news for empathy, don’t you think?
Free of religious orthodoxy, nonreligious and progressive religious parents alike can encourage their kids to push the concentric circles of their empathy as far and wide as possible. That includes, of course, people who believe differently from us. I don’t have to buy what their selling, nor do I have to refrain from challenging it. But I want my kids to work hard at understanding why people believe as they do. And if I expect it of them, I damn well better achieve it myself. Sometimes I do all right at that. Other times…meh.
So then…how are y’all doing with empathy for religious believers?
the legend of squishsquish
“Can I read you something from my Monster Museum book?”
I said sure, not knowing that we were launching a mini-obsession that so far has lasted a week. Delaney (6) flipped to the back of the book, which offers a short “bio” of each monster mentioned in the bad kiddie poetry that fills the rest of the book.
“‘BIGFOOT,’” she read. “‘Called Squishsquish in North America…’ Squishsquish?”
“Oh. Sasquatch.”
“You know about Bigfoot?” she asked, mighty impressed.
“A little,” I said. “Keep going. I want to hear.”
“‘Called Sasquatch in North America and Yeti in Asia. A huge, hairy, shy creature. Bigfoot prefers mountains, valleys, and cool weather. Many people claim to have seen and even photographed Squish…Squishkatch or Yeti or his footprints, but so far, no one has had a conversation with him.’ Haha! That’s funny.”
Each biographical entry has a little cartoon picture of the beast in question, with the exception of Bigfoot. We apparently know what a banshee looks like, and The Blob, and a poltergeist. But when it comes to Bigfoot, they simply put a question mark. I’m willing to bet it was the question mark that drew her attention to Bigfoot.
“If people took pictures,” she asked, “why is there a question mark?”
“I don’t know.”
“So is he real?”
“Some people think so, and some people think it’s a fake. Wanna see the pictures?”
We Googled up a few choice photos. Delaney gasped, launching into an enthralled monologue as I took furtive notes:
“It would be so interesting if Bigfoot was real. I really wonder if he is. It would be so cool if he was real! But maybe the picture is somebody in a gorilla suit. And maybe somebody went out with a big footprint maker and made footprints in the woods. Or maybe it’s real. But I’ll bet if he is real, he’s nice.”
“Why?”
“Because if he was mean, he’d be attacking people, and then we’d know he exists! But…there can’t be a person that big in a costume, so it seems like he has to be real somehow. Even the tallest person isn’t that tall.”
“So do you think it’s probably real, or probably not?”
She paused and thunk. “I’m not sure. I’m really, really not sure. I’ll bet scientists are trying to figure out. It’s just so cool to think about. It makes you curious.”
Yesterday she had a friend over—the pseudonymous Kaylee of a long-ago post—and dove right into the quest as I quietly transcribed the conversation on my laptop:
DELANEY: Have you ever heard of Bigfoot?
KAYLEE: No. What’s Bigfoot?
D: You have got to see this. You have got to see this. [Types BIGFOOT into Google.] Look, there it is. It’s called Bigfoot, but some people say Squishsquish.
K: What is it?!
D, with didactic precision: Some people say it’s like a gorilla man who lives in the forest. But you don’t have to worry. He wouldn’t be in any forest near us. Some people think it’s not even real.
K: So that’s Bigfoot??
D: Well it’s a picture.
K: So Bigfoot is real!
D: Nope, we don’t know that for sure. (Reads from website.) “An appeal to protect Bigfoot as an in-danger species has also been made to the U.S. Congress.”
K, (reading ahead): Look, it says right here, “Bigfoot is not real.” So he’s not real.
D: But we don’t know for sure. That’s just what the person says who has that website. That doesn’t make it for sure.
[Laney switches to image search, pulling up a full page of yetis.]
K: I hope it isn’t real. That would be so scary.
D: I hope it is. It would be so cool!
K: (looking at one photo): Does he only live in snow?
D: No look, there are pictures with no snow. It seems like he would hibernate. I wonder what he would eat.
K: Probably people.
D: I just wonder everything about him. Doesn’t it just make you so curious?
K: No. It makes me freaky.
I have a favorite particular moment in that dialogue–I’ll let you guess. But my favorite thing overall is Laney’s Saganistic approach to knowledge. Just as Carl Sagan wanted more than anything for intelligent life to exist elsewhere in the universe, Laney really wants Bigfoot to be real. It would, in both cases, be “so cool.” But that has no effect on her belief, or his, that the beloved possibility is real. Neither can see much joy or point in pretending that a wish makes it so. Both are happy to wait for the much greater thrill of knowledge, of the discovery that something wonderful turns out to be not just cool, but true.
is nothing sacred? epilogue
I recently offered my thoughts on the difference between pointless and pointful challenges to sacredness:
Why does the David Mills video I’ve denounced strike me instantly as a profoundly stupid gesture, while [Webster Cook’s removal of a communion wafer from a mass] strikes me just as instantly as an interesting and thought-provoking transgression?
The reason, I think, is that the act of crossing the church threshold with that wafer (whether he intended this or not) is a kind of Gandhian gesture. Doing something so seemingly innocuous and eliciting an explosive, violent, even homicidal response is precisely the way Gandhi drew attention to cruel policies and actions of the British Raj, the way black patrons in the deep South asserted their right to sit on a bar stool, while whites (enforcing a kind of sacred tradition) went ballistic….
Mills’ feces-and-obscenity-strewn video, on the other hand, had offense not as a byproduct but as its intentional essence. Of Cook, one can say, “he just walked out the door with a wafer,” and the contrast with the fireworks that followed is clear. But saying, with sing-song innocence, that Mills was “just smearing dogshit on a book while swearing, gah,” doesn’t achieve quite the same clarity. Even though it shares the act of questioning the sacred, it’s much less interesting and much less defensible.
When PZ Myers of the science blog Pharyngula made known his intention to desecrate a communion wafer, I held my breath a tad, wondering which way it would go. Would he do something stupid or something thought-provoking? Pointless or pointful?
Now Myers has made his gesture — and I couldn’t be more thrilled:
This fascinates me even more than Wafergate because it is so achingly close to the Mills’ video on the surface, yet light years away in substance.
Had Myers theatrically smashed a pile of communion wafers with a hammer while laughing hysterically, he’d have undercut his own point that it is just a “frackin’ cracker.” Instead, he made use of that old and brilliant insight that the opposite of love is not hate but indifference.
So he quite simply threw it out, along with the coffee grounds.
Granted, he put a nail through it, a subtle and ironic comic touch that I’m doomed to love. But the real brilliance is in the background. Myers has also thrown out pages of the Koran and The God Delusion. He isn’t allowing anything to be held sacred. ALL ideas must be exposed to disrespect, disconfirmation, and disinterest. The good ones can take the abuse, and the bad ones, to quote Twain, will be “[blown] to rags and atoms at a blast.” If instead we shield a set of beliefs or ideas from scrutiny or attack, the bad bits survive along with the good.
Myers is also making the important point that these are NOT ideas in the garbage — they are paper and wheat, which must not be confused with the things they represent any more than a flag should be revered in lieu of the principles for which it stands.
Toss in a wink at Ray Comfort’s banana argument against atheism and the whole tableau simply rocks with meaning, power, humor and intelligence. And pointfulness.
Myers’ post is long, but please take a few minutes to read it. I can’t recommend it highly enough for its provision of context and just plain smarts. The final paragraph drives it all home:
Nothing must be held sacred. Question everything. God is not great, Jesus is not your lord, you are not disciples of any charismatic prophet. You are all human beings who must make your way through your life by thinking and learning, and you have the job of advancing humanities’ knowledge by winnowing out the errors of past generations and finding deeper understanding of reality. You will not find wisdom in rituals and sacraments and dogma, which build only self-satisfied ignorance, but you can find truth by looking at your world with fresh eyes and a questioning mind.
When it comes to challenging sacredness, if I can get my kids to grasp the difference between Mills and Myers, I’ll count myself proud.
no cats were harmed
- July 24, 2008
- By Dale McGowan
- In myths, Science
- 23
Mother: “Don’t ask so many questions, child. Curiosity killed the cat.”
Willie: “What did the cat want to know, Mom?”
—The Portsmouth Daily Times, March 1915
An interesting character pops up in religion and folklore around the world and throughout history: the curious and disobedient woman. Here’s the story: A god/wizard gives a woman total freedom, with one exception—one thing she must not do/eat/see. She battles briefly with her curiosity and loses, opening/eating the door/jar/box/apple and thereby spoiling everything for everybody.
Curiosity didn’t just kill the cat, you see. It unleashed disease, misery, war and death on the world and got us evicted from a Paradise of blank incuriosity and unthinking obedience.
Bummer.
That this cautionary tale is found in religions worldwide leads me once again to conclude that religion isn’t the source of human hatreds, fears, and prejudices—it’s the expression of those fundamental human hatreds, fears, and prejudices, the place we put them for safekeeping against the sniffing nose of inquiry. And since the story includes three things powerfully reviled by most religious traditions (curiosity, disobedience, and women), it’s not surprising to find them conveniently bundled into a single high-speed cable running straight to our cultural hearts.
I could do pages on Eve alone and her act of disobedient curiosity with the fruit of the tree of knowledge of good and evil (Was she really punished for wanting to know the difference between right and wrong, or just for disobedience? How could she know it was wrong to disobey if she didn’t yet have knowledge of good and evil? etc). Then there’s Lot’s wife, poor nameless thing, a woman (check!) who was curious (check!) and therefore disobeyed (check!) instructions to not look back at her brimstoned friends and loved ones. Islam even coined a word for a disobedient woman – nashiz – and decreed a passel of human punishments for her in sharia law.
But neither Eve nor Mrs. Lot was the first nashiz woman to cross my path. Lovely, nosy Pandora was my first.
Pandora was designed for revenge on humanity by the gods, who were angry at the theft of fire by Prometheus. According to Hesiod, each of the Olympians gave her a gift (Pandora = “all-gifted”). She was created by Hephaestus in the very image of Aphrodite (rrrrrrowww). Hermes gave her “a shameful mind and deceitful nature” and filled her mouth with “lies and crafty words.” Poseidon gave her a pearl necklace, which (unlike the deceitful nature, for example) was at least on her registry. But the real drivers of the story were the last two gifts: Hermes gave her an exquisitely beautiful jar (or box) with instructions not to open it, while Hera, queen of the gods, blessed her with insatiable curiosity.
Nice.
Long story short, once on Earth, Pandora’s god-given curiosity consumed her and she opened the jar/box, releasing war, disease, famine, and talk radio into the world. Realizing what she had done, she clamped the lid on at last, with Hope alone left inside.
(This is usually interpreted as Hope being preserved for humankind as a comfort in the face of the terrors, but even at the age of ten I realized that by trapping Hope in the jar, she kept it out of the world. Are there no mythic traditions with continuity editors?)
This week I came across the anti-curiosity tale in yet another form, one I’d never seen before–a Grimms’ fairy tale called Fitcher’s Bird:
Once upon a time there was a sorcerer who disguised himself as a poor man, went begging from house to house, and captured beautiful girls. No one knew where he took them, for none of them ever returned.
One day he came to the door of a man who had three beautiful daughters. He asked for a bit to eat, and when the oldest daughter came out to give him a piece of bread, he simply touched her, and she was forced to jump into his pack basket. Then he hurried away with powerful strides and carried her to his house, which stood in the middle of a dark forest.
He gave her everything that she wanted. So it went for a few days, and then he said to her, “I have to go away and leave you alone for a short time. Here are the house keys. You may go everywhere and look at everything except for the one room that this little key here unlocks. I forbid you to go there on the penalty of death.”
He also gave her an egg, saying, “Take good care of this egg. If you should lose it, great misfortune would follow.”
She took the keys and the egg, and promised to take good care of everything.
As soon as he had gone she walked about in the house, examining everything. The rooms glistened with silver and gold. She had never seen such splendor.
Finally she came to the forbidden door. She wanted to pass it by, but curiosity gave her no rest. She put the key into the lock and the door sprang open.
What did she see when she stepped inside? A large bloody basin stood in the middle, inside which there lay the cut up parts of dead girls. Nearby there was a wooden block with a glistening ax lying on it.
She was so terrified that the egg slipped from her hand into the basin. She got it out again and wiped off the blood, but it was to no avail, for it always came back. She wiped and scrubbed, but she could not get rid of the stain.
Not long afterward the man returned from his journey and asked for the key and the egg. She handed them to him, and he saw from the red stain that she had been in the blood chamber.
“You went into that chamber against my will,” he said, “and now against your will you shall go into it once again. Your life is finished.”
He threw her down, dragged her by her hair into the chamber, cut off her head, then cut her up into pieces, and her blood flowed out onto the floor. Then he threw her into the basin with the others.
It gets worse, believe it or not, this charming children’s tale. Now I have to go away and leave you for a short time. You may read anything you wish on the Internet, but you may NOT, on pain of death, click on this link to read the rest of the story.
Given this glimpse into our cultural terror of curiosity, is it any wonder that religion and science are so often at loggerheads? One is fueled by the very thing the other has traditionally feared—the opening of interesting and forbidden jars.
_______________
NOTES
- Fitcher’s Bird is very closely related to (and probably the source of) the tale of Bluebeard.
- The phrase “Curiosity killed the cat” is in fact a much later corruption of the original “Care (i.e., worry) will kill a cat,” which appears in a Ben Jonson play of 1598.
thinking by druthers 2
[Second installment in a series on confirmation bias. Back to Part 1.]
An audience member at my Austin talk asked a good and common question. In The End of Faith, Sam Harris apparently made the case that those who do not hold religious beliefs must be willing to challenge the irrational beliefs of their friends and neighbors. (I say “apparently” because I started but didn’t finish EOF. I am the choir, he had me at hello, and I had other fish to fry.)
“So,” asked Audience Guy, “do you agree that we should more actively challenge the irrational beliefs of friends and neighbors?”
I said no.
I know this will strike a lot of y’all as heresy, and it depends on the relationship in question — but I don’t think we should make a general practice of confronting people we know and challenging their beliefs uninvited. I am opposed to aggressive evangelism of ALL kinds. And not because it isn’t “nice.” The reason is that uninvited personal critiques of belief, especially of irrational ones, are almost never effective. Of the scores of people I know who have given up religious beliefs, approximately zero did so as the result of an uninvited challenge by another person.
There are all sorts of things we can and should do to make it more likely that they challenge themselves, but you can’t force another person to think. You can help another person become curious enough to invite the discussion, in part by being a visibly contented nonbeliever yourself. Once you have an invitation from the other side, a lot is possible. Otherwise, forget it.
“But but but…I have such a great argument!” You crack me up. Sit down and listen. The very idea of argumentation is based on the premise that you’re after the truth. It works brilliantly when a person is convinced of the virtues of the scientific method, convinced that there is nothing so beautiful as reality and nothing so ugly as self-deception.
But traditional religious belief isn’t arrived at by a critical determination to avoid error. It is arrived at by the focused determination to confirm one’s biases. Now, quite suddenly, you are asking a person to switch pole stars — to reorient his or her entire way of thinking from confirmation bias to a love of reality wherever it lies.
You’re funny. No no, in a good way.
“It is useless to attempt to reason a man out of a thing he was never reasoned into,” said Jonathan Swift, supposedly. If you have ever tried to argue a religious point with a fervent believer, only to see the goalposts move and terms redefine themselves in midair, you know what he was talking about. But you may not have known why: the other person is working from an entirely incompatible operating system. Stop being surprised that he can’t open your attachments.
A lifetime of cherry-picking evidence on the basis of its confirmation value rather than assessing its value as evidence can lead people into unintentional hilarity. The more they surround themselves with nodding people who are busily confirming the same biases, the more hilarious it gets. The nonreligious are by no means excluded from this disease — more on that in part 3. But traditional religion, founded as it was on the principle of confirmation bias, is an especially fun source of rib-tickling.
During some down time in my room before my May presentation at the Center for Inquiry in Amherst NY, I indulged in one of my favorite masochistic pastimes: watching EWTN, the Global Catholic Network. A panel discussion was under way, and a priest was going off on the evils of condoms, of homosexuality, of abortion — anything, really, other than unprotected-face-to-face-one-man-on-top-of-one-woman-he-is-married-to-resulting-in-baby-sex. (You know…like the kind priests have.) There was never a risk that the rest of the panel would do anything but nod, so of course his statements got ever-stranger and ever-less-supportable.
Finally he hit bottom. “And why do you think there is a priest shortage?” he asked. “That’s right: abortion! Nothing could be more obvious.”
Nod, nod, nod.
The next topic was end-of-life care. “Too many doctors are woefully ignorant of Catholic bioethics,” said an expert on, presumably, Catholic bioethics. “They will, for example, pull the plug on a patient merely because all brain activity has ceased.”
Nod, nod.
“What they fail to realize is that the suffering of the body in those final hours may be necessary to get that person into Heaven.”
Nod, nod.
“By denying the person that suffering, the doctors, in their ignorance, may be contravening God’s will by denying a chance at redemption.”
Nod, nod.
“And by moving so quickly, they may be denying God the chance to intervene miraculously to bring that person back.”
Nod, nod.
These are very close to verbatim. I was writing as fast as my little paw could push the pen.
An outsider looks at such a fatuously silly misuse of the neocortex with astonishment — and out spill the arguments. Wasn’t the plug contravening God’s will, and the removal of the plug restoring God’s intended situation? Does God, who exists outside of time and space, actually need “time” to perform a miracle? How much, exactly? Yes, yes, yes. Fine.
But those around her are having their own biases confirmed — so nod go the many heads, and she digs deeper and deeper for nonsense.
WE ALL DO THIS, myself included, as noted in the last installment. The key is to make yourself vulnerable to disconfirmation, to be in the room with people who will call you on it when you make a bias error, and to be properly embarrassed when it happens.
Need more? Enjoy this, remembering all the while that the arguments apply only to bananas — especially at 0:19, 0:41, and 0:51:
“Seriously, Kirk,” he says — which is how you know he’s serious.
Yes, fine, these are fairly extreme examples. But I think the essence of religious thought as confirmation bias is nicely captured, as is the essence of the difference between religion and science. Next time I’ll finish up by showing what it is that makes science work differently. And psst…it isn’t the superior moral or even intellectual fiber of scientists.
[On to Part 3.]
thinking by druthers 1
First installment in a series on confirmation bias.
“I disagree with what you’re saying, frankly. Strongly disagree.”
I guess I ought to delight in this kind of challenge, critical thinking enthusiast that I am. But I’m a chimp, too, which means instead of delighting, I have to suppress an urge to fling feces and hoot.
The disagreement came from a gentleman in one of my early seminars. I had suggested we allow our kids to try on different worldviews without pushing one direction or another. I put it this way in an earlier post:
I encourage my kids to try on as many beliefs as they wish and to switch back and forth whenever they feel drawn toward a different hat, confident that in the long run they will be better informed not only of the identity they choose, but of those they have declined. Were I to disown my kids each time they passed through a religious identity, I’d have to keep a lawyer on retainer.
He didn’t like this one bit. “Children need to be made to recognize the difference between faith-based thinking and EVIDENCE-based thinking,” he said. “They need to hear the word EVIDENCE from the very earliest age, as often as possible. ‘What’s your EVIDENCE? What is the EVIDENCE for that?’ Allow them to ‘try on the hat’ of mythical thinking and they just might not take it off!”
Hoo boy.
I gave my usual answer about having confidence in reason, but I knew there was more to it than that. I know my kids really well, and despite my failure to sprinkle the word EVIDENCE throughout my parenting, I know that all three would laugh at the idea that an opinion without evidence is worth squat.
One anecdotal exception doesn’t disprove his assertion, of course. Maybe my kids lucked into their rational hats despite my dippy incompetence. But I had the nagging feeling that this guy had made a more fundamental error — and that night, on the plane home, I realized what it was.
The evidence-free worldview is a straw man. A myth.
It’s the rare believer indeed who tethers belief to faith alone. Religious folks have evidence to support their beliefs — mountains and mountains of evidence. No one says, “I have absolutely no evidence for the existence of God, but I believe anyway.” If the man in the seminar were to offer his challenge (“What’s your EVIDENCE?”) to these folks, they’d offer the human eye, a sunrise, a seemingly answered prayer, a feeling of transcendence, a near-death experience, the Bible, a random act of kindness, Mother Teresa, “the starry heavens above and the moral universe within.” These add up to evidence of a particular kind: bad. It’s all gift-wrapped and insured by statements of faith, but it’s also evidence.
I’m not playing word games here. If we really want to understand the difference, it’s crucial to recognize that both the religious and scientific worldviews are evidence-based. That science does so well at uncovering reality and religion does so poorly is mostly due to the different ways in which the two approaches handle evidence.
The scientific method is largely devoted to neutralizing a single fallacy called confirmation bias — our strong tendency to find and collect whatever evidence supports our preconceptions and desires while ignoring the rest. Francis Bacon and the rest didn’t invent the idea of evidence — they laid the foundation for a systematic method of controlling the incredibly strong human tendency we all have to cherry-pick evidence to confirm our biases.
Both the religious and scientific worldviews are evidence-based. That science does so well at uncovering reality and religion does so poorly is mostly due to the different ways in which the two approaches handle evidence.
_________________________
In a post two months back I demonstrated my own ability to put on the blinders of confirmation bias. I had come across the most amazing statistic…(wavy lines and harp music)…
I recently came across a statistic about scientists that, given my own background, ranks as the single most thought-provoking stat I have ever seen.
As I’ve mentioned before, my dad died when I was thirteen. It was, and continues to be, the defining event in my life, the beginning of my deepest and most honest thinking about the world and my place in it. My grief was instantly matched by a profound sense of wonder and a consuming curiosity. It was the start of the intensive wondering and questioning that led me (among other things) to reject religious answers on the way to real ones.
Now I learn that the loss of a parent shows a robust correlation to an interest in science. A study by behavioral scientist William Woodward was published in the July 1974 issue of Science Studies. The title, “Scientific Genius and Loss of a Parent,” hints at the statistic that caught my attention. About 5 percent of Americans lose a parent before the age of 18. Among eminent scientists, however, that number is higher. Much higher.
According to the study, 39.6 percent of top scientists experienced the death of a parent while growing up—eight times the average.
While researching the chapter of Raising Freethinkers on dealing with death, I had come across some random website [RED FLAG 1!] that mentioned the claim that 39.6 percent of scientists had lost a parent as a child. The website also cited the 1974 Woodward study.
“Wow!” I thought. “This precisely bears out my own personal narrative as a person whose thirst for knowledge was fueled by my father’s death! [RED FLAG 2!] Better still, it joins me at the hip to the great scientists I admire! [RED FLAG 3!] In short, this huge and unexpected percentage [RED FLAG 4!] dramatically confirms all of my dearest biases!”
If I had actually thought that in those words, or thought for a moment about what Huxley might say (“Science warns me to be careful how I adopt a view which jumps with my preconceptions, and to require stronger evidence for such belief than for one to which I was previously hostile. My business is to teach my aspirations to conform themselves to fact, not to try and make facts harmonize with my aspirations”), I might have spared myself the public error. Instead, I made a halfhearted attempt to confirm the stat, couldn’t easily access the original article, and decided to swallow the thing whole without looking further.
The Woodward article, it turns out, was largely devoted to debunking the claim. As blogreader Ryan pointed out, the parent-loss stat was a rough estimate based on a small sampling of scientists in the 500-year period from 1400 to 1900 — a span during which 40 percent of garbage collectors and astrologers also surely lost parents when they were young. The same article notes that 20th century records show little difference between scientists and non-scientists in parent loss.
We ALL do it. The trick isn’t to lead our children into a magical life free of confirmation bias, but to get them to fall so deeply in love with reality that they work hard to fight this tendency in themselves and others — precisely because it deludes us and blinds us to reality more than any other error.
[More on confirmation bias next week.]
the giddy geek
- June 11, 2008
- By Dale McGowan
- In Science, wonder
- 16
We live in a universe made of a curved fabric woven of space and time in which hydrogen, given the proper conditions, eventually evolves into Yo Yo Ma. — from Parenting Beyond Belief
Last year I wrote about Major Tom and the way the Apollo program lit up my imagination and fueled my wonder in the 70s. I always shook my head in pity at anyone who shook his head in pity at the “coldness” and “sterility” of the scientific worldview.
I touched on this in one of my essays in Parenting Beyond Belief called “Teaching Kids to Yawn at Counterfeit Wonder”:
Religious wonder—the wonder we’re said to be missing out on—is counterfeit wonder. As each complex and awe-inspiring explanation of reality takes the place of “God did it,” the flush of real awe quickly overwhelms the memory of whatever it was we considered so wondrous in religious mythology. Most of the truly wonder-inducing aspects of our existence—the true size and age of the universe, the relatedness of all life, microscopic worlds, and more—are not, to paraphrase Hamlet, even dreamt of in our religions. Our new maturity brings with it some real challenges, of course, but it also brings astonishing wonder beyond the imaginings of our infancy.
I offered a short list of the kinds of scientific revelations that make me woozy with awe:
If you condense the history of the universe to a single year, humans would appear on December 31st at 10:30 pm. That means 99.98 percent of the history of the universe happened before humans even existed.
Look at a gold ring. As the core collapsed in a dying star, a gravity wave collapsed inward with it. As it did so, it slammed into the thundering sound wave heading out of the collapse. In that moment, as a star died, the gold in that ring was formed.
We are star material that knows it exists.
Our planet is spinning at 900 miles an hour beneath our feet while coursing through space at 68,400 miles per hour.
The continents are moving under our feet at 3 to 6 inches a year. But a snail’s pace for a million millennia has been enough to remake the face of the world several times over, build the Himalayas and create the oceans.
Through the wonder of DNA, you are literally half your mom and half your dad.
A complete blueprint to build you exists in each and every cell of your body.
The faster you go, the slower time moves.
Your memories, your knowledge, even your identity and sense of self exist entirely in the form of a constantly recomposed electrochemical symphony playing in your head.
All life on Earth is directly related by descent. You are a cousin not just of apes, but of the sequoia and the amoeba, of mosses and butterflies and blue whales.
Now that, my friends, is wonder.
I’ve tried to pay attention when geeks (a term of genuine endearment from me) of one stripe or another are enraptured at the poetry or wonder of something I can’t see. I know they are experiencing something transcendent, something that I lack the language or knowledge to apprehend directly.
I remember a student of mine, a math major/violist, walking into a rehearsal with a look of utter bliss, as if drunk on a mantra.
“What on Earth happened to you?” I asked.
“Laplace transforms,” she said. “Laplace transforms happened to me. They are so beautiful I can hardly stand it.”
I knew she was right, and that I would never know why. I was envious.
So imagine the fellow-feeling I felt when I saw this wonderful video by Phil Plait at Bad Astronomy. Next time someone starts into the drone about the cold, passionless world of science, show them this:
The awe-inspiring picture isn’t even my main point — it’s what the picture has done to Phil, someone who knows what it means, and better still, takes the time to share his amazement with the rest of us. Thanks, Phil!
[Thanks to Tim Mills at Friendly Humanist for leading me to this video.]
go ahead, judge the book by it
A first glimpse of the cover for Raising Freethinkers. I think the folks at Amacom did a very nice job, wouldn’t you say?
I’m now at work on a blog series that’s gone completely out of control. It’s been years since I taught courses and workshops in critical thinking, but this topic has it all flooding back. It’s confirmation bias, the one critical thinking error at the heart of most of our worst thinking.
A comment from a parent in the Minneapolis seminar in March got me thinking about confirmation bias again. My own thinking error in a mid-April post was a classic example of it. An idle comment I heard while watching Global Catholic Network EWTN during my May visit to Amherst NY brought it up again. The presidential campaign is laced with it. My son is tripping over it. And I’m just tucking in to David Linden’s fascinating book The Accidental Mind, which among other things looks at the biology and neurology of it.
In short, I don’t know where to begin. But I’m having a ball. Becca’s also finishing Part 2 of her post today, so watch for that as well.
Other fun: The Meming of Life is undergoing a secret facelift by one of my favorite web artists. Stay tuned…