Posted by Jay Livingston
Been here long?
Gallup regularly asks this question:
Which of the following statements comes closest to your views on the origin and development of human beings --Here are the results:
- Human beings have developed over millions of years from less advanced forms of life, but God guided this process,
- Human beings have developed over millions of years from less advanced forms of life, but God had no part in this process
- God created human beings pretty much in their present form at one time within the last 10,000 years or so?
For better or worse, Godless evolutionism has been rising steadily if slowly for the past decade – 16%, and counting. And “only” 40% of us Americans, down from 47%, believe that humans are johnnies-come-lately. Scientific fact is making some headway. But a lot of people still believe in something that’s just not true.
Andrew Gelman explains it in psycho-economic terms. The “belief in young-earth creationism . . . is costless.” What you hear from religion contradicts what you hear from science class in school. The cost (“discomfort” in Andrew’s terms) of rejecting one belief outweighs the cost of rejecting the other. That’s probably true, and it helps explain the popularity of the have-it-both-ways choice – evolution guided by God.
I think there’s something else – the law of ungraspably large numbers. For example, I know how far it is to California (3000 miles), and I even think I know how far it is to the moon (240,000 miles – and I’m not looking this up on the Internet; if I’m wrong, I’ll let my ignorance stand since that’s partly the point I’m trying to make). But once you get past that – how far is it to the sun or to Jupiter or to Betelgeuse? – you could tell me any number up in the millions or more – a number so wrong as to make any astronomer chuckle – and I’d think it sounded reasonable.
Those big numbers and the differences between them are meaningful only to people who are familiar with them. They are so large that they lie outside the realm of everyday human experience. The same holds for distances in time. Ten thousand years – that seems like a long, long time ago, long enough for any species to have been around. But “millions of years” is like those millions or hundreds of millions of miles – ungraspably large.
Since the number is outside the realm of human experience, it doesn’t make sense that humans or anything resembling them or even this familiar planet could have existed that long ago.
I suspect that it’s this same law of ungraspably large numbers that allows politicians to posture as doing something about “the huge deficit” by attacking a wasteful government program that costs $3 million. If I spend a few thousand dollars for something, that’s a big ticket item, so three million sounds like a lot. Millions and billions both translate to the same thing: “a lot of money” just as distances in millions of miles and billions of miles are both “a long way away.” The difference between them is hard to grasp.*
*How many such programs would the government have to cancel to cover the revenue losses we just signed on for by extending the tax cuts on incomes over $250,000? And if you think those tax cuts for the rich will pay for themselves or increase revenue, there’s a lovely piece of 1883 pontine architecture I’d like to show you for possible purchase.
There was something I heard not too long ago on the radio (maybe "Radiolab"?) that humans naturally think of numbers logarithmically.
ReplyDeleteOnly with "modern" arithmetic do we force ourselves to unlearn this innate skill and learn to count in 1, 2s, and 3s. Of course it just so happens that arithmetic is damn useful for day-to-day life and accounting.
But the idea was that there is nothing innate about counting 1, 2, 3, 4, 5. This system does not come naturally. If I remember correctly, the supporting evidence had something to do with an Amazon tribe (naturally) that only had numbers for one and two (or something like that). But what was interesting was their perception of what "half" of something was. When you asked them to say what "twice" as much of something was, they used a logarithmic system (maybe base-3 or something).
So if Duki was being chased by three jaguars and Naku was being chased by "twice" as many jaguars, how many jaguars would be eating Naku for dinner? Answer: nine.
Similarly, if you asked them to "halve" a distance, they picked a point that we would consider a quarter or a third of the way there.
There may be something to this. And I think it's especially relevant when we start considering very large numbers. Large numbers don't make sense. Perhaps because our brains aren't actually hard-wired to think this way.
Maybe large distances would be obvious if we used a logarithmic scale for miles (as we already do for decibels and earthquakes).
So let's see how this works. It turns out that you are right about the distance to the moon(!). The distance to Jupiter varies greatly (makes sense, if you think about two planets not in sync rotating around the sun) but let's use 500,000 miles for a nice round number. (And I am using google.)
By a base-10 logarithmic scale (and help me if I'm wrong, I was never too good with logarithms, and I'm doing this without a calculator), if one equals zero and 10 miles equals one, then the distance from New York to California is about 3.5. The distance to the moon would be about 5.4. And the distance to Jupiter is about 8.7.
Much more manageable, isn't it?
And if we switched to a logarithmic scale, just think about how much more reasonable all those obscene year-end wall street bonuses would appear. It's not a million-dollar bonus... it's just little bonus of six. And hell, I'm making almost five.
But man, making change would be a bitch.
"humans naturally think of numbers logarithmically." Finally, I understand why they call it the "natural" logarithm. Thanks.
ReplyDeleteThe amazon tribe might have been using something very close to the natural log (base 2.7). Interesting. I hadn't thought of that.
ReplyDelete