Tag Archives: myth

Area 51 Lives On

google-search-area-51

What to believe about Area 51? Over the decades it has inspired hundreds of conspiratorial theories — extra-terrestrial spaceship landings, alien abductions, illegal governmental experimentation. Indeed, an entire tourist-based industry has arisen to attract myth-seekers to the Nevada desert. One thing does seem to be true: there’s a lot going on behind the barbed wire fences and security gates, and it’s probably all military.

From StumbleUpon:

In the middle of the barren Nevada desert, there’s a dusty unmarked road that leads to the front gate of Area 51. It’s protected by little more than a chain link fence, a boom gate, and intimidating trespassing signs. One would think that America’s much mythicized top secret military base would be under closer guard, but make no mistake. They are watching.

Beyond the gate, cameras see every angle. On the distant hilltop, there’s a white pickup truck with a tinted windshield peering down on everything below. Locals says the base knows every desert tortoise and jackrabbit that hops the fence. Others claim there are embedded sensors in the approaching road.

What exactly goes on inside of Area 51 has led to decades of wild speculation. There are, of course, the alien conspiracies that galactic visitors are tucked away somewhere inside. One of the more colorful rumors insists the infamous 1947 Roswell crash was actually a Soviet aircraft piloted by mutated midgets and the wreckage remains on the grounds of Area 51. Some even believe that the U.S. government filmed the 1969 moon landing in one of the base’s hangars.

For all the myths and legends, what’s true is that Area 51 is real and still very active. There may not be aliens or a moon landing movie set inside those fences, but something is going on and only a select few are privy to what’s happening further down that closely-monitored wind-swept Nevada road. “The forbidden aspect of Area 51 is what makes people want to know what’s there,” says aerospace historian and author Peter Merlin who’s been researching Area 51 for more than three decades.

“And there sure is still a lot going on there.”

Read the entire article here.

Image courtesy of Google Search.

Heroes Only Die at the Top of Hills

google-search-heroes-comicWe all need heroes. So, if you wish to become one, you would stand a better chance if you took your dying breaths atop a hill. Also, it would really help your cause if you arrived via virgin birth.

Accordingly, please refer to the Rank-Raglan Mythotype — it is a list of 22 universal archetypes that are prerequisites to you becoming a hero of mythological proportions (far beyond being a Youtube sensation):

  1. Hero’s mother is a royal virgin;
  2. His father is a king, and
  3. Often a near relative of his mother, but
  4. The circumstances of his conception are unusual, and
  5. He is also reputed to be the son of a god.
  6. At birth an attempt is made, usually by his father or his maternal grand father to kill him, but
  7. He is spirited away, and
  8. Reared by foster -parents in a far country.
  9. We are told nothing of his childhood, but
  10. On reaching manhood he returns or goes to his future Kingdom.
  11. After a victory over the king and/or a giant, dragon, or wild beast,
  12. He marries a princess, often the daughter of his predecessor and
  13. Becomes king.
  14. For a time he reigns uneventfully and
  15. Prescribes laws, but
  16. Later he loses favor with the gods and/or his subjects, and
  17. Is driven from the throne and city, after which
  18. He meets with a mysterious death,
  19. Often at the top of a hill,
  20. His children, if any do not succeed him.
  21. His body is not buried, but nevertheless
  22. He has one or more holy sepulchres.

By far the most heroic fit to date is Mithradates the Great with 22 out of a possible 22 cross-cultural traits. Jesus comes in with a score of 18-20 (based on interpretation) out of 22 , beaten by Krishna with 21, while Robin Hood only manages a paltry 13. Interestingly, Buddha collects 15 points, followed closely by Czar Nicholas II with 14.

The mythotype comes from the book The Hero: A study in Tradition, Myth and Dreams by Lord Raglan.

List courtesy of Professor Thomas J. Sienkewicz, Monmouth College, Monmouth, Illinois. It is based upon material used in his mythology classes for many years, first at Howard University in Washington, D.C., and then at Monmouth College in Monmouth, Illinois.

Image courtesy of Google Search.

The Messiah Myth

merryoldsanta

Now that almost two weeks have gone by since Christmas it’s time to reflect on it’s (historical) meaning beyond the shopping discounts, santa hats and incessant cheesy music.

We know that Christmas falls on two different dates depending on whether you follow the Gregorian or Julian (orthodox) calendars.

We know that many Christmas traditions were poached and re-purposed from rituals and traditions that predate the birth of Jesus, regardless of which calendar you adhere to: the 12-days of Christmas (christmastide) originated from the ancient Germanic mid-winter, more recently Norse, festival of Yule; the tradition of gift giving and partying came from the ancient Roman festival of Saturnalia; the Western Christian church settled on December 25 based on the ancient Roman date of the winter solstice; holiday lights came from the ancient pagans who lit bonfires and candles on the winter solstice to celebrate the return of the light.

And, let’s not forget the now ubiquitous westernized Santa Claus. We know that Santa has evolved over the centuries from a melting pot of European traditions including those surrounding Saint Nicolas, who was born to a Greek family in Asia Minor (Greek Anatolia in present-day Turkey), and the white-bearded Norse god, Odin.

So, what of Jesus? We know that the gospels describing him are contradictory, written by different, anonymous and usually biased authors, and at different times, often decades after the reported fact. We have no eye-witness accounts. We lack a complete record — there is no account for Jesus’ years 12-30. Indeed, religion aside, many scholars, now question the historic existence of Jesus, the man.

From Big Think:

Today, several books approach the subject, including Zealot by Reza Aslan, Nailed: Ten Christian Myths That Show Jesus Never Existed at All by David Fitzgerald, and How Jesus Became God by Bart Ehrman. Historian Richard Carrier in his 600 page monograph: On the Historicity of Jesus, writes that the story may have derived from earlier semi-divine beings from Near East myth, who were murdered by demons in the celestial realm. This would develop over time into the gospels, he said. Another theory is that Jesus was a historical figure who become mythicized later on.

Carrier believes the pieces added to the work of Josephus were done by Christian scribes. In one particular passage, Carrier says that the execution by Pilate of Jesus was obviously lifted from the Gospel of Luke. Similar problems such as miscopying and misrepresentations are found throughout Tacitus. So where do all the stories in the New Testament derive? According to Carrier, Jesus may be as much a mythical figure as Hercules or Oedipus.

Ehrman focuses on the lack of witnesses. “What sorts of things do pagan authors from the time of Jesus have to say about him? Nothing. As odd as it may seem, there is no mention of Jesus at all by any of his pagan contemporaries. There are no birth records, no trial transcripts, no death certificates; there are no expressions of interest, no heated slanders, no passing references – nothing.”

One biblical scholar holds an even more radical idea, that Jesus story was an early form of psychological warfare to help quell a violent insurgency. The Great Revolt against Rome occurred in 66 BCE. Fierce Jewish warriors known as the Zealots won two decisive victories early on. But Rome returned with 60,000 heavily armed troops. What resulted was a bloody war of attrition that raged for three decades.

 Atwill contends that the Zealots were awaiting the arrival of a warrior messiah to throw off the interlopers. Knowing this, the Roman court under Titus Flavius decided to create their own, competing messiah who promoted pacifism among the populous. According to Atwill, the story of Jesus was taken from many sources, including the campaigns of a previous Caesar.

Of course, there may very well have been a Rabbi Yeshua ben Yosef (as would have been Jesus’s real name) who gathered a flock around his teachings in the first century. Most antiquarians believe a real man existed and became mythicized. But the historical record itself is thin.

Read the entire article here.

Image:  “Merry Old Santa Claus”, by Thomas Nast,  January 1, 1881 edition of Harper’s Weekly. Public Domain.

50 Years Later Texas Moves Backwards (Again)

** FILE **This 1966 file photo shows Charles J. Whitman, a 24-year-old student at the University of Texas, a sniper who killed 16 and wounded 31 from the tower of the University of Texas administration building in Austin, Texas, Aug. 1, 1966. Until the carnage by a student gunman at Virginia Tech in Blacksburg, Va., on Monday, April 16, 2007, the sniping rampage by Whitman from the Austin school's landmark 307-foot tower had remained the deadliest campus shooting in U.S. history. (AP Photo, File)
** FILE **This 1966 file photo shows Charles J. Whitman, a 24-year-old student at the University of Texas, a sniper who killed 16 and wounded 31 from the tower of the University of Texas administration building in Austin, Texas, Aug. 1, 1966. Until the carnage by a student gunman at Virginia Tech in Blacksburg, Va., on Monday, April 16, 2007, the sniping rampage by Whitman from the Austin school’s landmark 307-foot tower had remained the deadliest campus shooting in U.S. history. (AP Photo, File)

On August 1, 2016, Texas’ new “Campus Carry” law went into effect. This means that licensed gun holders will generally be allowed to carry concealed handguns at the University of Texas (UT) at Austin and other public colleges throughout Texas.

On August 1, 1966, Charles Whitman, a non-brown-skinned, non-Muslim, domestic terrorist killed his wife and mother in their homes, and then went on to murder a further 14 people at the UT Austin campus. Before being shot and killed by an Austin police officer Whitman seriously wounded an additional 32 people.

Ironically and sadly, many believe that Campus Carry will make their university campuses safer. History and real data shows otherwise.

Evidence does show that legally-armed citizens can prevent some crime. But this would make no serious dent in the annual 32,000-plus death toll from guns in the US. Sensible gun control, with thorough and exhaustive background checks, is a more rational answer. The good guy with a gun is a myth — go ask your local police department.

Image: Charles Whitman Source, 1963, Cactus, the student yearbook of the University of Texas. Courtesy: The Austin History Center. Reference AR.2000.002, Austin History Center, Austin Public Library Date: c 1963.

The Tech Emperor Has No Clothes

OLYMPUS DIGITAL CAMERA

Bill Hewlett. David Packard. Bill Gates. Steve Allen. Steve Jobs. Larry Ellison. Gordon Moore. Tech titans. Moguls of the microprocessor. Their names hold a key place in the founding and shaping of our technological evolution. That they catalyzed and helped create entire economic sectors goes without doubt. Yet, a deeper, objective analysis of market innovation shows that the view of the lone, great-man (or two) — combating and succeeding against all-comers — may be more of a self-perpetuating myth than actual reality. The idea that single, visionary individual drives history and shapes the future is but a long and enduring invention.

From Technology Review:

Since Steve Jobs’s death, in 2011, Elon Musk has emerged as the leading celebrity of Silicon Valley. Musk is the CEO of Tesla Motors, which produces electric cars; the CEO of SpaceX, which makes rockets; and the chairman of SolarCity, which provides solar power systems. A self-made billionaire, programmer, and engineer—as well as an inspiration for Robert Downey Jr.’s Tony Stark in the Iron Man movies—he has been on the cover of Fortune and Time. In 2013, he was first on the Atlantic’s list of “today’s greatest inventors,” nominated by leaders at Yahoo, Oracle, and Google. To believers, Musk is steering the history of technology. As one profile described his mystique, his “brilliance, his vision, and the breadth of his ambition make him the one-man embodiment of the future.”

Musk’s companies have the potential to change their sectors in fundamental ways. Still, the stories around these advances—and around Musk’s role, in particular—can feel strangely outmoded.

The idea of “great men” as engines of change grew popular in the 19th century. In 1840, the Scottish philosopher Thomas Carlyle wrote that “the history of what man has accomplished in this world is at bottom the history of the Great Men who have worked here.” It wasn’t long, however, before critics questioned this one–dimensional view, arguing that historical change is driven by a complex mix of trends and not by any one person’s achievements. “All of those changes of which he is the proximate initiator have their chief causes in the generations he descended from,” Herbert Spencer wrote in 1873. And today, most historians of science and technology do not believe that major innovation is driven by “a lone inventor who relies only on his own imagination, drive, and intellect,” says Daniel Kevles, a historian at Yale. Scholars are “eager to identify and give due credit to significant people but also recognize that they are operating in a context which enables the work.” In other words, great leaders rely on the resources and opportunities available to them, which means they do not shape history as much as they are molded by the moments in which they live.

Musk’s success would not have been possible without, among other things, government funding for basic research and subsidies for electric cars and solar panels. Above all, he has benefited from a long series of innovations in batteries, solar cells, and space travel. He no more produced the technological landscape in which he operates than the Russians created the harsh winter that allowed them to vanquish Napoleon. Yet in the press and among venture capitalists, the great-man model of Musk persists, with headlines citing, for instance, “His Plan to Change the Way the World Uses Energy” and his own claim of “changing history.”

The problem with such portrayals is not merely that they are inaccurate and unfair to the many contributors to new technologies. By warping the popular understanding of how technologies develop, great-man myths threaten to undermine the structure that is actually necessary for future innovations.

Space cowboy

Elon Musk, the best-selling biography by business writer Ashlee Vance, describes Musk’s personal and professional trajectory—and seeks to explain how, exactly, the man’s repeated “willingness to tackle impossible things” has “turned him into a deity in Silicon Valley.”

Born in South Africa in 1971, Musk moved to Canada at age 17; he took a job cleaning the boiler room of a lumber mill and then talked his way into an internship at a bank by cold-calling a top executive. After studying physics and economics in Canada and at the Wharton School of the University of Pennsylvania, he enrolled in a PhD program at Stanford but opted out after a couple of days. Instead, in 1995, he cofounded a company called Zip2, which provided an online map of businesses—“a primitive Google maps meets Yelp,” as Vance puts it. Although he was not the most polished coder, Musk worked around the clock and slept “on a beanbag next to his desk.” This drive is “what the VCs saw—that he was willing to stake his existence on building out this platform,” an early employee told Vance. After Compaq bought Zip2, in 1999, Musk helped found an online financial services company that eventually became PayPal. This was when he “began to hone his trademark style of entering an ultracomplex business and not letting the fact that he knew very little about the industry’s nuances bother him,” Vance writes.

When eBay bought PayPal for $1.5 billion, in 2002, Musk emerged with the wherewithal to pursue two passions he believed could change the world. He founded SpaceX with the goal of building cheaper rockets that would facilitate research and space travel. Investing over $100 million of his personal fortune, he hired engineers with aeronautics experience, built a factory in Los Angeles, and began to oversee test launches from a remote island between Hawaii and Guam. At the same time, Musk cofounded Tesla Motors to develop battery technology and electric cars. Over the years, he cultivated a media persona that was “part playboy, part space cowboy,” Vance writes.

Musk sells himself as a singular mover of mountains and does not like to share credit for his success. At SpaceX, in particular, the engineers “flew into a collective rage every time they caught Musk in the press claiming to have designed the Falcon rocket more or less by himself,” Vance writes, referring to one of the company’s early models. In fact, Musk depends heavily on people with more technical expertise in rockets and cars, more experience with aeronautics and energy, and perhaps more social grace in managing an organization. Those who survive under Musk tend to be workhorses willing to forgo public acclaim. At SpaceX, there is Gwynne Shotwell, the company president, who manages operations and oversees complex negotiations. At Tesla, there is JB Straubel, the chief technology officer, responsible for major technical advances. Shotwell and Straubel are among “the steady hands that will forever be expected to stay in the shadows,” writes Vance. (Martin Eberhard, one of the founders of Tesla and its first CEO, arguably contributed far more to its engineering achievements. He had a bitter feud with Musk and left the company years ago.)

Likewise, Musk’s success at Tesla is undergirded by public-sector investment and political support for clean tech. For starters, Tesla relies on lithium-ion batteries pioneered in the late 1980s with major funding from the Department of Energy and the National Science Foundation. Tesla has benefited significantly from guaranteed loans and state and federal subsidies. In 2010, the company reached a loan agreement with the Department of Energy worth $465 million. (Under this arrangement, Tesla agreed to produce battery packs that other companies could benefit from and promised to manufacture electric cars in the United States.) In addition, Tesla has received $1.29 billion in tax incentives from Nevada, where it is building a “gigafactory” to produce batteries for cars and consumers. It has won an array of other loans and tax credits, plus rebates for its consumers, totaling another $1 billion, according to a recent series by the Los Angeles Times.

It is striking, then, that Musk insists on a success story that fails to acknowledge the importance of public-sector support. (He called the L.A. Times series “misleading and deceptive,” for instance, and told CNBC that “none of the government subsidies are necessary,” though he did admit they are “helpful.”)

If Musk’s unwillingness to look beyond himself sounds familiar, Steve Jobs provides a recent antecedent. Like Musk, who obsessed over Tesla cars’ door handles and touch screens and the layout of the SpaceX factory, Jobs brought a fierce intensity to product design, even if he did not envision the key features of the Mac, the iPod, or the iPhone. An accurate version of Apple’s story would give more acknowledgment not only to the work of other individuals, from designer Jonathan Ive on down, but also to the specific historical context in which Apple’s innovation occurred. “There is not a single key technology behind the iPhone that has not been state funded,” says economist Mazzucato. This includes the wireless networks, “the Internet, GPS, a touch-screen display, and … the voice-activated personal assistant Siri.” Apple has recombined these technologies impressively. But its achievements rest on many years of public-sector investment. To put it another way, do we really think that if Jobs and Musk had never come along, there would have been no smartphone revolution, no surge of interest in electric vehicles?

Read the entire story here.

Image: Titan Oceanus. Trevi Fountain, Rome. Public Domain.

13.6 Billion Versus 4004 BCE

The first number, 13.6 billion, is the age in years of the oldest known star in the cosmos. It was discovered recently by astronomers in Australia at the National University’s Mount Stromlo SkyMapper Observatory. The star is located in our Milky Way galaxy about 6,000 light years away. A little closer to home, in Kentucky at the aptly named Creation Museum, the Synchronological Chart places the beginning of time and all things at 4004 BCE.

Interestingly enough both Australia and Kentucky should not exist according to the flat earth myth or the widespread pre-Columbus view of our world with an edge at the visible horizon. But, the evolution versus creationism debates continue unabated. The chasm between the two camps remains a mere 13.6 billion years give or take a handful of millennia. But perhaps over time, those who subscribe to reason and the scientific method are likely to prevail — an apt example of survival of the most adaptable at work.

Hitch, we still miss you!

From ars technica:

In 1878, the American scholar and minister Sebastian Adams put the final touches on the third edition of his grandest project: a massive Synchronological Chart that covers nothing less than the entire history of the world in parallel, with the deeds of kings and kingdoms running along together in rows over 25 horizontal feet of paper. When the chart reaches 1500 BCE, its level of detail becomes impressive; at 400 CE it becomes eyebrow-raising; at 1300 CE it enters the realm of the wondrous. No wonder, then, that in their 2013 book Cartographies of Time: A History of the Timeline, authors Daniel Rosenberg and Anthony Grafton call Adams’ chart “nineteenth-century America’s surpassing achievement in complexity and synthetic power… a great work of outsider thinking.”

The chart is also the last thing that visitors to Kentucky’s Creation Museum see before stepping into the gift shop, where full-sized replicas can be purchased for $40.

That’s because, in the world described by the museum, Adams’ chart is more than a historical curio; it remains an accurate timeline of world history. Time is said to have begun in 4004 BCE with the creation of Adam, who went on to live for 930 more years. In 2348 BCE, the Earth was then reshaped by a worldwide flood, which created the Grand Canyon and most of the fossil record even as Noah rode out the deluge in an 81,000 ton wooden ark. Pagan practices at the eight-story high Tower of Babel eventually led God to cause a “confusion of tongues” in 2247 BCE, which is why we speak so many different languages today.

Adams notes on the second panel of the chart that “all the history of man, before the flood, extant, or known to us, is found in the first six chapters of Genesis.”

Ken Ham agrees. Ham, CEO of Answers in Genesis (AIG), has become perhaps the foremost living young Earth creationist in the world. He has authored more books and articles than seems humanly possible and has built AIG into a creationist powerhouse. He also made national headlines when the slickly modern Creation Museum opened in 2007.

He has also been looking for the opportunity to debate a prominent supporter of evolution.

And so it was that, as a severe snow and sleet emergency settled over the Cincinnati region, 900 people climbed into cars and wound their way out toward the airport to enter the gates of the Creation Museum. They did not come for the petting zoo, the zip line, or the seasonal camel rides, nor to see the animatronic Noah chortle to himself about just how easy it had really been to get dinosaurs inside his ark. They did not come to see The Men in White, a 22-minute movie that plays in the museum’s halls in which a young woman named Wendy sees that what she’s been taught about evolution “doesn’t make sense” and is then visited by two angels who help her understand the truth of six-day special creation. They did not come to see the exhibits explaining how all animals had, before the Fall of humanity into sin, been vegetarians.

He has also been looking for the opportunity to debate a prominent supporter of evolution.

And so it was that, as a severe snow and sleet emergency settled over the Cincinnati region, 900 people climbed into cars and wound their way out toward the airport to enter the gates of the Creation Museum. They did not come for the petting zoo, the zip line, or the seasonal camel rides, nor to see the animatronic Noah chortle to himself about just how easy it had really been to get dinosaurs inside his ark. They did not come to see The Men in White, a 22-minute movie that plays in the museum’s halls in which a young woman named Wendy sees that what she’s been taught about evolution “doesn’t make sense” and is then visited by two angels who help her understand the truth of six-day special creation. They did not come to see the exhibits explaining how all animals had, before the Fall of humanity into sin, been vegetarians.

They came to see Ken Ham debate TV presenter Bill Nye the Science Guy—an old-school creation v. evolution throwdown for the Powerpoint age. Even before it began, the debate had been good for both men. Traffic to AIG’s website soared by 80 percent, Nye appeared on CNN, tickets sold out in two minutes, and post-debate interviews were lined up with Piers Morgan Live and MSNBC.

While plenty of Ham supporters filled the parking lot, so did people in bow ties and “Bill Nye is my Homeboy” T-shirts. They all followed the stamped dinosaur tracks to the museum’s entrance, where a pack of AIG staffers wearing custom debate T-shirts stood ready to usher them into “Discovery Hall.”

Security at the Creation Museum is always tight; the museum’s security force is made up of sworn (but privately funded) Kentucky peace officers who carry guns, wear flat-brimmed state trooper-style hats, and operate their own K-9 unit. For the debate, Nye and Ham had agreed to more stringent measures. Visitors passed through metal detectors complete with secondary wand screenings, packages were prohibited in the debate hall itself, and the outer gates were closed 15 minutes before the debate began.

Inside the hall, packed with bodies and the blaze of high-wattage lights, the temperature soared. The empty stage looked—as everything at the museum does—professionally designed, with four huge video screens, custom debate banners, and a pair of lecterns sporting Mac laptops. 20 different video crews had set up cameras in the hall, and 70 media organizations had registered to attend. More than 10,000 churches were hosting local debate parties. As AIG technical staffers made final preparations, one checked the YouTube-hosted livestream—242,000 people had already tuned in before start time.

An AIG official took the stage eight minutes before start time. “We know there are people who disagree with each other in this room,” he said. “No cheering or—please—any disruptive behavior.”

At 6:59pm, the music stopped and the hall fell silent but for the suddenly prominent thrumming of the air conditioning. For half a minute, the anticipation was electric, all eyes fixed on the stage, and then the countdown clock ticked over to 7:00pm and the proceedings snapped to life. Nye, wearing his traditional bow tie, took the stage from the left; Ham appeared from the right. The two shook hands in the center to sustained applause, and CNN’s Tom Foreman took up his moderating duties.

Inside the hall, packed with bodies and the blaze of high-wattage lights, the temperature soared. The empty stage looked—as everything at the museum does—professionally designed, with four huge video screens, custom debate banners, and a pair of lecterns sporting Mac laptops. 20 different video crews had set up cameras in the hall, and 70 media organizations had registered to attend. More than 10,000 churches were hosting local debate parties. As AIG technical staffers made final preparations, one checked the YouTube-hosted livestream—242,000 people had already tuned in before start time.

An AIG official took the stage eight minutes before start time. “We know there are people who disagree with each other in this room,” he said. “No cheering or—please—any disruptive behavior.”

At 6:59pm, the music stopped and the hall fell silent but for the suddenly prominent thrumming of the air conditioning. For half a minute, the anticipation was electric, all eyes fixed on the stage, and then the countdown clock ticked over to 7:00pm and the proceedings snapped to life. Nye, wearing his traditional bow tie, took the stage from the left; Ham appeared from the right. The two shook hands in the center to sustained applause, and CNN’s Tom Foreman took up his moderating duties.

Ham had won the coin toss backstage and so stepped to his lectern to deliver brief opening remarks. “Creation is the only viable model of historical science confirmed by observational science in today’s modern scientific era,” he declared, blasting modern textbooks for “imposing the religion of atheism” on students.

“We’re teaching people to think critically!” he said. “It’s the creationists who should be teaching the kids out there.”

And we were off.

Two kinds of science

Digging in the fossil fields of Colorado or North Dakota, scientists regularly uncover the bones of ancient creatures. No one doubts the existence of the bones themselves; they lie on the ground for anyone to observe or weigh or photograph. But in which animal did the bones originate? How long ago did that animal live? What did it look like? One of Ham’s favorite lines is that the past “doesn’t come with tags”—so the prehistory of a stegosaurus thigh bone has to be interpreted by scientists, who use their positions in the present to reconstruct the past.

For mainstream scientists, this is simply an obvious statement of our existential position. Until a real-life Dr. Emmett “Doc” Brown finds a way to power a Delorean with a 1.21 gigawatt flux capacitor in order to shoot someone back through time to observe the flaring-forth of the Universe, the formation of the Earth, or the origins of life, or the prehistoric past can’t be known except by interpretation. Indeed, this isn’t true only of prehistory; as Nye tried to emphasize, forensic scientists routinely use what they know of nature’s laws to reconstruct past events like murders.

For Ham, though, science is broken into two categories, “observational” and “historical,” and only observational science is trustworthy. In the initial 30 minute presentation of his position, Ham hammered the point home.

“You don’t observe the past directly,” he said. “You weren’t there.”

Ham spoke with the polish of a man who has covered this ground a hundred times before, has heard every objection, and has a smooth answer ready for each one.

When Bill Nye talks about evolution, Ham said, that’s “Bill Nye the Historical Science Guy” speaking—with “historical” being a pejorative term.

In Ham’s world, only changes that we can observe directly are the proper domain of science. Thus, when confronted with the issue of speciation, Ham readily admits that contemporary lab experiments on fast-breeding creatures like mosquitoes can produce new species. But he says that’s simply “micro-evolution” below the family level. He doesn’t believe that scientists can observe “macro-evolution,” such as the alteration of a lobe-finned fish into a tiger over millions of years.

Because they can’t see historical events unfold, scientists must rely on reconstructions of the past. Those might be accurate, but they simply rely on too many “assumptions” for Ham to trust them. When confronted during the debate with evidence from ancient trees which have more rings than there are years on the Adams Sychronological Chart, Ham simply shrugged.

“We didn’t see those layers laid down,” he said.

To him, the calculus of “one ring, one year” is merely an assumption when it comes to the past—an assumption possibly altered by cataclysmic events such as Noah’s flood.

In other words, “historical science” is dubious; we should defer instead to the “observational” account of someone who witnessed all past events: God, said to have left humanity an eyewitness account of the world’s creation in the book of Genesis. All historical reconstructions should thus comport with this more accurate observational account.

Mainstream scientists don’t recognize this divide between observational and historical ways of knowing (much as they reject Ham’s distinction between “micro” and “macro” evolution). Dinosaur bones may not come with tags, but neither does observed contemporary reality—think of a doctor presented with a set of patient symptoms, who then has to interpret what she sees in order to arrive at a diagnosis.

Given that the distinction between two kinds of science provides Ham’s key reason for accepting the “eyewitness account” of Genesis as a starting point, it was unsurprising to see Nye take generous whacks at the idea. You can’t observe the past? “That’s what we do in astronomy,” said Nye in his opening presentation. Since light takes time to get here, “All we can do in astronomy is look at the past. By the way, you’re looking at the past right now.”

Those in the present can study the past with confidence, Nye said, because natural laws are generally constant and can be used to extrapolate into the past.

“This idea that you can separate the natural laws of the past from the natural laws you have now is at the heart of our disagreement,” Nye said. “For lack of a better word, it’s magical. I’ve appreciated magic since I was a kid, but it’s not what we want in mainstream science.”

How do scientists know that these natural laws are correctly understood in all their complexity and interplay? What operates as a check on their reconstructions? That’s where the predictive power of evolutionary models becomes crucial, Nye said. Those models of the past should generate predictions which can then be verified—or disproved—through observations in the present.

Read the entire article here.

Dragons of the Mind

[div class=attrib]From the Wall Street Journal:[end-div]

Peter Jackson’s “Hobbit” movie is on its way, and with it will come the resurrection of the vile dragon Smaug. With fiery breath, razor-sharp claws, scales as hard as shields and a vast underground lair, Smaug is portrayed in J.R.R. Tolkien’s text as a merciless killer. But where did the idea for such a bizarre beast—with such an odd mixture of traits—come from in the first place?

Historically, most monsters were spawned not from pure imagination but from aspects of the natural world that our ancestors did not understand. Whales seen breaking the surface of the ancient oceans were sea monsters, the fossilized bones of prehistoric humans were the victims of Medusa, the roars of earthquakes were thought to emanate from underground beasts. The list goes on. But tracing Smaug’s draconic heritage is more complicated.

At first glance, dinosaurs seem the obvious source for the dragon myth. Our ancestors simply ran into Tyrannosaur skulls, became terrified and came up with the idea that such monsters must still be around. It all sounds so logical, but it’s unlikely to be true.

Dragon myths were alive and well in the ancient Mediterranean world, despite the fact that the region is entirely bereft of dinosaur fossils. The Assyrians had Tiamat, a giant snake with horns (leading some to dispute whether it even qualifies as a dragon). The Greeks, for their part, had a fierce reptilian beast that guarded the golden fleece. In depicting it, they oscillated between a tiny viper and a huge snake capable of swallowing people whole. But even in this latter case, there was no fire-breathing or underground hoard, just a big reptile.

For decades, zoologists have argued that the only snakes humans ever had to seriously worry about were of the venomous variety. Last year, however, a study published in the Proceedings of the National Academy of Sciences revealed that members of Indonesian tribes are regularly eaten by enormous constrictors and that this was likely a common problem throughout human evolution. Moreover, reports by Pliny the Elder and others describe snakes of such size existing in the ancient Mediterranean world and sometimes attacking people. It seems likely that the early dragon myths were based on these real reptilian threats.

But Tolkien’s Smaug lives below the Lonely Mountain and breathes fire. Some reptiles live below ground, but none breathes anything that looks remotely like flame. Yet as strange as this trait may seem, it too may have natural origins.

Among the earliest mythical dragons that lived underground are those found in the 12th-century tales of Geoffrey of Monmouth. Monmouth recounts the story of Vortigern, an ancient British king who was forced to flee to the hills of Wales as Saxons invaded. Desperate to make a final stand, Vortigern orders a fortress to be built, but its walls keep falling over. Baffled, Vortigern seeks the advice of his wise men, who tell him that the ground must be consecrated with the blood of a child who is not born from the union between a man and a woman. Vortigern agrees and sends the wise men off to find such a child.

Not far away, in the town of Carmarthen, they come across two boys fighting. One insults the other as a bastard who has no father, and the wise men drag him back to Vortigern.

When the boy learns that he is to be killed, he tells Vortigern that his advisers have got things wrong. He declares that there are dragons below the ground and that their wrestling with one another is what keeps the walls from standing. Vortigern tests the boy’s theory out, and sure enough, as his men dig deeper, they discover the dragons’ “panting” flames.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Zmey Gorynych, the Russian three-headed dragon. Courtesy of Wikipedia.[end-div]