Tag Archives: history

A Vinyl-head’s Dream

vinyl-LPs

If you’ve ever owned vinyl — the circular, black, 12 inch kind — you will know that there are certain pleasures associated with it. The different quality of sound from the spiraling grooves; the (usually) gorgeous album cover art; the printed lyrics and liner notes, sometimes an included wall poster.

Cassette tapes and then CDs shrank these pleasures. Then came the death knell, tolled by MP3 (or MPEG3) and MP4 and finally streaming.

Fortunately some of us have managed to hold on to our precious vinyl collections: our classic LPs and rare 12-inch singles; though not so much the 45s. And, to some extent vinyl is having a small — but probably temporary — renaissance.

So, I must must admit to awe and a little envy over Zero Freitas’ collection. Over the years he has amassed a vast collection of over 6 million records. During his 40 plus years of collecting he’s evolved from a mere vinyl junkie to a global curator and preservationist.

From the Vinyl Factory:

Nearly everyone interested in records will have, at some point heard, the news that there is a Brazilian who owns millions of records. Fewer seem to know, however, that Zero Freitas, a São Paulo-based businessman now in his sixties, plans to turn his collection into a public archive of the world’s music, with special focus on the Americas. Having amassed over six million records, he manages a collection similar to the entire Discogs database. Given the magnitude of this enterprise, Freitas deals with serious logistical challenges and, above all, time constraints. But he strongly believes it is worth his while. After all, no less than a vinyl library of global proportions is at stake.

How to become a part of this man’s busy timetable – that was the question that remained unanswered almost until the very end of my stay in São Paulo in April 2015. It was 8 am on my second last morning in the city, when Viviane Riegel, my Brazilian partner in crime, received a terse message: ‘if you can make it by 10am to his warehouse, he’ll have an hour for you’. That was our chance. We instantly took a taxi from the city’s south-west part called Campo Belo to a more westerly neighbourhood of Vila Leopoldina. We were privileged enough to listen to Freitas’ stories for what felt like a very quick hundred minutes. His attitude and life’s work provoked compelling questions.

The analogue record in the digital age
What makes any vinyl collection truly valuable? How to tell a mere hoarder from a serious collector? And why is vinyl collectable now, at a time of intensive digitalization of life and culture?

Publically pronounced dead by the mainstream industry in the 1990s, vinyl never really ceased to live and has proved much more resilient than the corporate prophets of digital ‘progress’ would like us to believe. Apart from its unique physical properties, vinyl records contain a history that’s longer than any digital medium can ever hope to replicate. Zero Freitas insists that this history has not been fully told yet. Indeed, when acquired and classified with a set of principles in mind, records may literally offer a record of culture, for they preserve not just sounds, but also artistic expression, visual sensibility, poetry, fashion, ideas of genre differentiation and packaging design, and sometimes social commentary of a given time and place. If you go through your life with records, then your collection might be a record of your life. Big record collections are private libraries of cultural import and aesthetic appeal. They are not so very different from books, a medium we still hold in high regard. Books and records invite ritualistic experience, their digital counterparts offer routine convenience.

The problem is that many records are becoming increasingly rare. As Portuguese musicologist Rui Vieria Nery writes reflecting on the European case of Fado music, “the truth is that, strange as it may seem, collections of Fado recordings as recent as the ’50s to ’70s are difficult to get hold of.“ Zero Freitas emphasizes that the situation of collections from other parts of the world may be even worse.

We have to ask then, what we lose if we don’t get hold of them? For one thing, records preserve the past. They save something intangible from oblivion, where a tune or a cover can suddenly transport us back in time to a younger version of ourselves and the feelings we once had. Rare and independently released records can provide a chance for genuine discovery and learning. They help acquire new tastes, delve into different under-represented stories.

What Thomas Carlyle once wrote about books applies to vinyl perhaps with even greater force: ‘in books lies the soul of the whole past time, the articulate audible voice of the past when the body and material substance of it has altogether vanished like a dream’. This quote is inscribed in stone on the wall of the Mitchell Library in Sydney. Having listened to Zero Freitas, this motto could just as easily apply to his vinyl library project. Focusing on rare Brazilian music, he wants to save some endangered species of vinyl, and thus to raise awareness of world’s vast but jeopardised musical ecologies. This task seems urgent now as our attention span gets ever shorter and more distracted, as reflected in the uprooted samples and truncated snippets of music scattered all over the internet.

Read the entire article here.

Image: Vinyl LPs. Courtesy of the author.

Documenting the Self

Samuel_PepysIs Nicolas Felton the Samuel Pepys of our digital age?

They both chronicled their observations over a period of 10 years, but separated by 345 years. However, that’s where the similarity between the two men ends.

Samuel Pepys was a 17th century member of British Parliament and naval bureaucrat, famous for the decade-long private diary. Pepys kept detailed personal notes from 1660 to 1669. The diary was subsequently published in the 19th century, and is now regarded as one of the principal sources of information of the Restoration period (return of the monarchy under Charles II). Many a British school kid [myself included] has been exposed to Pepys’ observations of momentous events, including his tales of the plague and the Great Fire of London.

Nicolas Felton a graphic designer and ex-Facebook employee cataloged his life from 2005 to 2015. Based in New York, Felton began obsessively recording the minutiae of his life in 2005. He first tracked his locations and time spent in each followed by his music-listening habits. Then he began counting his emails, correspondence, calendar entries, photos. Felton eventually compiled his detailed digital tracks into a visually fascinating annual Feltron Report.

So, Felton is certainly no Pepys, but his data trove remains interesting nonetheless — for different reasons. Pepys recorded history during a tumultuous time in England; his very rare, detailed first-person account across an entire decade has no parallel. His diary is now an invaluable literary chronicle for scholars and history buffs.

Our world is rather different today. Our technologies now enable institutions and individuals to record and relate their observations ad nauseam. Thus Felton’s data is not unique per se, though his decade-long obsession certainly provides us with a quantitative trove of data, which is not necessarily useful to us for historical reasons, but more so for those who study our tracks and needs, and market to us.

Read Samuel Pepys diary here. Read more about Nicolas Felton here.

Image: Samuel Pepys by John Hayls, oil on canvas, 1666. National Portrait Gallery. Public Domain.

Digital Forensics and the Wayback Machine

Amazon-Aug1999

Many of us see history — the school subject — as rather dull and boring. After all, how can the topic be made interesting when it’s usually taught by a coach who has other things on his or her mind [no joke, I have evidence of this from both sides of the Atlantic!].

Yet we also know that history’s lessons are essential to shaping our current world view and our vision for the future, in a myriad of ways. Since humans could speak and then write, our ancestors have recorded and transmitted their histories through oral storytelling, and then through books and assorted media.

Then came the internet. The explosion of content, media formats and related technologies over the last quarter-century has led to an immense challenge for archivists and historians intent on cataloging our digital stories. One facet of this challenge is the tremendous volume of information and its accelerating growth. Another is the dynamic nature of the content — much of it being constantly replaced and refreshed.

But, all is not lost. The Internet Archive founded in 1996 has been quietly archiving text, pages, images, audio and more recently entire web sites from the Tubes of the vast Internets. Currently the non-profit has archived around half a trillion web pages. It’s our modern day equivalent of the Library of Alexandria.

Please say hello to the Internet Archive Wayback Machine, and give it a try. The Wayback Machine took the screenshot above of Amazon.com in 1999, in case you’ve ever wondered what Amazon looked like before it swallowed or destroyed entire retail sectors.

From the New Yorker:

Malaysia Airlines Flight 17 took off from Amsterdam at 10:31 A.M. G.M.T. on July 17, 2014, for a twelve-hour flight to Kuala Lumpur. Not much more than three hours later, the plane, a Boeing 777, crashed in a field outside Donetsk, Ukraine. All two hundred and ninety-eight people on board were killed. The plane’s last radio contact was at 1:20 P.M. G.M.T. At 2:50 P.M. G.M.T., Igor Girkin, a Ukrainian separatist leader also known as Strelkov, or someone acting on his behalf, posted a message on VKontakte, a Russian social-media site: “We just downed a plane, an AN-26.” (An Antonov 26 is a Soviet-built military cargo plane.) The post includes links to video of the wreckage of a plane; it appears to be a Boeing 777.

Two weeks before the crash, Anatol Shmelev, the curator of the Russia and Eurasia collection at the Hoover Institution, at Stanford, had submitted to the Internet Archive, a nonprofit library in California, a list of Ukrainian and Russian Web sites and blogs that ought to be recorded as part of the archive’s Ukraine Conflict collection. Shmelev is one of about a thousand librarians and archivists around the world who identify possible acquisitions for the Internet Archive’s subject collections, which are stored in its Wayback Machine, in San Francisco. Strelkov’s VKontakte page was on Shmelev’s list. “Strelkov is the field commander in Slaviansk and one of the most important figures in the conflict,” Shmelev had written in an e-mail to the Internet Archive on July 1st, and his page “deserves to be recorded twice a day.”

On July 17th, at 3:22 P.M. G.M.T., the Wayback Machine saved a screenshot of Strelkov’s VKontakte post about downing a plane. Two hours and twenty-two minutes later, Arthur Bright, the Europe editor of the Christian Science Monitor, tweeted a picture of the screenshot, along with the message “Grab of Donetsk militant Strelkov’s claim of downing what appears to have been MH17.” By then, Strelkov’s VKontakte page had already been edited: the claim about shooting down a plane was deleted. The only real evidence of the original claim lies in the Wayback Machine.

The average life of a Web page is about a hundred days. Strelkov’s “We just downed a plane” post lasted barely two hours. It might seem, and it often feels, as though stuff on the Web lasts forever, for better and frequently for worse: the embarrassing photograph, the regretted blog (more usually regrettable not in the way the slaughter of civilians is regrettable but in the way that bad hair is regrettable). No one believes any longer, if anyone ever did, that “if it’s on the Web it must be true,” but a lot of people do believe that if it’s on the Web it will stay on the Web. Chances are, though, that it actually won’t. In 2006, David Cameron gave a speech in which he said that Google was democratizing the world, because “making more information available to more people” was providing “the power for anyone to hold to account those who in the past might have had a monopoly of power.” Seven years later, Britain’s Conservative Party scrubbed from its Web site ten years’ worth of Tory speeches, including that one. Last year, BuzzFeed deleted more than four thousand of its staff writers’ early posts, apparently because, as time passed, they looked stupider and stupider. Social media, public records, junk: in the end, everything goes.

Web pages don’t have to be deliberately deleted to disappear. Sites hosted by corporations tend to die with their hosts. When MySpace, GeoCities, and Friendster were reconfigured or sold, millions of accounts vanished. (Some of those companies may have notified users, but Jason Scott, who started an outfit called Archive Team—its motto is “We are going to rescue your shit”—says that such notification is usually purely notional: “They were sending e-mail to dead e-mail addresses, saying, ‘Hello, Arthur Dent, your house is going to be crushed.’ ”) Facebook has been around for only a decade; it won’t be around forever. Twitter is a rare case: it has arranged to archive all of its tweets at the Library of Congress. In 2010, after the announcement, Andy Borowitz tweeted, “Library of Congress to acquire entire Twitter archive—will rename itself Museum of Crap.” Not long after that, Borowitz abandoned that Twitter account. You might, one day, be able to find his old tweets at the Library of Congress, but not anytime soon: the Twitter Archive is not yet open for research. Meanwhile, on the Web, if you click on a link to Borowitz’s tweet about the Museum of Crap, you get this message: “Sorry, that page doesn’t exist!”

The Web dwells in a never-ending present. It is—elementally—ethereal, ephemeral, unstable, and unreliable. Sometimes when you try to visit a Web page what you see is an error message: “Page Not Found.” This is known as “link rot,” and it’s a drag, but it’s better than the alternative. More often, you see an updated Web page; most likely the original has been overwritten. (To overwrite, in computing, means to destroy old data by storing new data in their place; overwriting is an artifact of an era when computer storage was very expensive.) Or maybe the page has been moved and something else is where it used to be. This is known as “content drift,” and it’s more pernicious than an error message, because it’s impossible to tell that what you’re seeing isn’t what you went to look for: the overwriting, erasure, or moving of the original is invisible. For the law and for the courts, link rot and content drift, which are collectively known as “reference rot,” have been disastrous. In providing evidence, legal scholars, lawyers, and judges often cite Web pages in their footnotes; they expect that evidence to remain where they found it as their proof, the way that evidence on paper—in court records and books and law journals—remains where they found it, in libraries and courthouses. But a 2013 survey of law- and policy-related publications found that, at the end of six years, nearly fifty per cent of the URLs cited in those publications no longer worked. According to a 2014 study conducted at Harvard Law School, “more than 70% of the URLs within the Harvard Law Review and other journals, and 50% of the URLs within United States Supreme Court opinions, do not link to the originally cited information.” The overwriting, drifting, and rotting of the Web is no less catastrophic for engineers, scientists, and doctors. Last month, a team of digital library researchers based at Los Alamos National Laboratory reported the results of an exacting study of three and a half million scholarly articles published in science, technology, and medical journals between 1997 and 2012: one in five links provided in the notes suffers from reference rot. It’s like trying to stand on quicksand.

The footnote, a landmark in the history of civilization, took centuries to invent and to spread. It has taken mere years nearly to destroy. A footnote used to say, “Here is how I know this and where I found it.” A footnote that’s a link says, “Here is what I used to know and where I once found it, but chances are it’s not there anymore.” It doesn’t matter whether footnotes are your stock-in-trade. Everybody’s in a pinch. Citing a Web page as the source for something you know—using a URL as evidence—is ubiquitous. Many people find themselves doing it three or four times before breakfast and five times more before lunch. What happens when your evidence vanishes by dinnertime?

The day after Strelkov’s “We just downed a plane” post was deposited into the Wayback Machine, Samantha Power, the U.S. Ambassador to the United Nations, told the U.N. Security Council, in New York, that Ukrainian separatist leaders had “boasted on social media about shooting down a plane, but later deleted these messages.” In San Francisco, the people who run the Wayback Machine posted on the Internet Archive’s Facebook page, “Here’s why we exist.”

Read the entire story here.

Image: Wayback Machine’s screenshot of Amazon.com’s home page, August 1999.

Past Experience is Good; Random Decision-Making is Better

We all know that making decisions from past experience is wise. We learn from the benefit of hindsight. We learn to make small improvements or radical shifts in our thinking and behaviors based on history and previous empirical evidence. Stock market gurus and investment mavens will tell you time after time that they have a proven method — based on empirical evidence and a lengthy, illustrious track record — for picking the next great stock or investing your hard-earned retirement funds.

Yet, empirical evidence shows that chimpanzees throwing darts at the WSJ stock pages are just as good at stock market tips as we humans (and the “masters of the universe”). So, it seems that random decision-making can be just as good, if not better, than wisdom and experience.

From the Guardian:

No matter how much time you spend reading the recent crop of books on How To Decide or How To Think Clearly, you’re unlikely to encounter glowing references to a decision-making system formerly used by the Azande of central Africa. Faced with a dilemma, tribespeople would force poison down the neck of a chicken while asking questions of the “poison oracle”; the chicken answered by surviving (“yes”) or expiring (“no”). Clearly, this was cruel to chickens. That aside, was it such a terrible way to choose among options? The anthropologist EE Evans-Pritchard, who lived with the Azande in the 1920s, didn’t think so. “I always kept a supply of poison [and] we regulated our affairs in accordance with the oracle’s decisions,” he wrote, adding drily: “I found this as satisfactory a way of running my home and affairs as any other I know of.” You could dismiss that as a joke. After all, chicken-poisoning is plainly superstition, delivering random results. But what if random results are sometimes exactly what you need?

The other day, US neuroscientists published details of experiments on rats, showing that in certain unpredictable situations, they stop trying to make decisions based on past experience. Instead, a circuit in their brains switches to “random mode”. The researchers’ hunch is that this serves a purpose: past experience is usually helpful, but when uncertainty levels are high, it can mislead, so randomness is in the rats’ best interests. When we’re faced with the unfamiliar, experience can mislead humans, too, partly because we filter it through various irrational biases. According to those books on thinking clearly, we should strive to overcome these biases, thus making more rational calculations. But there’s another way to bypass our biased brains: copy the rats, and choose randomly.

In certain walks of life, the usefulness of randomness is old news: the stock market, say, is so unpredictable that, to quote the economist Burton Malkiel, “a blindfolded monkey throwing darts at a newspaper’s financial pages could select a portfolio that would do as well as one carefully selected by experts”. (This has been tried, with simulated monkeys, andthey beat the market.) But, generally, as Michael Schulson put it recentlyin an Aeon magazine essay, “We take it for granted that the best decisions stem from empirical analysis and informed choice.” Yet consider, he suggests, the ancient Greek tradition of filling some government positions by lottery. Randomness disinfects a process that might be dirtied by corruption.

Randomness can be similarly useful in everyday life. For tiny choices, it’s a time-saver: pick randomly from a menu, and you can get back to chatting with friends. For bigger ones, it’s an acknowledgment of how little one can ever know about the complex implications of a decision. Let’s be realistic: for the biggest decisions, such as whom to marry, trusting to randomness feels absurd. But if you can up the randomness quotient for marginally less weighty choices, especially when uncertainty prevails, you may find it pays off. Though kindly refrain from poisoning any chickens.

Read the entire article here.

The Future of History

[tube]f3nJOCfkerI[/tube]

Take and impassioned history professor, a mediocre U.S. high school history curriculum and add Bill Gates, and you get an opportunity to inject fresh perspectives and new ideas into young minds.

Not too long ago Professor David Christian’s collection of Big History DVDs caught Gates’ attention, leading to a broad mission to overhaul the boring history lesson — one school at a time. Professor Christian’s approach takes a thoroughly holistic approach to the subject, spanning broad and interconnected topics such as culture, biochemistry, astronomy, agriculture and physics. The sweeping narrative fundamental to Christian’s delivery reminds me somewhat of Kenneth Clark’s Civilisation and Jacob Bronowski’s The Ascent of Man, two landmark U.K. television series.

From the New York Times:

In 2008, shortly after Bill Gates stepped down from his executive role at Microsoft, he often awoke in his 66,000-square-foot home on the eastern bank of Lake Washington and walked downstairs to his private gym in a baggy T-shirt, shorts, sneakers and black socks yanked up to the midcalf. Then, during an hour on the treadmill, Gates, a self-described nerd, would pass the time by watching DVDs from the Teaching Company’s “Great Courses” series. On some mornings, he would learn about geology or meteorology; on others, it would be oceanography or U.S. history.

As Gates was working his way through the series, he stumbled upon a set of DVDs titled “Big History” — an unusual college course taught by a jovial, gesticulating professor from Australia named David Christian. Unlike the previous DVDs, “Big History” did not confine itself to any particular topic, or even to a single academic discipline. Instead, it put forward a synthesis of history, biology, chemistry, astronomy and other disparate fields, which Christian wove together into nothing less than a unifying narrative of life on earth. Standing inside a small “Mr. Rogers”-style set, flanked by an imitation ivy-covered brick wall, Christian explained to the camera that he was influenced by the Annales School, a group of early-20th-century French historians who insisted that history be explored on multiple scales of time and space. Christian had subsequently divided the history of the world into eight separate “thresholds,” beginning with the Big Bang, 13 billion years ago (Threshold 1), moving through to the origin of Homo sapiens (Threshold 6), the appearance of agriculture (Threshold 7) and, finally, the forces that gave birth to our modern world (Threshold 8).

Christian’s aim was not to offer discrete accounts of each period so much as to integrate them all into vertiginous conceptual narratives, sweeping through billions of years in the span of a single semester. A lecture on the Big Bang, for instance, offered a complete history of cosmology, starting with the ancient God-centered view of the universe and proceeding through Ptolemy’s Earth-based model, through the heliocentric versions advanced by thinkers from Copernicus to Galileo and eventually arriving at Hubble’s idea of an expanding universe. In the worldview of “Big History,” a discussion about the formation of stars cannot help including Einstein and the hydrogen bomb; a lesson on the rise of life will find its way to Jane Goodall and Dian Fossey. “I hope by the end of this course, you will also have a much better sense of the underlying unity of modern knowledge,” Christian said at the close of the first lecture. “There is a unified account.”

As Gates sweated away on his treadmill, he found himself marveling at the class’s ability to connect complex concepts. “I just loved it,” he said. “It was very clarifying for me. I thought, God, everybody should watch this thing!” At the time, the Bill & Melinda Gates Foundation had donated hundreds of millions of dollars to educational initiatives, but many of these were high-level policy projects, like the Common Core Standards Initiative, which the foundation was instrumental in pushing through. And Gates, who had recently decided to become a full-time philanthropist, seemed to pine for a project that was a little more tangible. He was frustrated with the state of interactive coursework and classroom technology since before he dropped out of Harvard in the mid-1970s; he yearned to experiment with entirely new approaches. “I wanted to explore how you did digital things,” he told me. “That was a big issue for me in terms of where education was going — taking my previous skills and applying them to education.” Soon after getting off the treadmill, he asked an assistant to set a meeting with Christian.

A few days later, the professor, who was lecturing at San Diego State University, found himself in the lobby of a hotel, waiting to meet with the billionaire. “I was scared,” Christian recalled. “Someone took me along the corridor, knocks on a door, Bill opens it, invites me in. All I remember is that within five minutes, he had so put me at my ease. I thought, I’m a nerd, he’s a nerd and this is fun!” After a bit of small talk, Gates got down to business. He told Christian that he wanted to introduce “Big History” as a course in high schools all across America. He was prepared to fund the project personally, outside his foundation, and he wanted to be personally involved. “He actually gave me his email address and said, ‘Just think about it,’ ” Christian continued. ” ‘Email me if you think this is a good idea.’ ”

Christian emailed to say that he thought it was a pretty good idea. The two men began tinkering, adapting Christian’s college course into a high-school curriculum, with modules flexible enough to teach to freshmen and seniors alike. Gates, who insisted that the course include a strong digital component, hired a team of engineers and designers to develop a website that would serve as an electronic textbook, brimming with interactive graphics and videos. Gates was particularly insistent on the idea of digital timelines, which may have been vestige of an earlier passion project, Microsoft Encarta, the electronic encyclopedia that was eventually overtaken by the growth of Wikipedia. Now he wanted to offer a multifaceted historical account of any given subject through a friendly user interface. The site, which is open to the public, would also feature a password-protected forum for teachers to trade notes and update and, in some cases, rewrite lesson plans based on their experiences in the classroom.

Read the entire article here.

Video: Clip from Threshold 1, The Big Bang. Courtesy of Big History Project, David Christian.

Time Traveling Camels

camels_at_giza

Camels have no place in the Middle East of biblical times. Forensic scientists, biologists, archeologists, geneticists and paleontologists all seem to agree that camels could not have been present in the early Jewish stories of the Genesis and the Old Testament — camels trotted in to the land many hundreds of years later.

From the NYT:

There are too many camels in the Bible, out of time and out of place.

Camels probably had little or no role in the lives of such early Jewish patriarchs as Abraham, Jacob and Joseph, who lived in the first half of the second millennium B.C., and yet stories about them mention these domesticated pack animals more than 20 times. Genesis 24, for example, tells of Abraham’s servant going by camel on a mission to find a wife for Isaac.

These anachronisms are telling evidence that the Bible was written or edited long after the events it narrates and is not always reliable as verifiable history. These camel stories “do not encapsulate memories from the second millennium,” said Noam Mizrahi, an Israeli biblical scholar, “but should be viewed as back-projections from a much later period.”

Dr. Mizrahi likened the practice to a historical account of medieval events that veers off to a description of “how people in the Middle Ages used semitrailers in order to transport goods from one European kingdom to another.”

For two archaeologists at Tel Aviv University, the anachronisms were motivation to dig for camel bones at an ancient copper smelting camp in the Aravah Valley in Israel and in Wadi Finan in Jordan. They sought evidence of when domesticated camels were first introduced into the land of Israel and the surrounding region.

The archaeologists, Erez Ben-Yosef and Lidar Sapir-Hen, used radiocarbon dating to pinpoint the earliest known domesticated camels in Israel to the last third of the 10th century B.C. — centuries after the patriarchs lived and decades after the kingdom of David, according to the Bible. Some bones in deeper sediments, they said, probably belonged to wild camels that people hunted for their meat. Dr. Sapir-Hen could identify a domesticated animal by signs in leg bones that it had carried heavy loads.

The findings were published recently in the journal Tel Aviv and in a news release from Tel Aviv University. The archaeologists said that the origin of the domesticated camel was probably in the Arabian Peninsula, which borders the Aravah Valley. Egyptians exploited the copper resources there and probably had a hand in introducing the camels. Earlier, people in the region relied on mules and donkeys as their beasts of burden.

“The introduction of the camel to our region was a very important economic and social development,” Dr. Ben-Yosef said in a telephone interview. “The camel enabled long-distance trade for the first time, all the way to India, and perfume trade with Arabia. It’s unlikely that mules and donkeys could have traversed the distance from one desert oasis to the next.”

Dr. Mizrahi, a professor of Hebrew culture studies at Tel Aviv University who was not directly involved in the research, said that by the seventh century B.C. camels had become widely employed in trade and travel in Israel and through the Middle East, from Africa as far as India. The camel’s influence on biblical research was profound, if confusing, for that happened to be the time that the patriarchal stories were committed to writing and eventually canonized as part of the Hebrew Bible.

“One should be careful not to rush to the conclusion that the new archaeological findings automatically deny any historical value from the biblical stories,” Dr. Mizrahi said in an email. “Rather, they established that these traditions were indeed reformulated in relatively late periods after camels had been integrated into the Near Eastern economic system. But this does not mean that these very traditions cannot capture other details that have an older historical background.”

Read the entire article here.

Image: Camels at the Great Pyramid of Giza, Egypt. Courtesy of Wikipedia.

Apocalypse Now or Later?

Armageddon-poster06Americans love their apocalypses. So, should demise come at the hands of a natural catastrophe, hastened by human (in)action, or should it come courtesy of an engineered biological or nuclear disaster? You chose. Isn’t this so much fun, thinking about absolute extinction?

Ira Chernus, Professor of Religious Studies at the University of Colorado at Boulder, brings us a much-needed scholarly account of our love affairs with all things apocalyptic. But our fascination for  Armageddon — often driven by hope — does nothing to resolve the ultimate conundrum: regardless of the type of ending, it is unlikely that Bruce Willis will be featuring.

From TomDispatch / Salon:

Wherever we Americans look, the threat of apocalypse stares back at us.

Two clouds of genuine doom still darken our world: nuclear extermination and environmental extinction. If they got the urgent action they deserve, they would be at the top of our political priority list.

But they have a hard time holding our attention, crowded out as they are by a host of new perils also labeled “apocalyptic”: mounting federal debt, the government’s plan to take away our gunscorporate control of the Internet, the Comcast-Time Warner mergerocalypse, Beijing’s pollution airpocalypse, the American snowpocalypse, not to speak of earthquakes and plagues. The list of topics, thrown at us with abandon from the political right, left, and center, just keeps growing.

Then there’s the world of arts and entertainment where selling the apocalypse turns out to be a rewarding enterprise. Check out the website “Romantically Apocalyptic,” Slash’s album “Apocalyptic Love,” or the history-lite documentary “Viking Apocalypse” for starters. These days, mathematicians even have an “apocalyptic number.”

Yes, the A-word is now everywhere, and most of the time it no longer means “the end of everything,” but “the end of anything.” Living a life so saturated with apocalypses undoubtedly takes a toll, though it’s a subject we seldom talk about.

So let’s lift the lid off the A-word, take a peek inside, and examine how it affects our everyday lives. Since it’s not exactly a pretty sight, it’s easy enough to forget that the idea of the apocalypse has been a container for hope as well as fear. Maybe even now we’ll find some hope inside if we look hard enough.

A Brief History of Apocalypse

Apocalyptic stories have been around at least since biblical times, if not earlier. They show up in many religions, always with the same basic plot: the end is at hand; the cosmic struggle between good and evil (or God and the Devil, as the New Testament has it) is about to culminate in catastrophic chaos, mass extermination, and the end of the world as we know it.

That, however, is only Act I, wherein we wipe out the past and leave a blank cosmic slate in preparation for Act II: a new, infinitely better, perhaps even perfect world that will arise from the ashes of our present one. It’s often forgotten that religious apocalypses, for all their scenes of destruction, are ultimately stories of hope; and indeed, they have brought it to millions who had to believe in a better world a-comin’, because they could see nothing hopeful in this world of pain and sorrow.

That traditional religious kind of apocalypse has also been part and parcel of American political life since, in Common Sense, Tom Paine urged the colonies to revolt by promising, “We have it in our power to begin the world over again.”

When World War II — itself now sometimes called an apocalypse – ushered in the nuclear age, it brought a radical transformation to the idea. Just as novelist Kurt Vonnegut lamented that the threat of nuclear war had robbed us of “plain old death” (each of us dying individually, mourned by those who survived us), the theologically educated lamented the fate of religion’s plain old apocalypse.

After this country’s “victory weapon” obliterated two Japanese cities in August 1945, most Americans sighed with relief that World War II was finally over. Few, however, believed that a permanently better world would arise from the radioactive ashes of that war. In the 1950s, even as the good times rolled economically, America’s nuclear fear created something historically new and ominous — a thoroughly secular image of the apocalypse.  That’s the one you’ll get first if you type “define apocalypse” into Google’s search engine: “the complete final destruction of the world.” In other words, one big “whoosh” and then… nothing. Total annihilation. The End.

Apocalypse as utter extinction was a new idea. Surprisingly soon, though, most Americans were (to adapt the famous phrase of filmmaker Stanley Kubrick) learning how to stop worrying and get used to the threat of “the big whoosh.” With the end of the Cold War, concern over a world-ending global nuclear exchange essentially evaporated, even if the nuclear arsenals of that era were left ominously in place.

Meanwhile, another kind of apocalypse was gradually arising: environmental destruction so complete that it, too, would spell the end of all life.

This would prove to be brand new in a different way. It is, as Todd Gitlin has so aptly termed it, history’s first “slow-motion apocalypse.” Climate change, as it came to be called, had been creeping up on us “in fits and starts,” largely unnoticed, for two centuries. Since it was so different from what Gitlin calls “suddenly surging Genesis-style flood” or the familiar “attack out of the blue,” it presented a baffling challenge. After all, the word apocalypse had been around for a couple of thousand years or more without ever being associated in any meaningful way with the word gradual.
The eminent historian of religions Mircea Eliade once speculated that people could grasp nuclear apocalypse because it resembled Act I in humanity’s huge stock of apocalypse myths, where the end comes in a blinding instant — even if Act II wasn’t going to follow. This mythic heritage, he suggested, remains lodged in everyone’s unconscious, and so feels familiar.

But in a half-century of studying the world’s myths, past and present, he had never found a single one that depicted the end of the world coming slowly. This means we have no unconscious imaginings to pair it with, nor any cultural tropes or traditions that would help us in our struggle to grasp it.

That makes it so much harder for most of us even to imagine an environmentally caused end to life. The very category of “apocalypse” doesn’t seem to apply. Without those apocalyptic images and fears to motivate us, a sense of the urgent action needed to avert such a slowly emerging global catastrophe lessens.

All of that (plus of course the power of the interests arrayed against regulating the fossil fuel industry) might be reason enough to explain the widespread passivity that puts the environmental peril so far down on the American political agenda. But as Dr. Seuss would have said, that is not all! Oh no, that is not all.

Apocalypses Everywhere

When you do that Google search on apocalypse, you’ll also get the most fashionable current meaning of the word: “Any event involving destruction on an awesome scale; [for example] ‘a stock market apocalypse.’” Welcome to the age of apocalypses everywhere.

With so many constantly crying apocalyptic wolf or selling apocalyptic thrills, it’s much harder now to distinguish between genuine threats of extinction and the cheap imitations. The urgency, indeed the very meaning, of apocalypse continues to be watered down in such a way that the word stands in danger of becoming virtually meaningless. As a result, we find ourselves living in an era that constantly reflects premonitions of doom, yet teaches us to look away from the genuine threats of world-ending catastrophe.

Oh, America still worries about the Bomb — but only when it’s in the hands of some “bad” nation. Once that meant Iraq (even if that country, under Saddam Hussein, never had a bomb and in 2003, when the Bush administration invaded, didn’t even have a bomb program). Now, it means Iran — another country without a bomb or any known plan to build one, but with the apocalyptic stare focused on it as if it already had an arsenal of such weapons — and North Korea.

These days, in fact, it’s easy enough to pin the label “apocalyptic peril” on just about any country one loathes, even while ignoring friendsallies, and oneself. We’re used to new apocalyptic threats emerging at a moment’s notice, with little (or no) scrutiny of whether the A-word really applies.

What’s more, the Cold War era fixed a simple equation in American public discourse: bad nation + nuclear weapon = our total destruction. So it’s easy to buy the platitude that Iran must never get a nuclear weapon or it’s curtains. That leaves little pressure on top policymakers and pundits to explain exactly how a few nuclear weapons held by Iran could actually harm Americans.

Meanwhile, there’s little attention paid to the world’s largest nuclear arsenal, right here in the U.S. Indeed, America’s nukes are quite literally impossible to see, hidden as they are underground, under the seas, and under the wraps of “top secret” restrictions. Who’s going to worry about what can’t be seen when so many dangers termed “apocalyptic” seem to be in plain sight?

Environmental perils are among them: melting glaciers and open-water Arctic seas, smog-blinded Chinese cities, increasingly powerful storms, and prolonged droughts. Yet most of the time such perils seem far away and like someone else’s troubles. Even when dangers in nature come close, they generally don’t fit the images in our apocalyptic imagination. Not surprisingly, then, voices proclaiming the inconvenient truth of a slowly emerging apocalypse get lost in the cacophony of apocalypses everywhere. Just one more set of boys crying wolf and so remarkably easy to deny or stir up doubt about.

Death in Life

Why does American culture use the A-word so promiscuously? Perhaps we’ve been living so long under a cloud of doom that every danger now readily takes on the same lethal hue.

Psychiatrist Robert Lifton predicted such a state years ago when he suggested that the nuclear age had put us all in the grips of what he called “psychic numbing” or “death in life.” We can no longer assume that we’ll die Vonnegut’s plain old death and be remembered as part of an endless chain of life. Lifton’s research showed that the link between death and life had become, as he put it, a “broken connection.”

As a result, he speculated, our minds stop trying to find the vitalizing images necessary for any healthy life. Every effort to form new mental images only conjures up more fear that the chain of life itself is coming to a dead end. Ultimately, we are left with nothing but “apathy, withdrawal, depression, despair.”

If that’s the deepest psychic lens through which we see the world, however unconsciously, it’s easy to understand why anything and everything can look like more evidence that The End is at hand. No wonder we have a generation of American youth and young adults who take a world filled with apocalyptic images for granted.

Think of it as, in some grim way, a testament to human resiliency. They are learning how to live with the only reality they’ve ever known (and with all the irony we’re capable of, others are learning how to sell them cultural products based on that reality). Naturally, they assume it’s the only reality possible. It’s no surprise that “The Walking Dead,” a zombie apocalypse series, is theirfavorite TV show, since it reveals (and revels in?) what one TV critic called the “secret life of the post-apocalyptic American teenager.”

Perhaps the only thing that should genuinely surprise us is how many of those young people still manage to break through psychic numbing in search of some way to make a difference in the world.

Yet even in the political process for change, apocalypses are everywhere. Regardless of the issue, the message is typically some version of “Stop this catastrophe now or we’re doomed!” (An example: Stop the Keystone XL pipeline or it’s “game over”!) A better future is often implied between the lines, but seldom gets much attention because it’s ever harder to imagine such a future, no less believe in it.

No matter how righteous the cause, however, such a single-minded focus on danger and doom subtly reinforces the message of our era of apocalypses everywhere: abandon all hope, ye who live here and now.

Read the entire article here.

Image: Armageddon movie poster. Courtesy of Touchstone Pictures.

Abraham Lincoln Was a Sham President

 

This is not the opinion of theDiagonal. Rather, it’s the view of the revisionist thinkers over at the so-called “News Leader”, Fox News. Purposefully I avoid commenting on news and political events, but once in a while the story is so jaw-droppingly incredulous that your friendly editor cannot keep away from his keyboard. Which brings me to Fox News.

The latest diatribe from the 24/7 conservative think tank is that Lincoln actually caused the Civil War. According to Fox analyst Andrew Napolitano the Civil War was an unnecessary folly, and could have been avoided by Lincoln had he chosen to pay off the South or let slavery come to a natural end.

This is yet another example of the mindless, ideological drivel dished out on a daily basis by Fox. Next are we likely to see Fox defend Hitler’s “cleansing” of Europe as fine economic policy that the Allies should have let run its course? Ugh! One has to suppose that the present day statistic of 30 million enslaved humans around the world is just as much a figment of the collective imaginarium that is Fox.

The one bright note to ponder about Fox and its finely-tuned propaganda machine comes from looking at its commercials. When the majority of its TV ads are for the over-60s — think Viagra, statins and catheters — you can sense that its aging demographic will soon sublimate to meet its alternate, heavenly reality.

From Salon:

“The Daily Show” had one of its best segments in a while on Monday night, ruthlessly and righteously taking Fox News legal analyst and libertarian Andrew Napolitano to task for using the airwaves to push his clueless and harmful revisionist understanding of the Civil War.

Jon Stewart and “senior black correspondent” Larry Wilmore criticized Napolitano for a Feb. 14 appearance on the Fox Business channel during which he called himself a “contrarian” when it comes to estimating former President Abraham Lincoln’s legacy and argued that the Civil War was unnecessary — and may not have even been about slavery, anyway!

“At the time that [Lincoln] was the president of the United States, slavery was dying a natural death all over the Western world,” Napolitano said. “Instead of allowing it to die, or helping it to die, or even purchasing the slaves and then freeing them — which would have cost a lot less money than the Civil War cost — Lincoln set about on the most murderous war in American history.”

Stewart quickly shred this argument to pieces, noting that Lincoln spent much of 1862 trying (and failing) to convince border states to accept compensatory emancipation as well as the fact that the South’s relationship with chattel slavery was fundamentally not just an economic but also a social system, one that it would never willingly abandon.

Soon after, Stewart turned to Wilmore, who noted that the Confederacy was “so committed to slavery that Lincoln didn’t die of natural causes.” Wilmore next pointed out that people who “think Lincoln started the Civil War because the North was ready to kill to end slavery” are mistaken. “[T]he truth was,” Wilmore said, “the South was ready to die to keep slavery.”

Stewart and Wilmore next highlighted that Napolitano doesn’t hate all wars, and in fact has a history of praising the Revolutionary War as necessary and just. “So it was heroic to fight for the proposition that all men are created equal, but when there’s a war to enforce that proposition, that’s wack?” Wilmore asked. “You know, there’s something not right when you feel the only black thing worth fighting for is tea.”

As the final dagger, Stewart and Wilmore noted that Napolitano has ranted at length on Fox about how taxation is immoral and unjust, prompting Wilmore to elegantly outline the problems with Napolitano-style libertarianism in a single paragraph. Speaking to Napolitano, Wilmore said:

You think it’s immoral for the government to reach into your pocket, rip your money away from its warm home and claim it as its own property, money that used to enjoy unfettered freedom is now conscripted to do whatever its new owner tells it to. Now, I know this is going to be a leap, but you know that sadness and rage you feel about your money? Well, that’s the way some of us feel about people.

Read the entire story here.

Video courtesy of The Daily Show with Jon Stewart, Comedy Central.

 

MondayMap: Human History

How does one condense four thousand years of human history into a single view? Well, John Sparks did just that with his histomap in 1931.

From Slate:

This “Histomap,” created by John B. Sparks, was first printed by Rand McNally in 1931. (The David Rumsey Map Collection hosts a fully zoomable version here.) (Update: Click on the image below to arrive at a bigger version.)

This giant, ambitious chart fit neatly with a trend in nonfiction book publishing of the 1920s and 1930s: the “outline,” in which large subjects (the history of the world! every school of philosophy! all of modern physics!) were distilled into a form comprehensible to the most uneducated layman.

The 5-foot-long Histomap was sold for $1 and folded into a green cover, which featured endorsements from historians and reviewers. The chart was advertised as “clear, vivid, and shorn of elaboration,” while at the same time capable of “holding you enthralled” by presenting:

the actual picture of the march of civilization, from the mud huts of the ancients thru the monarchistic glamour of the middle ages to the living panorama of life in present day America.

The chart emphasizes domination, using color to show how the power of various “peoples” (a quasi-racial understanding of the nature of human groups, quite popular at the time) evolved throughout history.

It’s unclear what the width of the colored streams is meant to indicate. In other words, if the Y axis of the chart clearly represents time, what does the X axis represent? Did Sparks see history as a zero-sum game, in which peoples and nations would vie for shares of finite resources? Given the timing of his enterprise—he made this chart between two world wars and at the beginning of a major depression—this might well have been his thinking.

Sparks followed up on the success of this Histomap by publishing at least two more: the Histomap of religion (which I’ve been unable to find online) and the Histomap of evolution.

Read the entire article a check out the zoomable histomap here.

Nineteenth Century Celebrity

You could be forgiven for believing that celebrity is a peculiar and pervasive symptom of our contemporary culture. After all in our multi-channel, always on pop-culture, 24×7 event-driven, media-obsessed maelstrom celebrities come, and go, in the blink of an eye. This is the age of celebrity.

Well, the U.S. had its own national and international celebrity almost two hundred years ago, and he wasn’t an auto-tuned pop star or a viral internet sensation with a cute cat. His name — Marie-Joseph Paul Yves Roch Gilbert du Motier, the Marquis de La Fayette, a French nobleman and officer, and a major general in the Continental Army.

From Slate:

The Marquis de Lafayette, French nobleman and officer, was a major general in the Continental Army by the age of nineteen. When he returned for a comprehensive tour of the United States in 1824-1825, Lafayette was 67, and was the last man still living who had served at his rank in the Continental Army.

Americans loved the aging soldier for his role in the Revolutionary War, and for his help after the war in smoothing diplomatic relations between the United States and France. Moreover, he was a living connection to his friend and mentor George Washington. The combination made him a celebrity who enjoyed a frenzied reception as he made his way through all 24 states.

Women, especially, poured forth affection for the Marquis. In one beautifully lettered address, the “Young Ladies of the Lexington Female Academy” (Kentucky) showered their visitor with assurances that he was remembered by the new generation of Americans: “Even the youngest, gallant Warrior, know you; even the youngest have been taught to lisp your name.”

Lafayette’s visit inspired the production of souvenir merchandise embroidered, painted, or printed with his face and name. This napkin and glove are two examples of such products.

In his book Souvenir Nation: Relics, Keepsakes, and Curios from the Smithsonian’s National Museum of American History, William L. Bird, Jr. reports that Lafayette was uncomfortable when he encountered ladies wearing these gloves—particularly because a gentleman was expected to kiss a lady’s hand upon first meeting. Bird writes:

When offered a gloved hand at a ball in Philadelphia, Lafayette “murmur[ed] a few graceful words to the effect that he did not care to kiss himself, he [then] made a very low bow, and the lady passed on.”

Read the entire article here.

Image: La Fayette as a Lieutenant General, in 1791. Portrait by Joseph-Désiré Court. Courtesy of Wikipedia.

Sounds of Extinction

Camera aficionados will find themselves lamenting the demise of the film advance. Now that the world has moved on from film to digital you will no longer hear that distinctive mechanical sound as you wind on the film, and hope the teeth on the spool engage the plastic of the film.

Hardcore computer buffs will no doubt miss the beep-beep-hiss sound of the 56K modem — that now seemingly ancient box that once connected us to… well, who knows what it actually connected us to at that speed.

Our favorite arcane sounds, soon to become relegated to the audio graveyard: the telephone handset slam, the click and carriage return of the typewriter, the whir of reel-to-reel tape, the crackle of the diamond stylus as it first hits an empty groove on a 33.

More sounds you may (or may not) miss below.

From Wired:

The forward march of technology has a drum beat. These days, it’s custom text-message alerts, or your friend saying “OK, Glass” every five minutes like a tech-drunk parrot. And meanwhile, some of the most beloved sounds are falling out of the marching band.

The boops and beeps of bygone technology can be used to chart its evolution. From the zzzzzzap of the Tesla coil to the tap-tap-tap of Morse code being sent via telegraph, what were once the most important nerd sounds in the world are now just historical signposts. But progress marches forward, and for every irritatingly smug Angry Pigs grunt we have to listen to, we move further away from the sound of the Defender ship exploding.

Let’s celebrate the dying cries of technology’s past. The follow sounds are either gone forever, or definitely on their way out. Bow your heads in silence and bid them a fond farewell.

The Telephone Slam

Ending a heated telephone conversation by slamming the receiver down in anger was so incredibly satisfying. There was no better way to punctuate your frustration with the person on the other end of the line. And when that receiver hit the phone, the clack of plastic against plastic was accompanied by a slight ringing of the phone’s internal bell. That’s how you knew you were really pissed — when you slammed the phone so hard, it rang.

There are other sounds we’ll miss from the phone. The busy signal died with the rise of voicemail (although my dad refuses to get voicemail or call waiting, so he’s still OG), and the rapid click-click-click of the dial on a rotary phone is gone. But none of those compare with hanging up the phone with a forceful slam.

Tapping a touchscreen just does not cut it. So the closest thing we have now is throwing the pitifully fragile smartphone against the wall.

The CRT Television

The only TVs left that still use cathode-ray tubes are stashed in the most depressing places — the waiting rooms of hospitals, used car dealerships, and the dusty guest bedroom at your grandparents’ house. But before we all fell prey to the magical resolution of zeros and ones, boxy CRT televisions warmed (literally) the living rooms of every home in America. The sounds they made when you turned them on warmed our hearts, too — the gentle whoosh of the degaussing coil as the set was brought to life with the heavy tug of a pull-switch, or the satisfying mechanical clunk of a power button. As the tube warmed up, you’d see the visuals slowly brighten on the screen, giving you ample time to settle into the couch to enjoy latest episode of Seinfeld.

Read the entire article here.

Image courtesy of Wired.

Worst Job in the World

Would you rather be a human automaton inside a Chinese factory making products for your peers or a banquet attendant in ancient Rome? Thanks to Lapham’s Quarterly for this disturbing infographic, which shows how times may not have changed as much as we would have believed for the average worker over the last 2,000 years.

Visit the original infographic here.

Infographic courtesy of Lapham’s Quarterly.

Social Media and Vanishing History

Social media is great for notifying members in one’s circle of events in the here and now. Of course, most events turn out to be rather trivial, of the “what I ate for dinner” kind. However, social media also has a role in spreading word of more momentous social and political events; the Arab Spring comes to mind.

But, while Twitter and its peers may be a boon for those who live in the present moment and need to transmit their current status, it seems that our social networks are letting go of the past. Will history become lost and irrelevant to the Twitter generation?

A terrifying thought.

[div class=attrib]From Technology Review:[end-div]

On 25 January 2011, a popular uprising began in Egypt that  led to the overthrow of the country’s brutal president and to the first truly free elections. One of the defining features of this uprising and of others in the Arab Spring was the way people used social media to organise protests and to spread news.

Several websites have since begun the task of curating this content, which is an important record of events and how they unfolded. That led Hany SalahEldeen and Michael Nelson at Old Dominion University in Norfolk, Virginia, to take a deeper look at the material to see how much the shared  were still live.

What they found has serious implications. SalahEldeen and Nelson say a significant proportion of the websites that this social media points to has disappeared. And the same pattern occurs for other culturally significant events, such as the the H1N1 virus outbreak, Michael Jackson’s death and the Syrian uprising.

In other words, our history, as recorded by social media, is slowly leaking away.

Their method is straightforward. SalahEldeen and Nelson looked for tweets on six culturally significant events that occurred between June 2009 and March 2012. They then filtered the URLs these tweets pointed to and checked to see whether the content was still available on the web, either in its original form or in an archived form.

They found that the older the social media, the more likely its content was to be missing. In fact, they found an almost linear relationship between time and the percentage lost.

The numbers are startling. They say that 11 per cent of the social media content had disappeared within a year and 27 per cent within 2 years. Beyond that, SalahEldeen and Nelson say the world loses 0.02 per cent of its culturally significant social media material every day.

That’s a sobering thought.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Movie poster for the 2002 film ”The Man Without a Past”. The Man Without a Past (Finnish: Mies vailla menneisyyttä) is a 2002 Finnish comedy-drama film directed by Aki Kaurismäki. Courtesy of Wikipedia.[end-div]

Famous Artworks Inspired by Other Famous Works

The Garden of Earthly Delights. Hieronymous Bosch.

The Tilled Field. Joan Miró.

[div class=attrib]From Flavorwire:[end-div]

We tend to think of appropriation as a postmodern thing, with artists in all media drawing on, referring to, and mashing up the most influential works of the past. But we forget that this has been happening for centuries — millennia, actually — as Renaissance painters paid tribute to Greek art, ideas circulated within the 19th-century French art scene, and Dada hijacked the course of art history, mocking and inverting everything that came before it. After the jump, we round up some of the best, most famous, and all-around strangest artworks inspired by other artworks. Some are homages, some are parodies, some are responses, and a few seem to function as all three.

Joan Miró’s The Tilled Field, inspired by Hieronymous Bosch’s The Garden of Earthly Delights

The resemblance between Joan Miró’s Surrealist painting and Bosch’s Early Netherlander triptych may not be as clear as the parallels between some of the other works on this list, but when you know what to look for, the resemblance is certainly there. Besides the colors, which do echo The Garden of Earthly Delights, Miró placed in his painting many objects that appear in Bosch’s — crudely sexualized figures, disembodied ears, flocks of birds. Although the styles are different, both have the same busy, chaotic energy.

[div class=attrib]More from this top 10 list after the jump.[end-div]

Once Not So Crazy Ideas About Our Sun

Some wacky ideas about our sun from not so long ago help us realize the importance of a healthy dose of skepticism combined with good science. In fact, as you’ll see from the timestamp on the image from NASA’s Solar and Heliospheric Observatory (SOHO) science can now bring us – the public – near realtime images of our nearest star.

[div class=attrib]From Slate:[end-div]

The sun is hell.

The18th-century English clergyman Tobias Swinden argued that hell couldn’t lie below Earth’s surface: The fires would soon go out, he reasoned, due to lack of air. Not to mention that the Earth’s interior would be too small to accommodate all the damned, especially after making allowances for future generations of the damned-to-be. Instead, wrote Swinden, it’s obvious that hell stares us in the face every day: It’s the sun.

The sun is made of ice.

In 1798, Charles Palmer—who was not an astronomer, but an accountant—argued that the sun can’t be a source of heat, since Genesis says that light already existed before the day that God created the sun. Therefore, he reasoned, the sun must merely focus light upon Earth—light that exists elsewhere in the universe. Isn’t the sun even shaped like a giant lens? The only natural, transparent substance that it could be made of, Palmer figured, is ice. Palmer’s theory was published in a widely read treatise that, its title crowed, “overturn[ed] all the received systems of the universe hitherto extant, proving the celebrated and indefatigable Sir Isaac Newton, in his theory of the solar system, to be as far distant from the truth, as any of the heathen authors of Greece or Rome.”

Earth is a sunspot.

Sunspots are magnetic regions on the sun’s surface. But in 1775, mathematician and theologian J. Wiedeberg said that the sun’s spots are created by the clumping together of countless solid “heat particles,” which he speculated were constantly being emitted by the sun. Sometimes, he theorized, these heat particles stick together even at vast distances from the sun—and this is how planets form. In other words, he believed that Earth is a sunspot.

The sun’s surface is liquid.

Throughout the 18th and 19th centuries, textbooks and astronomers were torn between two competing ideas about the sun’s nature. Some believed that its dazzling brightness was caused by luminous clouds and that small holes in the clouds, which revealed the cool, dark solar surface below, were the sunspots. But the majority view was that the sun’s body was a hot, glowing liquid, and that the sunspots were solar mountains sticking up through this lava-like substance.

The sun is inhabited.

No less a distinguished astronomer than William Herschel, who discovered the planet Uranus in 1781, often stated that the sun has a cool, solid surface on which human-like creatures live and play. According to him, these solar citizens are shielded from the heat given off by the sun’s “dazzling outer clouds” by an inner protective cloud layer—like a layer of haz-mat material—that perfectly blocks the solar emissions and allows for pleasant grassy solar meadows and idyllic lakes.

Will nostalgia destroy pop culture?

[div class=attrib]Thomas Rogers for Slate:[end-div]

Over the last decade, American culture has been overtaken by a curious, overwhelming sense of nostalgia. Everywhere you look, there seems to be some new form of revivalism going on. The charts are dominated by old-school-sounding acts like Adele and Mumford & Sons. The summer concert schedule is dominated by reunion tours. TV shows like VH1’s “I Love the 90s” allow us to endlessly rehash the catchphrases of the recent past. And, thanks to YouTube and iTunes, new forms of music and pop culture are facing increasing competition from the ever-more-accessible catalog of older acts.

In his terrific new book, “Retromania,” music writer Simon Reynolds looks at how this nostalgia obsession is playing itself out everywhere from fashion to performance art to electronic music — and comes away with a worrying prognosis. If we continue looking backward, he argues, we’ll never have transformative decades, like the 1960s, or bold movements like rock ‘n’ roll, again. If all we watch and listen to are things that we’ve seen and heard before, and revive trends that have already existed, culture becomes an inescapable feedback loop.

Salon spoke to Reynolds over the phone from Los Angeles about the importance of the 1960s, the strangeness of Mumford & Sons — and why our future could be defined by boredom.

In the book you argue that our culture has increasingly been obsessed with looking backward, and that’s a bad thing. What makes you say that?

Every day, some new snippet of news comes along that is somehow connected to reconsuming the past. Just the other day I read that the famous Redding Festival in Britain is going to be screening a 1992 Nirvana concert during their festival. These events are like cultural antimatter. They won’t be remembered 20 years from now, and the more of them there are, the more alarming it is. I can understand why people want to go to them — they’re attractive and comforting. But this nostalgia seems to have crept into everything. The other day my daughter, who is 5 years old, was at camp, and they had an ’80s day. How can my daughter even understand what that means? She said the counselors were dressed really weird.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Slate.[end-div]

Book Review: The First Detective

A new book by James Morton examines the life and times of cross-dressing burglar, prison-escapee and snitch turned super-detective Eugène-François Vidocq.

[div class=attrib]From The Barnes & Noble Review:[end-div]

The daring costumed escapes and bedsheet-rope prison breaks of the old romances weren’t merely creaky plot devices; they were also the objective correlatives of the lost politics of early modern Europe. Not yet susceptible to legislative amelioration, rules and customs that seemed both indefensible and unassailable had to be vaulted over like collapsing bridges or tunneled under like manor walls. Not only fictional musketeers but such illustrious figures as the young Casanova and the philosopher Jean-Jacques Rousseau spent their early years making narrow escapes from overlapping orthodoxies, swimming moats to marriages of convenience and digging their way  out of prisons of privilege by dressing in drag or posing as noblemen’s sons. If one ran afoul of the local clergy or some aristocratic cuckold, there were always new bishops and magistrates to charm in the next diocese or département.

In 1775–roughly a generation after the exploits of Rousseau and Casanova–a prosperous baker’s son named Eugène-François Vidocq was born in Arras, in northern France. Indolent and adventuresome, he embarked upon a career that in its early phase looked even more hapless and disastrous than those of his illustrious forebears. An indifferent soldier in the chaotic, bloody interregnum of revolutionary France, Vidocq quickly fell into petty crime (at one point, he assumed the name Rousseau for a time as an alias and nom de guerre). A hapless housebreaker and a credulous co-conspirator, his criminal misadventures were equaled only by his skill escaping from the dungeons and bagnes that passed for a penal system in the pre-Napoleonic era.

By 1809, his canniness as an informer landed him a job with the police; with his old criminal comrades as willing foot soldiers, Vidocq organized a brigade de sûreté, a unit of plainclothes police, which in 1813 Napoleon made an official organ of state security. Throughout his subsequent career he would lay much of the foundation of modern policing, and may be considered a forebear not only to the Dupins and the Holmes of modern detective literature but of swashbuckling, above-the-law policemen like Eliot Ness and J. Edgar Hoover as well.

[div class=attrib]More from theSource here.[end-div]