Tag Archives: new media

Fahrenheit 2451? Ray Bradbury Comes to the eReader

Fahrenheit 2,451 may well be the temperature at which the glass in your Kindle or Nook eReader is likely to melt. This may give Ray Bradbury mixed feelings.

In one of his masterworks, Fahrenheit 451, Bradbury warned of the displacement and destruction of books by newer means of distribution such as television. Of the novel’s central idea Bradbury says, “It’s about the moronic influence of popular culture through local TV news, the proliferation of giant screens and the bombardment of factoids… We’ve moved in to this period of history that I described in Fahrenheit 50 years ago.”

So, it’s rather a surprise to see his work in full digital form available through an eReader, such as the Kindle or Nook. More over at Wired on Bradbury’s reasoning.

From Wired:

Ray Bradbury’s Fahrenheit 451 is now officially available as an e-book. Simon & Schuster are publishing both the hardcover and digital editions in the United States for a deal reportedly worth millions of dollars, according to the Associated Press.

Bradbury has been vocal about his dislike for e-books and the internet, calling it “a big distraction.” In order to get him to relent, the publisher had to both pay a premium price and play a little hardball.

Bradbury’s agent Michael Congdon told the AP that renewing the book’s hardcover rights, whether with Simon & Schuster or any other publisher, had to include digital rights as well.

“We explained the situation to [Bradbury] that a new contract wouldn’t be possible without e-book rights,” said Congdon. “He understood and gave us the right to go ahead.”

Unfortunately for hard-core Bradbury fans, according to Simon & Schuster’s press release [PDF], only Fahrenheit 451 is currently being released as an e-book. The deal includes the mass-market rights to The Martian Chronicles and The Illustrated Man, but not their digital rights.

Like the Harry Potter books before them, samizdat digital copies of Bradbury’s books edited by fans have been floating around for years. (I don’t know anyone who’s actually memorized Fahrenheit, like the novel’s “Book People” do with banned books.)

Bradbury is far from the last digital holdout. Another K-12 classic, Harper Lee’s To Kill A Mockingbird, is only available in print. None of Thomas Pynchon’s novels are available as e-books, although Pynchon has been characteristically quiet on the subject. Nor are any English translations of Gabriel Garcia Marquez, and only a few of Marquez’s story collections and none of his classic novels are even available in Spanish. Early editions of James Joyce’s books are in the public domain, but Finnegans Wake, whose rights are tightly controlled by Joyce’s grandson, is not.

Most of the gaps in the digital catalog, however, don’t stem from individual authors or rightsholders holding out like Bradbury. They’re structural; whole presses whose catalogs haven’t been digitized, whose rights aren’t extended to certain countries, or whose contracts didn’t anticipate some of the newer innovations in e-reading, such as book lending, whether from a retailer, another user, or a public library.

In light of Bradbury’s lifelong advocacy for libraries, I asked Simon & Schuster whether Fahrenheit 451 would be made available for digital lending; their representatives did not respond. [Update: Simon & Schuster's Emer Flounders says the publisher plans to make Fahrenheit 451 available as an e-book to libraries in the first half of 2012.]

In a 2009 interview, Bradbury says he rebuffed an offer from Yahoo to publish a book or story on the internet. “You know what I told them? ‘To hell with you. To hell with you and to hell with the Internet.’”

Read the entire article here.

Image: Fahrenheit 451. Courtesy of Panther.

Send to Kindle

Mind Over Mass Media

From the New York Times:

NEW forms of media have always caused moral panics: the printing press, newspapers, paperbacks and television were all once denounced as threats to their consumers’ brainpower and moral fiber.

So too with electronic technologies. PowerPoint, we’re told, is reducing discourse to bullet points. Search engines lower our intelligence, encouraging us to skim on the surface of knowledge rather than dive to its depths. Twitter is shrinking our attention spans.

But such panics often fail basic reality checks. When comic books were accused of turning juveniles into delinquents in the 1950s, crime was falling to record lows, just as the denunciations of video games in the 1990s coincided with the great American crime decline. The decades of television, transistor radios and rock videos were also decades in which I.Q. scores rose continuously.

For a reality check today, take the state of science, which demands high levels of brainwork and is measured by clear benchmarks of discovery. These days scientists are never far from their e-mail, rarely touch paper and cannot lecture without PowerPoint. If electronic media were hazardous to intelligence, the quality of science would be plummeting. Yet discoveries are multiplying like fruit flies, and progress is dizzying. Other activities in the life of the mind, like philosophy, history and cultural criticism, are likewise flourishing, as anyone who has lost a morning of work to the Web site Arts & Letters Daily can attest.

Critics of new media sometimes use science itself to press their case, citing research that shows how “experience can change the brain.” But cognitive neuroscientists roll their eyes at such talk. Yes, every time we learn a fact or skill the wiring of the brain changes; it’s not as if the information is stored in the pancreas. But the existence of neural plasticity does not mean the brain is a blob of clay pounded into shape by experience.

Experience does not revamp the basic information-processing capacities of the brain. Speed-reading programs have long claimed to do just that, but the verdict was rendered by Woody Allen after he read “War and Peace” in one sitting: “It was about Russia.” Genuine multitasking, too, has been exposed as a myth, not just by laboratory studies but by the familiar sight of an S.U.V. undulating between lanes as the driver cuts deals on his cellphone.

Moreover, as the psychologists Christopher Chabris and Daniel Simons show in their new book “The Invisible Gorilla: And Other Ways Our Intuitions Deceive Us,” the effects of experience are highly specific to the experiences themselves. If you train people to do one thing (recognize shapes, solve math puzzles, find hidden words), they get better at doing that thing, but almost nothing else. Music doesn’t make you better at math, conjugating Latin doesn’t make you more logical, brain-training games don’t make you smarter. Accomplished people don’t bulk up their brains with intellectual calisthenics; they immerse themselves in their fields. Novelists read lots of novels, scientists read lots of science.

The effects of consuming electronic media are also likely to be far more limited than the panic implies. Media critics write as if the brain takes on the qualities of whatever it consumes, the informational equivalent of “you are what you eat.” As with primitive peoples who believe that eating fierce animals will make them fierce, they assume that watching quick cuts in rock videos turns your mental life into quick cuts or that reading bullet points and Twitter postings turns your thoughts into bullet points and Twitter postings.

Yes, the constant arrival of information packets can be distracting or addictive, especially to people with attention deficit disorder. But distraction is not a new phenomenon. The solution is not to bemoan technology but to develop strategies of self-control, as we do with every other temptation in life. Turn off e-mail or Twitter when you work, put away your Blackberry at dinner time, ask your spouse to call you to bed at a designated hour.

And to encourage intellectual depth, don’t rail at PowerPoint or Google. It’s not as if habits of deep reflection, thorough research and rigorous reasoning ever came naturally to people. They must be acquired in special institutions, which we call universities, and maintained with constant upkeep, which we call analysis, criticism and debate. They are not granted by propping a heavy encyclopedia on your lap, nor are they taken away by efficient access to information on the Internet.

The new media have caught on for a reason. Knowledge is increasing exponentially; human brainpower and waking hours are not. Fortunately, the Internet and information technologies are helping us manage, search and retrieve our collective intellectual output at different scales, from Twitter and previews to e-books and online encyclopedias. Far from making us stupid, these technologies are the only things that will keep us smart.

Steven Pinker, a professor of psychology at Harvard, is the author of “The Stuff of Thought.”

More from theSource here.

Send to Kindle

Why I Blog

By Andrew Sullivan for the Altantic

The word blog is a conflation of two words: Web and log. It contains in its four letters a concise and accurate self-description: it is a log of thoughts and writing posted publicly on the World Wide Web. In the monosyllabic vernacular of the Internet, Web log soon became the word blog.

This form of instant and global self-publishing, made possible by technology widely available only for the past decade or so, allows for no retroactive editing (apart from fixing minor typos or small glitches) and removes from the act of writing any considered or lengthy review. It is the spontaneous expression of instant thought—impermanent beyond even the ephemera of daily journalism. It is accountable in immediate and unavoidable ways to readers and other bloggers, and linked via hypertext to continuously multiplying references and sources. Unlike any single piece of print journalism, its borders are extremely porous and its truth inherently transitory. The consequences of this for the act of writing are still sinking in.

A ship’s log owes its name to a small wooden board, often weighted with lead, that was for centuries attached to a line and thrown over the stern. The weight of the log would keep it in the same place in the water, like a provisional anchor, while the ship moved away. By measuring the length of line used up in a set period of time, mariners could calculate the speed of their journey (the rope itself was marked by equidistant “knots” for easy measurement). As a ship’s voyage progressed, the course came to be marked down in a book that was called a log.

In journeys at sea that took place before radio or radar or satellites or sonar, these logs were an indispensable source for recording what actually happened. They helped navigators surmise where they were and how far they had traveled and how much longer they had to stay at sea. They provided accountability to a ship’s owners and traders. They were designed to be as immune to faking as possible. Away from land, there was usually no reliable corroboration of events apart from the crew’s own account in the middle of an expanse of blue and gray and green; and in long journeys, memories always blur and facts disperse. A log provided as accurate an account as could be gleaned in real time.

As you read a log, you have the curious sense of moving backward in time as you move forward in pages—the opposite of a book. As you piece together a narrative that was never intended as one, it seems—and is—more truthful. Logs, in this sense, were a form of human self-correction. They amended for hindsight, for the ways in which human beings order and tidy and construct the story of their lives as they look back on them. Logs require a letting-go of narrative because they do not allow for a knowledge of the ending. So they have plot as well as dramatic irony—the reader will know the ending before the writer did.

More from theSource here.

Send to Kindle