Banned and Challenged Books: A Summer Reading List

Each year the American Library Association publishes a list of attempts by groups and individuals to have books banned from classrooms, libraries and other public places in the United States. The list includes classics such as Ulysses, 1984, Beloved, Gone With the Wind, and The Lord of the Rings. So, if you’re at a loss this summer for a good book in which to get lost, pick one (or three) from the list below and mark one up for the freedom of ideas.

[div class=attrib]From American Library Association:[end-div]

The titles below represent banned or challenged books on that list ( see the entire list here). For more information on why these books were challenged, visit challenged classics and the Banned Books Week Web site

1. The Great Gatsby, by F. Scott Fitzgerald
2. The Catcher in the Rye, by J.D. Salinger
3. The Grapes of Wrath, by John Steinbeck
4. To Kill a Mockingbird, by Harper Lee
5. The Color Purple, by Alice Walker
6. Ulysses, by James Joyce
7. Beloved, by Toni Morrison
8. The Lord of the Flies, by William Golding
9. 1984, by George Orwell

11. Lolita, by Vladmir Nabokov
12. Of Mice and Men, by John Steinbeck

15. Catch-22, by Joseph Heller
16. Brave New World, by Aldous Huxley
17. Animal Farm, by George Orwell
18. The Sun Also Rises, by Ernest Hemingway
19. As I Lay Dying, by William Faulkner
20. A Farewell to Arms, by Ernest Hemingway

23. Their Eyes Were Watching God, by Zora Neale Hurston
24. Invisible Man, by Ralph Ellison
25. Song of Solomon, by Toni Morrison
26. Gone with the Wind, by Margaret Mitchell
27. Native Son, by Richard Wright
28. One Flew Over the Cuckoo’s Nest, by Ken Kesey
29. Slaughterhouse-Five, by Kurt Vonnegut
30. For Whom the Bell Tolls, by Ernest Hemingway

33. The Call of the Wild, by Jack London

36. Go Tell it on the Mountain, by James Baldwin

38. All the King’s Men, by Robert Penn Warren

40. The Lord of the Rings, by J.R.R. Tolkien

45. The Jungle, by Upton Sinclair

48. Lady Chatterley’s Lover, by D.H. Lawrence
49. A Clockwork Orange, by Anthony Burgess
50. The Awakening, by Kate Chopin

53. In Cold Blood, by Truman Capote

55. The Satanic Verses, by Salman Rushdie

57. Sophie’s Choice, by William Styron

64. Sons and Lovers, by D.H. Lawrence

66. Cat’s Cradle, by Kurt Vonnegut
67. A Separate Peace, by John Knowles

73. Naked Lunch, by William S. Burroughs
74. Brideshead Revisited, by Evelyn Waugh
75. Women in Love, by D.H. Lawrence

80. The Naked and the Dead, by Norman Mailer

84. Tropic of Cancer, by Henry Miller

88. An American Tragedy, by Theodore Dreiser

97. Rabbit, Run, by John Updike

The Arrow of Time

No, not a cosmologist’s convoluted hypothesis as to why time moves in only (so far discovered) one direction. The arrow of time here is a thoroughly personal look at the linearity of the 4th dimension and an homage to the family portrait in the process.

The family takes a “snapshot” of each member at the same time each year; we’ve just glimpsed the latest for 2011. And, in so doing they give us much to ponder on the nature of change and the nature of stasis.

[div class=attrib]From Diego Goldberg and family:[end-div]

Catch all the intervening years between 1976 and 2011 at theSource here.

More subatomic spot changing

[div class=attrib]From the Economist:[end-div]

IN THIS week’s print edition we report a recent result from the T2K collaboration in Japan which has found strong hints that neutrinos, the elusive particles theorists believe to be as abundant in the universe as photons, but which almost never interact with anything, are as fickle as they are coy.

It has been known for some time that neutrinos switch between three types, or flavours, as they zip through space at a smidgen below the speed of light. The flavours are distinguished by the particles which emerge on the rare occasion a neutrino does bump into something. And so, an electron-neutrino conjures up an electron, a muon-neutrino, a muon, and a tau-neutrino, a tau particle (muons and tau are a lot like electrons, but heavier and less stable). Researchers at T2K observed, for the first time, muon-neutrinos transmuting into the electron variety—the one sort of spot-changing that had not been seen before. But their results, with a 0.7% chance of being a fluke, was, by the elevated standards of particle physics, tenuous.

Now, T2K’s rival across the Pacific has made it less so. MINOS beams muon-neutrinos from Fermilab, America’s biggest particle-physics lab located near Chicago, to a 5,000-tonne detector sitting in the Soudan mine in Minnesota, 735km (450 miles) to the north-west. On June 24th its researchers annouced that they, too, had witnessed some of muon-neutrinos change to the electron variety along the way. To be precise, the experiment recorded 62 events which could have been caused by electron-neutrinos. If the proposed transmutation does not occur in nature, it ought to have seen no more than 49 (the result of electron-neutrinos streaming in from space or radioactive rocks on Earth). Were the T2K figures spot on, as it were, it should have seen 71.

As such, the result from MINOS, which uses different methods to study the same phenomenon, puts the transmutation hypothesis on a firmer footing. This advances the search for a number known as delta (?). This is one of the parameters of the formula which physicists think describes neutrinos spot-changing antics. Physicists are keen to pin it down, since it also governs the description of the putative asymmetry between matter and antimatter that left matter as the dominant feature of the universe after the Big Bang.

In light of the latest result, it remains unclear whether either the American or the Japanese experiment is precise enough to measure delta. In 2013, however, MINOS will be supplanted by NOvA, a fancier device located in another Minnesota mine 810km from Fermilab’s muon-neutrino cannon. That ought to do the trick. Then again, nature has the habit of springing surprises.

And in more ways than one. Days after T2K’s run was cut short by the earthquake that shook Japan in March, devastating the muon-neutrino source at J-PARC, the country’s main particle-accelerator complex, MINOS had its own share of woe when the Soudan mine sustained significant flooding. Fortunately, the experiment itself escaped relatively unscathed. But the eerie coincidence spurred some boffins, not a particularly superstitious bunch, to speak of a neutrino curse. Fingers crossed that isn’t the case.

[div class=attrib]More from theSource here.[end-div]

[div]Image courtesy of Fermilab.[end-div]

Solar power from space: Beam it down, Scotty

[div class=attrib]From the Economist:[end-div]

THE idea of collecting solar energy in space and beaming it to Earth has been around for at least 70 years. In “Reason”, a short story by Isaac Asimov that was published in 1941, a space station transmits energy collected from the sun to various planets using microwave beams.

The advantage of intercepting sunlight in space, instead of letting it find its own way through the atmosphere, is that so much gets absorbed by the air. By converting it to the right frequency first (one of the so-called windows in the atmosphere, in which little energy is absorbed) a space-based collector could, enthusiasts claim, yield on average five times as much power as one located on the ground.

The disadvantage is cost. Launching and maintaining suitable satellites would be ludicrously expensive. But perhaps not, if the satellites were small and the customers specialised. Military expeditions, rescuers in disaster zones, remote desalination plants and scientific-research bases might be willing to pay for such power from the sky. And a research group based at the University of Surrey, in England, hopes that in a few years it will be possible to offer it to them.

This summer, Stephen Sweeney and his colleagues will test a laser that would do the job which Asimov assigned to microwaves. Certainly, microwaves would work: a test carried out in 2008 transmitted useful amounts of microwave energy between two Hawaiian islands 148km (92 miles) apart, so penetrating the 100km of the atmosphere would be a doddle. But microwaves spread out as they propagate. A collector on Earth that was picking up power from a geostationary satellite orbiting at an altitude of 35,800km would need to be spread over hundreds of square metres. Using a laser means the collector need be only tens of square metres in area.

[div class=attrib]More from theSource here.[end-div]

Largest cosmic structures ‘too big’ for theories

[div class=attrib]From New Scientist:[end-div]

Space is festooned with vast “hyperclusters” of galaxies, a new cosmic map suggests. It could mean that gravity or dark energy – or perhaps something completely unknown – is behaving very strangely indeed.

We know that the universe was smooth just after its birth. Measurements of the cosmic microwave background radiation (CMB), the light emitted 370,000 years after the big bang, reveal only very slight variations in density from place to place. Gravity then took hold and amplified these variations into today’s galaxies and galaxy clusters, which in turn are arranged into big strings and knots called superclusters, with relatively empty voids in between.

On even larger scales, though, cosmological models say that the expansion of the universe should trump the clumping effect of gravity. That means there should be very little structure on scales larger than a few hundred million light years across.

But the universe, it seems, did not get the memo. Shaun Thomas of University College London (UCL), and colleagues have found aggregations of galaxies stretching for more than 3 billion light years. The hyperclusters are not very sharply defined, with only a couple of per cent variation in density from place to place, but even that density contrast is twice what theory predicts.

“This is a challenging result for the standard cosmological models,” says Francesco Sylos Labini of the University of Rome, Italy, who was not involved in the work.

Colour guide

The clumpiness emerges from an enormous catalogue of galaxies called the Sloan Digital Sky Survey, compiled with a telescope at Apache Point, New Mexico. The survey plots the 2D positions of galaxies across a quarter of the sky. “Before this survey people were looking at smaller areas,” says Thomas. “As you look at more of the sky, you start to see larger structures.”

A 2D picture of the sky cannot reveal the true large-scale structure in the universe. To get the full picture, Thomas and his colleagues also used the colour of galaxies recorded in the survey.

More distant galaxies look redder than nearby ones because their light has been stretched to longer wavelengths while travelling through an expanding universe. By selecting a variety of bright, old elliptical galaxies whose natural colour is well known, the team calculated approximate distances to more than 700,000 objects. The upshot is a rough 3D map of one quadrant of the universe, showing the hazy outlines of some enormous structures.

[div class=attrib]More from theSource here.[end-div]

Life of a Facebook Photo

Before photo-sharing, photo blogs, photo friending, “PhotoShopping” and countless other photo-enabled apps and services, there was compose, point, focus, click, develop, print. The process seemed a lot simpler way back then. Perhaps, this was due to lack of options for both input and output. Input? Simple. Go buy a real camera. Output? Simple. Slide or prints. The end.

The options for input and output have exploded by orders of magnitude over the last couple of decades. Nowadays, even my toaster can take pictures and I can output them on my digital refrigerator, sans, of course, real photographs with that limp, bendable magnetic backing. The entire end-to-end process of taking a photograph and sharing it with someone else is now replete with so many choices and options that today it seems to have become inordinately more complex.

So, to help all prehistoric photographers like me, here’s an interesting process flow for your digital images in the age of Facebook.

[div class=attrib]From Pixable:[end-div]

Evolution machine: Genetic engineering on fast forward

[div class=attrib]From the New Scientist:[end-div]

Automated genetic tinkering is just the start – this machine could be used to rewrite the language of life and create new species of humans

IT IS a strange combination of clumsiness and beauty. Sitting on a cheap-looking worktop is a motley ensemble of flasks, trays and tubes squeezed onto a home-made frame. Arrays of empty pipette tips wait expectantly. Bunches of black and grey wires adorn its corners. On the top, robotic arms slide purposefully back and forth along metal tracks, dropping liquids from one compartment to another in an intricately choreographed dance. Inside, bacteria are shunted through slim plastic tubes, and alternately coddled, chilled and electrocuted. The whole assembly is about a metre and a half across, and controlled by an ordinary computer.

Say hello to the evolution machine. It can achieve in days what takes genetic engineers years. So far it is just a prototype, but if its proponents are to be believed, future versions could revolutionise biology, allowing us to evolve new organisms or rewrite whole genomes with ease. It might even transform humanity itself.

These days everything from your food and clothes to the medicines you take may well come from genetically modified plants or bacteria. The first generation of engineered organisms has been a huge hit with farmers and manufacturers – if not consumers. And this is just the start. So far organisms have only been changed in relatively crude and simple ways, often involving just one or two genes. To achieve their grander ambitions, such as creating algae capable of churning out fuel for cars, genetic engineers are now trying to make far more sweeping changes.

[div class=attrib]More from theSource here.[end-div]

MondayPoem: Morning In The Burned House

[div class=attrib]Morning In The Burned House, Margaret Atwood[end-div]

In the burned house I am eating breakfast.
You understand: there is no house, there is no breakfast,
yet here I am.

The spoon which was melted scrapes against
the bowl which was melted also.
No one else is around.

Where have they gone to, brother and sister,
mother and father? Off along the shore,
perhaps. Their clothes are still on the hangers,

their dishes piled beside the sink,
which is beside the woodstove
with its grate and sooty kettle,

every detail clear,
tin cup and rippled mirror.
The day is bright and songless,

the lake is blue, the forest watchful.
In the east a bank of cloud
rises up silently like dark bread.

I can see the swirls in the oilcloth,
I can see the flaws in the glass,
those flares where the sun hits them.

I can’t see my own arms and legs
or know if this is a trap or blessing,
finding myself back here, where everything

in this house has long been over,
kettle and mirror, spoon and bowl,
including my own body,

including the body I had then,
including the body I have now
as I sit at this morning table, alone and happy,

bare child’s feet on the scorched floorboards
(I can almost see)
in my burning clothes, the thin green shorts

and grubby yellow T-shirt
holding my cindery, non-existent,
radiant flesh. Incandescent.

Nick Risinger’s Photopic Sky Survey

Big science covering scales from the microscopic to the vastness of the universe continues to deliver stunning new insights, now on a daily basis. I takes huge machines such as the Tevatron at Fermilab, CERN’s Large Hadron Collider, NASA’s Hubble Telescope and the myriad other detectors, arrays, spectrometers, particle smashers to probe some of our ultimate questions. The results from these machines bring us fantastic new perspectives and often show us remarkable pictures of the very small and very large.

Then there is Nick Risinger’s Photopic Sky Survey. No big science, no vast machines — just Nick Risinger, accompanied by retired father, camera equipment and 45,000 miles of travels capturing our beautiful night sky as never before.

[div class=attrib]From Nick Risinger:[end-div]

The Photopic Sky Survey is a 5,000 megapixel photograph of the entire night sky stitched together from 37,440 exposures. Large in size and scope, it portrays a world far beyond the one beneath our feet and reveals our familiar Milky Way with unfamiliar clarity.

It was clear that such a survey would be quite difficult visually hopping from one area of the sky to the next—not to mention possible lapses in coverage—so this called for a more systematic approach. I divided the sky into 624 uniformly spaced areas and entered their coordinates into the computer which gave me assurance that I was on target and would finish without any gaps. Each frame received a total of 60 exposures: 4 short, 4 medium, and 4 long shots for each camera which would help to reduce the amount of noise, overhead satellite trails and other unwanted artifacts.

And so it was with this blueprint that I worked my way through the sky, frame by frame, night after night. The click-clack of the shutters opening and closing became a staccato soundtrack for the many nights spent under the stars. Occasionally, the routine would be pierced by a bright meteor or the cry of a jackal, each compelling a feeling of eerie beauty that seemed to hang in the air. It was an experience that will stay with me a lifetime.

A truly remarkable and beautiful achievement. This is what focus and passion can achieve.

[div class=attrib]More from theSource here.[end-div]

Susan Wolf and Meaningfulness

[div class=attrib]From PEA Soup:[end-div]

A lot of interesting work has been done recently on what makes lives meaningful. One brilliant example of this is Susan Wolf’s recent wonderful book Meaning in Life and Why It Matters. It consists of two short lectures, critical commentaries by John Koethe, Robert M. Adams, Nomy Arpaly, and Jonathan Haidt, and responses by Wolf herself. What I want to do here is to introduce quickly Wolf’s ‘Fitting Fulfillment’ View, and then I’ll raise a potential objection to it.

According to Wolf, all meaningful lives have both a ‘subjective’ and an ‘objective’ element to them. These elements can make lives meaningful only together. Wolf’s view of the subjective side is highly complex. The starting-point is the idea that agent’s projects and activities ultimately make her life meaningful. However, this happens only when the projects and activities satisfy two conditions on the subjective side and one on the objective side.
Firstly, in order for one’s projects and activities to make one’s life meaningful, one must be at least somewhat successful in carrying them out. This does not mean that one must fully complete one’s projects and excel in the activities but, other things being equal, the more successful one is in one’s projects and activities the more they can contribute to the meaningfulness of one’s life.

Secondly, one must have a special relation to one’s projects and activities. This special relation has several overlapping aspects which seem to have two main aspects. I’ll call one of them the ‘loving relation’. Thus, Wolf often seems to claim that one must love the relevant projects and activities, experience subjective attraction towards them, and be gripped and excited by them. This seems to imply that one must be passionate about the relevant projects and activities. It also seems to entail that our willingness to pursue the relevant projects must be diachronically stable (and even constitute ‘volitional necessities’).

The second aspect could be called the ‘fulfilment side’. This means that, when one is successfully engaged in one’s projects and activities, one must experience some positive sensations – fulfilment, satisfaction, feeling good and happy and the like. Wolf is careful to emphasise that there need not be single felt quality present in all cases. Rather, there is a range of the positive experiences some of which need to be present in each case.

Finally, on the objective side, one’s projects and activities must be objectively worthwhile. One way to think about this is to start from the idea that one can be more or less successful in the relevant projects and activities. This seems to entail that the relevant projects and activities are difficult to complete and master in the beginning. As a result, one can become better in them through practice.

The objective element of Wolf’s view requires that some objective values are promoted either during this process or as a consequence of completion. There are some basic reasons to take part in the activities and to try to succeed in the relevant projects. These reasons are neither purely prudential nor necessarily universal moral reasons. Wolf is a pluralist about which projects and activities are objectively worthwhile (she takes no substantial stand in order to avoid any criticism of elitism). She also emphasises that saying all of this is fairly neutral metaethically.

[div class=attrib]More from theSource here.[end-div]

How your dad’s music influences your taste

[div class=attrib]From Sonos:[end-div]

There’s no end to the reasons why you listen to the music you do today, but we’re willing to bet that more than a few of you were subjected to your father’s music at some point in the past (or present). So that leads to the question: what do dear old dad’s listening habits say about the artists in your repertoire? In honor of Father’s Day, we tried our hand at finding out.

[div class=attrib]More from the Source here.[end-div]

The Technology of Personalization and the Bubble Syndrome

A decade ago in another place and era during my days as director of technology research for a Fortune X company I tinkered with a cool array of then new personalization tools. The aim was simple, use some of these emerging technologies to deliver a more customized and personalized user experience for our customers and suppliers. What could be wrong with that? Surely, custom tools and more personalized data could do nothing but improve knowledge and enhance business relationships for all concerned. Our customers would benefit from seeing only the information they asked for, our suppliers would benefit from better analysis and filtered feedback, and we, the corporation in the middle, would benefit from making everyone in our supply chain more efficient and happy. Advertisers would be even happier since with more focused data they would be able to deliver messages that were increasingly more precise and relevant based on personal context.

Fast forward to the present. Customization, or filtering, technologies have indeed helped optimize the supply chain; personalization tools and services have made customer experiences more focused and efficient. In today’s online world it’s so much easier to find, navigate and transact when the supplier at the other end of our browser knows who we are, where we live, what we earn, what we like and dislike, and so on. After all, if a supplier knows my needs, requirements, options, status and even personality, I’m much more likely to only receive information, services or products that fall within the bounds that define “me” in the supplier’s database.

And, therein lies the crux of the issue that has helped me to realize that personalization offers a false promise despite the seemingly obvious benefits to all concerned. The benefits are outweighed by two key issues: erosion of privacy and the bubble syndrome.

Privacy as Commodity

I’ll not dwell too long on the issue of privacy since in this article I’m much more concerned with the personalization bubble. However, as we have increasingly seen in recent times privacy in all its forms is becoming a scarce, and tradable commodity. Much of our data is now in the hands of a plethora of suppliers, intermediaries and their partners, ready for continued monetization. Our locations are constantly pinged and polled; our internet browsers note our web surfing habits and preferences; our purchases generate genius suggestions and recommendations to further whet our consumerist desires. Now in digital form this data is open to legitimate sharing and highly vulnerable to discovery by hackers, phishers and spammers and any with technical or financial resources.

Bubble Syndrome

Personalization technologies filter content at various levels, minutely and broadly, both overtly and covertly. For instance, I may explicitly signal my preferences for certain types of clothing deals at my favorite online retailer by answering a quick retail survey or checking a handful of specific preference buttons on a website.

However, my previous online purchases, browsing behaviors, time spent of various online pages, visits to other online retailers and a range of other flags deliver a range of implicit or “covert” information to the same retailer (and others). This helps the retailer filter, customize and personalize what I get to see even before I have made a conscious decision to limit my searches and exposure to information. Clearly, this is not too concerning when my retailer knows I’m male and usually purchase size 32 inch jeans; after all why would I need to see deals or product information for women’s shoes.

But, this type of covert filtering becomes more worrisome when the data being filtered and personalized is information, news, opinion and comment in all its glorious diversity. Sophisticated media organizations, information portals, aggregators and news services can deliver personalized and filtered information based on your overt and covert personal preferences as well. So, if you subscribe only to a certain type of information based on topic, interest, political persuasion or other dimension your personalized news services will continue to deliver mostly or only this type of information. And, as I have already described, your online behaviors will deliver additional filtering parameters to these news and information providers so that they may further personalize and narrow your consumption of information.

Increasingly, we will not be aware of what we don’t know. Whether explicitly or not, our use of personalization technologies will have the ability to build a filter, a bubble, around us, which will permit only information that we wish to see or that which our online suppliers wish us to see. We’ll not even get exposed to peripheral and tangential information — that information which lies outside the bubble. This filtering of the rich oceans of diverse information to a mono-dimensional stream will have profound implications for our social and cultural fabric.

I assume that our increasingly crowded planet will require ever more creativity, insight, tolerance and empathy as we tackle humanity’s many social and political challenges in the future. And, these very seeds of creativity, insight, tolerance and empathy are those that are most at risk from the personalization filter. How are we to be more tolerant of others’ opinions if we are never exposed to them in the first place? How are we to gain insight when disparate knowledge is no longer available for serendipitous discovery? How are we to become more creative if we are less exposed to ideas outside of our normal sphere, our bubble?

For some ideas on how to punch a few holes in your online filter bubble read Eli Pariser’s practical guide, here.

Filter Bubble image courtesy of TechCrunch.

Cosmic Smoothness

Simulations based on the standard cosmological model, as shown here, indicate that on very large distance scales, galaxies should be uniformly distributed. But observations show a clumpier distribution than expected. (The length bar represents about $2.3$ billion light years.)[div class=attrib]From American Physical Society, Michael J. Hudson:[end-div]

The universe is expected to be very nearly homogeneous in density on large scales. In Physical Review Letters, Shaun Thomas and colleagues from University College London analyze measurements of the density of galaxies on the largest spatial scales so far—billions of light years—and find that the universe is less smooth than expected. If it holds up, this result will have important implications for our understanding of dark matter, dark energy, and perhaps gravity itself.

In the current standard cosmological model, the average mass-energy density of the observable universe consists of 5% normal matter (most of which is hydrogen and helium), 23% dark matter, and 72% dark energy. The dark energy is assumed to be uniform, but the normal and dark matter are not. The balance between matter and dark energy determines both how the universe expands and how regions of unusually high or low matter density evolve with time.

The same cosmological model predicts the statistics of the nonuniform structure and their dependence on spatial scale. On scales that are small by cosmological standards, fluctuations in the matter density are comparable to its mean, in agreement with what is seen: matter is clumped into galaxies, clusters of galaxies, and filaments of the “cosmic web.” On larger scales, however, the contrast of the structures compared to the mean density decreases. On the largest cosmological scales, these density fluctuations are small in amplitude compared to the average density of the universe and so are well described by linear perturbation theory (see simulation results in Fig. 1). Moreover, these perturbations can be calibrated at early times directly from the cosmic microwave background (CMB), a snapshot of the universe from when it was only 380,000 years old. Despite the fact that only 5% of the Universe is well understood, this model is an excellent fit to data spanning a wide range of spatial scales as the fluctuations evolved from the time of the CMB to the present age of the universe, some 13.8 billion years. On the largest scales, dark energy drives accelerated expansion of the universe. Because this aspect of the standard model is least understood, it is important to test it on these scales.

Thomas et al. use publicly-released catalogs from the Sloan Digital Sky Survey to select more than 700,000 galaxies whose observed colors indicate a significant redshift and are therefore presumed to be at large cosmological distances. They use the redshift of the galaxies, combined with their observed positions on the sky, to create a rough three-dimensional map of the galaxies in space and to assess the homogeneity on scales of a couple of billion light years. One complication is that Thomas et al. measure the density of galaxies, not the density of all matter, but we expect that fluctuations of these two densities about their means to be proportional; the constant of proportionality can be calibrated by observations on smaller scales. Indeed, on small scales the galaxy data are in good agreement with the standard model. On the largest scales, the fluctuations in galaxy density are expected to be of order a percent of the mean density, but Thomas et al. find fluctuations double this prediction. This result then suggests that the universe is less homogeneous than expected.

This result is not entirely new: previous studies based on subsets of the data studied by Thomas et al. showed the same effect, albeit with a lower statistical significance. In addition, there are other ways of probing the large-scale mass distribution. For example, inhomogeneities in the mass distribution lead to inhomogeneities in the local rate of expansion. Some studies have suggested that, on very large scales, this expansion too is less homogeneous than the model predictions.

Future large-scale surveys will produce an avalanche of data. These surveys will allow the methods employed by Thomas et al. and others to be extended to still larger scales. Of course, the challenge for these future surveys will be to correct for the systematic effects to even greater accuracy.

[div class=attrib]More from theSource here.[end-div]

Lemonade without the Lemons: New Search Engine Looks for Uplifting News

[div class=attrib]From Scientific American:[end-div]

Good news, if you haven’t noticed, has always been a rare commodity. We all have our ways of coping, but the media’s pessimistic proclivity presented a serious problem for Jurriaan Kamp, editor of the San Francisco-based Ode magazine—a must-read for “intelligent optimists”—who was in dire need of an editorial pick-me-up, last year in particular. His bright idea: an algorithm that can sense the tone of daily news and separate the uplifting stories from the Debbie Downers.

Talk about a ripe moment: A Pew survey last month found the number of Americans hearing “mostly bad” news about the economy and other issues is at its highest since the downturn in 2008. That is unlikely to change anytime soon: global obesity rates are climbing, the Middle East is unstable, and campaign 2012 vitriol is only just beginning to spew in the U.S. The problem is not trivial. A handful of studies, including one published in the Clinical Psychology Review in 2010, have linked positive thinking to better health. Another from the Journal of Economic Psychology the year prior found upbeat people can even make more money.

Kamp, realizing he could be a purveyor of optimism in an untapped market, partnered with Federated Media Publishing, a San Francisco–based company that leads the field in search semantics. The aim was to create an automated system for Ode to sort and aggregate news from the world’s 60 largest news sources based on solutions, not problems. The system, released last week in public beta testing online and to be formally introduced in the next few months, runs thousands of directives to find a story’s context. “It’s kind of like playing 20 questions, building an ontology to find either optimism or pessimism,” says Tim Musgrove, the chief scientist who designed the broader system, which has been dubbed a “slant engine”. Think of the word “hydrogen” paired with “energy” rather than “bomb.”

Web semantics developers in recent years have trained computers to classify news topics based on intuitive keywords and recognizable names. But the slant engine dives deeper into algorithmic programming. It starts by classifying a story’s topic as either a world problem (disease and poverty, for example) or a social good (health care and education). Then it looks for revealing phrases. “Efforts against” in a story, referring to a world problem, would signal something good. “Setbacks to” a social good, likely bad. Thousands of questions later every story is eventually assigned a score between 0 and 1—above 0.95 fast-tracks the story to Ode’s Web interface, called OdeWire. Below that, a score higher than 0.6 is reviewed by a human. The system is trained to only collect themes that are “meaningfully optimistic,” meaning it throws away flash-in-the-pan stories about things like sports or celebrities.

[div class=attrib]More from theSource here.[end-div]

Self-Published Author Sells a Million E-Books on Amazon

[div class=attrib]From ReadWriteWeb:[end-div]

Since the Kindle’s launch, Amazon has heralded each new arrival into what it calls the “Kindle Million Club,” the group of authors who have sold over 1 million Kindle e-books. There have been seven authors in this club up ’til now – some of the big names in publishing: Stieg Larsson, James Patterson, and Nora Roberts for example.

But the admission today of the eighth member of this club is really quite extraordinary. Not because John Locke is a 60 year old former insurance salesman from Kentucky with no writing or publishing background. But because John Locke has accomplished the feat of selling one million e-books as a completely self-published author.

Rather than being published by major publishing house – and all the perks that have long been associated with that (marketing, book tours, prime shelf space in retail stores) – Locke has sold 1,010,370 Kindle books (as of yesterday) having used Kindle Direct Publishing to get his e-books into the Amazon store. No major publisher. No major marketing.

Locke writes primarily crime and adventure stories, including Vegas Moon, Wish List, and the New York Times E-Book Bestseller, Saving Rachel. Most of the e-books sell for $.99, and he says he makes 35 cents on every sale. That sort of per book profit is something that authors would never get from a traditional book deal.

[div class=attrib]More from theSource here.[end-div]

Book Review: Solar. Ian McEwan

Solar is a timely, hilarious novel from the author of Atonement that examines the self-absorption and (self-)deceptions of Nobel Prize-winning physicist Michael Beard. With his best work many decades behind him Beard trades on his professional reputation to earn continuing financial favor, and maintain influence and respect amongst his peers. And, with his personal life in an ever-decreasing spiral, with his fifth marriage coming to an end, Beard manages to entangle himself in an impossible accident which has the power to re-shape his own world, and the planet in the process.

Ian McEwan’s depiction of Michael Beard is engaging and thoroughly entertaining. Beard hops from relationship to relationship in his singular quest for “love”, but very much on his own terms. And, this very centric view of himself extends to his own science, where his personal contributions don’t seem to be all that they appear. Satire and climate science makes a stylish and witty combination in the hands of McEwan.

Book Review: The Social Animal. David Brooks

David Brooks brings us a detailed journey through the building blocks of the self in his new book, The Social Animal: A Story of Love, Character and Achievement. With his insight and gift for narrative Brooks weaves an engaging and compelling story of Erica and Harold. Brooks uses the characters of Erica and Harold as platforms on which he visualizes the results of numerous psychological, social and cultural studies. Placed in contemporary time the two characters show us a holistic picture in practical terms of the unconscious effects of physical and social context on behavioral and character traits. The narrative takes us through typical life events and stages: infancy, childhood, school, parenting, work-life, attachment, aging. At each stage, Brooks illustrates his views of the human condition by selecting a flurry of facts and anecdotal studies.

The psychologist in me would say that this is a rather shallow attempt at synthesizing profoundly complex issues. Brooks certainly makes use of many studies from the brain and social sciences, but never dwells long enough to give us a detailed sense of major underlying implications or competing scientific positions. So too, the character development of Erica and Harold lacks the depth and breadth one would expect — Brooks fails to explore much of what typically seems to motivate human behavior: greed, ambition, lust, violence, empathy.  Despite these flaws in the execution of the idea, Brooks’ attempt is praiseworthy; perhaps in the hands of a more skilled social scientist, or Rousseau who used this technique much more effectively, this type of approach would gain a better grade.