Tag Archives: particle physics

The Anomaly

Is the smallest, lightest, most ghostly particle about to upend our understanding of the universe? Recently, the ephemeral neutrino has begun to give up some of its secrets. Beginning in 1998 the neutrino experiments at Super-Kamiokande and Sudbury Neutrino Observatory showed for the first time that neutrinos oscillate with one of three flavors. In 2015, two physicists were awarded the Nobel prize for this discovery, which also proved that neutrinos must have mass. More recently, a small anomaly at the Super-Kamiokande detector has surfaced, which, is hoped, could shed light on why the universe is constructed primarily from matter and not anti-matter.

From Quanta:

The anomaly, detected by the T2K experiment, is not yet pronounced enough to be sure of, but it and the findings of two related experiments “are all pointing in the same direction,” said Hirohisa Tanaka of the University of Toronto, a member of the T2K team who presented the result to a packed audience in London earlier this month.

“A full proof will take more time,” said Werner Rodejohann, a neutrino specialist at the Max Planck Institute for Nuclear Physics in Heidelberg who was not involved in the experiments, “but my and many others’ feeling is that there is something real here.”

The long-standing puzzle to be solved is why we and everything we see is matter-made. More to the point, why does anything — matter or antimatter — exist at all? The reigning laws of particle physics, known as the Standard Model, treat matter and antimatter nearly equivalently, respecting (with one known exception) so-called charge-parity, or “CP,” symmetry: For every particle decay that produces, say, a negatively charged electron, the mirror-image decay yielding a positively charged antielectron occurs at the same rate. But this cannot be the whole story. If equal amounts of matter and antimatter were produced during the Big Bang, equal amounts should have existed shortly thereafter. And since matter and antimatter annihilate upon contact, such a situation would have led to the wholesale destruction of both, resulting in an empty cosmos.

Somehow, significantly more matter than antimatter must have been created, such that a matter surplus survived the annihilation and now holds sway. The question is, what CP-violating process beyond the Standard Model favored the production of matter over antimatter?

Many physicists suspect that the answer lies with neutrinos — ultra-elusive, omnipresent particles that pass unfelt through your body by the trillions each second.

Read the entire article here.

Neutrinos in the News

Something’s up. Perhaps there’s some degree of hope that we may be reversing the tide of “dumbeddownness” in the stories that the media pumps through its many tubes to reach us. So, it comes as a welcome surprise to see articles about the very, very small making big news in publications like the New Yorker. Stories about neutrinos no less. Thank you New Yorker for dumbing us up. And, kudos to the latest Nobel laureates — Takaaki Kajita and Arthur B. McDonald — for helping us understand just a little bit more about our world.

From the New Yorker:

This week the 2015 Nobel Prize in Physics was awarded jointly to Takaaki Kajita and Arthur B. McDonald for their discovery that elementary particles called neutrinos have mass. This is, remarkably, the fourth Nobel Prize associated with the experimental measurement of neutrinos. One might wonder why we should care so much about these ghostly particles, which barely interact with normal matter.

Even though the existence of neutrinos was predicted in 1930, by Wolfgang Pauli, none were experimentally observed until 1956. That’s because neutrinos almost always pass through matter without stopping. Every second of every day, more than six trillion neutrinos stream through your body, coming directly from the fiery core of the sun—but most of them go right through our bodies, and the Earth, without interacting with the particles out of which those objects are made. In fact, on average, those neutrinos would be able to traverse more than one thousand light-years of lead before interacting with it even once.

The very fact that we can detect these ephemeral particles is a testament to human ingenuity. Because the rules of quantum mechanics are probabilistic, we know that, even though almost all neutrinos will pass right through the Earth, a few will interact with it. A big enough detector can observe such an interaction. The first detector of neutrinos from the sun was built in the nineteen-sixties, deep within a mine in South Dakota. An area of the mine was filled with a hundred thousand gallons of cleaning fluid. On average, one neutrino each day would interact with an atom of chlorine in the fluid, turning it into an atom of argon. Almost unfathomably, the physicist in charge of the detector, Raymond Davis, Jr., figured out how to detect these few atoms of argon, and, four decades later, in 2002, he was awarded the Nobel Prize in Physics for this amazing technical feat.

Because neutrinos interact so weakly, they can travel immense distances. They provide us with a window into places we would never otherwise be able to see. The neutrinos that Davis detected were emitted by nuclear reactions at the very center of the sun, escaping this incredibly dense, hot place only because they so rarely interact with other matter. We have been able to detect neutrinos emerging from the center of an exploding star more than a hundred thousand light-years away.

But neutrinos also allow us to observe the universe at its very smallest scales—far smaller than those that can be probed even at the Large Hadron Collider, in Geneva, which, three years ago, discovered the Higgs boson. It is for this reason that the Nobel Committee decided to award this year’s Nobel Prize for yet another neutrino discovery.

Read the entire story here.

What’s Next For the LHC?

As CERN’s Large Hadron Collider gears up for a restart in March 2015 after a refit that doubled its particle smashing power, researchers are pondering what may come next. During its previous run scientists uncovered signals identifying the long-sought Higgs boson. Now, particle physicists have their eyes and minds on more exotic, but no less significant, particle discoveries. And — of course — these come with suitably exotic names: gluino, photino, selectron, squark, axion — the list goes on. But beyond these creative names lie possible answers to some very big questions: What is the composition of dark matter (and even dark energy)? How does gravity fit in with all the other identified forces? Do other fundamental particles exist?

From the Smithsonian:

The Large Hadron Collider, the world’s biggest and most famous particle accelerator, will reopen in March after a years-long upgrade. So what’s the first order of business for the rebooted collider? Nothing less than looking for a particle that forces physicists to reconsider everything they think they know about how the universe works.

Since the second half of the twentieth century, physicists have used the Standard Model of physics to describe how particles look and act. But though the model explains pretty much everything scientists have observed using particle accelerators, it doesn’t account for everything they can observe in the universe, including the existence of dark matter.

That’s where supersymmetry, or SUSY, comes in. Supersymmetry predicts that each particle has what physicists call a “superpartner”—a more massive sub-atomic partner particle that acts like a twin of the particle we can observe. Each observable particle would have its own kind of superpartner, pairing bosons with “fermions,” electrons with “selectrons,” quarks with “squarks,” photons with “photinos,” and gluons with “gluinos.”

If scientists could identify a single superparticle, they could be on track for a more complete theory of particle physics that accounts for strange inconsistencies between existing knowledge and observable phenomena. Scientists used the Large Hadron Collider to identify Higgs boson particles in 2012, but it didn’t behave quite as they expected. One surprise was that its mass was much lighter than predicted—an inconsistency that would be explained by the existence of a supersymmetric particle.

Scientists hope that the rebooted—and more powerful—LHC will reveal just such a particle. “Higher energies at the new LHC could boost the production of hypothetical supersymmetric particles called gluinos by a factor of 60, increasing the odds of finding it,” reports Emily Conover for Science.

If the LHC were to uncover a single superparticle, it wouldn’t just be a win for supersymmetry as a theory—it could be a step toward understanding the origins of our universe. But it could also create a lot of work for scientists—after all, a supersymmetric universe is one that would hold at least twice as many particles.

Read the entire article here.

 

Universal Amniotic Fluid

Another day, another physics paper describing the origin of the universe. This is no wonder. Since the development of general relativity and quantum mechanics — two mutually incompatible descriptions of our reality — theoreticians have been scurrying to come up with a grand theory, a rapprochement of sorts. This one describes the universe as a quantum fluid, perhaps made up of hypothesized gravitons.

From Nature Asia:

The prevailing model of cosmology, based on Einstein’s theory of general relativity, puts the universe at around 13.8 billion years old and suggests it originated from a “singularity” – an infinitely small and dense point – at the Big Bang.

 To understand what happened inside that tiny singularity, physicists must marry general relativity with quantum mechanics – the laws that govern small objects. Applying both of these disciplines has challenged physicists for decades. “The Big Bang singularity is the most serious problem of general relativity, because the laws of physics appear to break down there,” says Ahmed Farag Ali, a physicist at Zewail City of Science and Technology, Egypt.

 In an effort to bring together the laws of quantum mechanics and general relativity, and to solve the singularity puzzle, Ali and Saurya Das, a physicist at the University of Lethbridge in Alberta Canada, employed an equation that predicts the development of singularities in general relativity. That equation had been developed by Das’s former professor, Amal Kumar Raychaudhuri, when Das was an undergraduate student at Presidency University, in Kolkata, India, so Das was particularly familiar and fascinated by it.

 When Ali and Das made small quantum corrections to the Raychaudhuri equation, they realised it described a fluid, made up of small particles, that pervades space. Physicists have long believed that a quantum version of gravity would include a hypothetical particle, called the graviton, which generates the force of gravity. In their new model — which will appear in Physics Letters B in February — Ali and Das propose that such gravitons could form this fluid.

To understand the origin of the universe, they used this corrected equation to trace the behaviour of the fluid back through time. Surprisingly, they found that it did not converge into a singularity. Instead, the universe appears to have existed forever. Although it was smaller in the past, it never quite crunched down to nothing, says Das.

 “Our theory serves to complement Einstein’s general relativity, which is very successful at describing physics over large distances,” says Ali. “But physicists know that to describe short distances, quantum mechanics must be accommodated, and the quantum Raychaudhui equation is a big step towards that.”

The model could also help solve two other cosmic mysteries. In the late 1990s, astronomers discovered that the expansion of the universe is accelerating due the presence of a mysterious dark energy, the origin of which is not known. The model has the potential to explain it since the fluid creates a minor but constant outward force that expands space. “This is a happy offshoot of our work,” says Das.

 Astronomers also now know that most matter in the universe is in an invisible mysterious form called dark matter, only perceptible through its gravitational effect on visible matter such as stars. When Das and a colleague set the mass of the graviton in the model to a small level, they could make the density of their fluid match the universe’s observed density of dark matter, while also providing the right value for dark energy’s push.

Read the entire article here.

 

The Next (and Final) Doomsday Scenario

Personally, I love dystopian visions and apocalyptic nightmares. So, news that the famed Higgs boson may ultimately cause our demise, and incidentally the end of the entire cosmos, caught my attention.

Apparently theoreticians have calculated that the Higgs potential of which the Higgs boson is a manifestation has characteristics that make the universe unstable. (The Higgs was discovered in 2012 by teams at CERN’s Large Hadron Collider.) Luckily for those wishing to avoid the final catastrophe this instability may keep the universe intact for several more billions of years, and if suddenly the Higgs were to trigger the final apocalypse it would be at the speed of light.

From Popular Mechanics:

In July 2012, when scientists at CERN’s Large Hadron Collider culminated decades of work with their discovery of the Higgs boson, most physicists celebrated. Stephen Hawking did not. The famed theorist expressed his disappointmentthat nothing more unusual was found, calling the discovery “a pity in a way.” But did he ever say the Higgs could destroy the universe?

That’s what many reports in the media said earlier this week, quoting a preface Hawking wrote to a book called Starmus. According to The Australian, the preface reads in part: “The Higgs potential has the worrisome feature that it might become metastable at energies above 100 [billion] gigaelectronvolts (GeV). This could mean that the universe could undergo catastrophic vacuum decay, with a bubble of the true vacuum expanding at the speed of light. This could happen at any time and we wouldn’t see it coming.”

What Hawking is talking about here is not the Higgs boson but what’s called the Higgs potential, which are “totally different concepts,” says Katie Mack, a theoretical astrophysicist at Melbourne University. The Higgs field permeates the entire universe, and the Higgs boson is an excitation of that field, just like an electron is an excitation of an electric field. In this analogy, the Higgs potential is like the voltage, determining the value of the field.

Once physicists began to close in on the mass of the Higgs boson, they were able to work out the Higgs potential. That value seemed to reveal that the universe exists in what’s known as a meta-stable vacuum state, or false vacuum, a state that’s stable for now but could slip into the “true” vacuum at any time. This is the catastrophic vacuum decay in Hawking’s warning, though he is not the first to posit the idea.

Is he right?

“There are a couple of really good reasons to think that’s not the end of the story,” Mack says. There are two ways for a meta-stable state to fall off into the true vacuum—one classical way, and one quantum way. The first would occur via a huge energy boost, the 100 billion GeVs Hawking mentions. But, Mack says, the universe already experienced such high energies during the period of inflation just after the big bang. Particles in cosmic rays from space also regularly collide with these kinds of high energies, and yet the vacuum hasn’t collapsed (otherwise, we wouldn’t be here).

“Imagine that somebody hands you a piece of paper and says, ‘This piece of paper has the potential to spontaneously combust,’ and so you might be worried,” Mack says. “But then they tell you 20 years ago it was in a furnace.” If it didn’t combust in the furnace, it’s not likely to combust sitting in your hand.

Of course, there’s always the quantum world to consider, and that’s where things always get weirder. In the quantum world, where the smallest of particles interact, it’s possible for a particle on one side of a barrier to suddenly appear on the other side of the barrier without actually going through it, a phenomenon known as quantum tunneling. If our universe was in fact in a meta-stable state, it could quantum tunnel through the barrier to the vacuum on the other side with no warning, destroying everything in an instant. And while that is theoretically possible, predictions show that if it were to happen, it’s not likely for billions of billions of years. By then, the sun and Earth and you and I and Stephen Hawking will be a distant memory, so it’s probably not worth losing sleep over it.

What’s more likely, Mack says, is that there is some new physics not yet understood that makes our vacuum stable. Physicists know there are parts of the model missing; mysteries like quantum gravity and dark matter that still defy explanation. When two physicists published a paper documenting the Higgs potential conundrum in March, their conclusion was that an explanation lies beyond the Standard Model, not that the universe may collapse at any time.

Read the article here.

Non-Spooky Action at a Distance

Albert Einstein famously called quantum entanglement “spooky action at a distance”. It refers to the notion that measuring the state of one of two entangled particles makes the state of the second particle known instantaneously, regardless of the distance  separating the two particles. Entanglement seems to link these particles and make them behave as one system. This peculiar characteristic has been a core element of the counterintuitiive world of quantum theory. Yet while experiments have verified this spookiness, other theorists maintain that both theory and experiment are flawed, and that a different interpretation is required. However, one such competing theory — the many worlds interpretation — makes equally spooky predictions.

From ars technica:

Quantum nonlocality, perhaps one of the most mysterious features of quantum mechanics, may not be a real phenomenon. Or at least that’s what a new paper in the journal PNAS asserts. Its author claims that nonlocality is nothing more than an artifact of the Copenhagen interpretation, the most widely accepted interpretation of quantum mechanics.

Nonlocality is a feature of quantum mechanics where particles are able to influence each other instantaneously regardless of the distance between them, an impossibility in classical physics. Counterintuitive as it may be, nonlocality is currently an accepted feature of the quantum world, apparently verified by many experiments. It’s achieved such wide acceptance that even if our understandings of quantum physics turn out to be completely wrong, physicists think some form of nonlocality would be a feature of whatever replaced it.

The term “nonlocality” comes from the fact that this “spooky action at a distance,” as Einstein famously called it, seems to put an end to our intuitive ideas about location. Nothing can travel faster than the speed of light, so if two quantum particles can influence each other faster than light could travel between the two, then on some level, they act as a single system—there must be no real distance between them.

The concept of location is a bit strange in quantum mechanics anyway. Each particle is described by a mathematical quantity known as the “wave function.” The wave function describes a probability distribution for the particle’s location, but not a definite location. These probable locations are not just scientists’ guesses at the particle’s whereabouts; they’re actual, physical presences. That is to say, the particles exist in a swarm of locations at the same time, with some locations more probable than others.

A measurement collapses the wave function so that the particle is no longer spread out over a variety of locations. It begins to act just like objects we’re familiar with—existing in one specific location.

The experiments that would measure nonlocality, however, usually involve two particles that are entangled, which means that both are described by a shared wave function. The wave function doesn’t just deal with the particle’s location, but with other aspects of its state as well, such as the direction of the particle’s spin. So if scientists can measure the spin of one of the two entangled particles, the shared wave function collapses and the spins of both particles become certain. This happens regardless of the distance between the particles.

The new paper calls all this into question.

The paper’s sole author, Frank Tipler, argues that the reason previous studies apparently confirmed quantum nonlocality is that they were relying on an oversimplified understanding of quantum physics in which the quantum world and the macroscopic world we’re familiar with are treated as distinct from one another. Even large structures obey the laws of quantum Physics, Tipler points out, so the scientists making the measurements must be considered part of the system being studied.

It is intuitively easy to separate the quantum world from our everyday world, as they appear to behave so differently. However, the equations of quantum mechanics can be applied to large objects like human beings, and they essentially predict that you’ll behave just as classical physics—and as observation—says you will. (Physics students who have tried calculating their own wave functions can attest to this). The laws of quantum physics do govern the entire Universe, even if distinctly quantum effects are hard to notice at a macroscopic level.

When this is taken into account, according to Tipler, the results of familiar nonlocality experiments are altered. Typically, such experiments are thought to involve only two measurements: one on each of two entangled particles. But Tipler argues that in such experiments, there’s really a third measurement taking place when the scientists compare the results of the two.

This third measurement is crucial, Tipler argues, as without it, the first two measurements are essentially meaningless. Without comparing the first two, there’s no way to know that one particle’s behavior is actually linked to the other’s. And crucially, in order for the first two measurements to be compared, information must be exchanged between the particles, via the scientists, at a speed less than that of light. In other words, when the third measurement is taken into account, the two particles are not communicating faster than light. There is no “spooky action at a distance.”

Tipler has harsh criticism for the reasoning that led to nonlocality. “The standard argument that quantum phenomena are nonlocal goes like this,” he says in the paper. “(i) Let us add an unmotivated, inconsistent, unobservable, nonlocal process (collapse) to local quantum mechanics; (ii) note that the resulting theory is nonlocal; and (iii) conclude that quantum mechanics is [nonlocal].”

He’s essentially saying that scientists are arbitrarily adding nonlocality, which they can’t observe, and then claiming they have discovered nonlocality. Quite an accusation, especially for the science world. (The “collapse” he mentions is the collapse of the particle’s wave function, which he asserts is not a real phenomenon.) Instead, he claims that the experiments thought to confirm nonlocality are in fact confirming an alternative to the Copenhagen interpretation called the many-worlds interpretation (MWI). As its name implies, the MWI predicts the existence of other universes.

The Copenhagen interpretation has been summarized as “shut up and measure.” Even though the consequences of a wave function-based world don’t make much intuitive sense, it works. The MWI tries to keep particles concrete at the cost of making our world a bit fuzzy. It posits that rather than becoming a wave function, particles remain distinct objects but enter one of a number of alternative universes, which recombine to a single one when the particle is measured.

Scientists who thought they were measuring nonlocality, Tipler claims, were in fact observing the effects of alternate universe versions of themselves, also measuring the same particles.

Part of the significance of Tipler’s claim is that he’s able to mathematically derive the same experimental results from the MWI without use of nonlocality. But this does not necessarily make for evidence that the MWI is correct; either interpretation remains consistent with the data. Until the two can be distinguished experimentally, it all comes down to whether you personally like or dislike nonlocality.

Read the entire article here.

Questioning Quantum Orthodoxy

de-BrogliePhysics works very well in explaining our world, yet it is also broken — it cannot, at the moment, reconcile our views of the very small (quantum theory) with those of the very large (relativity theory).

So although the probabilistic underpinnings of quantum theory have done wonders in allowing physicists to construct the Standard Model, gaps remain.

Back in the mid-1920s, the probabilistic worldview proposed by Niels Bohr and others gained favor and took hold. A competing theory, known as the pilot wave theory, proposed by a young Louis de Broglie, was given short shrift. Yet some theorists have maintained that it may do a better job of reconciling this core gap in our understanding — so it is time to revisit and breathe fresh life into pilot wave theory.

From Wired / Quanta:

For nearly a century, “reality” has been a murky concept. The laws of quantum physics seem to suggest that particles spend much of their time in a ghostly state, lacking even basic properties such as a definite location and instead existing everywhere and nowhere at once. Only when a particle is measured does it suddenly materialize, appearing to pick its position as if by a roll of the dice.

This idea that nature is inherently probabilistic — that particles have no hard properties, only likelihoods, until they are observed — is directly implied by the standard equations of quantum mechanics. But now a set of surprising experiments with fluids has revived old skepticism about that worldview. The bizarre results are fueling interest in an almost forgotten version of quantum mechanics, one that never gave up the idea of a single, concrete reality.

The experiments involve an oil droplet that bounces along the surface of a liquid. The droplet gently sloshes the liquid with every bounce. At the same time, ripples from past bounces affect its course. The droplet’s interaction with its own ripples, which form what’s known as a pilot wave, causes it to exhibit behaviors previously thought to be peculiar to elementary particles — including behaviors seen as evidence that these particles are spread through space like waves, without any specific location, until they are measured.

Particles at the quantum scale seem to do things that human-scale objects do not do. They can tunnel through barriers, spontaneously arise or annihilate, and occupy discrete energy levels. This new body of research reveals that oil droplets, when guided by pilot waves, also exhibit these quantum-like features.

To some researchers, the experiments suggest that quantum objects are as definite as droplets, and that they too are guided by pilot waves — in this case, fluid-like undulations in space and time. These arguments have injected new life into a deterministic (as opposed to probabilistic) theory of the microscopic world first proposed, and rejected, at the birth of quantum mechanics.

“This is a classical system that exhibits behavior that people previously thought was exclusive to the quantum realm, and we can say why,” said John Bush, a professor of applied mathematics at the Massachusetts Institute of Technology who has led several recent bouncing-droplet experiments. “The more things we understand and can provide a physical rationale for, the more difficult it will be to defend the ‘quantum mechanics is magic’ perspective.”

Magical Measurements

The orthodox view of quantum mechanics, known as the “Copenhagen interpretation” after the home city of Danish physicist Niels Bohr, one of its architects, holds that particles play out all possible realities simultaneously. Each particle is represented by a “probability wave” weighting these various possibilities, and the wave collapses to a definite state only when the particle is measured. The equations of quantum mechanics do not address how a particle’s properties solidify at the moment of measurement, or how, at such moments, reality picks which form to take. But the calculations work. As Seth Lloyd, a quantum physicist at MIT, put it, “Quantum mechanics is just counterintuitive and we just have to suck it up.”

A classic experiment in quantum mechanics that seems to demonstrate the probabilistic nature of reality involves a beam of particles (such as electrons) propelled one by one toward a pair of slits in a screen. When no one keeps track of each electron’s trajectory, it seems to pass through both slits simultaneously. In time, the electron beam creates a wavelike interference pattern of bright and dark stripes on the other side of the screen. But when a detector is placed in front of one of the slits, its measurement causes the particles to lose their wavelike omnipresence, collapse into definite states, and travel through one slit or the other. The interference pattern vanishes. The great 20th-century physicist Richard Feynman said that this double-slit experiment “has in it the heart of quantum mechanics,” and “is impossible, absolutely impossible, to explain in any classical way.”

Some physicists now disagree. “Quantum mechanics is very successful; nobody’s claiming that it’s wrong,” said Paul Milewski, a professor of mathematics at the University of Bath in England who has devised computer models of bouncing-droplet dynamics. “What we believe is that there may be, in fact, some more fundamental reason why [quantum mechanics] looks the way it does.”

Riding Waves

The idea that pilot waves might explain the peculiarities of particles dates back to the early days of quantum mechanics. The French physicist Louis de Broglie presented the earliest version of pilot-wave theory at the 1927 Solvay Conference in Brussels, a famous gathering of the founders of the field. As de Broglie explained that day to Bohr, Albert Einstein, Erwin Schrödinger, Werner Heisenberg and two dozen other celebrated physicists, pilot-wave theory made all the same predictions as the probabilistic formulation of quantum mechanics (which wouldn’t be referred to as the “Copenhagen” interpretation until the 1950s), but without the ghostliness or mysterious collapse.

The probabilistic version, championed by Bohr, involves a single equation that represents likely and unlikely locations of particles as peaks and troughs of a wave. Bohr interpreted this probability-wave equation as a complete definition of the particle. But de Broglie urged his colleagues to use two equations: one describing a real, physical wave, and another tying the trajectory of an actual, concrete particle to the variables in that wave equation, as if the particle interacts with and is propelled by the wave rather than being defined by it.

For example, consider the double-slit experiment. In de Broglie’s pilot-wave picture, each electron passes through just one of the two slits, but is influenced by a pilot wave that splits and travels through both slits. Like flotsam in a current, the particle is drawn to the places where the two wavefronts cooperate, and does not go where they cancel out.

De Broglie could not predict the exact place where an individual particle would end up — just like Bohr’s version of events, pilot-wave theory predicts only the statistical distribution of outcomes, or the bright and dark stripes — but the two men interpreted this shortcoming differently. Bohr claimed that particles don’t have definite trajectories; de Broglie argued that they do, but that we can’t measure each particle’s initial position well enough to deduce its exact path.

In principle, however, the pilot-wave theory is deterministic: The future evolves dynamically from the past, so that, if the exact state of all the particles in the universe were known at a given instant, their states at all future times could be calculated.

At the Solvay conference, Einstein objected to a probabilistic universe, quipping, “God does not play dice,” but he seemed ambivalent about de Broglie’s alternative. Bohr told Einstein to “stop telling God what to do,” and (for reasons that remain in dispute) he won the day. By 1932, when the Hungarian-American mathematician John von Neumann claimed to have proven that the probabilistic wave equation in quantum mechanics could have no “hidden variables” (that is, missing components, such as de Broglie’s particle with its well-defined trajectory), pilot-wave theory was so poorly regarded that most physicists believed von Neumann’s proof without even reading a translation.

More than 30 years would pass before von Neumann’s proof was shown to be false, but by then the damage was done. The physicist David Bohm resurrected pilot-wave theory in a modified form in 1952, with Einstein’s encouragement, and made clear that it did work, but it never caught on. (The theory is also known as de Broglie-Bohm theory, or Bohmian mechanics.)

Later, the Northern Irish physicist John Stewart Bell went on to prove a seminal theorem that many physicists today misinterpret as rendering hidden variables impossible. But Bell supported pilot-wave theory. He was the one who pointed out the flaws in von Neumann’s original proof. And in 1986 he wrote that pilot-wave theory “seems to me so natural and simple, to resolve the wave-particle dilemma in such a clear and ordinary way, that it is a great mystery to me that it was so generally ignored.”

The neglect continues. A century down the line, the standard, probabilistic formulation of quantum mechanics has been combined with Einstein’s theory of special relativity and developed into the Standard Model, an elaborate and precise description of most of the particles and forces in the universe. Acclimating to the weirdness of quantum mechanics has become a physicists’ rite of passage. The old, deterministic alternative is not mentioned in most textbooks; most people in the field haven’t heard of it. Sheldon Goldstein, a professor of mathematics, physics and philosophy at Rutgers University and a supporter of pilot-wave theory, blames the “preposterous” neglect of the theory on “decades of indoctrination.” At this stage, Goldstein and several others noted, researchers risk their careers by questioning quantum orthodoxy.

A Quantum Drop

Now at last, pilot-wave theory may be experiencing a minor comeback — at least, among fluid dynamicists. “I wish that the people who were developing quantum mechanics at the beginning of last century had access to these experiments,” Milewski said. “Because then the whole history of quantum mechanics might be different.”

The experiments began a decade ago, when Yves Couder and colleagues at Paris Diderot University discovered that vibrating a silicon oil bath up and down at a particular frequency can induce a droplet to bounce along the surface. The droplet’s path, they found, was guided by the slanted contours of the liquid’s surface generated from the droplet’s own bounces — a mutual particle-wave interaction analogous to de Broglie’s pilot-wave concept.

Read the entire article here.

Image: Louis de Broglie. Courtesy of Wikipedia.

c2=e/m

Feynmann_Diagram_Gluon_RadiationParticle physicists will soon attempt to reverse the direction of Einstein’s famous equation delineating energy-matter equivalence, e=mc2. Next year, they plan to crash quanta of light into each other to create matter. Cool or what!

From the Guardian:

Researchers have worked out how to make matter from pure light and are drawing up plans to demonstrate the feat within the next 12 months.

The theory underpinning the idea was first described 80 years ago by two physicists who later worked on the first atomic bomb. At the time they considered the conversion of light into matter impossible in a laboratory.

But in a report published on Sunday, physicists at Imperial College London claim to have cracked the problem using high-powered lasers and other equipment now available to scientists.

“We have shown in principle how you can make matter from light,” said Steven Rose at Imperial. “If you do this experiment, you will be taking light and turning it into matter.”

The scientists are not on the verge of a machine that can create everyday objects from a sudden blast of laser energy. The kind of matter they aim to make comes in the form of subatomic particles invisible to the naked eye.

The original idea was written down by two US physicists, Gregory Breit and John Wheeler, in 1934. They worked out that – very rarely – two particles of light, or photons, could combine to produce an electron and its antimatter equivalent, a positron. Electrons are particles of matter that form the outer shells of atoms in the everyday objects around us.

But Breit and Wheeler had no expectations that their theory would be proved any time soon. In their study, the physicists noted that the process was so rare and hard to produce that it would be “hopeless to try to observe the pair formation in laboratory experiments”.

Oliver Pike, the lead researcher on the study, said the process was one of the most elegant demonstrations of Einstein’s famous relationship that shows matter and energy are interchangeable currencies. “The Breit-Wheeler process is the simplest way matter can be made from light and one of the purest demonstrations of E=mc2,” he said.

Writing in the journal Nature Photonics, the scientists describe how they could turn light into matter through a number of separate steps. The first step fires electrons at a slab of gold to produce a beam of high-energy photons. Next, they fire a high-energy laser into a tiny gold capsule called a hohlraum, from the German for “empty room”. This produces light as bright as that emitted from stars. In the final stage, they send the first beam of photons into the hohlraum where the two streams of photons collide.

The scientists’ calculations show that the setup squeezes enough particles of light with high enough energies into a small enough volume to create around 100,000 electron-positron pairs.

The process is one of the most spectacular predictions of a theory called quantum electrodynamics (QED) that was developed in the run up to the second world war. “You might call it the most dramatic consequence of QED and it clearly shows that light and matter are interchangeable,” Rose told the Guardian.

The scientists hope to demonstrate the process in the next 12 months. There are a number of sites around the world that have the technology. One is the huge Omega laser in Rochester, New York. But another is the Orion laser at Aldermaston, the atomic weapons facility in Berkshire.

A successful demonstration will encourage physicists who have been eyeing the prospect of a photon-photon collider as a tool to study how subatomic particles behave. “Such a collider could be used to study fundamental physics with a very clean experimental setup: pure light goes in, matter comes out. The experiment would be the first demonstration of this,” Pike said.

Read the entire story here.

Image: Feynmann diagram for gluon radiation. Courtesy of Wikipedia.

 

 

Wolfgang Pauli’s Champagne

PauliAustrian theoretical physicist dreamed up neutrinos in 1930, and famously bet a case of fine champagne that these ghostly elementary particles would never be found. Pauli lost the bet in 1956. Since then researchers have made great progress both theoretically and experimentally in trying to delve into the neutrino’s secrets. Two new books describe the ongoing quest.

From the Economist:

Neutrinoa are weird. The wispy particles are far more abundant than the protons and electrons that make up atoms. Billions of them stream through every square centimetre of Earth’s surface each second, but they leave no trace and rarely interact with anything. Yet scientists increasingly agree that they could help unravel one of the biggest mysteries in physics: why the cosmos is made of matter.

Neutrinos’ scientific history is also odd, as two new books explain. The first is “Neutrino Hunters” by Ray Jayawardhana, a professor of astrophysics at the University of Toronto (and a former contributor to The Economist). The second, “The Perfect Wave”, is by Heinrich Päs, a neutrino theorist from Technical University in the German city of Dortmund.

The particles were dreamed up in 1930 by Wolfgang Pauli, an Austrian, to account for energy that appeared to go missing in a type of radioactivity known as beta decay. Pauli apologised for what was a bold idea at a time when physicists knew of just two subatomic particles (protons and electrons), explaining that the missing energy was carried away by a new, electrically neutral and, he believed, undetectable subatomic species. He bet a case of champagne that it would never be found.

Pauli lost the wager in 1956 to two Americans, Frederick Reines and Clyde Cowan. The original experiment they came up with to test the hypothesis was unorthodox. It involved dropping a detector down a shaft within 40 metres of an exploding nuclear bomb, which would act as a source of neutrinos. Though Los Alamos National Laboratory approved the experiment, the pair eventually chose a more practical approach and buried a detector near a powerful nuclear reactor at Savannah River, South Carolina, instead. (Most neutrino detectors are deep underground to shield them from cosmic rays, which can cause similar signals.)

However, as other experiments, in particular those looking for neutrinos in the physical reactions which power the sun, strove to replicate Reines’s and Cowan’s result, they hit a snag. The number of solar neutrinos they recorded was persistently just one third of what theory said the sun ought to produce. Either the theorists had made a mistake, the thinking went, or the experiments had gone awry.

In fact, both were right all along. It was the neutrinos that, true to form, behaved oddly. As early as 1957 Bruno Pontecorvo, an Italian physicist who had defected to the Soviet Union seven years earlier, suggested that neutrinos could come in different types, known to physicists as “flavours”, and that they morph from one type to another on their way from the sun to Earth. Other scientists were sceptical. Their blueprint for how nature works at the subatomic level, called the Standard Model, assumed that neutrinos have no mass. This, as Albert Einstein showed, is the same as saying they travel at the speed of light. On reaching that speed time stops. If neutrinos switch flavours they would have to experience change, and thus time. That means they would have to be slower than light. In other words, they would have mass. (A claim in 2011 by Italian physicists working with CERN, Europe’s main physics laboratory, that neutrinos broke Einstein’s speed limit turned out to be the result of a loose cable.)

Pontecorvo’s hypothesis was proved only in 1998, in Japan. Others have since confirmed the phenomenon known as “oscillation”. The Standard Model had to be tweaked to make room for neutrino mass. But scientists still have little idea about how much any of the neutrinos actually weigh, besides being at least 1m times lighter than an electron.

The answer to the weight question, as well as a better understanding of neutrino oscillations, may help solve the puzzle of why the universe is full of matter. One explanation boffins like a lot because of its elegant maths invokes a whole new category of “heavy” neutrino decaying more readily into matter than antimatter. If that happened a lot when the universe began, then there would have been more matter around than antimatter, and when the matter and antimatter annihilated each other, as they are wont to do, some matter (ie, everything now visible) would be left over. The lighter the known neutrinos, according to this “seesaw” theory, the heftier the heavy sort would have to be. A heavy neutrino has yet to be observed, and may well, as Pauli described it, be unobservable. But a better handle on the light variety, Messrs Jayawardhana and Päs both agree, may offer important clues.

These two books complement each other. Mr Jayawardhana’s is stronger on the history (though his accounts of the neutrino hunters’ personal lives can read a little too much like a professional CV). It is also more comprehensive on the potential use of neutrinos in examining the innards of the sun, of distant exploding stars or of Earth, as well as more practical uses such as fingering illicit nuclear-enrichment programmes (since they spew out a telltale pattern of the particles).

Read the entire article here.

Image: Wolfgang Pauli, c1945. Courtesy of Wikipedia.

God Is a Thermodynamicist

Physicists and cosmologists are constantly postulating and testing new ideas to explain the universe and everything within. Over the last hundred years or so, two such ideas have grown to explain much about our cosmos, and do so very successfully — quantum mechanics, which describes the very small, and relativity which describes the very large. However, these two views do no reconcile, leaving theoreticians and researchers looking for a more fundamental theory of everything. One possible idea banishes the notions of time and gravity — treating them both as emergent properties of a deeper reality.

From New Scientist:

As revolutions go, its origins were haphazard. It was, according to the ringleader Max Planck, an “act of desperation”. In 1900, he proposed the idea that energy comes in discrete chunks, or quanta, simply because the smooth delineations of classical physics could not explain the spectrum of energy re-radiated by an absorbing body.

Yet rarely was a revolution so absolute. Within a decade or so, the cast-iron laws that had underpinned physics since Newton’s day were swept away. Classical certainty ceded its stewardship of reality to the probabilistic rule of quantum mechanics, even as the parallel revolution of Einstein’s relativity displaced our cherished, absolute notions of space and time. This was complete regime change.

Except for one thing. A single relict of the old order remained, one that neither Planck nor Einstein nor any of their contemporaries had the will or means to remove. The British astrophysicist Arthur Eddington summed up the situation in 1915. “If your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation,” he wrote.

In this essay, I will explore the fascinating question of why, since their origins in the early 19th century, the laws of thermodynamics have proved so formidably robust. The journey traces the deep connections that were discovered in the 20th century between thermodynamics and information theory – connections that allow us to trace intimate links between thermodynamics and not only quantum theory but also, more speculatively, relativity. Ultimately, I will argue, those links show us how thermodynamics in the 21st century can guide us towards a theory that will supersede them both.

In its origins, thermodynamics is a theory about heat: how it flows and what it can be made to do (see diagram). The French engineer Sadi Carnot formulated the second law in 1824 to characterise the mundane fact that the steam engines then powering the industrial revolution could never be perfectly efficient. Some of the heat you pumped into them always flowed into the cooler environment, rather than staying in the engine to do useful work. That is an expression of a more general rule: unless you do something to stop it, heat will naturally flow from hotter places to cooler places to even up any temperature differences it finds. The same principle explains why keeping the refrigerator in your kitchen cold means pumping energy into it; only that will keep warmth from the surroundings at bay.

A few decades after Carnot, the German physicist Rudolph Clausius explained such phenomena in terms of a quantity characterising disorder that he called entropy. In this picture, the universe works on the back of processes that increase entropy – for example dissipating heat from places where it is concentrated, and therefore more ordered, to cooler areas, where it is not.

That predicts a grim fate for the universe itself. Once all heat is maximally dissipated, no useful process can happen in it any more: it dies a “heat death”. A perplexing question is raised at the other end of cosmic history, too. If nature always favours states of high entropy, how and why did the universe start in a state that seems to have been of comparatively low entropy? At present we have no answer, and later I will mention an intriguing alternative view.

Perhaps because of such undesirable consequences, the legitimacy of the second law was for a long time questioned. The charge was formulated with the most striking clarity by the British physicist James Clerk Maxwell in 1867. He was satisfied that inanimate matter presented no difficulty for the second law. In an isolated system, heat always passes from the hotter to the cooler, and a neat clump of dye molecules readily dissolves in water and disperses randomly, never the other way round. Disorder as embodied by entropy does always increase.

Maxwell’s problem was with life. Living things have “intentionality”: they deliberately do things to other things to make life easier for themselves. Conceivably, they might try to reduce the entropy of their surroundings and thereby violate the second law.

Information is power

Such a possibility is highly disturbing to physicists. Either something is a universal law or it is merely a cover for something deeper. Yet it was only in the late 1970s that Maxwell’s entropy-fiddling “demon” was laid to rest. Its slayer was the US physicist Charles Bennett, who built on work by his colleague at IBM, Rolf Landauer, using the theory of information developed a few decades earlier by Claude Shannon. An intelligent being can certainly rearrange things to lower the entropy of its environment. But to do this, it must first fill up its memory, gaining information as to how things are arranged in the first place.

This acquired information must be encoded somewhere, presumably in the demon’s memory. When this memory is finally full, or the being dies or otherwise expires, it must be reset. Dumping all this stored, ordered information back into the environment increases entropy – and this entropy increase, Bennett showed, will ultimately always be at least as large as the entropy reduction the demon originally achieved. Thus the status of the second law was assured, albeit anchored in a mantra of Landauer’s that would have been unintelligible to the 19th-century progenitors of thermodynamics: that “information is physical”.

But how does this explain that thermodynamics survived the quantum revolution? Classical objects behave very differently to quantum ones, so the same is presumably true of classical and quantum information. After all, quantum computers are notoriously more powerful than classical ones (or would be if realised on a large scale).

The reason is subtle, and it lies in a connection between entropy and probability contained in perhaps the most profound and beautiful formula in all of science. Engraved on the tomb of the Austrian physicist Ludwig Boltzmann in Vienna’s central cemetery, it reads simply S = k log W. Here S is entropy – the macroscopic, measurable entropy of a gas, for example – while k is a constant of nature that today bears Boltzmann’s name. Log W is the mathematical logarithm of a microscopic, probabilistic quantity W – in a gas, this would be the number of ways the positions and velocities of its many individual atoms can be arranged.

On a philosophical level, Boltzmann’s formula embodies the spirit of reductionism: the idea that we can, at least in principle, reduce our outward knowledge of a system’s activities to basic, microscopic physical laws. On a practical, physical level, it tells us that all we need to understand disorder and its increase is probabilities. Tot up the number of configurations the atoms of a system can be in and work out their probabilities, and what emerges is nothing other than the entropy that determines its thermodynamical behaviour. The equation asks no further questions about the nature of the underlying laws; we need not care if the dynamical processes that create the probabilities are classical or quantum in origin.

There is an important additional point to be made here. Probabilities are fundamentally different things in classical and quantum physics. In classical physics they are “subjective” quantities that constantly change as our state of knowledge changes. The probability that a coin toss will result in heads or tails, for instance, jumps from ½ to 1 when we observe the outcome. If there were a being who knew all the positions and momenta of all the particles in the universe – known as a “Laplace demon”, after the French mathematician Pierre-Simon Laplace, who first countenanced the possibility – it would be able to determine the course of all subsequent events in a classical universe, and would have no need for probabilities to describe them.

In quantum physics, however, probabilities arise from a genuine uncertainty about how the world works. States of physical systems in quantum theory are represented in what the quantum pioneer Erwin Schrödinger called catalogues of information, but they are catalogues in which adding information on one page blurs or scrubs it out on another. Knowing the position of a particle more precisely means knowing less well how it is moving, for example. Quantum probabilities are “objective”, in the sense that they cannot be entirely removed by gaining more information.

That casts in an intriguing light thermodynamics as originally, classically formulated. There, the second law is little more than impotence written down in the form of an equation. It has no deep physical origin itself, but is an empirical bolt-on to express the otherwise unaccountable fact that we cannot know, predict or bring about everything that might happen, as classical dynamical laws suggest we can. But this changes as soon as you bring quantum physics into the picture, with its attendant notion that uncertainty is seemingly hardwired into the fabric of reality. Rooted in probabilities, entropy and thermodynamics acquire a new, more fundamental physical anchor.

It is worth pointing out, too, that this deep-rooted connection seems to be much more general. Recently, together with my colleagues Markus Müller of the Perimeter Institute for Theoretical Physics in Waterloo, Ontario, Canada, and Oscar Dahlsten at the Centre for Quantum Technologies in Singapore, I have looked at what happens to thermodynamical relations in a generalised class of probabilistic theories that embrace quantum theory and much more besides. There too, the crucial relationship between information and disorder, as quantified by entropy, survives (arxiv.org/1107.6029).

One theory to rule them all

As for gravity – the only one of nature’s four fundamental forces not covered by quantum theory – a more speculative body of research suggests it might be little more than entropy in disguise (see “Falling into disorder”). If so, that would also bring Einstein’s general theory of relativity, with which we currently describe gravity, firmly within the purview of thermodynamics.

Take all this together, and we begin to have a hint of what makes thermodynamics so successful. The principles of thermodynamics are at their roots all to do with information theory. Information theory is simply an embodiment of how we interact with the universe – among other things, to construct theories to further our understanding of it. Thermodynamics is, in Einstein’s term, a “meta-theory”: one constructed from principles over and above the structure of any dynamical laws we devise to describe reality’s workings. In that sense we can argue that it is more fundamental than either quantum physics or general relativity.

If we can accept this and, like Eddington and his ilk, put all our trust in the laws of thermodynamics, I believe it may even afford us a glimpse beyond the current physical order. It seems unlikely that quantum physics and relativity represent the last revolutions in physics. New evidence could at any time foment their overthrow. Thermodynamics might help us discern what any usurping theory would look like.

For example, earlier this year, two of my colleagues in Singapore, Esther Hänggi and Stephanie Wehner, showed that a violation of the quantum uncertainty principle – that idea that you can never fully get rid of probabilities in a quantum context – would imply a violation of the second law of thermodynamics. Beating the uncertainty limit means extracting extra information about the system, which requires the system to do more work than thermodynamics allows it to do in the relevant state of disorder. So if thermodynamics is any guide, whatever any post-quantum world might look like, we are stuck with a degree of uncertainty (arxiv.org/abs/1205.6894).

My colleague at the University of Oxford, the physicist David Deutsch, thinks we should take things much further. Not only should any future physics conform to thermodynamics, but the whole of physics should be constructed in its image. The idea is to generalise the logic of the second law as it was stringently formulated by the mathematician Constantin Carathéodory in 1909: that in the vicinity of any state of a physical system, there are other states that cannot physically be reached if we forbid any exchange of heat with the environment.

James Joule’s 19th century experiments with beer can be used to illustrate this idea. The English brewer, whose name lives on in the standard unit of energy, sealed beer in a thermally isolated tub containing a paddle wheel that was connected to weights falling under gravity outside. The wheel’s rotation warmed the beer, increasing the disorder of its molecules and therefore its entropy. But hard as we might try, we simply cannot use Joule’s set-up to decrease the beer’s temperature, even by a fraction of a millikelvin. Cooler beer is, in this instance, a state regrettably beyond the reach of physics.

God, the thermodynamicist

The question is whether we can express the whole of physics simply by enumerating possible and impossible processes in a given situation. This is very different from how physics is usually phrased, in both the classical and quantum regimes, in terms of states of systems and equations that describe how those states change in time. The blind alleys down which the standard approach can lead are easiest to understand in classical physics, where the dynamical equations we derive allow a whole host of processes that patently do not occur – the ones we have to conjure up the laws of thermodynamics expressly to forbid, such as dye molecules reclumping spontaneously in water.

By reversing the logic, our observations of the natural world can again take the lead in deriving our theories. We observe the prohibitions that nature puts in place, be it on decreasing entropy, getting energy from nothing, travelling faster than light or whatever. The ultimately “correct” theory of physics – the logically tightest – is the one from which the smallest deviation gives us something that breaks those taboos.

There are other advantages in recasting physics in such terms. Time is a perennially problematic concept in physical theories. In quantum theory, for example, it enters as an extraneous parameter of unclear origin that cannot itself be quantised. In thermodynamics, meanwhile, the passage of time is entropy increase by any other name. A process such as dissolved dye molecules forming themselves into a clump offends our sensibilities because it appears to amount to running time backwards as much as anything else, although the real objection is that it decreases entropy.

Apply this logic more generally, and time ceases to exist as an independent, fundamental entity, but one whose flow is determined purely in terms of allowed and disallowed processes. With it go problems such as that I alluded to earlier, of why the universe started in a state of low entropy. If states and their dynamical evolution over time cease to be the question, then anything that does not break any transformational rules becomes a valid answer.

Such an approach would probably please Einstein, who once said: “What really interests me is whether God had any choice in the creation of the world.” A thermodynamically inspired formulation of physics might not answer that question directly, but leaves God with no choice but to be a thermodynamicist. That would be a singular accolade for those 19th-century masters of steam: that they stumbled upon the essence of the universe, entirely by accident. The triumph of thermodynamics would then be a revolution by stealth, 200 years in the making.

Read the entire article here.

Time for the Neutrino

Enough of the Higgs boson, already! It’s time to shine the light on its smaller, swifter cousin, the neutrino.

From the NYT:

HAVE you noticed how the Higgs boson has been hogging the limelight lately? For a measly little invisible item, whose significance cannot be explained without appealing to thorny concepts of quantum field theory, it has done pretty well for itself. The struggling starlets of Hollywood could learn a thing or two about the dark art of self-promotion from this boson.

First, its elusiveness “sparked the greatest hunt in science,” as the subtitle of one popular book put it. Then came all the hoopla over its actual discovery. Or should I say discoveries? Because those clever, well-meaning folks at the CERN laboratory outside Geneva proclaimed their finding of the particle not once but twice. First in 2012, on the Fourth of July no less, they told the world that their supergigantic — and awesomely expensive — atom smasher had found tentative evidence of the Higgs. Eight months later, they made a second announcement, this time with more data in hand, to confirm that they had nabbed the beast for real. Just recently, there was yet more fanfare when two of the grandees who had predicted the particle’s existence back in 1964 shared a Nobel Prize for their insight.

In fact, ever since another Nobel-winning genius, Leon Lederman, branded it the “God particle” some 20 years ago, the Higgs boson has captured the public imagination and dominated the media coverage of physics. Some consider Professor Lederman’s moniker a brilliant P.R. move for physics, while others denounce it as a terrible gaffe that confuses people and cheapens a solemn scientific enterprise. Either way, it has been effective. Nobody ever talks about the fascinating lives of other subatomic particles on “Fox and Friends.”

Sure, the story of Higgs is a compelling one. The jaw-dropping $9 billion price tag of the machine built to chase it is enough to command our attention. Plus, there is the serene, wise man at the center of this epic saga: the octogenarian Peter Higgs, finally vindicated after waiting patiently for decades. Professor Higgs was seen to shed a tear of joy at a news conference announcing the discovery, adding tenderness to the triumphant moment and tugging ever so gently at our heartstrings. For reporters looking for a human-interest angle to this complicated scientific brouhaha, that was pure gold.

But I say enough is enough. It is time to give another particle a chance.

And have I got a terrific candidate for you! It moves in mysterious ways, passing right through wood, walls and even our bodies, with nary a bump. It morphs among three forms, like a cosmic chameleon evading capture. It brings us news from the sun’s scorching heart and from the spectacular death throes of monstrous stars. It could tell us why antimatter is so rare in the universe and illuminate the inner workings of our own planet. Someday, it may even help expose rogue nuclear reactors and secret bomb tests, thus promoting world peace. Most important, we might not be here without it.

WHAT is this magical particle, you ask? It is none other than the ghostly neutrino.

O.K., I admit that I am biased, having just written a book about it. But believe me, no other particle comes close to matching the incredibly colorful and quirky personality of the neutrino, or promises to reveal as much about a mind-boggling array of natural phenomena, both subatomic and cosmic. As one researcher told me, “Whenever anything cool happens in the universe, neutrinos are usually involved.” Besides, John Updike considered it worthy of celebrating in a delightful poem in The New Yorker, and on “The Big Bang Theory,” Sheldon Cooper’s idol Professor Proton chose Gino the Neutrino as his beloved puppet sidekick.

Granted, the neutrino does come with some baggage. Remember how it made headlines two years ago for possibly traveling faster than light? Back then, the prospects of time travel and breaking Einstein’s speed limit provided plenty of fodder for rampant speculation and a few bad jokes. In the end, the whole affair turned out to be much ado about a faulty cable. I maintain it is unfair to hold the poor little neutrino responsible for that commotion.

Generally speaking, the neutrino tends to shun the limelight. Actually, it is pathologically shy and hardly ever interacts with other particles. That makes it tough to pin down.

Thankfully, today’s neutrino hunters have a formidable arsenal at their disposal, including newfangled observatories buried deep underground or in the Antarctic ice. Neutrino chasing, once an esoteric sideline, has turned into one of the hottest occupations for the discerning nerd. More eager young ones will surely clamor for entry into the Promised Land now that the magazine Physics World has declared the recent detection of cosmic neutrinos to be the No. 1 physics breakthrough of the year.

Drum roll, please. The neutrino is ready to take center stage. But don’t blink: It zips by at nearly the speed of light.

Read the entire story here.

The Universe of Numbers

There is no doubt that mathematics — some very complex — has been able to explain much of what we consider the universe. In reality, and perhaps surprisingly, only a small subset of equations is required to explain everything around us from the atoms and their constituents to the vast cosmos. Why is that? And, what is the fundamental relationship between mathematics and our current physical understanding of all things great and small?

From the New Scientist:

When Albert Einstein finally completed his general theory of relativity in 1916, he looked down at the equations and discovered an unexpected message: the universe is expanding.

Einstein didn’t believe the physical universe could shrink or grow, so he ignored what the equations were telling him. Thirteen years later, Edwin Hubble found clear evidence of the universe’s expansion. Einstein had missed the opportunity to make the most dramatic scientific prediction in history.

How did Einstein’s equations “know” that the universe was expanding when he did not? If mathematics is nothing more than a language we use to describe the world, an invention of the human brain, how can it possibly churn out anything beyond what we put in? “It is difficult to avoid the impression that a miracle confronts us here,” wrote physicist Eugene Wigner in his classic 1960 paper “The unreasonable effectiveness of mathematics in the natural sciences” (Communications on Pure and Applied Mathematics, vol 13, p 1).

The prescience of mathematics seems no less miraculous today. At the Large Hadron Collider at CERN, near Geneva, Switzerland, physicists recently observed the fingerprints of a particle that was arguably discovered 48 years ago lurking in the equations of particle physics.

How is it possible that mathematics “knows” about Higgs particles or any other feature of physical reality? “Maybe it’s because math is reality,” says physicist Brian Greene of Columbia University, New York. Perhaps if we dig deep enough, we would find that physical objects like tables and chairs are ultimately not made of particles or strings, but of numbers.

“These are very difficult issues,” says philosopher of science James Ladyman of the University of Bristol, UK, “but it might be less misleading to say that the universe is made of maths than to say it is made of matter.”

Difficult indeed. What does it mean to say that the universe is “made of mathematics”? An obvious starting point is to ask what mathematics is made of. The late physicist John Wheeler said that the “basis of all mathematics is 0 = 0”. All mathematical structures can be derived from something called “the empty set”, the set that contains no elements. Say this set corresponds to zero; you can then define the number 1 as the set that contains only the empty set, 2 as the set containing the sets corresponding to 0 and 1, and so on. Keep nesting the nothingness like invisible Russian dolls and eventually all of mathematics appears. Mathematician Ian Stewart of the University of Warwick, UK, calls this “the dreadful secret of mathematics: it’s all based on nothing” (New Scientist, 19 November 2011, p 44). Reality may come down to mathematics, but mathematics comes down to nothing at all.

That may be the ultimate clue to existence – after all, a universe made of nothing doesn’t require an explanation. Indeed, mathematical structures don’t seem to require a physical origin at all. “A dodecahedron was never created,” says Max Tegmark of the Massachusetts Institute of Technology. “To be created, something first has to not exist in space or time and then exist.” A dodecahedron doesn’t exist in space or time at all, he says – it exists independently of them. “Space and time themselves are contained within larger mathematical structures,” he adds. These structures just exist; they can’t be created or destroyed.

That raises a big question: why is the universe only made of some of the available mathematics? “There’s a lot of math out there,” Greene says. “Today only a tiny sliver of it has a realisation in the physical world. Pull any math book off the shelf and most of the equations in it don’t correspond to any physical object or physical process.”

It is true that seemingly arcane and unphysical mathematics does, sometimes, turn out to correspond to the real world. Imaginary numbers, for instance, were once considered totally deserving of their name, but are now used to describe the behaviour of elementary particles; non-Euclidean geometry eventually showed up as gravity. Even so, these phenomena represent a tiny slice of all the mathematics out there.

Not so fast, says Tegmark. “I believe that physical existence and mathematical existence are the same, so any structure that exists mathematically is also real,” he says.

So what about the mathematics our universe doesn’t use? “Other mathematical structures correspond to other universes,” Tegmark says. He calls this the “level 4 multiverse”, and it is far stranger than the multiverses that cosmologists often discuss. Their common-or-garden multiverses are governed by the same basic mathematical rules as our universe, but Tegmark’s level 4 multiverse operates with completely different mathematics.

All of this sounds bizarre, but the hypothesis that physical reality is fundamentally mathematical has passed every test. “If physics hits a roadblock at which point it turns out that it’s impossible to proceed, we might find that nature can’t be captured mathematically,” Tegmark says. “But it’s really remarkable that that hasn’t happened. Galileo said that the book of nature was written in the language of mathematics – and that was 400 years ago.”

Read the entire article here.

Bert and Ernie and Friends

The universe is a very strange place, stranger than Washington D.C., stranger than most reality TV shows.

And, it keep getting stranger as astronomers and cosmologists continue to make ever more head-scratching discoveries. The latest, a pair of super-high energy neutrinos, followed by another 28. It seems that these tiny, almost massless, particles are reaching Earth from an unknown source, or sources, of immense power outside of our own galaxy.

The neutrinos were spotted by the IceCube detector, which is buried beneath about a mile of solid ice in an Antarctic glacier.

From i09:

By drilling a 1.5 mile hole deep into an Antarctic glacier, physicists working at the IceCube South Pole Observatory have captured 28 extraterrestrial neutrinos — those mysterious and extremely powerful subatomic particles that can pass straight through solid matter. Welcome to an entirely new age of astronomy.

Back in April of this year, the same team of physicists captured the highest energy neutrinos ever detected. Dubbed Bert and Ernie, the elusive subatomic particles likely originated from beyond our solar system, and possibly even our galaxy.

Neutrinos are extremely tiny and prolific subatomic particles that are born in nuclear reactions, including those that occur inside of stars. And because they’re practically massless (together they contain only a tiny fraction of the mass of a single electron), they can pass through normal matter, which is why they’re dubbed ‘ghost particles.’ Neutrinos are able to do this because they don’t carry an electric charge, so they’re immune to electromagnetic forces that influence charged particles like electrons and protons.

A Billion Times More Powerful

But not all neutrinos are the same. The ones discovered by the IceCube team are about a billion times more energetic than the ones coming out of our sun. A pair of them had energies above an entire petaelectron volt. That’s more than 1,000 times the energy produced by protons smashed at CERN’s Large Hadron Collider.

So whatever created them must have been extremely powerful. Like, mindboggingly powerful — probably the remnants of supernova explosions. Indeed, as a recent study has shown, these cosmic explosions are more powerful than we could have ever imagined — to the point where they’re defying known physics.

Other candidates for neutrino production include black holes, pulsars, galactic nuclei — or even the cataclysmic merger of two black holes.

That’s why the discovery of these 28 new neutrinos, and the construction of the IceCube facility, is so important. It’s still a mystery, but these new findings, and the new detection technique, will help.

Back in April, the IceCube project looked for neutrinos above one petaelectronvolt, which is how Bert and Ernie were detected. But the team went back and searched through their data and found 26 neutrinos with slightly lower energies, though still above 30 teraelectronvolts that were detected between May 2010 and May 2012. While it’s possible that some of these less high-energy neutrinos could have been produced by cosmic rays in the Earth’s atmosphere, the researchers say that most of them likely came from space. And in fact, the data was analyzed in such a way as to exclude neutrinos that didn’t come from space and other types of particles that may have tripped off the detector.

The Dawn of a New Field

“This is a landmark discovery — possibly a Nobel Prize in the making,” said Alexander Kusenko, a UCLA astroparticle physicist who was not involved in the IceCube collaboration. Thanks to the remarkable IceCube facility, where neutrinos are captured in holes drilled 1.5 miles down into the Antarctic glacier, astronomers have a completely new way to scope out the cosmos. It’s both literally and figuratively changing the way we see the universe.

“It really is the dawn of a new field,” said Darren Grant, a University of Alberta physicist, and a member of the IceCube team.

Read the entire article here.

The Large Hadron Collider is So Yesterday

CERN’s Large Hadron Collider (LHC) smashed countless particles into one another to reveal the Higgs Boson. A great achievement for all concerned. Yet what of the still remaining “big questions” of physics, and how will we find the answers?

From Wired:

The current era of particle physics is over. When scientists at CERN  announced last July that they had found the Higgs boson — which is responsible for giving all other particles their mass — they uncovered the final missing piece in the framework that accounts for the interactions of all known particles and forces, a theory known as the Standard Model.

And that’s a good thing, right? Maybe not.

The prized Higgs particle, physicists assumed, would help steer them toward better theories, ones that fix the problems known to plague the Standard Model. Instead, it has thrown the field  into a confusing situation.

“We’re sitting on a puzzle that is difficult to explain,” said particle physicist Maria Spiropulu of Caltech, who works on one of the LHC’s main Higgs-finding experiments, CMS.

It may sound strange, but physicists were hoping, maybe even expecting, that the Higgs would not turn out to be like they predicted it would be. At the very least, scientists hoped the properties of the Higgs would be different enough from those predicted under the Standard Model that they could show researchers how to build new models. But the Higgs’ mass  proved stubbornly normal, almost exactly in the place the Standard Model said it would be.

To make matters worse, scientists had hoped to find evidence for other strange particles. These could have pointed in the direction of theories beyond the Standard Model, such as the current favorite  supersymmetry, which posits the existence of a heavy doppelganger to all the known subatomic bits like electrons, quarks, and photons.

Instead, they were disappointed by being right. So how do we get out of this mess? More data!

Over the next few years, experimentalists will be churning out new results, which may be able to answer questions about dark matter, the properties of neutrinos, the nature of the Higgs, and perhaps what the next era of physics will look like. Here we take a look at the experiments that you should be paying attention to. These are the ones scientists are the most excited about because they might just form the next cracks in modern physics.

ALTAS and CMS
The Large Hadron Collider isn’t smashing protons right now. Instead, engineers are installing upgrades to help it search at even higher energies. The machine may be closed for business until 2015 but the massive amounts of data it has already collected is still wide open. The two main Higgs-searching experiments, ATLAS and CMS, could have plenty of surprises in store.

“We looked for the low-hanging fruit,” said particle physicist David Miller of the University of Chicago, who works on ATLAS. “All that we found was the Higgs, and now we’re going back for the harder stuff.”

What kind of other stuff might be lurking in the data? Nobody knows for sure but the collaborations will spend the next two years combing through the data they collected in 2011 and 2012, when the Higgs was found. Scientists are hoping to see hints of other, more exotic particles, such as those predicted under a theory known as supersymmetry. They will also start to understand the Higgs better.

See, scientists don’t have some sort of red bell that goes “ding” every time their detector finds a Higgs boson. In fact, ATLAS and CMS can’t actually see the Higgs at all. What they look for instead are the different particles that the Higgs decays into. The easiest-to-detect channels include when the Higgs decays to things like a quark and an anti-quark or two photons. What scientists are now trying to find out is exactly what percent of the time it decays to various different particle combinations, which will help them further pin down its properties.

It’s also possible that, with careful analysis, physicists would add up the percentages for each of the different decays and notice that they haven’t quite gotten to 100. There might be just a tiny remainder, indicating that the Higgs is decaying to particles that the detectors can’t see.

“We call that invisible decay,” said particle physicist Maria Spiropulu. The reason that might be exciting is that the Higgs could be turning into something really strange, like a dark matter particle.

We know from cosmological observations that dark matter has mass and, because the Higgs gives rise to mass, it probably has to somehow interact with dark matter. So the LHC data could tell scientists just how strong the connection is between the Higgs and dark matter. If found, these invisible decays could open up a whole new world of exploration.

“It’s fashionable to call it the ‘dark matter portal’ right now,” said Spiropulu.

NOVA and T2K
Neutrinos are oddballs in the Standard Model. They are tiny, nearly massless, and barely like interacting with any other members of the subatomic zoo. Historically, they have been the subject of  many surprising results and the future will probably reveal them to be even stranger. Physicists are currently trying to figure out some of their properties, which remain open questions.

“A very nice feature of these open questions is we know they all have answers that are accessible in the next round of experiments,” said physicist Maury Goodman of Argonne National Laboratory.

The US-based NOvA experiment will hopefully pin down some neutrino characteristics, in particular their masses. There are three types of neutrinos: electron, muon, and tau. We know that they have a very tiny mass — at least 10 billion times smaller than an electron — but we don’t know exactly what it is nor which of the three different types is heaviest or lightest.

NOvA will attempt to figure out this mass hierarchy by shooting a beam of neutrinos from Fermilab near Chicago 810 kilometers away to a detector in Ash River, Minnesota. A similar experiment in Japan called T2K is also sending neutrinos across 295 kilometers. As they pass through the Earth, neutrinos oscillate between their three different types. By comparing how the neutrinos look when they are first shot out versus how they appear at the distant detector, NOvA and T2K will be able to determine their properties with high precision.

T2K has been running for a couple years while NOvA is expected to begin taking data in 2014 and will run six years. Scientists hope that they will help answer some of the last remaining questions about neutrinos.

Read the entire article here.

Image: A simulation of the decay of a Higgs boson in a linear collider detector. Courtesy of Norman Graf / CERN.

Mr. Higgs

A fascinating profile of Peter Higgs, the theoretical physicist whose name has become associated with the most significant scientific finding of recent times.

From the Guardian:

For scientists of a certain calibre, these early days of October can bring on a bad case of the jitters. The nominations are in. The reports compiled. All that remains is for the Nobel committees to cast their final votes. There are no sure bets on who will win the most prestigious prize in science this year, but there are expectations aplenty. Speak to particle physicists, for example, and one name comes up more than any other. Top of their wishlist of winners – the awards are announced next Tuesday – is the self-deprecating British octagenarian, Peter Higgs.

Higgs, 84, is no household name, but he is closer to being one than any Nobel physics laureate since Richard Feynman, the Manhattan project scientist, who accepted the award reluctantly in 1964. But while Feynman was a showman who adored attention, Higgs is happy when eclipsed by the particle that bears his name, the elusive boson that scientists at Cern’s Large Hadron Collider triumphantly discovered last year.

“He’s modest and actually almost to a fault,” said Alan Walker, a fellow physicist at Edinburgh University, who sat next to Higgs at Cern when scientists revealed they had found the particle.

“You meet many physicists who will tell you how good they are. Peter doesn’t do that.”

Higgs, now professor emeritus at Edinburgh, made his breakthrough the same year Feynman won the Nobel. It was an era when the tools of the trade were pencil and paper. He outlined what came to be known as the Higgs mechanism, an explanation for how elementary particles, which make up all that is around us, gained their masses in the earliest moments after the big bang. Before 1964, the question of why the simplest particles weighed anything at all was met with an embarrassed but honest shrug.

Higgs plays down his role in developing the idea, but there is no dismissing the importance of the theory itself. “He didn’t produce a great deal, but what he did produce is actually quite profound and is one of the keystones of what we now understand as the fundamental building blocks of nature,” Walker said.

Higgs was born in Newcastle in 1929. His father, a BBC sound engineer, brought the family south to Birmingham and then onwards to Bristol. There, Higgs enrolled at what is now Cotham School. He got off to a bad start. One of the first things he did was tumble into a crater left by a second world war bomb in the playground and fracture his left arm. But he was a brilliant student. He won prizes in a haul of subjects – although not, as it happens, in physics.

To the teenage Higgs, physics lacked excitement. The best teachers were off at war, and that no doubt contributed to his attitude. It changed through a chance encounter. While standing around at the back of morning assembly Higgs noticed a name that appeared more than once on the school’s honours board. Higgs wondered who PAM Dirac was and read up on the former pupil. He learned that Paul Dirac was a founding father of quantum theory, and the closest Britain had to an Einstein. Through Dirac, Higgs came to relish the arcane world of theoretical physics.

Higgs found that he was not cut out for experiments, a fact driven home by a series of sometimes dramatic mishaps, but at university he proved himself a formidable theorist. He was the first to sit a six-hour theory exam at Kings College London, and for the want of a better idea, his tutors posed him a question that had recently been solved in a leading physics journal.

“Peter sailed ahead, took it seriously, thought about it, and in that six-hour time scale had managed to solve it, had written it up and presented it,” said Michael Fisher, a friend from Kings.

But getting the right answer was only the start. “In the long run it turned out, when it was actually graded, that Peter had done a better paper than the original they took from the literature.”

Higgs’s great discovery came at Edinburgh University, where he was considered an outsider for plugging away at ideas that many physicists had abandoned. But his doggedness paid off.

At the time an argument was raging in the field over a way that particles might gain their masses. The theory in question was clearly wrong, but Higgs saw why and how to fix it. He published a short note in September 1964 and swiftly wrote a more expansive follow-up paper.

To his dismay the article was rejected, ironically by an editor at Cern. Indignant at the decision, Higgs added two paragraphs to the paper and published it in a rival US journal instead. In the penultimate sentence was the first mention of what became known as the Higgs boson.

At first, there was plenty of resistance to Higgs’s theory. Before giving a talk at Harvard in 1966, a senior physicist, the late Sidney Coleman, told his class some idiot was coming to see them. “And you’re going to tear him to shreds.” Higgs stuck to his guns. Eventually he won them over.

Ken Peach, an Oxford physics professor who worked with Higgs in Edinburgh, said the determination was classic Peter: “There is an inner toughness, some steely resolve, which is not quite immediately apparent,” he said.

It was on display again when Stephen Hawking suggested the Higgs boson would never be found. Higgs hit back, saying that Hawking’s celebrity status meant he got away with pronouncements that others would not.

Higgs was at one time deeply involved in the Campaign for Nuclear Disarmament, but left when the organisation extended its protests to nuclear power. He felt CND had confused controlled and uncontrolled release of nuclear energy. He also joined Greenpeace but quit that organisation, too, when he felt its ideologies had started to trump its science.

“The one thing you get from Peter is that he is his own person,” said Walker.

Higgs was not the only scientist to come up with the theory of particle masses in 1964. François Englert and Robert Brout at the Free University in Brussels beat him into print by two weeks, but failed to mention the crucial new particle that scientists would need to prove the theory right. Three others, Gerry Guralnik, , Dick Hagen and Tom Kibble, had worked out the theory too, and published a month later.

Higgs is not comfortable taking all the credit for the work, and goes to great pains to list all the others whose work he built on. But in the community he is revered. When Higgs walked into the Cern auditorium last year to hear scientists tell the world about the discovery, he was welcomed with a standing ovation. He nodded off during the talks, but was awake at the end, when the crowd erupted as the significance of the achievement became clear. At that moment, he was caught on camera reaching for a handkerchief and dabbing his eyes. “He was tearful,” said Walker. “He was really deeply moved. I think he was absolutely surprised by the atmosphere of the room.”

Read the entire article here.

Image: Ken Currie, Portrait of Peter Higgs, 2008. Courtesy of Wikipedia.

Fields from Dreams

It’s time to abandon the notion that you, and everything around you, is made up of tiny particles and their subatomic constituents. You are nothing more than perturbations in the field, or fields. Nothing more. Theoretical physicist Sean Carroll explains all.

From Symmetry:

When scientists talk to non-scientists about particle physics, they talk about the smallest building blocks of matter: what you get when you divide cells and molecules into tinier and tinier bits until you can’t divide them any more.

That’s one way of looking at things. But it’s not really the way things are, said Caltech theoretical physicist Sean Carroll in a lecture at Fermilab. And if physicists really want other people to appreciate the discovery of the Higgs boson, he said, it’s time to tell them the rest of the story.

“To understand what is going on, you actually need to give up a little bit on the notion of particles,” Carroll said in the June lecture.

Instead, think in terms of fields.

You’re already familiar with some fields. When you hold two magnets close together, you can feel their attraction or repulsion before they even touch—an interaction between two magnetic fields. Likewise, you know that when you jump in the air, you’re going to come back down. That’s because you live in Earth’s gravitational field.

Carroll’s stunner, at least to many non-scientists, is this: Every particle is actually a field. The universe is full of fields, and what we think of as particles are just excitations of those fields, like waves in an ocean. An electron, for example, is just an excitation of an electron field.

This may seem counterintuitive, but seeing the world in terms of fields actually helps make sense of some otherwise confusing facts of particle physics.

When a radioactive material decays, for example, we think of it as spitting out different kinds of particles. Neutrons decay into protons, electrons and neutrinos. Those protons, electrons and neutrinos aren’t hiding inside neutrons, waiting to get out. Yet they appear when neutrons decay.

If we think in terms of fields, this sudden appearance of new kinds of particles starts to make more sense. The energy and excitation of one field transfers to others as they vibrate against each other, making it seem like new types of particles are appearing.

Thinking in fields provides a clearer picture of how scientists are able to make massive particles like Higgs bosons in the Large Hadron Collider. The LHC smashes bunches of energetic protons into one another, and scientists study those collisions.

“There’s an analogy that’s often used here,” Carroll said, “that doing particle physics is like smashing two watches together and trying to figure out how watches work by watching all the pieces fall apart.

“This analogy is terrible for many reasons,” he said. “The primary one is that what’s coming out when you smash particles together is not what was inside the original particles. … [Instead,] it’s like you smash two Timex watches together and a Rolex pops out.”

What’s really happening in LHC collisions is that especially excited excitations of a field—the energetic protons—are vibrating together and transfering their energy to adjacent fields, forming new excitations that we see as new particles—such as Higgs bosons.

Thinking in fields can also better explain how the Higgs works. Higgs bosons themselves do not give other particles mass by, say, sticking to them in clumps. Instead, the Higgs field interacts with other fields, giving them—and, by extension, their particles—mass.

Read the entire article here.

Image: iron filing magnetic field lines between two bar magnets. Courtesy of Wikimedia.

Everywhere And Nowhere

Most physicists believe that dark matter exists, but have never seen it, only deduced its existence. This is a rather unsettling state of affairs since by most estimates dark matter (and possibly dark energy) accounts for 95 percent of the universe. The stuff we are made from, interact with and see on a daily basis — atoms, their constituents and their forces — is a mere 5 percent.

From the Atlantic:

Here’s a little experiment.

Hold up your hand.

Now put it back down.

In that window of time, your hand somehow interacted with dark matter — the mysterious stuff that comprises the vast majority of the universe. “Our best guess,” according to Dan Hooper, an astronomy professor at the University of Chicago and a theoretical astrophysicist at the Fermi National Accelerator Laboratory, “is that a million particles of dark matter passed through your hand just now.”

Dark matter, in other words, is not merely the stuff of black holes and deep space. It is all around us. Somehow. We’re pretty sure.

But if you did the experiment — as the audience at Hooper’s talk on dark matter and other cosmic mysteries did at the Aspen Ideas Festival today — you didn’t feel those million particles. We humans have no sense of their existence, Hooper said, in part because they don’t hew to the forces that regulate our movement in the world — gravity, electromagnetism, the forces we can, in some way, feel. Dark matter, instead, is “this ghostly, elusive stuff that dominates our universe,” Hooper said.

It’s everywhere. And it’s also, as far as human knowledge is concerned, nowhere.

And yet, despite its mysteries, we know it’s out there. “All astronomers are in complete conviction that there is dark matter,” said Richard Massey, the lead author of a recent study mapping the dark matter of the universe, and Hooper’s co-panelist. The evidence for its existence, Hooper agreed, is “overwhelming.” And yet it’s evidence based on deduction: through our examinations of the observable universe, we make assumptions about the unobservable version.

Dark matter, in other words, is aptly named. A full 95 percent of the universe — the dark matter, the stuff that both is and is not — is effectively unknown to us. “All the science that we’ve ever done only ever examines five percent of the universe,” Massey said. Which means that there are still mysteries to be unraveled, and dark truths to be brought to light.

And it also means, Massey pointed out, that for scientists, “the job security is great.”

You might be wondering, though: given how little we know about dark matter, how is it that Hooper knew that a million particles of the stuff passed through your hand as you raised and lowered it?

“I cheated a little,” Hooper admitted. He assumed a particular mass for the individual particles. “We know what the density of dark matter is on Earth from watching how the Milky Way rotates. And we know roughly how fast they’re going. So you take those two bits of information, and all you need to know is how much mass each individual particle has, and then I can get the million number. And I assumed a kind of traditional guess. But it could be 10,000 higher; it could be 10,000 lower.”

Read the entire article here.

Shedding Light on Dark Matter

Scientists are cautiously optimistic that results from a particle experiment circling the Earth onboard the International Space Station (ISS) hint at the existence of dark matter.

From Symmetry:

The space-based Alpha Magnetic Spectrometer experiment could be building toward evidence of dark matter, judging by its first result.

The AMS detector does its work more than 200 miles above Earth, latched to the side of the International Space Station. It detects charged cosmic rays, high-energy particles that for the most part originate outside our solar system.

The experiment’s first result, released today, showed an excess of antimatter particles—over the number expected to come from cosmic-ray collisions—in a certain energy range.

There are two competing explanations for this excess. Extra antimatter particles called positrons could be forming in collisions between unseen dark-matter particles and their antiparticles in space. Or an astronomical object such as a pulsar could be firing them into our solar system.

Luckily, there are a couple of ways to find out which explanation is correct.

If dark-matter particles are the culprits, the excess of positrons should sink suddenly above a certain energy. But if a pulsar is responsible, at higher energies the excess will only gradually disappear.

“The way they drop off tells you everything,” said AMS Spokesperson and Nobel laureate Sam Ting, in today’s presentation at CERN, the European center for particle physics.

The AMS result, to be published in Physical Review Letters on April 5, includes data from the energy range between 0.5 and 350 GeV. A graph of the flux of positrons over the flux of electrons and positrons takes the shape of a valley, dipping in the energy range between 0.5 to 10 GeV and then increasing steadily between 10 and 250 GeV. After that point, it begins to dip again—but the graph cuts off just before one can tell whether this is the great drop-off expected in dark matter models or the gradual fade-out expected in pulsar models. This confirms previous results from the PAMELA experiment, with greater precision.

Ting smiled slightly while presenting this cliffhanger, pointing to the empty edge of the graph. “In here, what happens is of great interest,” he said.

“We, of course, have a feeling what is happening,” he said. “But probably it is too early to discuss that.”

Ting kept mum about any data collected so far above that energy, telling curious audience members to wait until the experiment had enough information to present a statistically significant result.

“I’ve been working at CERN for many years. I’ve never made a mistake on an experiment,” he said. “And this is a very difficult experiment.”

A second way to determine the origin of the excess of positrons is to consider where they’re coming from. If positrons are hitting the detector from all directions at random, they could be coming from something as diffuse as dark matter. But if they are arriving from one preferred direction, they might be coming from a pulsar.

So far, the result leans toward the dark-matter explanation, with positrons coming from all directions. But AMS scientists will need to collect more data to say this for certain.

Read the entire article following the jump.

Image: Alpha Magnetic Spectrometer (AMS) detector latched on to the International Space Station. Courtesy of NASA / AMS-02.

What’s Next at the LHC: Parallel Universe?

The Large Hadron Collider (LHC) at CERN made headlines in 2012 with the announcement of a probable discovery of the Higgs Boson. Scientists are collecting and analyzing more data before they declare an outright discovery in 2013. In the meantime, they plan to use the giant machine to examine even more interesting science — at very small and very large scales — in the new year.

[div class=attrib]From the Guardian:[end-div]

When it comes to shutting down the most powerful atom smasher ever built, it’s not simply a question of pressing the off switch.

In the French-Swiss countryside on the far side of Geneva, staff at the Cern particle physics laboratory are taking steps to wind down the Large Hadron Collider. After the latest run of experiments ends next month, the huge superconducting magnets that line the LHC’s 27km-long tunnel must be warmed up, slowly and gently, from -271 Celsius to room temperature. Only then can engineers descend into the tunnel to begin their work.

The machine that last year helped scientists snare the elusive Higgs boson – or a convincing subatomic impostor – faces a two-year shutdown while engineers perform repairs that are needed for the collider to ramp up to its maximum energy in 2015 and beyond. The work will beef up electrical connections in the machine that were identified as weak spots after an incident four years ago that knocked the collider out for more than a year.

The accident happened days after the LHC was first switched on in September 2008, when a short circuit blew a hole in the machine and sprayed six tonnes of helium into the tunnel that houses the collider. Soot was scattered over 700 metres. Since then, the machine has been forced to run at near half its design energy to avoid another disaster.

The particle accelerator, which reveals new physics at work by crashing together the innards of atoms at close to the speed of light, fills a circular, subterranean tunnel a staggering eight kilometres in diameter. Physicists will not sit around idle while the collider is down. There is far more to know about the new Higgs-like particle, and clues to its identity are probably hidden in the piles of raw data the scientists have already gathered, but have had too little time to analyse.

But the LHC was always more than a Higgs hunting machine. There are other mysteries of the universe that it may shed light on. What is the dark matter that clumps invisibly around galaxies? Why are we made of matter, and not antimatter? And why is gravity such a weak force in nature? “We’re only a tiny way into the LHC programme,” says Pippa Wells, a physicist who works on the LHC’s 7,000-tonne Atlas detector. “There’s a long way to go yet.”

The hunt for the Higgs boson, which helps explain the masses of other particles, dominated the publicity around the LHC for the simple reason that it was almost certainly there to be found. The lab fast-tracked the search for the particle, but cannot say for sure whether it has found it, or some more exotic entity.

“The headline discovery was just the start,” says Wells. “We need to make more precise measurements, to refine the particle’s mass and understand better how it is produced, and the ways it decays into other particles.” Scientists at Cern expect to have a more complete identikit of the new particle by March, when repair work on the LHC begins in earnest.

By its very nature, dark matter will be tough to find, even when the LHC switches back on at higher energy. The label “dark” refers to the fact that the substance neither emits nor reflects light. The only way dark matter has revealed itself so far is through the pull it exerts on galaxies.

Studies of spinning galaxies show they rotate with such speed that they would tear themselves apart were there not some invisible form of matter holding them together through gravity. There is so much dark matter, it outweighs by five times the normal matter in the observable universe.

The search for dark matter on Earth has failed to reveal what it is made of, but the LHC may be able to make the substance. If the particles that constitute it are light enough, they could be thrown out from the collisions inside the LHC. While they would zip through the collider’s detectors unseen, they would carry energy and momentum with them. Scientists could then infer their creation by totting up the energy and momentum of all the particles produced in a collision, and looking for signs of the missing energy and momentum.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: The eight torodial magnets can be seen on the huge ATLAS detector with the calorimeter before it is moved into the middle of the detector. This calorimeter will measure the energies of particles produced when protons collide in the centre of the detector. ATLAS will work along side the CMS experiment to search for new physics at the 14 TeV level. Courtesy of CERN.[end-div]

Uncertainty Strikes the Uncertainty Principle

Some recent experiments out of the University of Toronto show for the first time an anomaly in measurements predicted by Werner Heisenberg’s fundamental law of quantum mechanics, the Uncertainty Principle.

[div class=attrib]From io9:[end-div]

Heisenberg’s uncertainty principle is an integral component of quantum physics. At the quantum scale, standard physics starts to fall apart, replaced by a fuzzy, nebulous set of phenomena. Among all the weirdness observed at this microscopic scale, Heisenberg famously observed that the position and momentum of a particle cannot be simultaneously measured, with any meaningful degree of precision. This led him to posit the uncertainty principle, the declaration that there’s only so much we can know about a quantum system, namely a particle’s momentum and position.

Now, by definition, the uncertainty principle describes a two-pronged process. First, there’s the precision of a measurement that needs to be considered, and second, the degree of uncertainty, or disturbance, that it must create. It’s this second aspect that quantum physicists refer to as the “measurement-disturbance relationship,” and it’s an area that scientists have not sufficiently explored or proven.

Up until this point, quantum physicists have been fairly confident in their ability to both predict and measure the degree of disturbances caused by a measurement. Conventional thinking is that a measurement will always cause a predictable and consistent disturbance — but as the study from Toronto suggests, this is not always the case. Not all measurements, it would seem, will cause the effect predicted by Heisenberg and the tidy equations that have followed his theory. Moreover, the resultant ambiguity is not always caused by the measurement itself.

The researchers, a team led by Lee Rozema and Aephraim Steinberg, experimentally observed a clear-cut violation of Heisenberg’s measurement-disturbance relationship. They did this by applying what they called a “weak measurement” to define a quantum system before and after it interacted with their measurement tools — not enough to disturb it, but enough to get a basic sense of a photon’s orientation.

Then, by establishing measurement deltas, and then applying stronger, more disruptive measurements, the team was able to determine that they were not disturbing the quantum system to the degree that the uncertainty principle predicted. And in fact, the disturbances were half of what would normally be expected.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Heisenberg, Werner Karl Prof. 1901-1976; Physicist, Nobel Prize for Physics 1933, Germany. Courtesy of Wikipedia.[end-div]

Higgs?

 

A week ago, on July 4, 2012 researchers at CERN told the world that they had found evidence of a new fundamental particle — the so-called Higgs boson, or something closely similar. If further particle collisions at CERN’s Large Hadron Collider uphold this finding over the coming years, this will rank as significant a discovery as that of the proton or the electro-magnetic force. While practical application of this discovery, in our lifetimes at least, is likely to be scant, it undeniably furthers our quest to understand the underlying mechanism of our existence.

So where might this discovery lead next?

[div class=attrib]From the New Scientist:[end-div]

“As a layman, I would say, I think we have it,” said Rolf-Dieter Heuer, director general of CERN at Wednesday’s seminar announcing the results of the search for the Higgs boson. But when pressed by journalists afterwards on what exactly “it” was, things got more complicated. “We have discovered a boson – now we have to find out what boson it is,” he said cryptically. Eh? What kind of particle could it be if it isn’t the Higgs boson? And why would it show up right where scientists were looking for the Higgs? We asked scientists at CERN to explain.

If we don’t know the new particle is a Higgs, what do we know about it?
We know it is some kind of boson, says Vivek Sharma of CMS, one of the two Large Hadron Collider experiments that presented results on Wednesday. There are only two types of elementary particle in the standard model: fermions, which include electrons, quarks and neutrinos, and bosons, which include photons and the W and Z bosons. The Higgs is a boson – and we know the new particle is too because one of the things it decays into is a pair of high-energy photons, or gamma rays. According to the rules of mathematical symmetry, only a boson could decay into exactly two other photons.

Anything else?
Another thing we can say about the new particle is that nothing yet suggests it isn’t a Higgs. The standard model, our leading explanation for the known particles and the forces that act on them, predicts the rate at which a Higgs of a given mass should decay into various particles. The rates of decay reported for the new particle yesterday are not exactly what would be predicted for its mass of about 125 gigaelectronvolts (GeV) – leaving the door open to more exotic stuff. “If there is such a thing as a 125 GeV Higgs, we know what its rate of decay should be,” says Sharma. But the decay rates are close enough for the differences to be statistical anomalies that will disappear once more data is taken. “There are no serious inconsistencies,” says Joe Incandela, head of CMS, who reported the results on Wednesday.

In that case, are the CERN scientists just being too cautious? What would be enough evidence to call it a Higgs boson?
As there could be many different kinds of Higgs bosons, there’s no straight answer. An easier question to answer is: what would make the new particle neatly fulfil the Higgs boson’s duty in the standard model? Number one is to give other particles mass via the Higgs field – an omnipresent entity that “slows” some particles down more than others, resulting in mass. Any particle that makes up this field must be “scalar”. The opposite of a vector, this means that, unlike a magnetic field, or gravity, it doesn’t have any directionality. “Only a scalar boson fixes the problem,” says Oliver Buchmueller, also of CMS.

When will we know whether it’s a scalar boson?
By the end of the year, reckons Buchmueller, when at least one outstanding property of the new particle – its spin – should be determined. Scalars’ lack of directionality means they have spin 0. As the particle is a boson, we already know its spin is a whole number and as it decays into two photons, mathematical symmetry again dictates that the spin can’t be 1. Buchmueller says LHC researchers will able to determine whether it has a spin of 0 or 2 by examining whether the Higgs’ decay particles shoot into the detector in all directions or with a preferred direction – the former would suggest spin 0. “Most people think it is a scalar, but it still needs to be proven,” says Buchmueller. Sharma is pretty sure it’s a scalar boson – that’s because it is more difficult to make a boson with spin 2. He adds that, although it is expected, confirmation that this is a scalar boson is still very exciting: “The beautiful thing is, if this turns out to be a scalar particle, we are seeing a new kind of particle. We have never seen a fundamental particle that is a scalar.”

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: A typical candidate event including two high-energy photons whose energy (depicted by dashed yellow lines and red towers) is measured in the CMS electromagnetic calorimeter. The yellow lines are the measured tracks of other particles produced in the collision.[end-div]

CDM: Cosmic Discovery Machine

We think CDM sounds much more fun than LHC, a rather dry acronym for Large Hadron Collider.

Researchers at the LHC are set to announce the latest findings in early July from the record-breaking particle smasher buried below the French and Swiss borders. Rumors point towards the discovery of the so-called Higgs boson, the particle theorized to give mass to all the other fundamental building blocks of matter. So, while this would be another exciting discovery from CERN and yet another confirmation of the fundamental and elegant Standard Model of particle physics, perhaps there is yet more to uncover, such as the exotically named “inflaton”.

[div class=attrib]From Scientific American:[end-div]

Within a sliver of a second after it was born, our universe expanded staggeringly in size, by a factor of at least 10^26. That’s what most cosmologists maintain, although it remains a mystery as to what might have begun and ended this wild expansion. Now scientists are increasingly wondering if the most powerful particle collider in history, the Large Hadron Collider (LHC) in Europe, could shed light on this mysterious growth, called inflation, by catching a glimpse of the particle behind it. It could be that the main target of the collider’s current experiments, the Higgs boson, which is thought to endow all matter with mass, could also be this inflationary agent.

During inflation, spacetime is thought to have swelled in volume at an accelerating rate, from about a quadrillionth the size of an atom to the size of a dime. This rapid expansion would help explain why the cosmos today is as extraordinarily uniform as it is, with only very tiny variations in the distribution of matter and energy. The expansion would also help explain why the universe on a large scale appears geometrically flat, meaning that the fabric of space is not curved in a way that bends the paths of light beams and objects traveling within it.

The particle or field behind inflation, referred to as the “inflaton,” is thought to possess a very unusual property: it generates a repulsive gravitational field. To cause space to inflate as profoundly and temporarily as it did, the field’s energy throughout space must have varied in strength over time, from very high to very low, with inflation ending once the energy sunk low enough, according to theoretical physicists.

Much remains unknown about inflation, and some prominent critics of the idea wonder if it happened at all. Scientists have looked at the cosmic microwave background radiation—the afterglow of the big bang—to rule out some inflationary scenarios. “But it cannot tell us much about the nature of the inflaton itself,” says particle cosmologist Anupam Mazumdar at Lancaster University in England, such as its mass or the specific ways it might interact with other particles.

A number of research teams have suggested competing ideas about how the LHC might discover the inflaton. Skeptics think it highly unlikely that any earthly particle collider could shed light on inflation, because the uppermost energy densities one could imagine with inflation would be about 10^50 times above the LHC’s capabilities. However, because inflation varied with strength over time, scientists have argued the LHC may have at least enough energy to re-create inflation’s final stages.

It could be that the principal particle ongoing collider runs aim to detect, the Higgs boson, could also underlie inflation.

“The idea of the Higgs driving inflation can only take place if the Higgs’s mass lies within a particular interval, the kind which the LHC can see,” says theoretical physicist Mikhail Shaposhnikov at the École Polytechnique Fédérale de Lausanne in Switzerland. Indeed, evidence of the Higgs boson was reported at the LHC in December at a mass of about 125 billion electron volts, roughly the mass of 125 hydrogen atoms.

Also intriguing: the Higgs as well as the inflaton are thought to have varied with strength over time. In fact, the inventor of inflation theory, cosmologist Alan Guth at the Massachusetts Institute of Technology, originally assumed inflation was driven by the Higgs field of a conjectured grand unified theory.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of Physics World.[end-div]

Spacetime as an Emergent Phenomenon

A small, but growing, idea in theoretical physics and cosmology is that spacetime may be emergent. That is, spacetime emerges from something much more fundamental, in much the same way that our perception of temperature emerges from the motion and characteristics of underlying particles.

[div class=attrib]More on this new front in our quest to answer the most basic of questions from FQXi:[end-div]

Imagine if nothing around you was real. And, no, not in a science-fiction Matrix sense, but in an actual science-fact way.

Technically, our perceived reality is a gigantic series of approximations: The tables, chairs, people, and cell phones that we interact with every day are actually made up of tiny particles—as all good schoolchildren learn. From the motion and characteristics of those particles emerge the properties that we see and feel, including color and temperature. Though we don’t see those particles, because they are so much smaller than the phenomena our bodies are built to sense, they govern our day-to-day existence.

Now, what if spacetime is emergent too? That’s the question that Joanna Karczmarek, a string theorist at the University of British Columbia, Vancouver, is attempting to answer. As a string theorist, Karczmarek is familiar with imagining invisible constituents of reality. String theorists posit that at a fundamental level, matter is made up of unthinkably tiny vibrating threads of energy that underlie subatomic particles, such as quarks and electrons. Most string theorists, however, assume that such strings dance across a pre-existing and fundamental stage set by spacetime. Karczmarek is pushing things a step further, by suggesting that spacetime itself is not fundamental, but made of more basic constituents.

Having carried out early research in atomic, molecular and optical physics, Karczmarek shifted into string theory because she “was more excited by areas where less was known”—and looking for the building blocks from which spacetime arises certainly fits that criteria. The project, funded by a $40,000 FQXi grant, is “high risk but high payoff,” Karczmarek says.

Although one of only a few string theorists to address the issue, Karczmarek is part of a growing movement in the wider physics community to create a theory that shows spacetime is emergent. (See, for instance, “Breaking the Universe’s Speed Limit.”) The problem really comes into focus for those attempting to combine quantum mechanics with Einstein’s theory of general relativity and thus is traditionally tackled directly by quantum gravity researchers, rather than by string theorists, Karczmarek notes.

That may change though. Nathan Seiberg, a string theorist at the Institute for Advanced Study (IAS) in Princeton, New Jersey, has found good reasons for his stringy colleagues to believe that at least space—if not spacetime—is emergent. “With space we can sort of imagine how it might work,” Sieberg says. To explain how, Seiberg uses an everyday example—the emergence of an apparently smooth surface of water in a bowl. “If you examine the water at the level of particles, there is no smooth surface. It looks like there is, but this is an approximation,” Seiberg says. Similarly, he has found examples in string theory where some spatial dimensions emerge when you take a step back from the picture (arXiv:hep-th/0601234v1). “At shorter distances it doesn’t look like these dimensions are there because they are quantum fluctuations that are very rapid,” Seiberg explains. “In fact, the notion of space ceases to make sense, and eventually if you go to shorter and shorter distances you don’t even need it for the formulation of the theory.”

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of Nature.[end-div]

There’s the Big Bang theory and then there’s The Big Bang Theory

Now in it’s fifth season on U.S. television, The Big Bang Theory has made serious geekiness fun and science cool. In fact, the show is rising in popularity to such an extent that a Google search for “big bang theory” ranks the show first and above all other more learned scientific entires.

Brad Hooker from Symmetry Breaking asks some deep questions of David Saltzberg, science advisor to The Big Bang Theory.

[div class=attrib]From Symmetry Breaking:[end-div]

For those who live, breathe and laugh physics, one show entangles them all: The Big Bang Theory. Now in its fifth season on CBS, the show follows a group of geeks, including a NASA engineer, an astrophysicist and two particle physicists.

Every episode has at least one particle physics joke. On faster-than-light neutrinos: “Is this observation another Swiss export full of more holes than their cheese?” On Saul Perlmutter clutching the Nobel Prize: “What’s the matter, Saul? You afraid somebody’s going to steal it, like you stole Einstein’s cosmological constant?”

To make these jokes timely and accurate, while sprinkling the sets with authentic scientific plots and posters, the show’s writers depend on one physicist, David Saltzberg. Since the first episode, Saltzberg’s dose of realism has made science chic again, and has even been credited with increasing admissions to physics programs. Symmetry writer Brad Hooker asked the LHC physicist, former Tevatron researcher and University of California, Los Angeles professor to explain how he walks the tightrope between science and sitcom.

Brad: How many of your suggestions are put into the show?

David: In general, when they ask for something, they use it. But it’s never anything that’s funny or moves the story along. It’s the part that you don’t need to understand. They explained to me in the beginning that you can watch an I Love Lucy rerun and not understand Spanish, but understand that Ricky Ricardo is angry. That’s all the level of science understanding needed for the show.

B: These references are current. Astrophysicist Saul Perlmutter of Lawrence Berkeley National Laboratory was mentioned on the show just weeks after winning the Nobel Prize for discovering the accelerating expansion of the universe.

D: Right. And you may wonder why they chose Saul Perlmutter, as opposed to the other two winners. It just comes down to that they liked the sound of his name better. Things like that matter. The writers think of the script in terms of music and the rhythm of the lines. I usually give them multiple choices because I don’t know if they want something short or long or something with odd sounds in it. They really think about that kind of thing.

B: Do the writers ever ask you to explain the science and it goes completely over their heads?

D: We respond by email so I don’t really know. But I don’t think it goes over their heads because you can Wikipedia anything.

One thing was a little difficult for me: they asked for a spoof of the Born-Oppenheimer approximation, which is harder than it sounds. But for the most part it’s just a matter of narrowing it down to a few choices. There are so many ways to go through it and I deliberately chose things that are current.

First of all, these guys live in our universe—they’re talking about the things we physicists are talking about. And also, there isn’t a whole lot of science journalism out there. It’s been cut back a lot. In getting the words out there, whether it’s “dark matter” or “topological insulators,” hopefully some fraction of the audience will Google it.

B: Are you working with any other science advisors? I know one character is a neurobiologist.

D: Luckily the actress who portrays her, Mayim Bialik, is also a neuroscientist. She has a PhD in neuroscience from UCLA. So that worked out really well because I don’t know all of physics, let alone all of science. What I’m able to do with the physics is say, “Well, we don’t really talk like that even though it’s technically correct.” And I can’t do that for biology, but she can.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of The Big Bang Theory, Warner Bros.[end-div]

Everything Comes in Threes

[div class=attrib]From the Guardian:[end-div]

Last week’s results from the Daya Bay neutrino experiment were the first real measurement of the third neutrino mixing angle, ?13 (theta one-three). There have been previous experiments which set limits on the angle, but this is the first time it has been shown to be significantly different from zero.

Since ?13 is a fundamental parameter in the Standard Model of particle physics1, this would be an important measurement anyway. But there’s a bit more to it than that.

Neutrinos – whatever else they might be doing – mix up amongst themselves as they travel through space. This is a quantum mechanical effect, and comes from the fact that there are two ways of defining the three types of neutrino.

You can define them by the way they are produced. So a neutrino which is produced (or destroyed) in conjunction with an electron is an “electron neutrino”. If a muon is involved, it’s a “muon neutrino”. The third one is a “tau neutrino”. We call this the “flavour”.

Or you can define them by their masses. Usually we just call this definition neutrinos 1, 2 and 3.

The two definitions don’t line up, and there is a matrix which tells you how much of each “flavour” neutrino overlaps with each “mass” one. This is the neutrino mixing matrix. Inside this matrix in the standard model there are potentially four parameters describing how the neutrinos mix.

You could just have two-way mixing. For example, the flavour states might just mix up neutrino 1 and 2, and neutrino 2 and 3. This would be the case if the angle ?13 were zero. If it is bigger than zero (as Daya Bay have now shown) then neutrino 1 also mixes with neutrino 3. In this case, and only in this case, a fourth parameter is also allowed in the matrix. This fourth parameter (?) is one we haven’t measured yet, but now we know it is there. And the really important thing is, if it is there, and also not zero, then it introduces an asymmetry between matter and antimatter.

This is important because currently we don’t know why there is more matter than antimatter around. We also don’t know why there are three copies of neutrinos (and indeed of each class of fundamental particle). But we know that three copies is minimum number which allows some difference in the way matter and antimatter experience the weak nuclear force. This is the kind of clue which sets off big klaxons in the minds of physicists: New physics hiding somewhere here! It strongly suggests that these two not-understood facts are connected by some bigger, better theory than the one we have.

We’ve already measured a matter-antimatter difference for quarks; a non-zero ?13 means there can be a difference for neutrinos too. More clues.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image: The first use of a hydrogen bubble chamber to detect neutrinos, on November 13, 1970. A neutrino hit a proton in a hydrogen atom. The collision occurred at the point where three tracks emanate on the right of the photograph. Courtesy of Wikipedia.[end-div]

A Theory of Everything? Nah

A peer-reviewed journal recently published a 100-page scientific paper describing a theory of everything that unifies quantum theory and relativity (a long sought-after goal) with the origin of life, evolution and cosmology. And, best of all the paper contains no mathematics.

The paper written by a faculty member at Case Western Reserve University raises interesting issues about the peer review process and the viral spread of information, whether it’s correct or not.

[div class=attrib]From Ars Technica:[end-div]

Physicists have been working for decades on a “theory of everything,” one that unites quantum mechanics and relativity. Apparently, they were being too modest. Yesterday saw publication of a press release claiming a biologist had just published a theory accounting for all of that—and handling the origin of life and the creation of the Moon in the bargain. Better yet, no math!

Where did such a crazy theory originate? In the mind of a biologist at a respected research institution, Case Western Reserve University Medical School. Amazingly, he managed to get his ideas published, then amplified by an official press release. At least two sites with poor editorial control then reposted the press release—verbatim—as a news story.

Gyres all the way down

The theory in question springs from the brain of one Erik Andrulis, a CWRU faculty member who has a number of earlier papers on fairly standard biochemistry. The new paper was accepted by an open access journal called Life, meaning that you can freely download a copy of its 105 pages if you’re so inclined. Apparently, the journal is peer-reviewed, which is a bit of a surprise; even accepting that the paper makes a purely theoretical proposal, it is nothing like science as I’ve ever seen it practiced.

The basic idea is that everything, from subatomic particles to living systems, is based on helical systems the author calls “gyres,” which transform matter, energy, and information. These transformations then determine the properties of various natural systems, living and otherwise. What are these gyres? It’s really hard to say; even Andrulis admits that they’re just “a straightforward and non-mathematical core model” (although he seems to think that’s a good thing). Just about everything can be derived from this core model; the author cites “major phenomena including, but not limited to, quantum gravity, phase transitions of water, why living systems are predominantly CHNOPS (carbon, hydrogen, nitrogen, oxygen, phosphorus, and sulfur), homochirality of sugars and amino acids, homeoviscous adaptation, triplet code, and DNA mutations.”

He’s serious about the “not limited to” part; one of the sections describes how gyres could cause the Moon to form.

Is this a viable theory of everything? The word “boson,” the particle that carries forces, isn’t in the text at all. “Quark” appears once—in the title of one of the 800 references. The only subatomic particle Andrulis describes is the electron; he skips from there straight up to oxygen. Enormous gaps exist everywhere one looks.

[div class=attrib]Read more here.[end-div]

From Nine Dimensions to Three

Over the last 40 years or so physicists and cosmologists have sought to construct a single grand theory that describes our entire universe from the subatomic soup that makes up particles and describes all forces to the vast constructs of our galaxies, and all in between and beyond. Yet a major stumbling block has been how to bring together the quantum theories that have so successfully described, and predicted, the microscopic with our current understanding of gravity. String theory is one such attempt to develop a unified theory of everything, but it remains jumbled with many possible solutions and, currently, is beyond experimental verification.

Recently however, theorists in Japan announced a computer simulation which shows how our current 3-dimensional universe may have evolved from a 9-dimensional space hypothesized by string theory.

[div class=attrib]From Interactions:[end-div]

A group of three researchers from KEK, Shizuoka University and Osaka University has for the first time revealed the way our universe was born with 3 spatial dimensions from 10-dimensional superstring theory1 in which spacetime has 9 spatial directions and 1 temporal direction. This result was obtained by numerical simulation on a supercomputer.

[Abstract]

According to Big Bang cosmology, the universe originated in an explosion from an invisibly tiny point. This theory is strongly supported by observation of the cosmic microwave background2 and the relative abundance of elements. However, a situation in which the whole universe is a tiny point exceeds the reach of Einstein’s general theory of relativity, and for that reason it has not been possible to clarify how the universe actually originated.

In superstring theory, which is considered to be the “theory of everything”, all the elementary particles are represented as various oscillation modes of very tiny strings. Among those oscillation modes, there is one that corresponds to a particle that mediates gravity, and thus the general theory of relativity can be naturally extended to the scale of elementary particles. Therefore, it is expected that superstring theory allows the investigation of the birth of the universe. However, actual calculation has been intractable because the interaction between strings is strong, so all investigation thus far has been restricted to discussing various models or scenarios.

Superstring theory predicts a space with 9 dimensions3, which poses the big puzzle of how this can be consistent with the 3-dimensional space that we live in.

A group of 3 researchers, Jun Nishimura (associate professor at KEK), Asato Tsuchiya (associate professor at Shizuoka University) and Sang-Woo Kim (project researcher at Osaka University) has succeeded in simulating the birth of the universe, using a supercomputer for calculations based on superstring theory. This showed that the universe had 9 spatial dimensions at the beginning, but only 3 of these underwent expansion at some point in time.

This work will be published soon in Physical Review Letters.

[The content of the research]

In this study, the team established a method for calculating large matrices (in the IKKT matrix model4), which represent the interactions of strings, and calculated how the 9-dimensional space changes with time. In the figure, the spatial extents in 9 directions are plotted against time.

If one goes far enough back in time, space is indeed extended in 9 directions, but then at some point only 3 of those directions start to expand rapidly. This result demonstrates, for the first time, that the 3-dimensional space that we are living in indeed emerges from the 9-dimensional space that superstring theory predicts.

This calculation was carried out on the supercomputer Hitachi SR16000 (theoretical performance: 90.3 TFLOPS) at the Yukawa Institute for Theoretical Physics of Kyoto University.

[The significance of the research]

It is almost 40 years since superstring theory was proposed as the theory of everything, extending the general theory of relativity to the scale of elementary particles. However, its validity and its usefulness remained unclear due to the difficulty of performing actual calculations. The newly obtained solution to the space-time dimensionality puzzle strongly supports the validity of the theory.

Furthermore, the establishment of a new method to analyze superstring theory using computers opens up the possibility of applying this theory to various problems. For instance, it should now be possible to provide a theoretical understanding of the inflation5 that is believed to have taken place in the early universe, and also the accelerating expansion of the universe6, whose discovery earned the Nobel Prize in Physics this year. It is expected that superstring theory will develop further and play an important role in solving such puzzles in particle physics as the existence of the dark matter that is suggested by cosmological observations, and the Higgs particle, which is expected to be discovered by LHC experiments.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image: A visualization of strings. Courtesy of R. Dijkgraaf / Universe Today.[end-div]

Faster Than Light Travel

The world of particle physics is agog with recent news of an experiment that shows a very unexpected result – sub-atomic particles traveling faster than the speed of light. If verified and independently replicated the results would violate one of the universe’s fundamental properties described by Einstein in the Special Theory of Relativity. The speed of light — 186,282 miles per second (299,792 kilometers per second) — has long been considered an absolute cosmic speed limit.

Stranger still, over the last couple of days news of this anomalous result has even been broadcast on many cable news shows.

The experiment known as OPERA is a collaboration between France’s National Institute for Nuclear and Particle Physics Research and Italy’s Gran Sasso National Laboratory. Over the course of three years scientists fired a neutrino beam 454 miles (730 kilometers) underground from Geneva to a receiver in Italy. Their measurements show that neutrinos arrived an average of 60 nanoseconds sooner than light would have done. This doesn’t seem like a great amount, after all is only 60 billionths of a second, however the small difference could nonetheless undermine a hundred years of physics.

Understandably most physicists remain skeptical of the result, until further independent experiments are used to confirm the measurements or not. However, all seem to agree that if the result is confirmed this would be a monumental finding and would likely reshape modern physics and our understanding of the universe.

[div class=attrib]More on this intriguing story here courtesy of ARs Technica, which also offers a detailed explanation of several possible sources of error that may have contributed to the faster-than-light measurements.[end-div]

Just Another Week at Fermilab

Another day, another particle, courtesy of scientists at Fermilab. The CDF group working with data from Fermilab’s Tevatron particle collider announced the finding of a new, neutron-like particle last week. The particle known as a neutral Xi-sub-b is a heavy relative of the neutron and is made up of a strange quark, an up quark and a bottom quark, hence the “s-u-b” moniker.

[div class=attrib]Here’s more from Symmetry Breaking:[end-div]

While its existence was predicted by the Standard Model, the observation of the neutral Xi-sub-b is significant because it strengthens our understanding of how quarks form matter. Fermilab physicist Pat Lukens, a member of the CDF collaboration, presented the discovery at Fermilab on Wednesday, July 20.

The neutral Xi-sub-b is the latest entry in the periodic table of baryons. Baryons are particles formed of three quarks, the most common examples being the proton (two up quarks and a down quark) and the neutron (two down quarks and an up quark). The neutral Xi-sub-b belongs to the family of bottom baryons, which are about six times heavier than the proton and neutron because they all contain a heavy bottom quark. The particles are produced only in high-energy collisions, and are rare and very difficult to observe.

Although Fermilab’s Tevatron particle collider is not a dedicated bottom quark factory, sophisticated particle detectors and trillions of proton-antiproton collisions have made it a haven for discovering and studying almost all of the known bottom baryons. Experiments at the Tevatron discovered the Sigma-sub-b baryons (?b and ?b*) in 2006, observed the Xi-b-minus baryon (?b) in 2007, and found the Omega-sub-b (?b) in 2009.

[div class=attrib]Image courtesy of Fermilab/CDF Collaboration.[end-div]

Higgs Particle Collides with Modern Art

Jonathan Jones over at the Guardian puts an creative spin (pun intended) on the latest developments in the world of particle physics. He suggests that we might borrow from the world of modern and contemporary art to help us take the vast imaginative leaps necessary to understand our physical world and its underlying quantum mechanical nature bound up in uncertainty and paradox.

Jones makes a good point that many leading artists of recent times broke new ground by presenting us with an alternate reality that demanded a fresh perspective of the world and what lies beneath. Think Picasso and Dali and Miro and Twombly.

[div class=attrib]From Jonathan Jones for the Guardian:[end-div]

The experiments currently being performed in the LHC are enigmatic, mind-boggling and imaginative. But are they science – or art? In his renowned television series The Ascent of Man, the polymath Jacob Bronowski called the discovery of the invisible world within the atom the great collective achievement of science in the 20th century. Then he went further. “No – it is a great, collective work of art.”

Niels Bohr, who was at the heart of the new sub-atomic physics in the early 20th century, put the mystery of what he and others were finding into provocative sayings. He was very quotable, and every quote stresses the ambiguity of the new realm he was opening up, the realm of the smallest conceivable things in the universe. “If quantum mechanics hasn’t profoundly shocked you, you haven’t understood it yet,” ran one of his remarks. According to Bronowski, Bohr also said that to think about the paradoxical truths of quantum mechanics is to think in images, because the only way to know anything about the invisible is to create an image of it that is by definition a human construct, a model, a half-truth trying to hint at the real truth.

. . .

We won’t understand what those guys at Cern are up to until our idea of science catches up with the greatest minds of the 20th century who blew apart all previous conventions of thought. One guide offers itself to those of us who are not physicists: modern art. Bohr, explained Bronowski, collected Cubist paintings. Cubism was invented by Pablo Picasso and Georges Braque at the same time modern physics was being created: its crystalline structures and opaque surfaces suggest the astonishment of a reality whose every microcosmic particle is sublimely complex.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Wikipedia / CERN / Creative Commons.[end-div]