Tag Archives: General Relativity theory

Spacetime Without the Time

anti-de-sitter-spaceSince they were first dreamed up explanations of the very small (quantum mechanics) and the very large (general relativity) have both been highly successful at describing their respective spheres of influence. Yet, these two descriptions of our physical universe are not compatible, particularly when it comes to describing gravity. Indeed, physicists and theorists have struggled for decades to unite these two frameworks. Many agree that we need a new theory (of everything).

One new idea, from theorist Erik Verlinde of the University of Amsterdam, proposes that time is an emergent construct (it’s not a fundamental building block) and that dark matter is an illusion.

From Quanta:

Theoretical physicists striving to unify quantum mechanics and general relativity into an all-encompassing theory of quantum gravity face what’s called the “problem of time.”

In quantum mechanics, time is universal and absolute; its steady ticks dictate the evolving entanglements between particles. But in general relativity (Albert Einstein’s theory of gravity), time is relative and dynamical, a dimension that’s inextricably interwoven with directions x, y and z into a four-dimensional “space-time” fabric. The fabric warps under the weight of matter, causing nearby stuff to fall toward it (this is gravity), and slowing the passage of time relative to clocks far away. Or hop in a rocket and use fuel rather than gravity to accelerate through space, and time dilates; you age less than someone who stayed at home.

Unifying quantum mechanics and general relativity requires reconciling their absolute and relative notions of time. Recently, a promising burst of research on quantum gravity has provided an outline of what the reconciliation might look like — as well as insights on the true nature of time.

As I described in an article this week on a new theoretical attempt to explain away dark matter, many leading physicists now consider space-time and gravity to be “emergent” phenomena: Bendy, curvy space-time and the matter within it are a hologram that arises out of a network of entangled qubits (quantum bits of information), much as the three-dimensional environment of a computer game is encoded in the classical bits on a silicon chip. “I think we now understand that space-time really is just a geometrical representation of the entanglement structure of these underlying quantum systems,” said Mark Van Raamsdonk, a theoretical physicist at the University of British Columbia.

Researchers have worked out the math showing how the hologram arises in toy universes that possess a fisheye space-time geometry known as “anti-de Sitter” (AdS) space. In these warped worlds, spatial increments get shorter and shorter as you move out from the center. Eventually, the spatial dimension extending from the center shrinks to nothing, hitting a boundary. The existence of this boundary — which has one fewer spatial dimension than the interior space-time, or “bulk” — aids calculations by providing a rigid stage on which to model the entangled qubits that project the hologram within. “Inside the bulk, time starts bending and curving with the space in dramatic ways,” said Brian Swingle of Harvard and Brandeis universities. “We have an understanding of how to describe that in terms of the ‘sludge’ on the boundary,” he added, referring to the entangled qubits.

The states of the qubits evolve according to universal time as if executing steps in a computer code, giving rise to warped, relativistic time in the bulk of the AdS space. The only thing is, that’s not quite how it works in our universe.

Here, the space-time fabric has a “de Sitter” geometry, stretching as you look into the distance. The fabric stretches until the universe hits a very different sort of boundary from the one in AdS space: the end of time. At that point, in an event known as “heat death,” space-time will have stretched so much that everything in it will become causally disconnected from everything else, such that no signals can ever again travel between them. The familiar notion of time breaks down. From then on, nothing happens.

On the timeless boundary of our space-time bubble, the entanglements linking together qubits (and encoding the universe’s dynamical interior) would presumably remain intact, since these quantum correlations do not require that signals be sent back and forth. But the state of the qubits must be static and timeless. This line of reasoning suggests that somehow, just as the qubits on the boundary of AdS space give rise to an interior with one extra spatial dimension, qubits on the timeless boundary of de Sitter space must give rise to a universe with time — dynamical time, in particular. Researchers haven’t yet figured out how to do these calculations. “In de Sitter space,” Swingle said, “we don’t have a good idea for how to understand the emergence of time.”

Read the entire article here.

Image: Image of (1 + 1)-dimensional anti-de Sitter space embedded in flat (1 + 2)-dimensional space. The t1- and t2-axes lie in the plane of rotational symmetry, and the x1-axis is normal to that plane. The embedded surface contains closed timelike curves circling the x1 axis, though these can be eliminated by “unrolling” the embedding (more precisely, by taking the universal cover). Courtesy: Krishnavedala. Wikipedia. Creative Commons Attribution-Share Alike 3.0.

A Gravitational Wave Comes Ashore

ligo-gravitational-waves-detection

On February 11, 2016, a historic day for astronomers the world over, scientists announced a monumental discovery, which was made on September 14, 2015! Thank you LIGO, the era of gravitational wave (G-Wave) astronomy has begun.

One hundred years after a prediction from Einstein’s theory of general relativity scientists have their first direct evidence of gravitational waves. These waves are ripples in the fabric of spacetime itself rather than the movement of fields and particles, such as from electromagnetic radiation. These ripples show up when gravitationally immense bodies warp the structure of space in which they sit, such as through collisions or acceleration.

ligo-hanford-aerial

As you might imagine for such disturbances to be observed here on Earth over distances in the tens to hundreds of millions, of light-years requires not only vastly powerful forces at one end but immensely sensitive instruments at the other. In fact the detector credited with discovery in this case is the Laser Interferometer Gravitational-Wave Observatory, or LIGO. It is so sensitive it can detect a change in length of its measurement apparatus — infra-red laser beams — 10,000 times smaller than the width of a proton. LIGO is operated by Caltech and MIT and supported through the U.S. National Science Foundation.

Prof Kip Thorne, one of the founders of LIGO, said that until now, astronomers had looked at the universe as if on a calm sea. This is now changed. He adds:

“The colliding black holes that produced these gravitational waves created a violent storm in the fabric of space and time, a storm in which time speeded up and slowed down, and speeded up again, a storm in which the shape of space was bent in this way and that way.”

And, as Prof Stephen Hawking remarked:

“Gravitational waves provide a completely new way of looking at the universe. The ability to detect them has the potential to revolutionise astronomy. This discovery is the first detection of a black hole binary system and the first observation of black holes merging.”

Congratulations to the many hundreds of engineers, technicians, researchers and theoreticians who have collaborated on this ground-breaking experiment. Particular congratulations go to LIGO’s three principal instigators: Rainier Weiss, Kip Thorne, and Ronald Drever.

This discovery paves the way for deeper understanding of our cosmos and lays the foundation for a new and rich form of astronomy through gravitational observations.

Galileo’s first telescopes opened our eyes to the visual splendor of our solar system and its immediate neighborhood. More recently, radio-wave, x-ray and gamma-ray astronomy have allowed us to discover wonders further afield: star-forming nebulae, neutron stars, black holes, active galactic nuclei, the Cosmic Microwave Background (CMB). Now, through LIGO and its increasingly sensitive descendants we are likely to make even more breathtaking discoveries, some of which, courtesy of gravitational waves, may let us peer at the very origin of the universe itself — the Big Bang.

How brilliant is that!

Image 1: The historic detection of gravitational waves by the Laser Interferometer Gravitational-Wave Observatory (LIGO) is shown in this plot during a press conference in Washington, D.C. on Feb. 11, 2016.Courtesy: National Science Foundation.

Image 2: LIGO Laboratory operates two detector sites 1,800 miles apart: one near Hanford in eastern Washington, and another near Livingston, Louisiana. This photo shows the Hanford detector. Courtesy of LIGO Caltech.

 

When 8 Equals 16

commercial-standard-cs215-58

I’m sure that most, if not all, mathematicians would tell you that their calling is at the heart of our understanding of the universe. Mathematics describes our world precisely and logically. But, mix it with the world of women’s fashion and this rigorous discipline becomes rather squishy, and far from absolute. A case in point: a women’s size 16 today is equivalent to a women’s size 8 from 1958.

This makes me wonder what the fundamental measurements and equations describing our universe would look like if controlled by advertisers and marketers. Though, Einstein’s work on Special and General Relativity may seem to fit the fashion industry quite well: one of the central tenets of relativity holds that measurements of various quantities (read: dress size) are relative to the velocities (market size) of observers (retailers). In particular, space (dress size) contracts and time (waist size) dilates.

From the Washington Post:

Here are some numbers that illustrate the insanity of women’s clothing sizes: A size 8 dress today is nearly the equivalent of a size 16 dress in 1958. And a size 8 dress of 1958 doesn’t even have a modern-day equivalent — the waist and bust measurements of a Mad Men-era 8 come in smaller than today’s size 00.

These measurements come from official sizing standards once maintained by the National Bureau of Standards (now the National Institute of Standards and Technology) and taken over in recent years by the American Society of Testing and Materials. Data visualizer Max Galka recently unearthed them for a blog post on America’s obesity epidemic.

Centers for Disease Control and Prevention data show that the average American woman today weighs about as much as the average 1960s man. And while the weight story is pretty straightforward — Americans got heavier — the story behind the dress sizes is a little more complicated, as any woman who’s ever shopped for clothes could probably tell you.

As Julia Felsenthal detailed over at Slate, today’s women’s clothing sizes have their roots in a depression-era government project to define the “Average American Woman” by sending a pair of statisticians to survey and measure nearly 15,000 women. They “hoped to determine whether any proportional relationships existed among measurements that could be broadly applied to create a simple, standardized system of sizing,” Felsenthal writes.

Sadly, they failed. Not surprisingly, women’s bodies defied standardization. The project did yield one lasting contribution to women’s clothing: The statisticians were the first to propose the notion of arbitrary numerical sizes that weren’t based on any specific measurement — similar to shoe sizes.

The government didn’t return to the project until the late 1950s, when the National Bureau of Standards published “Body Measurements for the Sizing of Women’s Patterns and Apparel” in 1958. The standard was based on the 15,000 women interviewed previously, with the addition of a group of women who had been in the Army during World War II. The document’s purpose? “To provide the consumer with a means of identifying her body type and size from the wide range of body types covered, and enable her to be fitted properly by the same size regardless of price, type of apparel, or manufacturer of the garment.”

Read the entire article here.

Image: Diagram from “Body Measurements for the Sizing of Women’s Patterns and Apparel”, 1958. Courtesy of National Bureau of Standards /  National Institute of Standards and Technology (NIST).

Universal Amniotic Fluid

Another day, another physics paper describing the origin of the universe. This is no wonder. Since the development of general relativity and quantum mechanics — two mutually incompatible descriptions of our reality — theoreticians have been scurrying to come up with a grand theory, a rapprochement of sorts. This one describes the universe as a quantum fluid, perhaps made up of hypothesized gravitons.

From Nature Asia:

The prevailing model of cosmology, based on Einstein’s theory of general relativity, puts the universe at around 13.8 billion years old and suggests it originated from a “singularity” – an infinitely small and dense point – at the Big Bang.

 To understand what happened inside that tiny singularity, physicists must marry general relativity with quantum mechanics – the laws that govern small objects. Applying both of these disciplines has challenged physicists for decades. “The Big Bang singularity is the most serious problem of general relativity, because the laws of physics appear to break down there,” says Ahmed Farag Ali, a physicist at Zewail City of Science and Technology, Egypt.

 In an effort to bring together the laws of quantum mechanics and general relativity, and to solve the singularity puzzle, Ali and Saurya Das, a physicist at the University of Lethbridge in Alberta Canada, employed an equation that predicts the development of singularities in general relativity. That equation had been developed by Das’s former professor, Amal Kumar Raychaudhuri, when Das was an undergraduate student at Presidency University, in Kolkata, India, so Das was particularly familiar and fascinated by it.

 When Ali and Das made small quantum corrections to the Raychaudhuri equation, they realised it described a fluid, made up of small particles, that pervades space. Physicists have long believed that a quantum version of gravity would include a hypothetical particle, called the graviton, which generates the force of gravity. In their new model — which will appear in Physics Letters B in February — Ali and Das propose that such gravitons could form this fluid.

To understand the origin of the universe, they used this corrected equation to trace the behaviour of the fluid back through time. Surprisingly, they found that it did not converge into a singularity. Instead, the universe appears to have existed forever. Although it was smaller in the past, it never quite crunched down to nothing, says Das.

 “Our theory serves to complement Einstein’s general relativity, which is very successful at describing physics over large distances,” says Ali. “But physicists know that to describe short distances, quantum mechanics must be accommodated, and the quantum Raychaudhui equation is a big step towards that.”

The model could also help solve two other cosmic mysteries. In the late 1990s, astronomers discovered that the expansion of the universe is accelerating due the presence of a mysterious dark energy, the origin of which is not known. The model has the potential to explain it since the fluid creates a minor but constant outward force that expands space. “This is a happy offshoot of our work,” says Das.

 Astronomers also now know that most matter in the universe is in an invisible mysterious form called dark matter, only perceptible through its gravitational effect on visible matter such as stars. When Das and a colleague set the mass of the graviton in the model to a small level, they could make the density of their fluid match the universe’s observed density of dark matter, while also providing the right value for dark energy’s push.

Read the entire article here.

 

Questioning Quantum Orthodoxy

de-BrogliePhysics works very well in explaining our world, yet it is also broken — it cannot, at the moment, reconcile our views of the very small (quantum theory) with those of the very large (relativity theory).

So although the probabilistic underpinnings of quantum theory have done wonders in allowing physicists to construct the Standard Model, gaps remain.

Back in the mid-1920s, the probabilistic worldview proposed by Niels Bohr and others gained favor and took hold. A competing theory, known as the pilot wave theory, proposed by a young Louis de Broglie, was given short shrift. Yet some theorists have maintained that it may do a better job of reconciling this core gap in our understanding — so it is time to revisit and breathe fresh life into pilot wave theory.

From Wired / Quanta:

For nearly a century, “reality” has been a murky concept. The laws of quantum physics seem to suggest that particles spend much of their time in a ghostly state, lacking even basic properties such as a definite location and instead existing everywhere and nowhere at once. Only when a particle is measured does it suddenly materialize, appearing to pick its position as if by a roll of the dice.

This idea that nature is inherently probabilistic — that particles have no hard properties, only likelihoods, until they are observed — is directly implied by the standard equations of quantum mechanics. But now a set of surprising experiments with fluids has revived old skepticism about that worldview. The bizarre results are fueling interest in an almost forgotten version of quantum mechanics, one that never gave up the idea of a single, concrete reality.

The experiments involve an oil droplet that bounces along the surface of a liquid. The droplet gently sloshes the liquid with every bounce. At the same time, ripples from past bounces affect its course. The droplet’s interaction with its own ripples, which form what’s known as a pilot wave, causes it to exhibit behaviors previously thought to be peculiar to elementary particles — including behaviors seen as evidence that these particles are spread through space like waves, without any specific location, until they are measured.

Particles at the quantum scale seem to do things that human-scale objects do not do. They can tunnel through barriers, spontaneously arise or annihilate, and occupy discrete energy levels. This new body of research reveals that oil droplets, when guided by pilot waves, also exhibit these quantum-like features.

To some researchers, the experiments suggest that quantum objects are as definite as droplets, and that they too are guided by pilot waves — in this case, fluid-like undulations in space and time. These arguments have injected new life into a deterministic (as opposed to probabilistic) theory of the microscopic world first proposed, and rejected, at the birth of quantum mechanics.

“This is a classical system that exhibits behavior that people previously thought was exclusive to the quantum realm, and we can say why,” said John Bush, a professor of applied mathematics at the Massachusetts Institute of Technology who has led several recent bouncing-droplet experiments. “The more things we understand and can provide a physical rationale for, the more difficult it will be to defend the ‘quantum mechanics is magic’ perspective.”

Magical Measurements

The orthodox view of quantum mechanics, known as the “Copenhagen interpretation” after the home city of Danish physicist Niels Bohr, one of its architects, holds that particles play out all possible realities simultaneously. Each particle is represented by a “probability wave” weighting these various possibilities, and the wave collapses to a definite state only when the particle is measured. The equations of quantum mechanics do not address how a particle’s properties solidify at the moment of measurement, or how, at such moments, reality picks which form to take. But the calculations work. As Seth Lloyd, a quantum physicist at MIT, put it, “Quantum mechanics is just counterintuitive and we just have to suck it up.”

A classic experiment in quantum mechanics that seems to demonstrate the probabilistic nature of reality involves a beam of particles (such as electrons) propelled one by one toward a pair of slits in a screen. When no one keeps track of each electron’s trajectory, it seems to pass through both slits simultaneously. In time, the electron beam creates a wavelike interference pattern of bright and dark stripes on the other side of the screen. But when a detector is placed in front of one of the slits, its measurement causes the particles to lose their wavelike omnipresence, collapse into definite states, and travel through one slit or the other. The interference pattern vanishes. The great 20th-century physicist Richard Feynman said that this double-slit experiment “has in it the heart of quantum mechanics,” and “is impossible, absolutely impossible, to explain in any classical way.”

Some physicists now disagree. “Quantum mechanics is very successful; nobody’s claiming that it’s wrong,” said Paul Milewski, a professor of mathematics at the University of Bath in England who has devised computer models of bouncing-droplet dynamics. “What we believe is that there may be, in fact, some more fundamental reason why [quantum mechanics] looks the way it does.”

Riding Waves

The idea that pilot waves might explain the peculiarities of particles dates back to the early days of quantum mechanics. The French physicist Louis de Broglie presented the earliest version of pilot-wave theory at the 1927 Solvay Conference in Brussels, a famous gathering of the founders of the field. As de Broglie explained that day to Bohr, Albert Einstein, Erwin Schrödinger, Werner Heisenberg and two dozen other celebrated physicists, pilot-wave theory made all the same predictions as the probabilistic formulation of quantum mechanics (which wouldn’t be referred to as the “Copenhagen” interpretation until the 1950s), but without the ghostliness or mysterious collapse.

The probabilistic version, championed by Bohr, involves a single equation that represents likely and unlikely locations of particles as peaks and troughs of a wave. Bohr interpreted this probability-wave equation as a complete definition of the particle. But de Broglie urged his colleagues to use two equations: one describing a real, physical wave, and another tying the trajectory of an actual, concrete particle to the variables in that wave equation, as if the particle interacts with and is propelled by the wave rather than being defined by it.

For example, consider the double-slit experiment. In de Broglie’s pilot-wave picture, each electron passes through just one of the two slits, but is influenced by a pilot wave that splits and travels through both slits. Like flotsam in a current, the particle is drawn to the places where the two wavefronts cooperate, and does not go where they cancel out.

De Broglie could not predict the exact place where an individual particle would end up — just like Bohr’s version of events, pilot-wave theory predicts only the statistical distribution of outcomes, or the bright and dark stripes — but the two men interpreted this shortcoming differently. Bohr claimed that particles don’t have definite trajectories; de Broglie argued that they do, but that we can’t measure each particle’s initial position well enough to deduce its exact path.

In principle, however, the pilot-wave theory is deterministic: The future evolves dynamically from the past, so that, if the exact state of all the particles in the universe were known at a given instant, their states at all future times could be calculated.

At the Solvay conference, Einstein objected to a probabilistic universe, quipping, “God does not play dice,” but he seemed ambivalent about de Broglie’s alternative. Bohr told Einstein to “stop telling God what to do,” and (for reasons that remain in dispute) he won the day. By 1932, when the Hungarian-American mathematician John von Neumann claimed to have proven that the probabilistic wave equation in quantum mechanics could have no “hidden variables” (that is, missing components, such as de Broglie’s particle with its well-defined trajectory), pilot-wave theory was so poorly regarded that most physicists believed von Neumann’s proof without even reading a translation.

More than 30 years would pass before von Neumann’s proof was shown to be false, but by then the damage was done. The physicist David Bohm resurrected pilot-wave theory in a modified form in 1952, with Einstein’s encouragement, and made clear that it did work, but it never caught on. (The theory is also known as de Broglie-Bohm theory, or Bohmian mechanics.)

Later, the Northern Irish physicist John Stewart Bell went on to prove a seminal theorem that many physicists today misinterpret as rendering hidden variables impossible. But Bell supported pilot-wave theory. He was the one who pointed out the flaws in von Neumann’s original proof. And in 1986 he wrote that pilot-wave theory “seems to me so natural and simple, to resolve the wave-particle dilemma in such a clear and ordinary way, that it is a great mystery to me that it was so generally ignored.”

The neglect continues. A century down the line, the standard, probabilistic formulation of quantum mechanics has been combined with Einstein’s theory of special relativity and developed into the Standard Model, an elaborate and precise description of most of the particles and forces in the universe. Acclimating to the weirdness of quantum mechanics has become a physicists’ rite of passage. The old, deterministic alternative is not mentioned in most textbooks; most people in the field haven’t heard of it. Sheldon Goldstein, a professor of mathematics, physics and philosophy at Rutgers University and a supporter of pilot-wave theory, blames the “preposterous” neglect of the theory on “decades of indoctrination.” At this stage, Goldstein and several others noted, researchers risk their careers by questioning quantum orthodoxy.

A Quantum Drop

Now at last, pilot-wave theory may be experiencing a minor comeback — at least, among fluid dynamicists. “I wish that the people who were developing quantum mechanics at the beginning of last century had access to these experiments,” Milewski said. “Because then the whole history of quantum mechanics might be different.”

The experiments began a decade ago, when Yves Couder and colleagues at Paris Diderot University discovered that vibrating a silicon oil bath up and down at a particular frequency can induce a droplet to bounce along the surface. The droplet’s path, they found, was guided by the slanted contours of the liquid’s surface generated from the droplet’s own bounces — a mutual particle-wave interaction analogous to de Broglie’s pilot-wave concept.

Read the entire article here.

Image: Louis de Broglie. Courtesy of Wikipedia.

God Is a Thermodynamicist

Physicists and cosmologists are constantly postulating and testing new ideas to explain the universe and everything within. Over the last hundred years or so, two such ideas have grown to explain much about our cosmos, and do so very successfully — quantum mechanics, which describes the very small, and relativity which describes the very large. However, these two views do no reconcile, leaving theoreticians and researchers looking for a more fundamental theory of everything. One possible idea banishes the notions of time and gravity — treating them both as emergent properties of a deeper reality.

From New Scientist:

As revolutions go, its origins were haphazard. It was, according to the ringleader Max Planck, an “act of desperation”. In 1900, he proposed the idea that energy comes in discrete chunks, or quanta, simply because the smooth delineations of classical physics could not explain the spectrum of energy re-radiated by an absorbing body.

Yet rarely was a revolution so absolute. Within a decade or so, the cast-iron laws that had underpinned physics since Newton’s day were swept away. Classical certainty ceded its stewardship of reality to the probabilistic rule of quantum mechanics, even as the parallel revolution of Einstein’s relativity displaced our cherished, absolute notions of space and time. This was complete regime change.

Except for one thing. A single relict of the old order remained, one that neither Planck nor Einstein nor any of their contemporaries had the will or means to remove. The British astrophysicist Arthur Eddington summed up the situation in 1915. “If your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation,” he wrote.

In this essay, I will explore the fascinating question of why, since their origins in the early 19th century, the laws of thermodynamics have proved so formidably robust. The journey traces the deep connections that were discovered in the 20th century between thermodynamics and information theory – connections that allow us to trace intimate links between thermodynamics and not only quantum theory but also, more speculatively, relativity. Ultimately, I will argue, those links show us how thermodynamics in the 21st century can guide us towards a theory that will supersede them both.

In its origins, thermodynamics is a theory about heat: how it flows and what it can be made to do (see diagram). The French engineer Sadi Carnot formulated the second law in 1824 to characterise the mundane fact that the steam engines then powering the industrial revolution could never be perfectly efficient. Some of the heat you pumped into them always flowed into the cooler environment, rather than staying in the engine to do useful work. That is an expression of a more general rule: unless you do something to stop it, heat will naturally flow from hotter places to cooler places to even up any temperature differences it finds. The same principle explains why keeping the refrigerator in your kitchen cold means pumping energy into it; only that will keep warmth from the surroundings at bay.

A few decades after Carnot, the German physicist Rudolph Clausius explained such phenomena in terms of a quantity characterising disorder that he called entropy. In this picture, the universe works on the back of processes that increase entropy – for example dissipating heat from places where it is concentrated, and therefore more ordered, to cooler areas, where it is not.

That predicts a grim fate for the universe itself. Once all heat is maximally dissipated, no useful process can happen in it any more: it dies a “heat death”. A perplexing question is raised at the other end of cosmic history, too. If nature always favours states of high entropy, how and why did the universe start in a state that seems to have been of comparatively low entropy? At present we have no answer, and later I will mention an intriguing alternative view.

Perhaps because of such undesirable consequences, the legitimacy of the second law was for a long time questioned. The charge was formulated with the most striking clarity by the British physicist James Clerk Maxwell in 1867. He was satisfied that inanimate matter presented no difficulty for the second law. In an isolated system, heat always passes from the hotter to the cooler, and a neat clump of dye molecules readily dissolves in water and disperses randomly, never the other way round. Disorder as embodied by entropy does always increase.

Maxwell’s problem was with life. Living things have “intentionality”: they deliberately do things to other things to make life easier for themselves. Conceivably, they might try to reduce the entropy of their surroundings and thereby violate the second law.

Information is power

Such a possibility is highly disturbing to physicists. Either something is a universal law or it is merely a cover for something deeper. Yet it was only in the late 1970s that Maxwell’s entropy-fiddling “demon” was laid to rest. Its slayer was the US physicist Charles Bennett, who built on work by his colleague at IBM, Rolf Landauer, using the theory of information developed a few decades earlier by Claude Shannon. An intelligent being can certainly rearrange things to lower the entropy of its environment. But to do this, it must first fill up its memory, gaining information as to how things are arranged in the first place.

This acquired information must be encoded somewhere, presumably in the demon’s memory. When this memory is finally full, or the being dies or otherwise expires, it must be reset. Dumping all this stored, ordered information back into the environment increases entropy – and this entropy increase, Bennett showed, will ultimately always be at least as large as the entropy reduction the demon originally achieved. Thus the status of the second law was assured, albeit anchored in a mantra of Landauer’s that would have been unintelligible to the 19th-century progenitors of thermodynamics: that “information is physical”.

But how does this explain that thermodynamics survived the quantum revolution? Classical objects behave very differently to quantum ones, so the same is presumably true of classical and quantum information. After all, quantum computers are notoriously more powerful than classical ones (or would be if realised on a large scale).

The reason is subtle, and it lies in a connection between entropy and probability contained in perhaps the most profound and beautiful formula in all of science. Engraved on the tomb of the Austrian physicist Ludwig Boltzmann in Vienna’s central cemetery, it reads simply S = k log W. Here S is entropy – the macroscopic, measurable entropy of a gas, for example – while k is a constant of nature that today bears Boltzmann’s name. Log W is the mathematical logarithm of a microscopic, probabilistic quantity W – in a gas, this would be the number of ways the positions and velocities of its many individual atoms can be arranged.

On a philosophical level, Boltzmann’s formula embodies the spirit of reductionism: the idea that we can, at least in principle, reduce our outward knowledge of a system’s activities to basic, microscopic physical laws. On a practical, physical level, it tells us that all we need to understand disorder and its increase is probabilities. Tot up the number of configurations the atoms of a system can be in and work out their probabilities, and what emerges is nothing other than the entropy that determines its thermodynamical behaviour. The equation asks no further questions about the nature of the underlying laws; we need not care if the dynamical processes that create the probabilities are classical or quantum in origin.

There is an important additional point to be made here. Probabilities are fundamentally different things in classical and quantum physics. In classical physics they are “subjective” quantities that constantly change as our state of knowledge changes. The probability that a coin toss will result in heads or tails, for instance, jumps from ½ to 1 when we observe the outcome. If there were a being who knew all the positions and momenta of all the particles in the universe – known as a “Laplace demon”, after the French mathematician Pierre-Simon Laplace, who first countenanced the possibility – it would be able to determine the course of all subsequent events in a classical universe, and would have no need for probabilities to describe them.

In quantum physics, however, probabilities arise from a genuine uncertainty about how the world works. States of physical systems in quantum theory are represented in what the quantum pioneer Erwin Schrödinger called catalogues of information, but they are catalogues in which adding information on one page blurs or scrubs it out on another. Knowing the position of a particle more precisely means knowing less well how it is moving, for example. Quantum probabilities are “objective”, in the sense that they cannot be entirely removed by gaining more information.

That casts in an intriguing light thermodynamics as originally, classically formulated. There, the second law is little more than impotence written down in the form of an equation. It has no deep physical origin itself, but is an empirical bolt-on to express the otherwise unaccountable fact that we cannot know, predict or bring about everything that might happen, as classical dynamical laws suggest we can. But this changes as soon as you bring quantum physics into the picture, with its attendant notion that uncertainty is seemingly hardwired into the fabric of reality. Rooted in probabilities, entropy and thermodynamics acquire a new, more fundamental physical anchor.

It is worth pointing out, too, that this deep-rooted connection seems to be much more general. Recently, together with my colleagues Markus Müller of the Perimeter Institute for Theoretical Physics in Waterloo, Ontario, Canada, and Oscar Dahlsten at the Centre for Quantum Technologies in Singapore, I have looked at what happens to thermodynamical relations in a generalised class of probabilistic theories that embrace quantum theory and much more besides. There too, the crucial relationship between information and disorder, as quantified by entropy, survives (arxiv.org/1107.6029).

One theory to rule them all

As for gravity – the only one of nature’s four fundamental forces not covered by quantum theory – a more speculative body of research suggests it might be little more than entropy in disguise (see “Falling into disorder”). If so, that would also bring Einstein’s general theory of relativity, with which we currently describe gravity, firmly within the purview of thermodynamics.

Take all this together, and we begin to have a hint of what makes thermodynamics so successful. The principles of thermodynamics are at their roots all to do with information theory. Information theory is simply an embodiment of how we interact with the universe – among other things, to construct theories to further our understanding of it. Thermodynamics is, in Einstein’s term, a “meta-theory”: one constructed from principles over and above the structure of any dynamical laws we devise to describe reality’s workings. In that sense we can argue that it is more fundamental than either quantum physics or general relativity.

If we can accept this and, like Eddington and his ilk, put all our trust in the laws of thermodynamics, I believe it may even afford us a glimpse beyond the current physical order. It seems unlikely that quantum physics and relativity represent the last revolutions in physics. New evidence could at any time foment their overthrow. Thermodynamics might help us discern what any usurping theory would look like.

For example, earlier this year, two of my colleagues in Singapore, Esther Hänggi and Stephanie Wehner, showed that a violation of the quantum uncertainty principle – that idea that you can never fully get rid of probabilities in a quantum context – would imply a violation of the second law of thermodynamics. Beating the uncertainty limit means extracting extra information about the system, which requires the system to do more work than thermodynamics allows it to do in the relevant state of disorder. So if thermodynamics is any guide, whatever any post-quantum world might look like, we are stuck with a degree of uncertainty (arxiv.org/abs/1205.6894).

My colleague at the University of Oxford, the physicist David Deutsch, thinks we should take things much further. Not only should any future physics conform to thermodynamics, but the whole of physics should be constructed in its image. The idea is to generalise the logic of the second law as it was stringently formulated by the mathematician Constantin Carathéodory in 1909: that in the vicinity of any state of a physical system, there are other states that cannot physically be reached if we forbid any exchange of heat with the environment.

James Joule’s 19th century experiments with beer can be used to illustrate this idea. The English brewer, whose name lives on in the standard unit of energy, sealed beer in a thermally isolated tub containing a paddle wheel that was connected to weights falling under gravity outside. The wheel’s rotation warmed the beer, increasing the disorder of its molecules and therefore its entropy. But hard as we might try, we simply cannot use Joule’s set-up to decrease the beer’s temperature, even by a fraction of a millikelvin. Cooler beer is, in this instance, a state regrettably beyond the reach of physics.

God, the thermodynamicist

The question is whether we can express the whole of physics simply by enumerating possible and impossible processes in a given situation. This is very different from how physics is usually phrased, in both the classical and quantum regimes, in terms of states of systems and equations that describe how those states change in time. The blind alleys down which the standard approach can lead are easiest to understand in classical physics, where the dynamical equations we derive allow a whole host of processes that patently do not occur – the ones we have to conjure up the laws of thermodynamics expressly to forbid, such as dye molecules reclumping spontaneously in water.

By reversing the logic, our observations of the natural world can again take the lead in deriving our theories. We observe the prohibitions that nature puts in place, be it on decreasing entropy, getting energy from nothing, travelling faster than light or whatever. The ultimately “correct” theory of physics – the logically tightest – is the one from which the smallest deviation gives us something that breaks those taboos.

There are other advantages in recasting physics in such terms. Time is a perennially problematic concept in physical theories. In quantum theory, for example, it enters as an extraneous parameter of unclear origin that cannot itself be quantised. In thermodynamics, meanwhile, the passage of time is entropy increase by any other name. A process such as dissolved dye molecules forming themselves into a clump offends our sensibilities because it appears to amount to running time backwards as much as anything else, although the real objection is that it decreases entropy.

Apply this logic more generally, and time ceases to exist as an independent, fundamental entity, but one whose flow is determined purely in terms of allowed and disallowed processes. With it go problems such as that I alluded to earlier, of why the universe started in a state of low entropy. If states and their dynamical evolution over time cease to be the question, then anything that does not break any transformational rules becomes a valid answer.

Such an approach would probably please Einstein, who once said: “What really interests me is whether God had any choice in the creation of the world.” A thermodynamically inspired formulation of physics might not answer that question directly, but leaves God with no choice but to be a thermodynamicist. That would be a singular accolade for those 19th-century masters of steam: that they stumbled upon the essence of the universe, entirely by accident. The triumph of thermodynamics would then be a revolution by stealth, 200 years in the making.

Read the entire article here.

General Relativity Lives on For Now

Since Einstein first published his elegant theory of General Relativity almost 100 years ago it has proved to be one of most powerful and enduring cornerstones of modern science. Yet theorists and researchers the world over know that it cannot possibly remain the sole answer to our cosmological questions. It answers questions about the very, very large — galaxies, stars and planets and the gravitational relationship between them. But it fails to tackle the science of the very, very small — atoms, their constituents and the forces that unite and repel them, which is addressed by the elegant and complex, but mutually incompatible Quantum Theory.

So, scientists continue to push their measurements to ever greater levels of precision across both greater and smaller distances with one aim in mind — to test the limits of each theory and to see which one breaks down first.

A recent highly precise and yet very long distance experiment, confirmed that Einstein’s theory still rules the heavens.

From ars technica:

The general theory of relativity is a remarkably successful model for gravity. However, many of the best tests for it don’t push its limits: they measure phenomena where gravity is relatively weak. Some alternative theories predict different behavior in areas subject to very strong gravity, like near the surface of a pulsar—the compact, rapidly rotating remnant of a massive star (also called a neutron star). For that reason, astronomers are very interested in finding a pulsar paired with another high-mass object. One such system has now provided an especially sensitive test of strong gravity.

The system is a binary consisting of a high-mass pulsar and a bright white dwarf locked in mutual orbit with a period of about 2.5 hours. Using optical and radio observations, John Antoniadis and colleagues measured its properties as it spirals toward merger by emitting gravitational radiation. After monitoring the system for a number of orbits, the researchers determined its behavior is in complete agreement with general relativity to a high level of precision.

The binary system was first detected in a survey of pulsars by the Green Bank Telescope (GBT). The pulsar in the system, memorably labeled PSR J0348+0432, emits radio pulses about once every 39 milliseconds (0.039 seconds). Fluctuations in the pulsar’s output indicated that it is in a binary system, though its companion lacked radio emissions. However, the GBT’s measurements were precise enough to pinpoint its location in the sky, which enabled the researchers to find the system in the archives of the Sloan Digital Sky Survey (SDSS). They determined the companion object was a particularly bright white dwarf, the remnant of the core of a star similar to our Sun. It and the pulsar are locked in a mutual orbit about 2.46 hours in length.

Following up with the Very Large Telescope (VLT) in Chile, the astronomers built up enough data to model the system. Pulsars are extremely dense, packing a star’s worth of mass into a sphere roughly 10 kilometers in radius—far too small to see directly. White dwarfs are less extreme, but they still involve stellar masses in a volume roughly equivalent to Earth’s. That means the objects in the PSR J0348+0432 system can orbit much closer to each other than stars could—as little as 0.5 percent of the average Earth-Sun separation, or 1.2 times the Sun’s radius.

The pulsar itself was interesting because of its relatively high mass: about 2.0 times that of the Sun (most observed pulsars are about 1.4 times more massive). Unlike more mundane objects, pulsar size doesn’t grow with mass; according to some models, a higher mass pulsar may actually be smaller than one with lower mass. As a result, the gravity at the surface of PSR J0348+0432 is far more intense than at a lower-mass counterpart, providing a laboratory for testing general relativity (GR). The gravitational intensity near PSR J0348+0432 is about twice that of other pulsars in binary systems, creating a more extreme environment than previously measured.

According to GR, a binary emits gravitational waves that carry energy away from the system, causing the size of the orbit to shrink. For most binaries, the effect is small, but for compact systems like the one containing PSR J0348+0432, it is measurable. The first such system was found by Russel Hulse and Joseph Taylor; its discovery won the two astronomers the Nobel Prize.

The shrinking of the orbit results in a decrease in the orbital period as the two objects revolve around each other more quickly. In this case, the researchers measured the effect by studying the change in the spectrum of light emitted by the white dwarf, as well as fluctuations in the emissions from the pulsar. (This study also helped demonstrate the two objects were in mutual orbit, rather than being coincidentally in the same part of the sky.)

To test agreement with GR, physicists established a set of observable quantities. These include the rate of orbit decrease (which is a reflection of the energy loss to gravitational radiation) and something called the Shapiro delay. The latter phenomenon occurs because light emitted from the pulsar must travel through the intense gravitational field of the pulsar when exiting the system. This effect depends on the relative orientation of the pulsar to us, but alternative models also predict different observable results.

In the case of the PSR J0348+0432 system, the change in orbital period and the Shapiro delay agreed with the predictions of GR, placing strong constraints on alternative theories. The researchers were also able to rule out energy loss from other, non-gravitational sources (rotation or electromagnetic phenomena). If the system continues as models predict, the white dwarf and pulsar will merge in about 400 million years—we don’t know what the product of that merger will be, so astronomers are undoubtedly marking their calendars now.

The results are of potential use for the Laser Interferometer Gravitational-wave Observatory (LIGO) and other ground-based gravitational-wave detectors. These instruments are sensitive to the final death spiral of binaries like the one containing PSR J0348+0432. The current detection and observation strategies involve “templates,” or theoretical models of the gravitational wave signal from binaries. All information about the behavior of close pulsar binaries helps gravitational-wave astronomers refine those templates, which should improve the chances of detection.

Of course, no theory can be “proven right” by experiment or observation—data provides evidence in support of or against the predictions of a particular model. However, the PSR J0348+0432 binary results placed stringent constraints on any alternative model to GR in the strong-gravity regime. (Certain other alternative models focus on altering gravity on large scales to explain dark energy and the acceleration expansion of the Universe.) Based on this new data, only theories that agree with GR to high precision are still standing—leaving general relativity the continuing champion theory of gravity.

Read the entire article after the jump.

Image: Artist’s impression of the PSR J0348+0432 system. The compact pulsar (with beams of radio emission) produces a strong distortion of spacetime (illustrated by the green mesh). Courtesy of Science Mag.