Tag Archives: quantum mechanics

Spacetime Without the Time

anti-de-sitter-spaceSince they were first dreamed up explanations of the very small (quantum mechanics) and the very large (general relativity) have both been highly successful at describing their respective spheres of influence. Yet, these two descriptions of our physical universe are not compatible, particularly when it comes to describing gravity. Indeed, physicists and theorists have struggled for decades to unite these two frameworks. Many agree that we need a new theory (of everything).

One new idea, from theorist Erik Verlinde of the University of Amsterdam, proposes that time is an emergent construct (it’s not a fundamental building block) and that dark matter is an illusion.

From Quanta:

Theoretical physicists striving to unify quantum mechanics and general relativity into an all-encompassing theory of quantum gravity face what’s called the “problem of time.”

In quantum mechanics, time is universal and absolute; its steady ticks dictate the evolving entanglements between particles. But in general relativity (Albert Einstein’s theory of gravity), time is relative and dynamical, a dimension that’s inextricably interwoven with directions x, y and z into a four-dimensional “space-time” fabric. The fabric warps under the weight of matter, causing nearby stuff to fall toward it (this is gravity), and slowing the passage of time relative to clocks far away. Or hop in a rocket and use fuel rather than gravity to accelerate through space, and time dilates; you age less than someone who stayed at home.

Unifying quantum mechanics and general relativity requires reconciling their absolute and relative notions of time. Recently, a promising burst of research on quantum gravity has provided an outline of what the reconciliation might look like — as well as insights on the true nature of time.

As I described in an article this week on a new theoretical attempt to explain away dark matter, many leading physicists now consider space-time and gravity to be “emergent” phenomena: Bendy, curvy space-time and the matter within it are a hologram that arises out of a network of entangled qubits (quantum bits of information), much as the three-dimensional environment of a computer game is encoded in the classical bits on a silicon chip. “I think we now understand that space-time really is just a geometrical representation of the entanglement structure of these underlying quantum systems,” said Mark Van Raamsdonk, a theoretical physicist at the University of British Columbia.

Researchers have worked out the math showing how the hologram arises in toy universes that possess a fisheye space-time geometry known as “anti-de Sitter” (AdS) space. In these warped worlds, spatial increments get shorter and shorter as you move out from the center. Eventually, the spatial dimension extending from the center shrinks to nothing, hitting a boundary. The existence of this boundary — which has one fewer spatial dimension than the interior space-time, or “bulk” — aids calculations by providing a rigid stage on which to model the entangled qubits that project the hologram within. “Inside the bulk, time starts bending and curving with the space in dramatic ways,” said Brian Swingle of Harvard and Brandeis universities. “We have an understanding of how to describe that in terms of the ‘sludge’ on the boundary,” he added, referring to the entangled qubits.

The states of the qubits evolve according to universal time as if executing steps in a computer code, giving rise to warped, relativistic time in the bulk of the AdS space. The only thing is, that’s not quite how it works in our universe.

Here, the space-time fabric has a “de Sitter” geometry, stretching as you look into the distance. The fabric stretches until the universe hits a very different sort of boundary from the one in AdS space: the end of time. At that point, in an event known as “heat death,” space-time will have stretched so much that everything in it will become causally disconnected from everything else, such that no signals can ever again travel between them. The familiar notion of time breaks down. From then on, nothing happens.

On the timeless boundary of our space-time bubble, the entanglements linking together qubits (and encoding the universe’s dynamical interior) would presumably remain intact, since these quantum correlations do not require that signals be sent back and forth. But the state of the qubits must be static and timeless. This line of reasoning suggests that somehow, just as the qubits on the boundary of AdS space give rise to an interior with one extra spatial dimension, qubits on the timeless boundary of de Sitter space must give rise to a universe with time — dynamical time, in particular. Researchers haven’t yet figured out how to do these calculations. “In de Sitter space,” Swingle said, “we don’t have a good idea for how to understand the emergence of time.”

Read the entire article here.

Image: Image of (1 + 1)-dimensional anti-de Sitter space embedded in flat (1 + 2)-dimensional space. The t1- and t2-axes lie in the plane of rotational symmetry, and the x1-axis is normal to that plane. The embedded surface contains closed timelike curves circling the x1 axis, though these can be eliminated by “unrolling” the embedding (more precisely, by taking the universal cover). Courtesy: Krishnavedala. Wikipedia. Creative Commons Attribution-Share Alike 3.0.

The Collapsing Wave Function

Schrodinger-equationOnce in every while I have to delve into the esoteric world of quantum mechanics. So, you will have to forgive me.

Since it was formalized in the mid-1920s QM has been extremely successful at describing the behavior of systems at the atomic scale. Two giants of the field — Niels Bohr and Werner Heisenberg — devised the intricate mathematics behind QM in 1927. Since then it has become known as the Copenhagen Interpretation, and has been widely and accurately used to predict and describe the workings of elementary particles and forces between them.

Yet recent theoretical stirrings in the field threaten to turn this widely held and accepted framework on its head. The Copenhagen Interpretation holds that particles do not have definitive locations until they are observed. Rather, their positions and movements are defined by a wave function that describes a spectrum of probabilities, but no certainties.

Rather understandably, this probabilistic description of our microscopic world tends to unnerve those who seek a more solid view of what we actually observe. Enter Bohmian mechanics, or more correctly, the De BroglieBohm theory of quantum mechanics. An increasing number of present day researchers and theorists are revisiting this theory, which may yet hold some promise.

From Wired:

Of the many counterintuitive features of quantum mechanics, perhaps the most challenging to our notions of common sense is that particles do not have locations until they are observed. This is exactly what the standard view of quantum mechanics, often called the Copenhagen interpretation, asks us to believe.

But there’s another view—one that’s been around for almost a century—in which particles really do have precise positions at all times. This alternative view, known as pilot-wave theory or Bohmian mechanics, never became as popular as the Copenhagen view, in part because Bohmian mechanics implies that the world must be strange in other ways. In particular, a 1992 study claimed to crystalize certain bizarre consequences of Bohmian mechanics and in doing so deal it a fatal conceptual blow. The authors of that paper concluded that a particle following the laws of Bohmian mechanics would end up taking a trajectory that was so unphysical—even by the warped standards of quantum theory—that they described it as “surreal.”

Nearly a quarter-century later, a group of scientists has carried out an experiment in a Toronto laboratory that aims to test this idea. And if their results, first reported earlier this year, hold up to scrutiny, the Bohmian view of quantum mechanics—less fuzzy but in some ways more strange than the traditional view—may be poised for a comeback.

As with the Copenhagen view, there’s a wave function governed by the Schrödinger equation. In addition, every particle has an actual, definite location, even when it’s not being observed. Changes in the positions of the particles are given by another equation, known as the “pilot wave” equation (or “guiding equation”). The theory is fully deterministic; if you know the initial state of a system, and you’ve got the wave function, you can calculate where each particle will end up.

That may sound like a throwback to classical mechanics, but there’s a crucial difference. Classical mechanics is purely “local”—stuff can affect other stuff only if it is adjacent to it (or via the influence of some kind of field, like an electric field, which can send impulses no faster than the speed of light). Quantum mechanics, in contrast, is inherently nonlocal. The best-known example of a nonlocal effect—one that Einstein himself considered, back in the 1930s—is when a pair of particles are connected in such a way that a measurement of one particle appears to affect the state of another, distant particle. The idea was ridiculed by Einstein as “spooky action at a distance.” But hundreds of experiments, beginning in the 1980s, have confirmed that this spooky action is a very real characteristic of our universe.

Read the entire article here.

Image: Schrödinger’s time-dependent equation. Courtesy: Wikipedia.

 

 

Universal Amniotic Fluid

Another day, another physics paper describing the origin of the universe. This is no wonder. Since the development of general relativity and quantum mechanics — two mutually incompatible descriptions of our reality — theoreticians have been scurrying to come up with a grand theory, a rapprochement of sorts. This one describes the universe as a quantum fluid, perhaps made up of hypothesized gravitons.

From Nature Asia:

The prevailing model of cosmology, based on Einstein’s theory of general relativity, puts the universe at around 13.8 billion years old and suggests it originated from a “singularity” – an infinitely small and dense point – at the Big Bang.

 To understand what happened inside that tiny singularity, physicists must marry general relativity with quantum mechanics – the laws that govern small objects. Applying both of these disciplines has challenged physicists for decades. “The Big Bang singularity is the most serious problem of general relativity, because the laws of physics appear to break down there,” says Ahmed Farag Ali, a physicist at Zewail City of Science and Technology, Egypt.

 In an effort to bring together the laws of quantum mechanics and general relativity, and to solve the singularity puzzle, Ali and Saurya Das, a physicist at the University of Lethbridge in Alberta Canada, employed an equation that predicts the development of singularities in general relativity. That equation had been developed by Das’s former professor, Amal Kumar Raychaudhuri, when Das was an undergraduate student at Presidency University, in Kolkata, India, so Das was particularly familiar and fascinated by it.

 When Ali and Das made small quantum corrections to the Raychaudhuri equation, they realised it described a fluid, made up of small particles, that pervades space. Physicists have long believed that a quantum version of gravity would include a hypothetical particle, called the graviton, which generates the force of gravity. In their new model — which will appear in Physics Letters B in February — Ali and Das propose that such gravitons could form this fluid.

To understand the origin of the universe, they used this corrected equation to trace the behaviour of the fluid back through time. Surprisingly, they found that it did not converge into a singularity. Instead, the universe appears to have existed forever. Although it was smaller in the past, it never quite crunched down to nothing, says Das.

 “Our theory serves to complement Einstein’s general relativity, which is very successful at describing physics over large distances,” says Ali. “But physicists know that to describe short distances, quantum mechanics must be accommodated, and the quantum Raychaudhui equation is a big step towards that.”

The model could also help solve two other cosmic mysteries. In the late 1990s, astronomers discovered that the expansion of the universe is accelerating due the presence of a mysterious dark energy, the origin of which is not known. The model has the potential to explain it since the fluid creates a minor but constant outward force that expands space. “This is a happy offshoot of our work,” says Das.

 Astronomers also now know that most matter in the universe is in an invisible mysterious form called dark matter, only perceptible through its gravitational effect on visible matter such as stars. When Das and a colleague set the mass of the graviton in the model to a small level, they could make the density of their fluid match the universe’s observed density of dark matter, while also providing the right value for dark energy’s push.

Read the entire article here.

 

The Arrow of Time

Arthur_Stanley_EddingtonEinstein’s “spooky action at a distance” and quantum information theory (QIT) may help explain the so-called arrow of time — specifically, why it seems to flow in only one direction. Astronomer Arthur Eddington first described this asymmetry in 1927, and it has stumped theoreticians ever since.

At a macro-level the classic and simple example is that of an egg breaking when it hits your kitchen floor: repeat this over and over, and it’s likely that the egg will always make for a scrambled mess on your clean tiles, but it will never rise up from the floor and spontaneously re-assemble in your slippery hand. Yet at the micro-level, physicists know their underlying laws apply equally in both directions. Enter two new tenets of the quantum world that may help us better understand this perplexing forward flow of time: entanglement and QIT.

From Wired:

Coffee cools, buildings crumble, eggs break and stars fizzle out in a universe that seems destined to degrade into a state of uniform drabness known as thermal equilibrium. The astronomer-philosopher Sir Arthur Eddington in 1927 cited the gradual dispersal of energy as evidence of an irreversible “arrow of time.”

But to the bafflement of generations of physicists, the arrow of time does not seem to follow from the underlying laws of physics, which work the same going forward in time as in reverse. By those laws, it seemed that if someone knew the paths of all the particles in the universe and flipped them around, energy would accumulate rather than disperse: Tepid coffee would spontaneously heat up, buildings would rise from their rubble and sunlight would slink back into the sun.

“In classical physics, we were struggling,” said Sandu Popescu, a professor of physics at the University of Bristol in the United Kingdom. “If I knew more, could I reverse the event, put together all the molecules of the egg that broke? Why am I relevant?”

Surely, he said, time’s arrow is not steered by human ignorance. And yet, since the birth of thermodynamics in the 1850s, the only known approach for calculating the spread of energy was to formulate statistical distributions of the unknown trajectories of particles, and show that, over time, the ignorance smeared things out.

Now, physicists are unmasking a more fundamental source for the arrow of time: Energy disperses and objects equilibrate, they say, because of the way elementary particles become intertwined when they interact — a strange effect called “quantum entanglement.”

“Finally, we can understand why a cup of coffee equilibrates in a room,” said Tony Short, a quantum physicist at Bristol. “Entanglement builds up between the state of the coffee cup and the state of the room.”

Popescu, Short and their colleagues Noah Linden and Andreas Winter reported the discovery in the journal Physical Review E in 2009, arguing that objects reach equilibrium, or a state of uniform energy distribution, within an infinite amount of time by becoming quantum mechanically entangled with their surroundings. Similar results by Peter Reimann of the University of Bielefeld in Germany appeared several months earlier in Physical Review Letters. Short and a collaborator strengthened the argument in 2012 by showing that entanglement causes equilibration within a finite time. And, in work that was posted on the scientific preprint site arXiv.org in February, two separate groups have taken the next step, calculating that most physical systems equilibrate rapidly, on time scales proportional to their size. “To show that it’s relevant to our actual physical world, the processes have to be happening on reasonable time scales,” Short said.

The tendency of coffee — and everything else — to reach equilibrium is “very intuitive,” said Nicolas Brunner, a quantum physicist at the University of Geneva. “But when it comes to explaining why it happens, this is the first time it has been derived on firm grounds by considering a microscopic theory.”

If the new line of research is correct, then the story of time’s arrow begins with the quantum mechanical idea that, deep down, nature is inherently uncertain. An elementary particle lacks definite physical properties and is defined only by probabilities of being in various states. For example, at a particular moment, a particle might have a 50 percent chance of spinning clockwise and a 50 percent chance of spinning counterclockwise. An experimentally tested theorem by the Northern Irish physicist John Bell says there is no “true” state of the particle; the probabilities are the only reality that can be ascribed to it.

Quantum uncertainty then gives rise to entanglement, the putative source of the arrow of time.

When two particles interact, they can no longer even be described by their own, independently evolving probabilities, called “pure states.” Instead, they become entangled components of a more complicated probability distribution that describes both particles together. It might dictate, for example, that the particles spin in opposite directions. The system as a whole is in a pure state, but the state of each individual particle is “mixed” with that of its acquaintance. The two could travel light-years apart, and the spin of each would remain correlated with that of the other, a feature Albert Einstein famously described as “spooky action at a distance.”

“Entanglement is in some sense the essence of quantum mechanics,” or the laws governing interactions on the subatomic scale, Brunner said. The phenomenon underlies quantum computing, quantum cryptography and quantum teleportation.

The idea that entanglement might explain the arrow of time first occurred to Seth Lloyd about 30 years ago, when he was a 23-year-old philosophy graduate student at Cambridge University with a Harvard physics degree. Lloyd realized that quantum uncertainty, and the way it spreads as particles become increasingly entangled, could replace human uncertainty in the old classical proofs as the true source of the arrow of time.

Using an obscure approach to quantum mechanics that treated units of information as its basic building blocks, Lloyd spent several years studying the evolution of particles in terms of shuffling 1s and 0s. He found that as the particles became increasingly entangled with one another, the information that originally described them (a “1” for clockwise spin and a “0” for counterclockwise, for example) would shift to describe the system of entangled particles as a whole. It was as though the particles gradually lost their individual autonomy and became pawns of the collective state. Eventually, the correlations contained all the information, and the individual particles contained none. At that point, Lloyd discovered, particles arrived at a state of equilibrium, and their states stopped changing, like coffee that has cooled to room temperature.

“What’s really going on is things are becoming more correlated with each other,” Lloyd recalls realizing. “The arrow of time is an arrow of increasing correlations.”

The idea, presented in his 1988 doctoral thesis, fell on deaf ears. When he submitted it to a journal, he was told that there was “no physics in this paper.” Quantum information theory “was profoundly unpopular” at the time, Lloyd said, and questions about time’s arrow “were for crackpots and Nobel laureates who have gone soft in the head.” he remembers one physicist telling him.

“I was darn close to driving a taxicab,” Lloyd said.

Advances in quantum computing have since turned quantum information theory into one of the most active branches of physics. Lloyd is now a professor at the Massachusetts Institute of Technology, recognized as one of the founders of the discipline, and his overlooked idea has resurfaced in a stronger form in the hands of the Bristol physicists. The newer proofs are more general, researchers say, and hold for virtually any quantum system.

“When Lloyd proposed the idea in his thesis, the world was not ready,” said Renato Renner, head of the Institute for Theoretical Physics at ETH Zurich. “No one understood it. Sometimes you have to have the idea at the right time.”

Read the entire article here.

Image: English astrophysicist Sir Arthur Stanley Eddington (1882–1944). Courtesy: George Grantham Bain Collection (Library of Congress).

God Is a Thermodynamicist

Physicists and cosmologists are constantly postulating and testing new ideas to explain the universe and everything within. Over the last hundred years or so, two such ideas have grown to explain much about our cosmos, and do so very successfully — quantum mechanics, which describes the very small, and relativity which describes the very large. However, these two views do no reconcile, leaving theoreticians and researchers looking for a more fundamental theory of everything. One possible idea banishes the notions of time and gravity — treating them both as emergent properties of a deeper reality.

From New Scientist:

As revolutions go, its origins were haphazard. It was, according to the ringleader Max Planck, an “act of desperation”. In 1900, he proposed the idea that energy comes in discrete chunks, or quanta, simply because the smooth delineations of classical physics could not explain the spectrum of energy re-radiated by an absorbing body.

Yet rarely was a revolution so absolute. Within a decade or so, the cast-iron laws that had underpinned physics since Newton’s day were swept away. Classical certainty ceded its stewardship of reality to the probabilistic rule of quantum mechanics, even as the parallel revolution of Einstein’s relativity displaced our cherished, absolute notions of space and time. This was complete regime change.

Except for one thing. A single relict of the old order remained, one that neither Planck nor Einstein nor any of their contemporaries had the will or means to remove. The British astrophysicist Arthur Eddington summed up the situation in 1915. “If your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation,” he wrote.

In this essay, I will explore the fascinating question of why, since their origins in the early 19th century, the laws of thermodynamics have proved so formidably robust. The journey traces the deep connections that were discovered in the 20th century between thermodynamics and information theory – connections that allow us to trace intimate links between thermodynamics and not only quantum theory but also, more speculatively, relativity. Ultimately, I will argue, those links show us how thermodynamics in the 21st century can guide us towards a theory that will supersede them both.

In its origins, thermodynamics is a theory about heat: how it flows and what it can be made to do (see diagram). The French engineer Sadi Carnot formulated the second law in 1824 to characterise the mundane fact that the steam engines then powering the industrial revolution could never be perfectly efficient. Some of the heat you pumped into them always flowed into the cooler environment, rather than staying in the engine to do useful work. That is an expression of a more general rule: unless you do something to stop it, heat will naturally flow from hotter places to cooler places to even up any temperature differences it finds. The same principle explains why keeping the refrigerator in your kitchen cold means pumping energy into it; only that will keep warmth from the surroundings at bay.

A few decades after Carnot, the German physicist Rudolph Clausius explained such phenomena in terms of a quantity characterising disorder that he called entropy. In this picture, the universe works on the back of processes that increase entropy – for example dissipating heat from places where it is concentrated, and therefore more ordered, to cooler areas, where it is not.

That predicts a grim fate for the universe itself. Once all heat is maximally dissipated, no useful process can happen in it any more: it dies a “heat death”. A perplexing question is raised at the other end of cosmic history, too. If nature always favours states of high entropy, how and why did the universe start in a state that seems to have been of comparatively low entropy? At present we have no answer, and later I will mention an intriguing alternative view.

Perhaps because of such undesirable consequences, the legitimacy of the second law was for a long time questioned. The charge was formulated with the most striking clarity by the British physicist James Clerk Maxwell in 1867. He was satisfied that inanimate matter presented no difficulty for the second law. In an isolated system, heat always passes from the hotter to the cooler, and a neat clump of dye molecules readily dissolves in water and disperses randomly, never the other way round. Disorder as embodied by entropy does always increase.

Maxwell’s problem was with life. Living things have “intentionality”: they deliberately do things to other things to make life easier for themselves. Conceivably, they might try to reduce the entropy of their surroundings and thereby violate the second law.

Information is power

Such a possibility is highly disturbing to physicists. Either something is a universal law or it is merely a cover for something deeper. Yet it was only in the late 1970s that Maxwell’s entropy-fiddling “demon” was laid to rest. Its slayer was the US physicist Charles Bennett, who built on work by his colleague at IBM, Rolf Landauer, using the theory of information developed a few decades earlier by Claude Shannon. An intelligent being can certainly rearrange things to lower the entropy of its environment. But to do this, it must first fill up its memory, gaining information as to how things are arranged in the first place.

This acquired information must be encoded somewhere, presumably in the demon’s memory. When this memory is finally full, or the being dies or otherwise expires, it must be reset. Dumping all this stored, ordered information back into the environment increases entropy – and this entropy increase, Bennett showed, will ultimately always be at least as large as the entropy reduction the demon originally achieved. Thus the status of the second law was assured, albeit anchored in a mantra of Landauer’s that would have been unintelligible to the 19th-century progenitors of thermodynamics: that “information is physical”.

But how does this explain that thermodynamics survived the quantum revolution? Classical objects behave very differently to quantum ones, so the same is presumably true of classical and quantum information. After all, quantum computers are notoriously more powerful than classical ones (or would be if realised on a large scale).

The reason is subtle, and it lies in a connection between entropy and probability contained in perhaps the most profound and beautiful formula in all of science. Engraved on the tomb of the Austrian physicist Ludwig Boltzmann in Vienna’s central cemetery, it reads simply S = k log W. Here S is entropy – the macroscopic, measurable entropy of a gas, for example – while k is a constant of nature that today bears Boltzmann’s name. Log W is the mathematical logarithm of a microscopic, probabilistic quantity W – in a gas, this would be the number of ways the positions and velocities of its many individual atoms can be arranged.

On a philosophical level, Boltzmann’s formula embodies the spirit of reductionism: the idea that we can, at least in principle, reduce our outward knowledge of a system’s activities to basic, microscopic physical laws. On a practical, physical level, it tells us that all we need to understand disorder and its increase is probabilities. Tot up the number of configurations the atoms of a system can be in and work out their probabilities, and what emerges is nothing other than the entropy that determines its thermodynamical behaviour. The equation asks no further questions about the nature of the underlying laws; we need not care if the dynamical processes that create the probabilities are classical or quantum in origin.

There is an important additional point to be made here. Probabilities are fundamentally different things in classical and quantum physics. In classical physics they are “subjective” quantities that constantly change as our state of knowledge changes. The probability that a coin toss will result in heads or tails, for instance, jumps from ½ to 1 when we observe the outcome. If there were a being who knew all the positions and momenta of all the particles in the universe – known as a “Laplace demon”, after the French mathematician Pierre-Simon Laplace, who first countenanced the possibility – it would be able to determine the course of all subsequent events in a classical universe, and would have no need for probabilities to describe them.

In quantum physics, however, probabilities arise from a genuine uncertainty about how the world works. States of physical systems in quantum theory are represented in what the quantum pioneer Erwin Schrödinger called catalogues of information, but they are catalogues in which adding information on one page blurs or scrubs it out on another. Knowing the position of a particle more precisely means knowing less well how it is moving, for example. Quantum probabilities are “objective”, in the sense that they cannot be entirely removed by gaining more information.

That casts in an intriguing light thermodynamics as originally, classically formulated. There, the second law is little more than impotence written down in the form of an equation. It has no deep physical origin itself, but is an empirical bolt-on to express the otherwise unaccountable fact that we cannot know, predict or bring about everything that might happen, as classical dynamical laws suggest we can. But this changes as soon as you bring quantum physics into the picture, with its attendant notion that uncertainty is seemingly hardwired into the fabric of reality. Rooted in probabilities, entropy and thermodynamics acquire a new, more fundamental physical anchor.

It is worth pointing out, too, that this deep-rooted connection seems to be much more general. Recently, together with my colleagues Markus Müller of the Perimeter Institute for Theoretical Physics in Waterloo, Ontario, Canada, and Oscar Dahlsten at the Centre for Quantum Technologies in Singapore, I have looked at what happens to thermodynamical relations in a generalised class of probabilistic theories that embrace quantum theory and much more besides. There too, the crucial relationship between information and disorder, as quantified by entropy, survives (arxiv.org/1107.6029).

One theory to rule them all

As for gravity – the only one of nature’s four fundamental forces not covered by quantum theory – a more speculative body of research suggests it might be little more than entropy in disguise (see “Falling into disorder”). If so, that would also bring Einstein’s general theory of relativity, with which we currently describe gravity, firmly within the purview of thermodynamics.

Take all this together, and we begin to have a hint of what makes thermodynamics so successful. The principles of thermodynamics are at their roots all to do with information theory. Information theory is simply an embodiment of how we interact with the universe – among other things, to construct theories to further our understanding of it. Thermodynamics is, in Einstein’s term, a “meta-theory”: one constructed from principles over and above the structure of any dynamical laws we devise to describe reality’s workings. In that sense we can argue that it is more fundamental than either quantum physics or general relativity.

If we can accept this and, like Eddington and his ilk, put all our trust in the laws of thermodynamics, I believe it may even afford us a glimpse beyond the current physical order. It seems unlikely that quantum physics and relativity represent the last revolutions in physics. New evidence could at any time foment their overthrow. Thermodynamics might help us discern what any usurping theory would look like.

For example, earlier this year, two of my colleagues in Singapore, Esther Hänggi and Stephanie Wehner, showed that a violation of the quantum uncertainty principle – that idea that you can never fully get rid of probabilities in a quantum context – would imply a violation of the second law of thermodynamics. Beating the uncertainty limit means extracting extra information about the system, which requires the system to do more work than thermodynamics allows it to do in the relevant state of disorder. So if thermodynamics is any guide, whatever any post-quantum world might look like, we are stuck with a degree of uncertainty (arxiv.org/abs/1205.6894).

My colleague at the University of Oxford, the physicist David Deutsch, thinks we should take things much further. Not only should any future physics conform to thermodynamics, but the whole of physics should be constructed in its image. The idea is to generalise the logic of the second law as it was stringently formulated by the mathematician Constantin Carathéodory in 1909: that in the vicinity of any state of a physical system, there are other states that cannot physically be reached if we forbid any exchange of heat with the environment.

James Joule’s 19th century experiments with beer can be used to illustrate this idea. The English brewer, whose name lives on in the standard unit of energy, sealed beer in a thermally isolated tub containing a paddle wheel that was connected to weights falling under gravity outside. The wheel’s rotation warmed the beer, increasing the disorder of its molecules and therefore its entropy. But hard as we might try, we simply cannot use Joule’s set-up to decrease the beer’s temperature, even by a fraction of a millikelvin. Cooler beer is, in this instance, a state regrettably beyond the reach of physics.

God, the thermodynamicist

The question is whether we can express the whole of physics simply by enumerating possible and impossible processes in a given situation. This is very different from how physics is usually phrased, in both the classical and quantum regimes, in terms of states of systems and equations that describe how those states change in time. The blind alleys down which the standard approach can lead are easiest to understand in classical physics, where the dynamical equations we derive allow a whole host of processes that patently do not occur – the ones we have to conjure up the laws of thermodynamics expressly to forbid, such as dye molecules reclumping spontaneously in water.

By reversing the logic, our observations of the natural world can again take the lead in deriving our theories. We observe the prohibitions that nature puts in place, be it on decreasing entropy, getting energy from nothing, travelling faster than light or whatever. The ultimately “correct” theory of physics – the logically tightest – is the one from which the smallest deviation gives us something that breaks those taboos.

There are other advantages in recasting physics in such terms. Time is a perennially problematic concept in physical theories. In quantum theory, for example, it enters as an extraneous parameter of unclear origin that cannot itself be quantised. In thermodynamics, meanwhile, the passage of time is entropy increase by any other name. A process such as dissolved dye molecules forming themselves into a clump offends our sensibilities because it appears to amount to running time backwards as much as anything else, although the real objection is that it decreases entropy.

Apply this logic more generally, and time ceases to exist as an independent, fundamental entity, but one whose flow is determined purely in terms of allowed and disallowed processes. With it go problems such as that I alluded to earlier, of why the universe started in a state of low entropy. If states and their dynamical evolution over time cease to be the question, then anything that does not break any transformational rules becomes a valid answer.

Such an approach would probably please Einstein, who once said: “What really interests me is whether God had any choice in the creation of the world.” A thermodynamically inspired formulation of physics might not answer that question directly, but leaves God with no choice but to be a thermodynamicist. That would be a singular accolade for those 19th-century masters of steam: that they stumbled upon the essence of the universe, entirely by accident. The triumph of thermodynamics would then be a revolution by stealth, 200 years in the making.

Read the entire article here.

Quantum Computation: Spooky Arithmetic

Quantum computation holds the promise of vastly superior performance over traditional digital systems based on bits that are either “on” or “off”. Yet for all the theory, quantum computation still remains very much a research enterprise in its very infancy. And, because of the peculiarities of the quantum world — think Schrödinger’s cat, both dead and alive — it’s even difficult to measure a quantum computer at work.

From Wired:

In early May, news reports gushed that a quantum computation device had for the first time outperformed classical computers, solving certain problems thousands of times faster. The media coverage sent ripples of excitement through the technology community. A full-on quantum computer, if ever built, would revolutionize large swaths of computer science, running many algorithms dramatically faster, including one that could crack most encryption protocols in use today.

Over the following weeks, however, a vigorous controversy surfaced among quantum computation researchers. Experts argued over whether the device, created by D-Wave Systems, in Burnaby, British Columbia, really offers the claimed speedups, whether it works the way the company thinks it does, and even whether it is really harnessing the counterintuitive weirdness of quantum physics, which governs the world of elementary particles such as electrons and photons.

Most researchers have no access to D-Wave’s proprietary system, so they can’t simply examine its specifications to verify the company’s claims. But even if they could look under its hood, how would they know it’s the real thing?

Verifying the processes of an ordinary computer is easy, in principle: At each step of a computation, you can examine its internal state — some series of 0s and 1s — to make sure it is carrying out the steps it claims.

A quantum computer’s internal state, however, is made of “qubits” — a mixture (or “superposition”) of 0 and 1 at the same time, like Schrödinger’s fabled quantum mechanical cat, which is simultaneously alive and dead. Writing down the internal state of a large quantum computer would require an impossibly large number of parameters. The state of a system containing 1,000 qubits, for example, could need more parameters than the estimated number of particles in the universe.

And there’s an even more fundamental obstacle: Measuring a quantum system “collapses” it into a single classical state instead of a superposition of many states. (When Schrödinger’s cat is measured, it instantly becomes alive or dead.) Likewise, examining the inner workings of a quantum computer would reveal an ordinary collection of classical bits. A quantum system, said Umesh Vazirani of the University of California, Berkeley, is like a person who has an incredibly rich inner life, but who, if you ask him “What’s up?” will just shrug and say, “Nothing much.”

“How do you ever test a quantum system?” Vazirani asked. “Do you have to take it on faith? At first glance, it seems that the obvious answer is yes.”

It turns out, however, that there is a way to probe the rich inner life of a quantum computer using only classical measurements, if the computer has two separate “entangled” components.

In the April 25 issue of the journal Nature, Vazirani, together with Ben Reichardt of the University of Southern California in Los Angeles and Falk Unger of Knight Capital Group Inc. in Santa Clara, showed how to establish the precise inner state of such a computer using a favorite tactic from TV police shows: Interrogate the two components in separate rooms, so to speak, and check whether their stories are consistent. If the two halves of the computer answer a particular series of questions successfully, the interrogator can not only figure out their internal state and the measurements they are doing, but also issue instructions that will force the two halves to jointly carry out any quantum computation she wishes.

“It’s a huge achievement,” said Stefano Pironio, of the Université Libre de Bruxelles in Belgium.

The finding will not shed light on the D-Wave computer, which is constructed along very different principles, and it may be decades before a computer along the lines of the Nature paper — or indeed any fully quantum computer — can be built. But the result is an important proof of principle, said Thomas Vidick, who recently completed his post-doctoral research at the Massachusetts Institute of Technology. “It’s a big conceptual step.”

In the short term, the new interrogation approach offers a potential security boost to quantum cryptography, which has been marketed commercially for more than a decade. In principle, quantum cryptography offers “unconditional” security, guaranteed by the laws of physics. Actual quantum devices, however, are notoriously hard to control, and over the past decade, quantum cryptographic systems have repeatedly been hacked.

The interrogation technique creates a quantum cryptography protocol that, for the first time, would transmit a secret key while simultaneously proving that the quantum devices are preventing any potential information leak. Some version of this protocol could very well be implemented within the next five to 10 years, predicted Vidick and his former adviser at MIT, the theoretical computer scientist Scott Aaronson.

“It’s a new level of security that solves the shortcomings of traditional quantum cryptography,” Pironio said.

Spooky Action

In 1964, the Irish physicist John Stewart Bell came up with a test to try to establish, once and for all, that the bafflingly counterintuitive principles of quantum physics are truly inherent properties of the universe — that the decades-long effort of Albert Einstein and other physicists to develop a more intuitive physics could never bear fruit.

Einstein was deeply disturbed by the randomness at the core of quantum physics — God “is not playing at dice,” he famously wrote to the physicist Max Born in 1926.

In 1935, Einstein, together with his colleagues Boris Podolsky and Nathan Rosen, described a strange consequence of this randomness, now called the EPR paradox (short for Einstein, Podolsky, Rosen). According to the laws of quantum physics, it is possible for two particles to interact briefly in such a way that their states become “entangled” as “EPR pairs.” Even if the particles then travel many light years away from each other, one particle somehow instantly seems to “know” the outcome of a measurement on the other particle: When asked the same question, it will give the same answer, even though quantum physics says that the first particle chose its answer randomly. Since the theory of special relativity forbids information from traveling faster than the speed of light, how does the second particle know the answer?
To Einstein, these “spooky actions at a distance” implied that quantum physics was an incomplete theory. “Quantum mechanics is certainly imposing,” he wrote to Born. “But an inner voice tells me that it is not yet the real thing.”

Over the remaining decades of his life, Einstein searched for a way that the two particles could use classical physics to come up with their answers — hidden variables that could explain the behavior of the particles without a need for randomness or spooky actions.

But in 1964, Bell realized that the EPR paradox could be used to devise an experiment that determines whether quantum physics or a local hidden-variables theory correctly explains the real world. Adapted five years later into a format called the CHSH game (after the researchers John Clauser, Michael Horne, Abner Shimony and Richard Holt), the test asks a system to prove its quantum nature by performing a feat that is impossible using only classical physics.

The CHSH game is a coordination game, in which two collaborating players — Bonnie and Clyde, say — are questioned in separate interrogation rooms. Their joint goal is to give either identical answers or different answers, depending on what questions the “detective” asks them. Neither player knows what question the detective is asking the other player.

If Bonnie and Clyde can use only classical physics, then no matter how many “hidden variables” they share, it turns out that the best they can do is decide on a story before they get separated and then stick to it, no matter what the detective asks them, a strategy that will win the game 75 percent of the time. But if Bonnie and Clyde share an EPR pair of entangled particles — picked up in a bank heist, perhaps — then they can exploit the spooky action at a distance to better coordinate their answers and win the game about 85.4 percent of the time.

Bell’s test gave experimentalists a specific way to distinguish between quantum physics and any hidden-variables theory. Over the decades that followed, physicists, most notably Alain Aspect, currently at the École Polytechnique in Palaiseau, France, carried out this test repeatedly, in increasingly controlled settings. Almost every time, the outcome has been consistent with the predictions of quantum physics, not with hidden variables.

Aspect’s work “painted hidden variables into a corner,” Aaronson said. The experiments had a huge role, he said, in convincing people that the counterintuitive weirdness of quantum physics is here to stay.

If Einstein had known about the Bell test, Vazirani said, “he wouldn’t have wasted 30 years of his life looking for an alternative to quantum mechanics.” He simply would have convinced someone to do the experiment.

Read the whole article here.

Impossible Chemistry in Space

Combine the vastness of the universe with the probabilistic behavior of quantum mechanics and you get some rather odd chemical results. This includes the spontaneous creation of some complex organic molecules in interstellar space — previously believed to be far too inhospitable for all but the lowliest forms of matter.

From the New Scientist:

Quantum weirdness can generate a molecule in space that shouldn’t exist by the classic rules of chemistry. If interstellar space is really a kind of quantum chemistry lab, that might also account for a host of other organic molecules glimpsed in space.

Interstellar space should be too cold for most chemical reactions to occur, as the low temperature makes it tough for molecules drifting through space to acquire the energy needed to break their bonds. “There is a standard law that says as you lower the temperature, the rates of reactions should slow down,” says Dwayne Heard of the University of Leeds, UK.

Yet we know there are a host of complex organic molecules in space. Some reactions could occur when different molecules stick to the surface of cosmic dust grain. This might give them enough time together to acquire the energy needed to react, which doesn’t happen when molecules drift past each other in space.

Not all reactions can be explained in this way, though. Last year astronomers discovered methoxy molecules – containing carbon, hydrogen and oxygen – in the Perseus molecular cloud, around 600 light years from Earth. But researchers couldn’t produce this molecule in the lab by allowing reactants to condense on dust grains, leaving a puzzle as to how it could have formed.

Molecular hang-out

Another route to methoxy is to combine a hydroxyl radical and methanol gas, both present in space. But this reaction requires hurdling a significant energy barrier – and the energy to do that simply isn’t available in the cold expanse of space.

Heard and his colleagues wondered if the answer lay in quantum mechanics: a process called quantum tunnelling might give the hydroxyl radical a small chance to cheat by digging through the barrier instead of going over it, they reasoned.

So, in another attempt to replicate the production of methoxy in space, the team chilled gaseous hydroxyl and methanol to 63 kelvin – and were able to produce methoxy.

The idea is that at low temperatures, the molecules slow down, increasing the likelihood of tunnelling. “At normal temperatures they just collide off each other, but when you go down in temperature they hang out together long enough,” says Heard.

Impossible chemistry

The team also found that the reaction occurred 50 times faster via quantum tunnelling than if it occurred normally at room temperature by hurdling the energy barrier. Empty space is much colder than 63 kelvin, but dust clouds near stars can reach this temperature, adds Heard.

“We’re showing there is organic chemistry in space of the type of reactions where it was assumed these just wouldn’t happen,” says Heard.

That means the chemistry of space may be richer than we had imagined. “There is maybe a suite of chemical reactions we hadn’t yet considered occurring in interstellar space,” agrees Helen Fraser of the University of Strathclyde, UK, who was not part of the team.

Read the entire article here.

Image: Amino-1-methoxy-4-methylbenzol, featuring methoxy molecular structure, recently found in interstellar space. Courtesy of Wikipedia.

Uncertainty Strikes the Uncertainty Principle

Some recent experiments out of the University of Toronto show for the first time an anomaly in measurements predicted by Werner Heisenberg’s fundamental law of quantum mechanics, the Uncertainty Principle.

[div class=attrib]From io9:[end-div]

Heisenberg’s uncertainty principle is an integral component of quantum physics. At the quantum scale, standard physics starts to fall apart, replaced by a fuzzy, nebulous set of phenomena. Among all the weirdness observed at this microscopic scale, Heisenberg famously observed that the position and momentum of a particle cannot be simultaneously measured, with any meaningful degree of precision. This led him to posit the uncertainty principle, the declaration that there’s only so much we can know about a quantum system, namely a particle’s momentum and position.

Now, by definition, the uncertainty principle describes a two-pronged process. First, there’s the precision of a measurement that needs to be considered, and second, the degree of uncertainty, or disturbance, that it must create. It’s this second aspect that quantum physicists refer to as the “measurement-disturbance relationship,” and it’s an area that scientists have not sufficiently explored or proven.

Up until this point, quantum physicists have been fairly confident in their ability to both predict and measure the degree of disturbances caused by a measurement. Conventional thinking is that a measurement will always cause a predictable and consistent disturbance — but as the study from Toronto suggests, this is not always the case. Not all measurements, it would seem, will cause the effect predicted by Heisenberg and the tidy equations that have followed his theory. Moreover, the resultant ambiguity is not always caused by the measurement itself.

The researchers, a team led by Lee Rozema and Aephraim Steinberg, experimentally observed a clear-cut violation of Heisenberg’s measurement-disturbance relationship. They did this by applying what they called a “weak measurement” to define a quantum system before and after it interacted with their measurement tools — not enough to disturb it, but enough to get a basic sense of a photon’s orientation.

Then, by establishing measurement deltas, and then applying stronger, more disruptive measurements, the team was able to determine that they were not disturbing the quantum system to the degree that the uncertainty principle predicted. And in fact, the disturbances were half of what would normally be expected.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Heisenberg, Werner Karl Prof. 1901-1976; Physicist, Nobel Prize for Physics 1933, Germany. Courtesy of Wikipedia.[end-div]

Something Out of Nothing

The debate on how the universe came to be rages on. Perhaps, however, we are a little closer to understanding why there is “something”, including us, rather than “nothing”.

[div class=attrib]From Scientific American:[end-div]

Why is there something rather than nothing? This is one of those profound questions that is easy to ask but difficult to answer. For millennia humans simply said, “God did it”: a creator existed before the universe and brought it into existence out of nothing. But this just begs the question of what created God—and if God does not need a creator, logic dictates that neither does the universe. Science deals with natural (not supernatural) causes and, as such, has several ways of exploring where the “something” came from.

Multiple universes. There are many multiverse hypotheses predicted from mathematics and physics that show how our universe may have been born from another universe. For example, our universe may be just one of many bubble universes with varying laws of nature. Those universes with laws similar to ours will produce stars, some of which collapse into black holes and singularities that give birth to new universes—in a manner similar to the singularity that physicists believe gave rise to the big bang.

M-theory. In his and Leonard Mlodinow’s 2010 book, The Grand Design, Stephen Hawking embraces “M-theory” (an extension of string theory that includes 11 dimensions) as “the only candidate for a complete theory of the universe. If it is finite—and this has yet to be proved—it will be a model of a universe that creates itself.”

Quantum foam creation. The “nothing” of the vacuum of space actually consists of subatomic spacetime turbulence at extremely small distances measurable at the Planck scale—the length at which the structure of spacetime is dominated by quantum gravity. At this scale, the Heisenberg uncertainty principle allows energy to briefly decay into particles and antiparticles, thereby producing “something” from “nothing.”

Nothing is unstable. In his new book, A Universe from Nothing, cosmologist Lawrence M. Krauss attempts to link quantum physics to Einstein’s general theory of relativity to explain the origin of a universe from nothing: “In quantum gravity, universes can, and indeed always will, spontaneously appear from nothing. Such universes need not be empty, but can have matter and radiation in them, as long as the total energy, including the negative energy associated with gravity [balancing the positive energy of matter], is zero.” Furthermore, “for the closed universes that might be created through such mechanisms to last for longer than infinitesimal times, something like inflation is necessary.” Observations show that the universe is in fact flat (there is just enough matter to slow its expansion but not to halt it), has zero total energy and underwent rapid inflation, or expansion, soon after the big bang, as described by inflationary cosmology. Krauss concludes: “Quantum gravity not only appears to allow universes to be created from noth ing—meaning … absence of space and time—it may require them. ‘Nothing’—in this case no space, no time, no anything!—is unstable.”

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: There’s Nothing Out There. Courtesy of Rolfe Kanefsky / Image Entertainment.[end-div]

Spooky Action at a Distance Explained

[div class=attrib]From Scientific American:[end-div]

Quantum entanglement is such a mainstay of modern physics that it is worth reflecting on how long it took to emerge. What began as a perceptive but vague insight by Albert Einstein languished for decades before becoming a branch of experimental physics and, increasingly, modern technology.

Einstein’s two most memorable phrases perfectly capture the weirdness of quantum mechanics. “I cannot believe that God plays dice with the universe” expressed his disbelief that randomness in quantum physics was genuine and impervious to any causal explanation. “Spooky action at a distance” referred to the fact that quantum physics seems to allow influences to travel faster than the speed of light. This was, of course, disturbing to Einstein, whose theory of relativity prohibited any such superluminal propagation.

These arguments were qualitative. They were targeted at the worldview offered by quantum theory rather than its predictive power. Niels Bohr is commonly seen as the patron saint of quantum physics, defending it against Einstein’s repeated onslaughts. He is usually said to be the ultimate winner in this battle of wits. However, Bohr’s writing was terribly obscure. He was known for saying “never express yourself more clearly than you are able to think,” a motto which he adhered to very closely. His arguments, like Einstein’s, were qualitative, verging on highly philosophical. The Einstein-Bohr dispute, although historically important, could not be settled experimentally—and the experiment is the ultimate judge of validity of any theoretical ideas in physics. For decades, the phenomenon was all but ignored.

All that changed with John Bell. In 1964 he understood how to convert the complaints about “dice-playing” and “spooky action at a distance” into a simple inequality involving measurements on two particles. The inequality is satisfied in a world where God does not play dice and there is no spooky action. The inequality is violated if the fates of the two particles are intertwined, so that if we measure a property of one of them, we immediately know the same property of the other one—no matter how far apart the particles are from each other. This state where particles behave like twin brothers is said to be entangled, a term introduced by Erwin Schrödinger.

[div class=attrib]Read the whole article here.[end-div]

Chance as a Subjective or Objective Measure

[div class=attrib]From Rationally Speaking:[end-div]

Stop me if you’ve heard this before: suppose I flip a coin, right now. I am not giving you any other information. What odds (or probability, if you prefer) do you assign that it will come up heads?

If you would happily say “Even” or “1 to 1” or “Fifty-fifty” or “probability 50%” — and you’re clear on WHY you would say this — then this post is not aimed at you, although it may pleasantly confirm your preexisting opinions as a Bayesian on probability. Bayesians, broadly, consider probability to be a measure of their state of knowledge about some proposition, so that different people with different knowledge may correctly quote different probabilities for the same proposition.

If you would say something along the lines of “The question is meaningless; probability only has meaning as the many-trials limit of frequency in a random experiment,” or perhaps “50%, but only given that a fair coin and fair flipping procedure is being used,” this post is aimed at you. I intend to try to talk you out of your Frequentist view; the view that probability exists out there and is an objective property of certain physical systems, which we humans, merely fallibly, measure.

My broader aim is therefore to argue that “chance” is always and everywhere subjective — a result of the limitations of minds — rather than objective in the sense of actually existing in the outside world.

[div class=attrib]Much more of this article here.[end-div]

[div class=attrib]Image courtesy of Wikipedia.[end-div]