Tag Archives: sustainability

Another Step Closer to Artificial Photosynthesis

google-search-leaf

Researchers from the University of Illinois at Chicago have constructed an artificial leaf that captures sunlight and uses it to convert carbon dioxide in the atmosphere to usable hydrocarbon fuel. Senior author on the study, Amin Salehi-Khojin assistant professor of mechanical and industrial engineering, notes that “the new solar cell is not photovoltaic — it’s photosynthetic.” Using a combination of intricately engineered nano-membranes and unique combinations of catalytic molecules the artificial leaf takes in sunlight and CO2 and produces syngas or synthetic gas (hydrogen and carbon monoxide gas) from the cathode, and free oxygen and hydrogen ions at the anode.

This is a remarkable breakthrough that holds real promise — not only producing energy in a sustainable way from renewable sources but also removing carbon dioxide from the atmosphere.

Read more about this pioneering work here.

Image courtesy of Google Search.

Farm in a Box

Freight-FarmsIf you’ve read my blog for a while you undoubtedly know that I have a rather jaded view of tech startup culture — particularly with Silicon Valley’s myopic obsession for discovering the next multi-billion dollar mobile-consumer-facing-peer-to-peer-gig-economy-service-sharing-buzzword-laden-dating-platform-with-integrated-messaging-and-travel-meta-search app.

So, here’s something refreshing and different. A startup focused on reimagining the production and distribution of fresh food. The company is called Freight Farms, their product: a self-contained farm straight out of a box. Actually the farm is contained inside a box — a standard, repurposed 40 ft long shipping container. Each Leafy Green Machine, as it is called, comes fully equipped with a vertically-oriented growing environment, plant-optimized LED lighting, recirculating hydroponic plumbing and finger-tip climate control.

Freight Farms may not (yet) make a significant impact on the converging and accelerating global crises of population growth, climate change, ecological destruction and natural resource depletion. But the company offers a sound solution to tackling the increasing demand for locally grown and sustainably produced food, especially as the world becomes increasingly urbanized.

Please check out Freight Farms and spread the word.

Image: Freight Farms. Courtesy: Freight Farms.

Perovskites

No, these are not a new form of luxury cut glass from Europe, but something much more significant. First discovered in the mid-1800s in the Ural mountain range of Russia, Perovskite materials could lay the foundation for a significant improvement in the efficiency of solar power systems.

From Technology Review:

A new solar cell material has properties that might lead to solar cells more than twice as efficient as the best on the market today. An article this week in the journal Nature describes the materials—a modified form of a class of compounds called perovskites, which have a particular crystalline structure.

The researchers haven’t yet demonstrated a high efficiency solar cell with the material. But their work adds to a growing body of evidence suggesting perovskite materials could change the face of solar power. Researchers are making new perovskites using combinations of elements and molecules not seen in nature; many researchers see the materials as the next great hope for making solar power cheap enough to compete with fossil fuels.

Perovskite-based solar cells have been improving at a remarkable pace. It took a decade or more for the major solar cell materials used today—silicon and cadmium telluride—to reach efficiency levels that have been demonstrated with perovskites in just four years. The rapid success of the material has impressed even veteran solar researchers who have learned to be cautious about new materials after seeing many promising ones come to nothing (see “A Material that Could Make Solar Power ‘Dirt Cheap’”).

The perovskite material described in Nature has properties that could lead to solar cells that can convert over half of the energy in sunlight directly into electricity, says Andrew Rappe, co-director of Pennergy, a center for energy innovation at the University of Pennsylvania, and one of the new report’s authors. That’s more than twice as efficient as conventional solar cells. Such high efficiency would cut in half the number of solar cells needed to produce a given amount of power. Besides reducing the cost of solar panels, this would greatly reduce installation costs, which now account for most of the cost of a new solar system.

Unlike conventional solar cell materials, the new material doesn’t require an electric field to produce an electrical current. This reduces the amount of material needed and produces higher voltages, which can help increase power output, Rappe says. While other materials have been shown to produce current without the aid of an electric field, the new material is the first to also respond well to visible light, making it relevant for solar cells, he says.

The researchers also showed that it is relatively easy to modify the material so that it efficiently converts different wavelengths of light into electricity. It could be possible to form a solar cell with different layers, each designed for a specific part of the solar spectrum, something that could greatly improve efficiency compared to conventional solar cells (see “Ultra-Efficient Solar Power” and “Manipulating Light to Double Solar Power Output”).

Other solar cell experts note that while these properties are interesting, Rappe and his colleagues have a long way to go before they can produce viable solar cells. For one thing, the electrical current it produces so far is very low. Ramamoorthy Ramesh, a professor of materials science and engineering at Berkeley, says, “This is nice work, but really early stage. To make a solar cell, a lot of other things are needed.”

Perovskites remain a promising solar material. Michael McGehee, a materials science and engineering professor at Stanford University, recently wrote, “The fact that multiple teams are making such rapid progress suggests that the perovskites have extraordinary potential, and might elevate the solar cell industry to new heights.”

Read the entire article here.

Image: Perovskite mined in Magnet Cove, Arkansas. Courtesy of Wikimedia.

Ethical Meat and Idiotic Media

Lab grown meat is now possible. But is not available on an industrial scale to satisfy the human desire for burgers, steak and ribs. While this does represent a breakthrough it’s likely to be a while before the last cow or chicken or pig is slaughtered. Of course, the mainstream media picked up this important event and immediately labeled it with captivating headlines featuring the word “frankenburger”. Perhaps a well-intentioned lab will someday come up with an intelligent form of media organization.

From the New York Times (dot earth):

I first explored livestock-free approaches to keeping meat on menus in 2008 in a pieced titled “Can People Have Meat and a Planet, Too?”

It’s been increasingly clear since then that there are both environmental and — obviously — ethical advantages to using technology to sustain omnivory on a crowding planet. This presumes humans will not all soon shift to a purely vegetarian lifestyle, even though there are signs of what you might call “peak meat” (consumption, that is) in prosperous societies (Mark Bittman wrote a nice piece on this). Given dietary trends as various cultures rise out of poverty, I would say it’s a safe bet meat will remain a favored food for decades to come.

Now non-farmed meat is back in the headlines, with a patty of in-vitro beef – widely dubbed a “frankenburger” — fried and served in London earlier today.

The beef was grown in a lab by a pioneer in this arena — Mark Post of Maastricht University in the Netherlands. My colleague Henry Fountain has reported the details in a fascinating news article. Here’s an excerpt followed by my thoughts on next steps in what I see as an important area of research and development:

According to the three people who ate it, the burger was dry and a bit lacking in flavor. One taster, Josh Schonwald, a Chicago-based author of a book on the future of food [link], said “the bite feels like a conventional hamburger” but that the meat tasted “like an animal-protein cake.”

But taste and texture were largely beside the point: The event, arranged by a public relations firm and broadcast live on the Web, was meant to make a case that so-called in-vitro, or cultured, meat deserves additional financing and research…..

Dr. Post, one of a handful of scientists working in the field, said there was still much research to be done and that it would probably take 10 years or more before cultured meat was commercially viable. Reducing costs is one major issue — he estimated that if production could be scaled up, cultured beef made as this one burger was made would cost more than $30 a pound.

The two-year project to make the one burger, plus extra tissue for testing, cost $325,000. On Monday it was revealed that Sergey Brin, one of the founders of Google, paid for the project. Dr. Post said Mr. Brin got involved because “he basically shares the same concerns about the sustainability of meat production and animal welfare.”
The enormous potential environmental benefits of shifting meat production, where feasible, from farms to factories were estimated in “Environmental Impacts of Cultured Meat Production,”a 2011 study in Environmental Science and Technology.

Read the entire article here.

Image: Professor Mark Post holds the world’s first lab-grown hamburger. Courtesy of Reuters/David Parry / The Atlantic.

A Smarter Smart Grid

If you live somewhere rather toasty you know how painful your electricity bills can be during the summer months. So, wouldn’t it be good to have a system automatically find you the cheapest electricity when you need it most? Welcome to the artificially intelligent smarter smart grid.

From the New Scientist:

An era is coming in which artificially intelligent systems can manage your energy consumption to save you money and make the electricity grid even smarter

IF YOU’RE tired of keeping track of how much you’re paying for energy, try letting artificial intelligence do it for you. Several start-up companies aim to help people cut costs, flex their muscles as consumers to promote green energy, and usher in a more efficient energy grid – all by unleashing smart software on everyday electricity usage.

Several states in the US have deregulated energy markets, in which customers can choose between several energy providers competing for their business. But the different tariff plans, limited-time promotional rates and other products on offer can be confusing to the average consumer.

A new company called Lumator aims to cut through the morass and save consumers money in the process. Their software system, designed by researchers at Carnegie Mellon University in Pittsburgh, Pennsylvania, asks new customers to enter their energy preferences – how they want their energy generated, and the prices they are willing to pay. The software also gathers any available metering measurements, in addition to data on how the customer responds to emails about opportunities to switch energy provider.

A machine-learning system digests that information and scans the market for the most suitable electricity supply deal. As it becomes familiar with the customer’s habits it is programmed to automatically switch energy plans as the best deals become available, without interrupting supply.

“This ensures that customers aren’t taken advantage of by low introductory prices that drift upward over time, expecting customer inertia to prevent them from switching again as needed,” says Lumator’s founder and CEO Prashant Reddy.

The goal is not only to save customers time and money – Lumator claims it can save people between $10 and $30 a month on their bills – but also to help introduce more renewable energy into the grid. Reddy says power companies have little idea whether or not their consumers want to get their energy from renewables. But by keeping customer preferences on file and automatically switching to a new service when those preferences are met, Reddy hopes renewable energy suppliers will see the demand more clearly.

A firm called Nest, based in Palo Alto, California, has another way to save people money. It makes Wi-Fi-enabled thermostats that integrate machine learning to understand users’ habits. Energy companies in southern California and Texas offer deals to customers if they allow Nest to make small adjustments to their thermostats when the supplier needs to reduce customer demand.

“The utility company gives us a call and says they’re going to need help tomorrow as they’re expecting a heavy load,” says Matt Rogers, one of Nest’s founders. “We provide about 5 megawatts of load shift, but each home has a personalised demand response. The entire programme is based on data collected by Nest.”

Rogers says that about 5000 Nest users have opted-in to such load-balancing programmes.

Read the entire article here.

Image courtesy of Treehugger.

Anti-Eco-Friendly Consumption

It should come as no surprise that those who deny the science of climate change and human-propelled impact on the environment would also shirk from purchasing products and services that are friendly to the environment.

A recent study shows how extreme political persuasion sways purchasing behavior of light bulbs: conservatives are more likely to purchase incandescent bulbs, while moderates and liberals lean towards more eco-friendly bulbs.

Joe Barton, U.S. Representative from Texas, sums up the issue of light bulb choice quite neatly, “… it is about personal freedom”. All the while our children shake their heads in disbelief.

Presumably many climate change skeptics prefer to purchase items that are harmful to the environment and also to humans just to make a political statement. This might include continuing to purchase products containing dangerous levels of unpronounceable acronyms and questionable chemicals: rBGH (recombinant Bovine Growth Hormone) in milk, BPA (Bisphenol_A) in plastic utensils and bottles, KBrO3 (Potassium Bromate) in highly processed flour, BHA (Butylated Hydroxyanisole) food preservative, Azodicarbonamide in dough.

Freedom truly does come at a cost.

From the Guardian:

Eco-friendly labels on energy-saving bulbs are a turn-off for conservative shoppers, a new study has found.

The findings, published this week in the Proceedings of the National Academy of Sciences, suggest that it could be counterproductive to advertise the environmental benefits of efficient bulbs in the US. This could make it even more difficult for America to adopt energy-saving technologies as a solution to climate change.

Consumers took their ideological beliefs with them when they went shopping, and conservatives switched off when they saw labels reading “protect the environment”, the researchers said.

The study looked at the choices of 210 consumers, about two-thirds of them women. All were briefed on the benefits of compact fluorescent (CFL) bulbs over old-fashioned incandescents.

When both bulbs were priced the same, shoppers across the political spectrum were uniformly inclined to choose CFL bulbs over incandescents, even those with environmental labels, the study found.

But when the fluorescent bulb cost more – $1.50 instead of $0.50 for an incandescent – the conservatives who reached for the CFL bulb chose the one without the eco-friendly label.

“The more moderate and conservative participants preferred to bear a long-term financial cost to avoid purchasing an item associated with valuing environmental protections,” the study said.

The findings suggest the extreme political polarisation over environment and climate change had now expanded to energy-savings devices – which were once supported by right and left because of their money-saving potential.

“The research demonstrates how promoting the environment can negatively affect adoption of energy efficiency in the United States because of the political polarisation surrounding environmental issues,” the researchers said.

Earlier this year Harvard academic Theda Skocpol produced a paper tracking how climate change and the environment became a defining issue for conservatives, and for Republican-elected officials.

Conservative activists elevated opposition to the science behind climate change, and to action on climate change, to core beliefs, Skocpol wrote.

There was even a special place for incandescent bulbs. Republicans in Congress two years ago fought hard to repeal a law phasing out incandescent bulbs – even over the objections of manufacturers who had already switched their product lines to the new energy-saving technology.

Republicans at the time cast the battle of the bulb as an issue of liberty. “This is about more than just energy consumption. It is about personal freedom,” said Joe Barton, the Texas Republican behind the effort to keep the outdated bulbs burning.

Read the entire article following the jump.

Image courtesy of Housecraft.

Cheap Hydrogen

Researchers at the University of Glasgow, Scotland, have discovered an alternative and possibly more efficient way to make hydrogen at industrial scales. Typically, hydrogen is produced from reacting high temperature steam with methane or natural gas. A small volume of hydrogen, less than five percent annually, is also made through the process of electrolysis — passing an electric current through water.

This new method of production appears to be less costly, less dangerous and also more environmentally sound.

From the Independent:

Scientists have harnessed the principles of photosynthesis to develop a new way of producing hydrogen – in a breakthrough that offers a possible solution to global energy problems.

The researchers claim the development could help unlock the potential of hydrogen as a clean, cheap and reliable power source.

Unlike fossil fuels, hydrogen can be burned to produce energy without producing emissions. It is also the most abundant element on the planet.

Hydrogen gas is produced by splitting water into its constituent elements – hydrogen and oxygen. But scientists have been struggling for decades to find a way of extracting these elements at different times, which would make the process more energy-efficient and reduce the risk of dangerous explosions.

In a paper published today in the journal Nature Chemistry, scientists at the University of Glasgow outline how they have managed to replicate the way plants use the sun’s energy to split water molecules into hydrogen and oxygen at separate times and at separate physical locations.

Experts heralded the “important” discovery yesterday, saying it could make hydrogen a more practicable source of green energy.

Professor Xile Hu, director of the Laboratory of Inorganic Synthesis and Catalysis at the Swiss Federal Institute of Technology in Lausanne, said: “This work provides an important demonstration of the principle of separating hydrogen and oxygen production in electrolysis and is very original. Of course, further developments are needed to improve the capacity of the system, energy efficiency, lifetime and so on. But this research already  offers potential and promise and can help in making the storage of green  energy cheaper.”

Until now, scientists have separated hydrogen and oxygen atoms using electrolysis, which involves running electricity through water. This is energy-intensive and potentially explosive, because the oxygen and hydrogen are removed at the same time.

But in the new variation of electrolysis developed at the University of Glasgow, hydrogen and oxygen are produced from the water at different times, thanks to what researchers call an “electron-coupled proton buffer”. This acts to collect and store hydrogen while the current runs through the water, meaning that in the first instance only oxygen is released. The hydrogen can then be released when convenient.

Because pure hydrogen does not occur naturally, it takes energy to make it. This new version of electrolysis takes longer, but is safer and uses less energy per minute, making it easier to rely on renewable energy sources for the electricity needed to separate  the atoms.

Dr Mark Symes, the report’s co-author, said: “What we have developed is a system for producing hydrogen on an industrial scale much more cheaply and safely than is currently possible. Currently much of the industrial production of hydrogen relies on reformation of fossil fuels, but if the electricity is provided via solar, wind or wave sources we can create an almost totally clean source of power.”

Professor Lee Cronin, the other author of the research, said: “The existing gas infrastructure which brings gas to homes across the country could just as easily carry hydrogen as it  currently does methane. If we were to use renewable power to generate hydrogen using the cheaper, more efficient decoupled process we’ve created, the country could switch to hydrogen  to generate our electrical power  at home. It would also allow us to  significantly reduce the country’s  carbon footprint.”

Nathan Lewis, a chemistry professor at the California Institute of Technology and a green energy expert, said: “This seems like an interesting scientific demonstration that may possibly address one of the problems involved with water electrolysis, which remains a relatively expensive method of producing hydrogen.”

Read the entire article following the jump.

Farmscrapers

No, the drawing is not a construction from the mind of sci fi illustrator extraordinaire Michael Whelan. This is reality. Or, to be more precise an architectural rendering of buildings to come — in China of course.

From the Independent:

A French architecture firm has unveiled their new ambitious ‘farmscraper’ project – six towering structures which promise to change the way that we think about green living.

Vincent Callebaut Architects’ innovative Asian Cairns was planned specifically for Chinese city Shenzhen in response to the growing population, increasing CO2 emissions and urban development.

The structures will consist of a series of pebble-shaped levels – each connected by a central spinal column – which will contain residential areas, offices, and leisure spaces.

Sustainability is key to the innovative project – wind turbines will cover the roof of each tower, water recycling systems will be in place to recycle waste water, and solar panels will be installed on the buildings, providing renewable energy. The structures will also have gardens on the exterior, further adding to the project’s green credentials.

Vincent Callebaut, the Belgian architect behind the firm, is well-known for his ambitious, eco-friendly projects, winning many awards over the years.

His self-sufficient amphibious city Lilypad – ‘a floating ecopolis for climate refugees’ – is perhaps his most famous design. The model has been proposed as a long-term solution to rising water levels, and successfully meets the four challenges of climate, biodiversity, water, and health, that the OECD laid out in 2008.

Vincent Callebaut Architects said: “It is a prototype to build a green, dense, smart city connected by technology and eco-designed from biotechnologies.”

Read the entire article and see more illustrations after the jump.

Image: “Farmscrapers” take eco-friendly architecture to dizzying heights in China. Courtesy of Vincent Callebaut Architects / Independent.

Light From Gravity

Often the best creative ideas and the most elegant solutions are the simplest. GravityLight is an example of this type of innovation. Here’s the problem: replace damaging and expensive kerosene fuel lamps in Africa with a less harmful and cheaper alternative. And, the solution:

[tube]1dd9NIlhvlI[/tube]

[div class=attrib]From ars technica:[end-div]

A London design consultancy has developed a cheap, clean, and safer alternative to the kerosene lamp. Kerosene burning lamps are thought to be used by over a billion people in developing nations, often in remote rural parts where electricity is either prohibitively expensive or simply unavailable. Kerosene’s potential replacement, GravityLight, is powered by gravity without the need of a battery—it’s also seen by its creators as a superior alternative to solar-powered lamps.

Kerosene lamps are problematic in three ways: they release pollutants which can contribute to respiratory disease; they pose a fire risk; and, thanks to the ongoing need to buy kerosene fuel, they are expensive to run. Research out of Brown University from July of last year called kerosene lamps a “significant contributor to respiratory diseases, which kill over 1.5 million people every year” in developing countries. The same paper found that kerosene lamps were responsible for 70 percent of fires (which cause 300,000 deaths every year) and 80 percent of burns. The World Bank has compared the indoor use of a kerosene lamp with smoking two packs of cigarettes per day.

The economics of the kerosene lamps are nearly as problematic, with the fuel costing many rural families a significant proportion of their money. The designers of the GravityLight say 10 to 20 percent of household income is typical, and they describe kerosene as a poverty trap, locking people into a “permanent state of subsistence living.” Considering that the median rural price of kerosene in Tanzania, Mali, Ghana, Kenya, and Senegal is $1.30 per liter, and the average rural income in Tanzania is under $9 per month, the designers’ figures seem depressingly plausible.

Approached by the charity Solar Aid to design a solar-powered LED alternative, London design consultancy Therefore shifted the emphasis away from solar, which requires expensive batteries that degrade over time. The company’s answer is both more simple and more radical: an LED lamp driven by a bag of sand, earth, or stones, pulled toward the Earth by gravity.

It takes only seconds to hoist the bag into place, after which the lamp provides up to half an hour of ambient light, or about 18 minutes of brighter task lighting. Though it isn’t clear quite how much light the GravityLight emits, its makers insist it is more than a kerosene lamp. Also unclear are the precise inner workings of the device, though clearly the weighted bag pulls a cord, driving an inner mechanism with a low-powered dynamo, with the aid of some robust plastic gearing. Talking to Ars by telephone, Therefore’s Jim Fullalove was loath to divulge details, but did reveal the gearing took the kinetic energy from a weighted bag descending at a rate of a millimeter per second to power a dynamo spinning at 2000rpm.

[div class=attrib]Read more about GravityLight after the jump.[end-div]

[div class=attrib]Video courtesy of GravityLight.[end-div]

Fusion and the Z Machine

The quest to tap fusion as an energy source here on Earth continues to inch forward with some promising new developments. Of course, we mean nuclear fusion — the type which drives our companion star to shine, not the now debunked “cold fusion” supposedly demonstrated in a test tube in the late 1980s.

[div class=attrib]From Wired:[end-div]

In the high-stakes race to realize fusion energy, a smaller lab may be putting the squeeze on the big boys. Worldwide efforts to harness fusion—the power source of the sun and stars—for energy on Earth currently focus on two multibillion dollar facilities: the ITER fusion reactor in France and the National Ignition Facility (NIF) in California. But other, cheaper approaches exist—and one of them may have a chance to be the first to reach “break-even,” a key milestone in which a process produces more energy than needed to trigger the fusion reaction.

Researchers at the Sandia National Laboratory in Albuquerque, New Mexico, will announce in a Physical Review Letters (PRL) paper accepted for publication that their process, known as magnetized liner inertial fusion (MagLIF) and first proposed 2 years ago, has passed the first of three tests, putting it on track for an attempt at the coveted break-even. Tests of the remaining components of the process will continue next year, and the team expects to take its first shot at fusion before the end of 2013.

Fusion reactors heat and squeeze a plasma—an ionized gas—composed of the hydrogen isotopes deuterium and tritium, compressing the isotopes until their nuclei overcome their mutual repulsion and fuse together. Out of this pressure-cooker emerge helium nuclei, neutrons, and a lot of energy. The temperature required for fusion is more than 100 million°C—so you have to put a lot of energy in before you start to get anything out. ITER and NIF are planning to attack this problem in different ways. ITER, which will be finished in 2019 or 2020, will attempt fusion by containing a plasma with enormous magnetic fields and heating it with particle beams and radio waves. NIF, in contrast, takes a tiny capsule filled with hydrogen fuel and crushes it with a powerful laser pulse. NIF has been operating for a few years but has yet to achieve break-even.

Sandia’s MagLIF technique is similar to NIF’s in that it rapidly crushes its fuel—a process known as inertial confinement fusion. But to do it, MagLIF uses a magnetic pulse rather than lasers. The target in MagLIF is a tiny cylinder about 7 millimeters in diameter; it’s made of beryllium and filled with deuterium and tritium. The cylinder, known as a liner, is connected to Sandia’s vast electrical pulse generator (called the Z machine), which can deliver 26 million amps in a pulse lasting milliseconds or less. That much current passing down the walls of the cylinder creates a magnetic field that exerts an inward force on the liner’s walls, instantly crushing it—and compressing and heating the fusion fuel.

Researchers have known about this technique of crushing a liner to heat the fusion fuel for some time. But the MagLIF-Z machine setup on its own didn’t produce quite enough heat; something extra was needed to make the process capable of reaching break-even. Sandia researcher Steve Slutz led a team that investigated various enhancements through computer simulations of the process. In a paper published in Physics of Plasmas in 2010, the team predicted that break-even could be reached with three enhancements.

First, they needed to apply the current pulse much more quickly, in just 100 nanoseconds, to increase the implosion velocity. They would also preheat the hydrogen fuel inside the liner with a laser pulse just before the Z machine kicks in. And finally, they would position two electrical coils around the liner, one at each end. These coils produce a magnetic field that links the two coils, wrapping the liner in a magnetic blanket. The magnetic blanket prevents charged particles, such as electrons and helium nuclei, from escaping and cooling the plasma—so the temperature stays hot.

Sandia plasma physicist Ryan McBride is leading the effort to see if the simulations are correct. The first item on the list is testing the rapid compression of the liner. One critical parameter is the thickness of the liner wall: The thinner the wall, the faster it will be accelerated by the magnetic pulse. But the wall material also starts to evaporate away during the pulse, and if it breaks up too early, it will spoil the compression. On the other hand, if the wall is too thick, it won’t reach a high enough velocity. “There’s a sweet spot in the middle where it stays intact and you still get a pretty good implosion velocity,” McBride says.

To test the predicted sweet spot, McBride and his team set up an elaborate imaging system that involved blasting a sample of manganese with a high-powered laser (actually a NIF prototype moved to Sandia) to produce x-rays. By shining the x-rays through the liner at various stages in its implosion, the researchers could image what was going on. They found that at the sweet-spot thickness, the liner held its shape right through the implosion. “It performed as predicted,” McBride says. The team aims to test the other two enhancements—the laser preheating and the magnetic blanket—in the coming year, and then put it all together to take a shot at break-even before the end of 2013.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: The Z Pulsed Power Facility produces tremendous energy when it fires. Courtesy of Sandia National Laboratory.[end-div]

The 10,000 Year Clock

Aside from the ubiquitous plastic grocery bag will any human made artifact last 10,000 years? Before you answer, let’s qualify the question by mandating the artifact have some long-term value. That would seem to eliminate plastic bags, plastic toys embedded in fast food meals, and DVDs of reality “stars” ripped from YouTube. What does that leave? Most human made products consisting of metals or biodegradable components, such as paper and wood, will rust, rot or breakdown in 20-300 years. Even some plastics left exposed to sun and air will breakdown within a thousand years. Of course, buried deep in a landfill, plastic containers, styrofoam cups and throwaway diapers may remain with us for tens or hundreds of thousands of years.

Archaeological excavations show us that artifacts made of glass and ceramic would fit the bill — lasting well into the year 12012 and beyond. But, in the majority of cases we usually unearth fragments of things.

But what if some ingenious humans could build something that would still be around 10,000 years from now? Better still, build something that will still function as designed 10,000 years from now. This would represent an extraordinary feat of contemporary design and engineering. And, more importantly it would provide a powerful story for countless generations beginning with ours.

So, enter Danny Hillis and the Clock of the Long Now (also knows as the Millennium Clock or the 10,000 Year Clock). Danny Hillis is an inventor, scientist, and computer designer. He pioneered the concept of massively parallel computers.

In Hillis’ own words:

Ten thousand years – the life span I hope for the clock – is about as long as the history of human technology. We have fragments of pots that old. Geologically, it’s a blink of an eye. When you start thinking about building something that lasts that long, the real problem is not decay and corrosion, or even the power source. The real problem is people. If something becomes unimportant to people, it gets scrapped for parts; if it becomes important, it turns into a symbol and must eventually be destroyed. The only way to survive over the long run is to be made of materials large and worthless, like Stonehenge and the Pyramids, or to become lost. The Dead Sea Scrolls managed to survive by remaining lost for a couple millennia. Now that they’ve been located and preserved in a museum, they’re probably doomed. I give them two centuries – tops. The fate of really old things leads me to think that the clock should be copied and hidden.

Plans call for the 200 foot tall, 10,000 Year Clock to be installed inside a mountain in remote west Texas, with a second location in remote eastern Nevada. Design and engineering work on the clock, and preparation of the Clock’s Texas home are underway.

For more on the 10,000 Year Clock jump to the Long Now Foundation, here.

[div class=attrib]More from Rationally Speaking:[end-div]

I recently read Brian Hayes’ wonderful collection of mathematically oriented essays called Group Theory In The Bedroom, and Other Mathematical Diversions. Not surprisingly, the book contained plenty of philosophical musings too. In one of the essays, called “Clock of Ages,” Hayes describes the intricacies of clock building and he provides some interesting historical fodder.

For instance, we learn that in the sixteenth century Conrad Dasypodius, a Swiss mathematician, could have chosen to restore the old Clock of the Three Kings in Strasbourg Cathedral. Dasypodius, however, preferred to build a new clock of his own rather than maintain an old one. Over two centuries later, Jean-Baptiste Schwilgue was asked to repair the clock built by Dasypodius, but he decided to build a new and better clock which would last for 10,000 years.

Did you know that a large-scale project is underway to build another clock that will be able to run with minimal maintenance and interruption for ten millennia? It’s called The 10,000 Year Clock and its construction is sponsored by The Long Now Foundation. The 10,000 Year Clock is, however, being built for more than just its precision and durability. If the creators’ intentions are realized, then the clock will serve as a symbol to encourage long-term thinking about the needs and claims of future generations. Of course, if all goes to plan, our future descendants will be left to maintain it too. The interesting question is: will they want to?

If history is any indicator, then I think you know the answer. As Hayes puts it: “The fact is, winding and dusting and fixing somebody else’s old clock is boring. Building a brand-new clock of your own is much more fun, especially if you can pretend that it’s going to inspire awe and wonder for the ages to come. So why not have the fun now and let the future generations do the boring bit.” I think Hayes is right, it seems humans are, by nature, builders and not maintainers.

Projects like The 10,000 Year Clock are often undertaken with the noblest of environmental intentions, but the old proverb is relevant here: the road to hell is paved with good intentions. What I find troubling, then, is that much of the environmental do-goodery in the world may actually be making things worse. It’s often nothing more than a form of conspicuous consumption, which is a term coined by the economist and sociologist Thorstein Veblen. When it pertains specifically to “green” purchases, I like to call it being conspicuously environmental. Let’s use cars as an example. Obviously it depends on how the calculations are processed, but in many instances keeping and maintaining an old clunker is more environmentally friendly than is buying a new hybrid. I can’t help but think that the same must be true of building new clocks.

In his book, The Conundrum, David Owen writes: “How appealing would ‘green’ seem if it meant less innovation and fewer cool gadgets — not more?” Not very, although I suppose that was meant to be a rhetorical question. I enjoy cool gadgets as much as the next person, but it’s delusional to believe that conspicuous consumption is somehow a gift to the environment.

Using insights from evolutionary psychology and signaling theory, I think there is also another issue at play here. Buying conspicuously environmental goods, like a Prius, sends a signal to others that one cares about the environment. But if it’s truly the environment (and not signaling) that one is worried about, then surely less consumption must be better than more. The homeless person ironically has a lesser environmental impact than your average yuppie, yet he is rarely recognized as an environmental hero. Using this logic I can’t help but conclude that killing yourself might just be the most environmentally friendly act of all time (if it wasn’t blatantly obvious, this is a joke). The lesson here is that we shouldn’t confuse smug signaling with actually helping.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Prototype of the 10,000 Year Clock. Courtesy of the Long Now Foundation / Science Museum of London.[end-div]

Skyscrapers A La Mode

Since 2006 Evolo architecture magazine has run a competition for architects to bring life to their most fantastic skyscraper designs. All the finalists of 2012 competition presented some stunning ideas, and topped by the winner, Himalaya Water Tower, from Zhi Zheng, Hongchuan Zhao, Dongbai Song of China.

[div class=attrib]From Evolo:[end-div]

Housed within 55,000 glaciers in the Himalaya Mountains sits 40 percent of the world’s fresh water. The massive ice sheets are melting at a faster-than-ever pace due to climate change, posing possible dire consequences for the continent of Asia and the entire world stand, and especially for the villages and cities that sit on the seven rivers that come are fed from the Himalayas’ runoff as they respond with erratic flooding or drought.

The “Himalaya Water Tower” is a skyscraper located high in the mountain range that serves to store water and helps regulate its dispersal to the land below as the mountains’ natural supplies dry up. The skyscraper, which can be replicated en masse, will collect water in the rainy season, purify it, freeze it into ice and store it for future use. The water distribution schedule will evolve with the needs of residents below; while it can be used to help in times of current drought, it’s also meant to store plentiful water for future generations.

Follow the other notable finalists at Evolo magazine after the jump.

Engineering the Ultimate Solar Power Collector: The Leaf

[div class=attrib]From Cosmic Log:[end-div]

Researchers have been trying for decades to improve upon Mother Nature’s favorite solar-power trick — photosynthesis — but now they finally think they see the sunlight at the end of the tunnel.

“We now understand photosynthesis much better than we did 20 years ago,” said Richard Cogdell, a botanist at the University of Glasgow who has been doing research on bacterial photosynthesis for more than 30 years. He and three colleagues discussed their efforts to tweak the process that powers the world’s plant life today in Vancouver, Canada, during the annual meeting of the American Association for the Advancement of Science.

The researchers are taking different approaches to the challenge, but what they have in common is their search for ways to get something extra out of the biochemical process that uses sunlight to turn carbon dioxide and water into sugar and oxygen. “You can really view photosynthesis as an assembly line with about 168 steps,” said Steve Long, head of the University of Illinois’ Photosynthesis and Atmospheric Change Laboratory.

Revving up Rubisco
Howard Griffiths, a plant physiologist at the University of Cambridge, just wants to make improvements in one section of that assembly line. His research focuses on ways to get more power out of the part of the process driven by an enzyme called Rubisco. He said he’s trying to do what many auto mechanics have done to make their engines run more efficiently: “You turbocharge it.”

Some plants, such as sugar cane and corn, already have a turbocharged Rubisco engine, thanks to a molecular pathway known as C4. Geneticists believe the C4 pathway started playing a significant role in plant physiology in just the past 10 million years or so. Now Griffiths is looking into strategies to add the C4 turbocharger to rice, which ranks among the world’s most widely planted staple crops.

The new cellular machinery might be packaged in a micro-compartment that operates within the plant cell. That’s the way biochemical turbochargers work in algae and cyanobacteria. Griffiths and his colleagues are looking at ways to create similar micro-compartments for higher plants. The payoff would come in the form of more efficient carbon dioxide conversion, with higher crop productivity as a result. “For a given amount of carbon gain, the plant uses less water,” Griffiths said.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image courtesy of Kumaravel via Flickr, Creative Commons.[end-div]

Rechargeable Nanotube-Based Solar Energy Storage

[div class=attrib]From Ars Technica:[end-div]

Since the 1970s, chemists have worked on storing solar energy in molecules that change state in response to light. These photoactive molecules could be the ideal solar fuel, as the right material should be transportable, affordable, and rechargeable. Unfortunately, scientists haven’t had much success.

One of the best examples in recent years, tetracarbonly-diruthenium fulvalene, requires the use of ruthenium, which is rare and expensive. Furthermore, the ruthenium compound has a volumetric energy density (watt-hours per liter) that is several times smaller than that of a standard lithium-ion battery.
Alexie Kolpak and Jeffrey Grossman from the Massachusetts Institute of Technology propose a new type of solar thermal fuel that would be affordable, rechargeable, thermally stable, and more energy-dense than lithium-ion batteries. Their proposed design combines an organic photoactive molecule, azobenzene, with the ever-popular carbon nanotube.

Before we get into the details of their proposal, we’ll quickly go over how photoactive molecules store solar energy. When a photoactive molecule absorbs sunlight, it undergoes a conformational change, moving from the ground energy state into a higher energy state. The higher energy state is metastable (stable for the moment, but highly susceptible to energy loss), so a trigger—voltage, heat, light, etc.—will cause the molecule to fall back to the ground state. The energy difference between the higher energy state and the ground state (termed ?H) is then discharged. A useful photoactive molecule will be able to go through numerous cycles of charging and discharging.

The challenge in making a solar thermal fuel is finding a material that will have both a large ?H and large activation energy. The two factors are not always compatible. To have a large ?H, you want a big energy difference between the ground and higher energy state. But you don’t want the higher energy state to be too energetic, as it would be unstable. Instability means that the fuel will have a small activation energy and be prone to discharging its stored energy too easily.

Kolpak and Grossman managed to find the right balance between ?H and activation energy when they examined computational models of azobenzene (azo) bound to carbon nanotubes (CNT) in azo/CNT nanostructures.

[div class=attrib]More from theSource here.[end-div]

Green Bootleggers and Baptists

[div class=attrib]Bjørn Lomborg for Project Syndicate:[end-div]

In May, the United Nations’ International Panel on Climate Change made media waves with a new report on renewable energy. As in the past, the IPCC first issued a short summary; only later would it reveal all of the data. So it was left up to the IPCC’s spin-doctors to present the take-home message for journalists.

The first line of the IPCC’s press release declared, “Close to 80% of the world‘s energy supply could be met by renewables by mid-century if backed by the right enabling public policies.” That story was repeated by media organizations worldwide.

Last month, the IPCC released the full report, together with the data behind this startlingly optimistic claim. Only then did it emerge that it was based solely on the most optimistic of 164 modeling scenarios that researchers investigated. And this single scenario stemmed from a single study that was traced back to a report by the environmental organization Greenpeace. The author of that report – a Greenpeace staff member – was one of the IPCC’s lead authors.

The claim rested on the assumption of a large reduction in global energy use. Given the number of people climbing out of poverty in China and India, that is a deeply implausible scenario.

When the IPCC first made the claim, global-warming activists and renewable-energy companies cheered. “The report clearly demonstrates that renewable technologies could supply the world with more energy than it would ever need,” boasted Steve Sawyer, Secretary-General of the Global Wind Energy Council.

This sort of behavior – with activists and big energy companies uniting to applaud anything that suggests a need for increased subsidies to alternative energy – was famously captured by the so-called “bootleggers and Baptists” theory of politics.

The theory grew out of the experience of the southern United States, where many jurisdictions required stores to close on Sunday, thus preventing the sale of alcohol. The regulation was supported by religious groups for moral reasons, but also by bootleggers, because they had the market to themselves on Sundays. Politicians would adopt the Baptists’ pious rhetoric, while quietly taking campaign contributions from the criminals.

Of course, today’s climate-change “bootleggers” are not engaged in any illegal behavior. But the self-interest of energy companies, biofuel producers, insurance firms, lobbyists, and others in supporting “green” policies is a point that is often missed.

Indeed, the “bootleggers and Baptists” theory helps to account for other developments in global warming policy over the past decade or so. For example, the Kyoto Protocol would have cost trillions of dollars, but would have achieved a practically indiscernible difference in stemming the rise in global temperature. Yet activists claimed that there was a moral obligation to cut carbon-dioxide emissions, and were cheered on by businesses that stood to gain.

[div class=attrib]More from theSource here[end-div]

Jevons Paradox: Energy Efficiency Increases Consumption?

Energy efficiency sounds simple, but it’s rather difficult to measure. Sure when you purchase a shiny, new more energy efficient washing machine compared with your previous model you’re making a personal dent in energy consumption. But, what if in aggregate overall consumption increases because more people want that energy efficient model? In a nutshell, that’s Jevons Paradox, named after a 19th-century British economist, William Jevons. He observed that while the steam engine consumed energy more efficiently from coal, it also stimulated so much economic growth that coal consumption actually increased. Thus, Jevons argued that improvements in fuel efficiency tend to increase, rather than decrease, fuel use.

John Tierney over at the New York Times brings Jevons into the 21st century and discovers that the issues remain the same.

[div class=attrib]From the New York Times:[end-div]

For the sake of a cleaner planet, should Americans wear dirtier clothes?

This is not a simple question, but then, nothing about dirty laundry is simple anymore. We’ve come far since the carefree days of 1996, when Consumer Reports tested some midpriced top-loaders and reported that “any washing machine will get clothes clean.”

In this year’s report, no top-loading machine got top marks for cleaning. The best performers were front-loaders costing on average more than $1,000. Even after adjusting for inflation, that’s still $350 more than the top-loaders of 1996.

What happened to yesterday’s top-loaders? To comply with federal energy-efficiency requirements, manufacturers made changes like reducing the quantity of hot water. The result was a bunch of what Consumer Reports called “washday wash-outs,” which left some clothes “nearly as stained after washing as they were when we put them in.”

Now, you might think that dirtier clothes are a small price to pay to save the planet. Energy-efficiency standards have been embraced by politicians of both parties as one of the easiest ways to combat global warming. Making appliances, cars, buildings and factories more efficient is called the “low-hanging fruit” of strategies to cut greenhouse emissions.

But a growing number of economists say that the environmental benefits of energy efficiency have been oversold. Paradoxically, there could even be more emissions as a result of some improvements in energy efficiency, these economists say.

The problem is known as the energy rebound effect. While there’s no doubt that fuel-efficient cars burn less gasoline per mile, the lower cost at the pump tends to encourage extra driving. There’s also an indirect rebound effect as drivers use the money they save on gasoline to buy other things that produce greenhouse emissions, like new electronic gadgets or vacation trips on fuel-burning planes.

[div class=attrib]Read more here.[end-div]

[div class=attrib]Image courtesy of Wikipedia, Popular Science Monthly / Creative Commons.[end-div]

Building an Interstate Highway System for Energy

[div class=attrib]From Discover:[end-div]

President Obama plans to spend billions building it. General Electric is already running slick ads touting the technology behind it. And Greenpeace declares that it is a great idea. But what exactly is a “smart grid”? According to one big-picture description, it is much of what today’s power grid is not, and more of what it must become if the United States is to replace carbon-belching, coal-fired power with renewable energy generated from sun and wind.

Today’s power grids are designed for local delivery, linking customers in a given city or region to power plants relatively nearby. But local grids are ill-suited to distributing energy from the alternative sources of tomorrow. North America’s strongest winds, most intense sunlight, and hottest geothermal springs are largely concentrated in remote regions hundreds or thousands of miles from the big cities that need electricity most. “Half of the population in the United States lives within 100 miles of the coasts, but most of the wind resources lie between North Dakota and West Texas,” says Michael Heyeck, senior vice president for transmission at the utility giant American Electric Power. Worse, those winds constantly ebb and flow, creating a variable supply.

Power engineers are already sketching the outlines of the next-generation electrical grid that will keep our homes and factories humming with clean—but fluctuating—renewable energy. The idea is to expand the grid from the top down by adding thousands of miles of robust new transmission lines, while enhancing communication from the bottom up with electronics enabling millions of homes and businesses to optimize their energy use.

The Grid We Have
When electricity leaves a power plant today, it is shuttled from place to place over high-voltage lines, those cables on steel pylons that cut across landscapes and run virtually contiguously from coast to coast. Before it reaches your home or office, the voltage is reduced incrementally by passing through one or more intermediate points, called substations. The substations process the power until it can flow to outlets in homes and businesses at the safe level of 110 volts.

The vast network of power lines delivering the juice may be interconnected, but pushing electricity all the way from one coast to the other is unthinkable with the present technology. That is because the network is an agglomeration of local systems patched together to exchange relatively modest quantities of surplus power. In fact, these systems form three distinct grids in the United States: the Eastern, Western, and Texas interconnects. Only a handful of transfer stations can move power between the different grids.

[div class=attrib]More from theSource here.[end-div]

The Strange Forests that Drink—and Eat—Fog

[div class=attrib]From Discover:[end-div]

On the rugged roadway approaching Fray Jorge National Park in north-central Chile, you are surrounded by desert. This area receives less than six inches of rain a year, and the dry terrain is more suggestive of the badlands of the American Southwest than of the lush landscapes of the Amazon. Yet as the road climbs, there is an improbable shift. Perched atop the coastal mountains here, some 1,500 to 2,000 feet above the level of the nearby Pacific Ocean, are patches of vibrant rain forest covering up to 30 acres apiece. Trees stretch as much as 100 feet into the sky, with ferns, mosses, and bromeliads adorning their canopies. Then comes a second twist: As you leave your car and follow a rising path from the shrub into the forest, it suddenly starts to rain. This is not rain from clouds in the sky above, but fog dripping from the tree canopy. These trees are so efficient at snatching moisture out of the air that the fog provides them with three-quarters of all the water they need.

Understanding these pocket rain forests and how they sustain themselves in the middle of a rugged desert has become the life’s work of a small cadre of scientists who are only now beginning to fully appreciate Fray Jorge’s third and deepest surprise: The trees that grow here do more than just drink the fog. They eat it too.

Fray Jorge lies at the north end of a vast rain forest belt that stretches southward some 600 miles to the tip of Chile. In the more southerly regions of this zone, the forest is wetter, thicker, and more contiguous, but it still depends on fog to survive dry summer conditions. Kathleen C. Weathers, an ecosystem scientist at the Cary Institute of Ecosystem Studies in Millbrook, New York, has been studying the effects of fog on forest ecosystems for 25 years, and she still cannot quite believe how it works. “One step inside a fog forest and it’s clear that you’ve entered a remarkable ecosystem,” she says. “The ways in which trees, leaves, mosses, and bromeliads have adapted to harvest tiny droplets of water that hang in the atmosphere is unparalleled.”

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Juan J. Armesto/Foundation Senda Darwin Archive[end-div]

Manufactured scarcity

[div class=attrib]From Eurozine:[end-div]

“Manufacturing scarcity” is the new watchword in “Green capitalism”. James Heartfield explains how for the energy sector, it has become a license to print money. Increasing profits by cutting output was pioneered by Enron in the 1990s; now the model of restricted supply together with domestic energy generation is promoted worldwide.

The corporate raiders of the 1980s first worked out that you might be able to make more money downsizing, or even breaking up industry than building it up. It is a perverse result of the profit motive that private gain should grow out of public decay. But even the corporate raiders never dreamt of making deindustrialisation into an avowed policy goal which the rest of us would pay for.

What some of the cannier Green Capitalists realised is that scarcity increases price, and manufacturing scarcity can increase returns. What could be more old hat, they said, than trying to make money by making things cheaper? Entrepreneurs disdained the “fast moving consumer goods” market.

Of course there is a point to all this. If labour gets too efficient the chances of wringing more profits from industry get less. The more productive labour is, the lower, in the end, will be the rate of return on investments. That is because the source of new value is living labour; but greater investment in new technologies tends to replace living labour with machines, which produce no additional value of their own.[2] Over time the rate of return must fall. Business theory calls this the diminishing rate of return.[3] Businessmen know it as the “race for the bottom” – the competitive pressure to make goods cheaper and cheaper, making it that much harder to sell enough to make a profit. Super efficient labour would make the capitalistic organisation of industry redundant. Manufacturing scarcity, restricting output and so driving up prices is one short-term way to secure profits and maybe even the profit-system. Of course that would also mean abandoning the historic justification for capitalism, that it increased output and living standards. Environmentalism might turn out to be the way to save capitalism, just at the point when industrial development had shown it to be redundant.

[div class=attrib]More from theSource here.[end-div]

A Solar Grand Plan

[div class=attrib]From Scientific American:[end-div]

By 2050 solar power could end U.S. dependence on foreign oil and slash greenhouse gas emissions.

High prices for gasoline and home heating oil are here to stay. The U.S. is at war in the Middle East at least in part to protect its foreign oil interests. And as China, India and other nations rapidly increase their demand for fossil fuels, future fighting over energy looms large. In the meantime, power plants that burn coal, oil and natural gas, as well as vehicles everywhere, continue to pour millions of tons of pollutants and greenhouse gases into the atmosphere annually, threatening the planet.

Well-meaning scientists, engineers, economists and politicians have proposed various steps that could slightly reduce fossil-fuel use and emissions. These steps are not enough. The U.S. needs a bold plan to free itself from fossil fuels. Our analysis convinces us that a massive switch to solar power is the logical answer.

  • A massive switch from coal, oil, natural gas and nuclear power plants to solar power plants could supply 69 percent of the U.S.’s electricity and 35 percent of its total energy by 2050.
  • A vast area of photovoltaic cells would have to be erected in the Southwest. Excess daytime energy would be stored as compressed air in underground caverns to be tapped during nighttime hours.
  • Large solar concentrator power plants would be built as well.
  • A new direct-current power transmission backbone would deliver solar electricity across the country.
  • But $420 billion in subsidies from 2011 to 2050 would be required to fund the infrastructure and make it cost-competitive.

[div class=attrib]More from theSource here.[end-div]

A Plan to Keep Carbon in Check

[div class=attrib]By Robert H. Socolow and Stephen W. Pacala, From Scientific American:[end-div]

Getting a grip on greenhouse gases is daunting but doable. The technologies already exist. But there is no time to lose.

Retreating glaciers, stronger hurricanes, hotter summers, thinner polar bears: the ominous harbingers of global warming are driving companies and governments to work toward an unprecedented change in the historical pattern of fossil-fuel use. Faster and faster, year after year for two centuries, human beings have been transferring carbon to the atmosphere from below the surface of the earth. Today the world’s coal, oil and natural gas industries dig up and pump out about seven billion tons of carbon a year, and society burns nearly all of it, releasing carbon dioxide (CO2). Ever more people are convinced that prudence dictates a reversal of the present course of rising CO2 emissions.

The boundary separating the truly dangerous consequences of emissions from the merely unwise is probably located near (but below) a doubling of the concentration of CO2 that was in the atmosphere in the 18th century, before the Industrial Revolution began. Every increase in concentration carries new risks, but avoiding that danger zone would reduce the likelihood of triggering major, irreversible climate changes, such as the disappearance of the Greenland ice cap. Two years ago the two of us provided a simple framework to relate future CO2 emissions to this goal.

[div class=attrib]More from theSource here.[end-div]

Plan B for Energy

[div class=attrib]From Scientific American:[end-div]

If efficiency improvements and incremental advances in today’s technologies fail to halt global warming, could revolutionary new carbon-free energy sources save the day? Don’t count on it–but don’t count it out, either.

To keep this world tolerable for life as we like it, humanity must complete a marathon of technological change whose finish line lies far over the horizon. Robert H. Socolow and Stephen W. Pacala of Princeton University have compared the feat to a multigenerational relay race [see their article “A Plan to Keep Carbon in Check”]. They outline a strategy to win the first 50-year leg by reining back carbon dioxide emissions from a century of unbridled acceleration. Existing technologies, applied both wisely and promptly, should carry us to this first milestone without trampling the global economy. That is a sound plan A.

The plan is far from foolproof, however. It depends on societies ramping up an array of carbon-reducing practices to form seven “wedges,” each of which keeps 25 billion tons of carbon in the ground and out of the air. Any slow starts or early plateaus will pull us off track. And some scientists worry that stabilizing greenhouse gas emissions will require up to 18 wedges by 2056, not the seven that Socolow and Pacala forecast in their most widely cited model.
[div class=attrib]More from theSource here.[end-div]

A Power Grid for the Hydrogen Economy

[div class=attrib]From Scientific American:[end-div]

On the afternoon of August 14, 2003, electricity failed to arrive in New York City, plunging the eight million inhabitants of the Big Apple–along with 40 million other people throughout the northeastern U.S. and Ontario–into a tense night of darkness. After one power plant in Ohio had shut down, elevated power loads overheated high-voltage lines, which sagged into trees and short-circuited. Like toppling dominoes, the failures cascaded through the electrical grid, knocking 265 power plants offline and darkening 24,000 square kilometers.

That incident–and an even more extensive blackout that affected 56 million people in Italy and Switzerland a month later–called attention to pervasive problems with modern civilization’s vital equivalent of a biological circulatory system, its interconnected electrical networks. In North America the electrical grid has evolved in piecemeal fashion over the past 100 years. Today the more than $1-trillion infrastructure spans the continent with millions of kilometers of wire operating at up to 765,000 volts. Despite its importance, no single organization has control over the operation, maintenance or protection of the grid; the same is true in Europe. Dozens of utilities must cooperate even as they compete to generate and deliver, every second, exactly as much power as customers demand–and no more. The 2003 blackouts raised calls for greater government oversight and spurred the industry to move more quickly, through its Intelli-Grid Consortium and the Grid-Wise program of the U.S. Department of Energy, to create self-healing systems for the grid that may prevent some kinds of outages from cascading. But reliability is not the only challenge–and arguably not even the most important challenge–that the grid faces in the decades ahead.

[div class=attrib]More from theSource here.[end-div]