Tag Archives: alternative energy

Cheap Hydrogen

Researchers at the University of Glasgow, Scotland, have discovered an alternative and possibly more efficient way to make hydrogen at industrial scales. Typically, hydrogen is produced from reacting high temperature steam with methane or natural gas. A small volume of hydrogen, less than five percent annually, is also made through the process of electrolysis — passing an electric current through water.

This new method of production appears to be less costly, less dangerous and also more environmentally sound.

From the Independent:

Scientists have harnessed the principles of photosynthesis to develop a new way of producing hydrogen – in a breakthrough that offers a possible solution to global energy problems.

The researchers claim the development could help unlock the potential of hydrogen as a clean, cheap and reliable power source.

Unlike fossil fuels, hydrogen can be burned to produce energy without producing emissions. It is also the most abundant element on the planet.

Hydrogen gas is produced by splitting water into its constituent elements – hydrogen and oxygen. But scientists have been struggling for decades to find a way of extracting these elements at different times, which would make the process more energy-efficient and reduce the risk of dangerous explosions.

In a paper published today in the journal Nature Chemistry, scientists at the University of Glasgow outline how they have managed to replicate the way plants use the sun’s energy to split water molecules into hydrogen and oxygen at separate times and at separate physical locations.

Experts heralded the “important” discovery yesterday, saying it could make hydrogen a more practicable source of green energy.

Professor Xile Hu, director of the Laboratory of Inorganic Synthesis and Catalysis at the Swiss Federal Institute of Technology in Lausanne, said: “This work provides an important demonstration of the principle of separating hydrogen and oxygen production in electrolysis and is very original. Of course, further developments are needed to improve the capacity of the system, energy efficiency, lifetime and so on. But this research already  offers potential and promise and can help in making the storage of green  energy cheaper.”

Until now, scientists have separated hydrogen and oxygen atoms using electrolysis, which involves running electricity through water. This is energy-intensive and potentially explosive, because the oxygen and hydrogen are removed at the same time.

But in the new variation of electrolysis developed at the University of Glasgow, hydrogen and oxygen are produced from the water at different times, thanks to what researchers call an “electron-coupled proton buffer”. This acts to collect and store hydrogen while the current runs through the water, meaning that in the first instance only oxygen is released. The hydrogen can then be released when convenient.

Because pure hydrogen does not occur naturally, it takes energy to make it. This new version of electrolysis takes longer, but is safer and uses less energy per minute, making it easier to rely on renewable energy sources for the electricity needed to separate  the atoms.

Dr Mark Symes, the report’s co-author, said: “What we have developed is a system for producing hydrogen on an industrial scale much more cheaply and safely than is currently possible. Currently much of the industrial production of hydrogen relies on reformation of fossil fuels, but if the electricity is provided via solar, wind or wave sources we can create an almost totally clean source of power.”

Professor Lee Cronin, the other author of the research, said: “The existing gas infrastructure which brings gas to homes across the country could just as easily carry hydrogen as it  currently does methane. If we were to use renewable power to generate hydrogen using the cheaper, more efficient decoupled process we’ve created, the country could switch to hydrogen  to generate our electrical power  at home. It would also allow us to  significantly reduce the country’s  carbon footprint.”

Nathan Lewis, a chemistry professor at the California Institute of Technology and a green energy expert, said: “This seems like an interesting scientific demonstration that may possibly address one of the problems involved with water electrolysis, which remains a relatively expensive method of producing hydrogen.”

Read the entire article following the jump.

Fusion and the Z Machine

The quest to tap fusion as an energy source here on Earth continues to inch forward with some promising new developments. Of course, we mean nuclear fusion — the type which drives our companion star to shine, not the now debunked “cold fusion” supposedly demonstrated in a test tube in the late 1980s.

[div class=attrib]From Wired:[end-div]

In the high-stakes race to realize fusion energy, a smaller lab may be putting the squeeze on the big boys. Worldwide efforts to harness fusion—the power source of the sun and stars—for energy on Earth currently focus on two multibillion dollar facilities: the ITER fusion reactor in France and the National Ignition Facility (NIF) in California. But other, cheaper approaches exist—and one of them may have a chance to be the first to reach “break-even,” a key milestone in which a process produces more energy than needed to trigger the fusion reaction.

Researchers at the Sandia National Laboratory in Albuquerque, New Mexico, will announce in a Physical Review Letters (PRL) paper accepted for publication that their process, known as magnetized liner inertial fusion (MagLIF) and first proposed 2 years ago, has passed the first of three tests, putting it on track for an attempt at the coveted break-even. Tests of the remaining components of the process will continue next year, and the team expects to take its first shot at fusion before the end of 2013.

Fusion reactors heat and squeeze a plasma—an ionized gas—composed of the hydrogen isotopes deuterium and tritium, compressing the isotopes until their nuclei overcome their mutual repulsion and fuse together. Out of this pressure-cooker emerge helium nuclei, neutrons, and a lot of energy. The temperature required for fusion is more than 100 million°C—so you have to put a lot of energy in before you start to get anything out. ITER and NIF are planning to attack this problem in different ways. ITER, which will be finished in 2019 or 2020, will attempt fusion by containing a plasma with enormous magnetic fields and heating it with particle beams and radio waves. NIF, in contrast, takes a tiny capsule filled with hydrogen fuel and crushes it with a powerful laser pulse. NIF has been operating for a few years but has yet to achieve break-even.

Sandia’s MagLIF technique is similar to NIF’s in that it rapidly crushes its fuel—a process known as inertial confinement fusion. But to do it, MagLIF uses a magnetic pulse rather than lasers. The target in MagLIF is a tiny cylinder about 7 millimeters in diameter; it’s made of beryllium and filled with deuterium and tritium. The cylinder, known as a liner, is connected to Sandia’s vast electrical pulse generator (called the Z machine), which can deliver 26 million amps in a pulse lasting milliseconds or less. That much current passing down the walls of the cylinder creates a magnetic field that exerts an inward force on the liner’s walls, instantly crushing it—and compressing and heating the fusion fuel.

Researchers have known about this technique of crushing a liner to heat the fusion fuel for some time. But the MagLIF-Z machine setup on its own didn’t produce quite enough heat; something extra was needed to make the process capable of reaching break-even. Sandia researcher Steve Slutz led a team that investigated various enhancements through computer simulations of the process. In a paper published in Physics of Plasmas in 2010, the team predicted that break-even could be reached with three enhancements.

First, they needed to apply the current pulse much more quickly, in just 100 nanoseconds, to increase the implosion velocity. They would also preheat the hydrogen fuel inside the liner with a laser pulse just before the Z machine kicks in. And finally, they would position two electrical coils around the liner, one at each end. These coils produce a magnetic field that links the two coils, wrapping the liner in a magnetic blanket. The magnetic blanket prevents charged particles, such as electrons and helium nuclei, from escaping and cooling the plasma—so the temperature stays hot.

Sandia plasma physicist Ryan McBride is leading the effort to see if the simulations are correct. The first item on the list is testing the rapid compression of the liner. One critical parameter is the thickness of the liner wall: The thinner the wall, the faster it will be accelerated by the magnetic pulse. But the wall material also starts to evaporate away during the pulse, and if it breaks up too early, it will spoil the compression. On the other hand, if the wall is too thick, it won’t reach a high enough velocity. “There’s a sweet spot in the middle where it stays intact and you still get a pretty good implosion velocity,” McBride says.

To test the predicted sweet spot, McBride and his team set up an elaborate imaging system that involved blasting a sample of manganese with a high-powered laser (actually a NIF prototype moved to Sandia) to produce x-rays. By shining the x-rays through the liner at various stages in its implosion, the researchers could image what was going on. They found that at the sweet-spot thickness, the liner held its shape right through the implosion. “It performed as predicted,” McBride says. The team aims to test the other two enhancements—the laser preheating and the magnetic blanket—in the coming year, and then put it all together to take a shot at break-even before the end of 2013.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: The Z Pulsed Power Facility produces tremendous energy when it fires. Courtesy of Sandia National Laboratory.[end-div]

Building an Interstate Highway System for Energy

[div class=attrib]From Discover:[end-div]

President Obama plans to spend billions building it. General Electric is already running slick ads touting the technology behind it. And Greenpeace declares that it is a great idea. But what exactly is a “smart grid”? According to one big-picture description, it is much of what today’s power grid is not, and more of what it must become if the United States is to replace carbon-belching, coal-fired power with renewable energy generated from sun and wind.

Today’s power grids are designed for local delivery, linking customers in a given city or region to power plants relatively nearby. But local grids are ill-suited to distributing energy from the alternative sources of tomorrow. North America’s strongest winds, most intense sunlight, and hottest geothermal springs are largely concentrated in remote regions hundreds or thousands of miles from the big cities that need electricity most. “Half of the population in the United States lives within 100 miles of the coasts, but most of the wind resources lie between North Dakota and West Texas,” says Michael Heyeck, senior vice president for transmission at the utility giant American Electric Power. Worse, those winds constantly ebb and flow, creating a variable supply.

Power engineers are already sketching the outlines of the next-generation electrical grid that will keep our homes and factories humming with clean—but fluctuating—renewable energy. The idea is to expand the grid from the top down by adding thousands of miles of robust new transmission lines, while enhancing communication from the bottom up with electronics enabling millions of homes and businesses to optimize their energy use.

The Grid We Have
When electricity leaves a power plant today, it is shuttled from place to place over high-voltage lines, those cables on steel pylons that cut across landscapes and run virtually contiguously from coast to coast. Before it reaches your home or office, the voltage is reduced incrementally by passing through one or more intermediate points, called substations. The substations process the power until it can flow to outlets in homes and businesses at the safe level of 110 volts.

The vast network of power lines delivering the juice may be interconnected, but pushing electricity all the way from one coast to the other is unthinkable with the present technology. That is because the network is an agglomeration of local systems patched together to exchange relatively modest quantities of surplus power. In fact, these systems form three distinct grids in the United States: the Eastern, Western, and Texas interconnects. Only a handful of transfer stations can move power between the different grids.

[div class=attrib]More from theSource here.[end-div]