Search This Blog

Monday, October 17, 2011

Brilliant 10: Sludge Miner



Scanning the genomes of an entire ecosystem will help scientists understand carbon sequestration
Marsh Dwellers Susannah Tringe studies the genomic fingerprints of ecosystems to understand how microbial species work together John B. Carnett
Susannah Tringe spends a fair bit of her work time, currently for the U.S. Department of Energy Joint Genome Institute, in the fragrant, murky wetlands of California’s Sacramento–San Joaquin Delta. Thriving microbial communities there could be the key to understanding how wetlands mitigate or exacerbate greenhouse-gas levels in our atmosphere. Tringe is cataloging the genetic fingerprints of the entire microbial ecosystem to determine how these wetlands work and if we can tailor them while restoring drained wetlands to absorb more greenhouse gas than they emit.
A biophysicist by training and a crossword-puzzle fiend in her spare time, Tringe is focused most closely on wetland microbes and the soil and plants they live in and on. Studying the ecosystem’s collective biology, she says, will help her figure out whether the wetlands are pulling carbon dioxide out of the atmosphere—or whether, in some cases, they’re net producers of methane and other greenhouse gases. But this practice poses logistical problems: a single cup of swamp sludge can contain many thousands of microbe species, and it’s very difficult to isolate each one and catalog its genes individually.
Instead, Tringe extracts the Dna from all the microbes in an entire sample to determine which genes are present. “If a lot of organisms in an environment have a gene, it’s probably pretty important,” she says. When she finds many genes in a microbe sample that code for processes guiding carbon storage, for example, that is a good indication that microbes, not just vegetation, are important for sequestering carbon dioxide.
The Department of Energy recently awarded Tringe a $2.5-million research grant to continue her study of wetland ecosystems, with the goal of finding the best way to restore them. Tule, a plant commonly used to restore wetlands, harbors methane-producing microbes in its roots, tringe found. Replacing the tule with a different plant might cut down on greenhouse-gas seepage. If we restored all the drained wetlands in the Sacramento delta, she says, “it would be like converting all the SUVs in the state into hybrids."

Brilliant 10: The Computational Contortionist



Rendering complex objects realistically requires a whole new kind of geometry
Complex Folding Computer scientist Eitan Grinspun studies how long, thin strands, such as spaghetti and undersea data cables, twist and coils John B. Carnett
When Eitan Grinspun’s adviser at the California Institute of Technology asked him to help develop a better way to model how cans bend when crushed, the young mathematician did not think it would be a major project. “He lured me into something that took years and years,” says Grinspun, now at Columbia University. But the journey to model a crushed Coke can ended with an entirely new field of geometry.
Differential geometry can describe how the curves and surfaces of a given object will bend and crease. The problem, Grinspun says, is “that differential geometry is built for smooth surfaces with infinite detail.” Computers can process only a finite amount of detail. For example, to describe a circle, computers must divide that circle into a series of connecting short sides—the greater the number of sides, the smoother the circle. Describing all of the curves and creases in a crushed can accurately takes a huge amount of processing power, so Grinspun—one of only a couple of mathematicians in the field with a background in computer science—set about translating the theorems into a more elegant set of instructions for the computer, allowing existing processors to break the infinite into discrete units far more efficiently.
Grinspun's method works by concentrating on the places where most of the movement will occur—in the case of the Coke can, the areas where it folds as it crumples. “There are a lot of flat regions where not much is happening,” he says. “If a computer spreads its attention equally, it’s not going to the interesting parts, where cracks are forming.”
Once Grinspun and his colleagues established this new approach, which they call discrete differential geometry, the queries from physicists, engineers and animators started arriving. Disney and Weta Digital use his theorems to make fabrics and hair move more convincingly. Physicists at MIT have created origami out of small sheets of plastic and water drops. Engineers can now far more accurately predict how cables will fall to the seafloor. “For me, [this field] is a playground,” he says. “I get to take any interesting physical problem—say, spaghetti movement. Toss it in the air, and it falls on the ground and it twists and coils. Why does it move that way?"

Brilliant 10: The Butterfly Pharmacist



Watching how insects use plants shows that self-medication isn’t just for complex animals
Checkmate Jaap de Roode studies how monarch butterflies use plant-based medicine to thwart parasites. John B. Carnett
“I didn’t start working with monarchs because I liked them,” says evolutionary biologist Jaap de Roode of Emory University. “I came to them because they have a really cool parasite.” That parasite, called Ophryocystis elektroscirrha, normally pokes holes in the butterflies’ skin, causing them to leak bodily fluids. But de Roode noticed that monarchs that ate the tropical milkweed plant did not suffer from parasitic infections as much as monarchs eating swamp milkweed did. This led him to suggest to his colleagues that the monarchs were self-medicating. “One of my reviewers said, ‘That’s completely ridiculous. There’s absolutely no way they could ever do that,’” de Roode recalls. Up until then, self-medication was seen as a complex cultural trait. Only a few animals, such as chimps and elephants, had been observed using medicine.
To test his hypothesis, first de Roode looked to see if infected larvae prefer to munch on the parasite-killing tropical milkweed species, rather than the swamp milkweed. They didn’t, so he concluded that the larvae do not use the tropical milkweed medicinally. But when he compared the behavior of healthy adult females with the behavior of infected adult females, a difference quickly became apparent. Infected females, which transmit the parasite to their offspring when they spawn, preferred to lay eggs on the tropical milkweed, showing that they can preemptively medicate their offspring. “Somehow, the mother knows what’s best,” de Roode says.
His findings challenge the view that only animals with cognitive complexity use medicine. If butterflies, which have a simple nervous system and no social structure, could preferentially use medicine, perhaps self-medication is pervasive in the animal kingdom and scientists just haven’t had the chance to find it yet.

Brilliant 10: The Chemical Catcher



Trapping and preserving biomarkers will help doctors detect cancer sooner
Protein Safari The nanoparticles built by Alessandra Luchini will catch cancer biomarkers the way nets catch fish John B. Carnett
When Alessandra Luchini was a girl growing up in Italy, she visited the Museo Galileo in Florence, where she saw the telescope that Galileo Galilei had invented four centuries before, in 1610. She was struck by its simplicity. with a just a couple of pieces of curved glass, anyone could see whole new worlds.
In 2005, Luchini, now an engineer at George Mason University, came to the U.S. on a grant from the Italian National Health Service to study ways to detect molecular signs of cancer. Some diseases, early on, release faint hints of their presence into our bodily fluids. These “biomarkers” are ephemeral—our enzymes chew them up within minutes, so they’re undetectable in most lab tests. If doctors had a way to catch and stabilize those biomarkers, they could detect diseases more quickly and begin treatment at a stage when the chances of recovery were much higher.
Luchini’s solution was to build a nanoparticle trap. The concept, like Galileo’s telescope, is simple: “It’s like a net for catching very small fishes,” Luchini says. The spherical nanoparticle, which took two years to perfect, uses hydrogel as its backbone. Inside, a crisscrossing polymer net holds bait, such as acid or dye, which chemically attracts various biomarkers. when lab technicians mix the nanoparticle in with a fresh blood sample, it traps the biomarkers and protects them from enzymes. The sample can then be tested at leisure. So far, Luchini has used nanoparticle traps to produce an early diagnosis of infectious diseases such as Lyme disease and tuberculosis. (The traps can also reveal the presence of human growth hormone in urine, and thus offer a novel way to reveal illegal doping by athletes.) She and her team are also working on nanotraps to find the skin-cancer biomarkers that exist in a person’s sweat.
Luchini’s next step is to modify the nanoparticles so they can trap biomarkers in a body, giving doctors a realtime view of what’s going on inside their patients.

Brilliant 10: The Chemical Mechanic



Attaching fluorines to medicine makes it more effective
Tobias Ritter Courtesy Tobias Ritter
After 1,200 unsuccessful attempts to do something, most people would call it quits. Not Harvard University chemist Tobias Ritter. Chemistry research is 90 percent failure, he says. But success, when it comes, can be big. In Ritter's case, it could mean more-effective drugs. Ritter, a native of Germany, had been studying fluorination, the process by which fluorine atoms bind to carbon, since 2007. Drug manufacturers had long known that fluorine could make their products more stable, potent and penetrating, but the standard methods for attaching fluorines were unreliable and, more often than not, would damage the drugs. Finding a better fluorination technique is one of the more difficult challenges in modern medicine, says Robert Grubbs, a Nobel Prize–winning chemist at Caltech. But Ritter kept at it.
One good way to attach fluorines to an organic compound is by using a catalyst. Ritter and his colleagues began conducting experiments with metal catalysts, adjusting the ingredients and trying again after each reaction, some 1,200 attempts over the course of a year. Finally, using palladium as the catalyst, Ritter’s group had its first success. The yield of fluorinated drug was a meager 1 percent of the reaction’s theoretical output. They continued tweaking the reaction conditions, switched over to a silver oxide catalyst, and improved the reaction yield to 90 percent. “Once we had our foot in the door, it was much easier,” Ritter says.
His fluorination method could make drugs for depression and cancer better at attaching to their targets—he has started a company to test the technique—but it will also help scientists study how these drugs work. Fluorinated molecules are used as tagging compounds in positron emission tomography (PET) scans. Since Ritter’s technique tags fluorines to drugs at a late stage of the drugs’ synthesis, scientists may be able to track these molecules’ paths through our bodies.

Giant Holes in the Ground



An expected nuclear renaissance has failed to materialize as plans for new plants are scrapped or delayed. What happened?

  • BY MATTHEW L. WALD


At the edge of the massive excavation project that is a preliminary step to building America's biggest nuclear power plant, Joshua Elkins stands next to two holes that span 42 acres in the red Georgia clay. Elkins maintains the earth-moving equipment that dug these holes, each as big as 15 football fields, 90 feet down to bedrock and then painstakingly refilled them to about 50 feet with soil tested to maintain stability in an earthquake. In helping to lay the foundation for the two 1,100-megawatt reactors the Southern Company is building here, his machines will contour the earth to specifications meticulously measured by GPS.
The last time anybody in the United States did excavation work for a new nuclear reactor, Elkins, who turned 27 in October, had not been born. Indeed, the groundbreaking for these Westinghouse-­designed reactors at the Vogtle nuclear plant, 35 miles south of Augusta, Georgia, represents the first new nuclear construction since the 1970s. (Two existing reactors at the plant began commercial operation in 1987 and 1989.) An unlikely coalition of large utility companies, government policy makers, and environmentalists worried about global warming hoped that it and several other large planned plants in the United States would mark the beginning of a nuclear renaissance, with scores of new reactors being built around the country and worldwide.
And at first glance, circumstances finally seem to favor an expansion of nuclear power. Some $18.5 billion in federal loan guarantees was made available to cover as much as 80 percent of the cost of building a new plant, and the loan program may soon offer tens of billions more. (The new Vogtle reactors received $8.3 billion in loan guarantees from the U.S. Department of Energy in February.) President Obama, members of his administration, and the Republican leadership have all called for increased use of nuclear power as part of a long-term strategy for reducing U.S. reliance on fossil fuels. Also on the bandwagon for nuclear power are such influential technologists as Microsoft founder Bill Gates (see Q&A, September/October 2010) and longtime environmentalist Stewart Brand, who have argued that expanding nuclear capacity is essential to meeting growing worldwide electricity demand with zero-carbon energy sources.
But now the renaissance is stalled--both in the United States and in many other parts of the world. Apart from the Vogtle plant, the only U.S. nuclear project on which site work has started is across the Savannah River, near Jenkinsville, South Carolina, where the South Carolina Electric & Gas Company and the South Carolina Public Service Authority are planning to add two reactors to the existing V. C. Summer plant. Although many other utilities have applied for approval of reactor sites or projects in the last few months, most of the plans, including some of the most high-profile examples, have met obstacles. The Chicago-based utility Exelon, which is the nation's largest nuclear operator, with 17 units, has postponed its decision on whether to build a twin-unit nuclear plant in Victoria County, Texas. Two other large nuclear suppliers, NRG Energy and UniStar Nuclear Energy, have put off building long-planned plants in south Texas and Calvert County, Maryland, respectively.
The problems are not confined to the United States: projects are delayed in many nations with high hopes for nuclear power (see "Nuclear Ambitions"). The first of a new class of reactors designed by the French energy giant Areva is being built on ­Olkiluoto Island in Finland. It was begun in 2005 and was supposed to be in service by 2009; now the estimate is 2013. A second reactor using the Areva design, which is meant to be ultrareliable and features four redundant safety systems, is being built in Flamanville, France, but it seems to have run into similar problems; its target date has been pushed from 2012 to 2014. In Japan, construction schedules for two advanced boiling water reactors, a recent design from General Electric and Hitachi, have slipped by a year. China is constructing 24 reactors and plans to quadruple capacity by 2020, but it is currently a tiny player, producing only 2 percent of its electricity from nuclear power.
Today there are 104 operating nuclear reactors in the United States, supplying about 20 percent of electricity generated. Many have increased their capacity, by up to 20 percent, and most operate more than 90 percent of the time, which is slightly more than coal- or gas-fired power plants and much more than wind farms or solar plants. But all are aging. Jay Apt, the executive director of the Electricity Industry Center at Carnegie Mellon University, says that as old plants are retired and demand for electricity grows, the role of nuclear power could actually shrink. "I don't think it's a question of whether nuclear plants will be able to shoulder more of the burden," he says. "It's more a question of whether the nuclear plants will be able to continue shouldering the current share. Nuclear is going to have to run very fast to stay in place."
COST CONUNDRUM
Although the debate over nuclear power often focuses on thorny questions about its safety and its usefulness as a zero-carbon source of energy, the stumbling block to building more reactors in the United States has been, simply, cost. The price tag for the ­Vogtle reactors is expected to be between $12 billion and $14 billion, depending in part on what it costs the owners to borrow money for construction. The $14 billion estimate puts the price of the plant at about $6,000 per kilowatt (enough power to keep a window air conditioner running). That's far higher than the cost for other types of plants, whether they use renewable or fossil fuels. Building wind-turbine capacity costs roughly $2,000 to $2,500 per kilowatt; for gas-fired capacity, the figure is only $950 to $1,175. Advocates argue that despite the higher capital costs of a nuclear plant, those costs can be recovered to make nuclear power cheap over time: after all, a new plant is designed to run for 60 years, its operating and fuel costs are relatively low, and it can operate almost continuously, unlike plants that generate electricity from renewable sources. The problem is that the comparative costs of different fuels--and even the relative costs of building, say, a nuclear plant and a wind farm--can shift radically, throwing these calculations in doubt.
Whether billions of dollars for a new reactor is a smart investment depends on complex and unpredictable factors: the future cost of fossil fuels, for example, and the price, if any, placed on carbon emissions through government policy. In a 2008 analysis, the financial and asset management group Lazard looked at numerous energy technologies and, for each, estimated a "levelized cost of energy," which takes into account the expected lifetime of the generator, the estimated cost of the fuel, and the value of invested money over time. The analysis put the price of electricity generated from nuclear power at $98 to $126 per megawatt-hour; for wind, the estimate was $44 to $91 per megawatt-hour, and for natural gas, it was $73 to $100 per megawatt-hour. The range in each set of numbers hints at the uncertainty. A more recent estimate by the Energy Information Administration, working from different assumptions, gave a more optimistic scenario for nuclear, putting its cost well below that of wind and other renewable sources and making it appear more competitive with fossil fuels (see "Nuclear Ambitions," p. 64). But Lazard's numbers also show that natural-gas-fired plants could produce electricity as cheaply as $59 per megawatt-hour, and coal plants as cheaply as $67 per megawatt-hour, if the prices of those fuels drop further. As it happens, prices of natural gas are currently low, and vast, accessible U.S. reserves have recently been found (see "Natural Gas Changes the Energy Map," November/December 2009).


So over time nuclear power may or may not produce electricity more cheaply than fossil fuels or renewable sources. For power companies, choosing to build a nuclear plant is thus an extremely risky decision, especially in tough financial times.
It is not coincidental that what signs of life the industry shows in the United States are mostly in the South, where so-called "cost-of-service" regulation guarantees some profit. When the plant is finished, accountants calculate the total amount the utility has invested in construction and equipment, minus depreciation. That "rate base," along with fuel, labor, and maintenance expenses, is used to figure the utility's cost of providing service; the rate customers pay is based on that cost plus an authorized rate of return on the capital investments. Thus most of the risk is borne by customers, not investors.
About half the United States uses a radically different pricing model, however. In Texas, for example, most electricity is sold in a daily auction. All generators get the same price for their electricity; that price is usually determined by the cost of natural gas used in the last few plants needed to generate the day's supply. Exelon says that the current low price of natural gas, around $4.50 per million British thermal units (BTUs), made building a new nuclear plant unthinkable. "We don't have the right stimulus now," says Christopher M. Crane, Exelon's president and chief operating officer. To make a new nuclear plant economically viable, he says, the price of natural gas would have to nearly double, to $8 per million BTUs, and a government cap-and-trade system would have to put a price on carbon dioxide emissions amounting to $25 a ton or more.
Carbon pricing alone could immediately make nuclear far more attractive. A typical power plant running on pulverized coal puts out just under two pounds of carbon dioxide per kilowatt-hour, so a carbon tax or a market price of $10 per ton of carbon pollution would cost that plant about a penny per kilowatt-hour--which is a lot considering that the average kilowatt-hour sells for about 10 cents. A 1,100-megawatt reactor operating 90 percent of the hours in a year would gain a cost advantage of about $87 million a year per $10 of carbon price, and some industry analysts project prices of $60 or $80 a ton, meaning a cost advantage of hundreds of millions of dollars a year. But all this is still only theoretical, says Carnegie Mellon's Apt. "We don't have a climate bill, and right now there's a lot of uncertainty whether we ever will," he points out.
Nevertheless, some observers believe that if we want to achieve low-carbon energy, and replace gasoline- and diesel-powered vehicles with electric cars, we will eventually and inevitably need nuclear power. The nuclear industry "has certainly been hit by the financial meltdown and the worldwide recession," says Brian D. Wirth, a professor of computational nuclear engineering at the University of Tennessee. But he predicts that the demand for electricity, and the price of natural gas, will snap back in three to five years, creating a new opening for nuclear.
INVENTING SMALLER
Since much of the high cost--and financial risk--of nuclear power is tied to the expense of building large plants, one obvious prescription is for smaller reactors and modular designs. Though they would show higher costs per kilowatt of capacity, smaller plants could represent far less financial risk, and far more flexibility for utilities that must adapt to shifting electricity demands.
Some designs are already moving toward production. NuScale Power, a company in Corvallis, Oregon, has developed a plan for a modular unit that measures 60 feet by 14 feet and weights 300 tons--small enough to be moved by rail or barge. An installation might consist of one unit or up to 24, each generating a mere 45 megawatts. The company says that in case of accident or unexpected shutdown, the heat is carried away by natural circulation, so no emergency pumps and valves are required--and that in a worst-case scenario, no individual unit could release enough radiation to necessitate a plan for evacuating the surrounding area.


Babcock & Wilcox, a giant construction and engineering firm based in Charlotte, North Carolina, has another modular design, for a 125-megawatt reactor. It would be built in a factory and shipped to an underground silo, reducing the risk of successful terrorist attack. The reactor would run four years without requiring refueling, which is about double the longest cycle that is now common.
These plans represent something the industry has not seen in decades: private-sector engineers who think they can make money by being entrepreneurial with new reactor designs. Per F. Peterson, a professor of nuclear engineering at the University of California, Berkeley, says he has high hopes for the smaller reactors, partly because each one represents less risk for investors willing to take a chance on something new. "The first-mover barriers and difficulties are so much smaller for the small modular reactor," he says.
DECAYING PROSPECTS
At the Vogtle site, in a sprawling array of temporary office trailers, David Jones is overseeing construction of the new reactors. A 30-year veteran at the Southern Company who previously served as vice president of engineering for its six existing reactors, Jones started his career in the nuclear industry in the mid-1970s, using a tape measure to make sure that the steel reinforcing bars in the main auxiliary building at the Tennessee Valley Authority's ­Bellefonte 1 plant were the right thickness and distance apart. His dream was to help the TVA build nuclear plants up and down the Tennessee River. But construction on Bellefonte was stopped in the late 1980s. The TVA had underestimated its costs and overestimated both the demand for electricity and its ability to manage nuclear projects.
These days, Jones is quite conscious that the parade of planned nuclear projects behind him has disappeared. But, he says, "someone has to be first, and we are first." If the Southern Company can "prove that [nuclear] is a viable option" by finishing the job on schedule and on budget, he maintains, others will follow.
Not everyone is so sure. Richard Lester, chairman of the nuclear science and engineering department at MIT, says it is uncertain whether success at the Georgia site and other planned nuclear plants would be enough to encourage a nuclear resurgence in this country. He points out that these early plants will have extensive government help that is unlikely to be available to would-be successors. "The question is whether one can see a path from those first few built under pretty exceptional circumstances--probably not sustainable circumstances when it comes to government support," he says. "If one takes the larger view, what really counts here is whether we can get up to 300 or 400 [plants]." He adds, "Even if one could get three or four or five new nuclear plants built in the U.S., the question was always going to be: well, what then? That question is still very much on the table."
Severin Borenstein, codirector of the Energy Institute at the Haas School of Business at the University of California, Berkeley, blames the standstill on a failure to pass legislation to address the threat of climate change.
"It's hard to be very optimistic about it at this point," Borenstein says of the outlook for the nuclear industry. "The original impetus behind the nuclear renaissance was [that nuclear energy would be] low-cost and low-carbon. It's not turning out to be nearly as low-cost as the proponents claimed, and the electorate is not turning out to care that much about low-carbon. This idea that the age of coal is over is not true."
Matthew L. Wald is a reporter at the New York Times. He frequently covers the nuclear industry.