Science Daily — Future computers may rely on magnetic microprocessors that consume the least amount of energy allowed by the laws of physics, according to an analysis by University of California, Berkeley, electrical engineers.
Search This Blog
Monday, July 4, 2011
Ultimate Energy Efficiency: Magnetic Microprocessors Could Use Million Times Less Energy Than Today's Silicon Chips
Nanomagnetic computers use tiny bar magnets to store and process information. The interactions between the polarized, north-south magnetic fields of closely spaced magnets allow logic operations like those in conventional transistors. (Credit: Jeffrey Bokor lab, UC Berkeley)
Such chips would dissipate only 18 millielectron volts of energy per operation at room temperature, the minimum allowed by the second law of thermodynamics and called the Landauer limit. That's 1 million times less energy per operation than consumed by today's computers.Today's silicon-based microprocessor chips rely on electric currents, or moving electrons, that generate a lot of waste heat. But microprocessors employing nanometer-sized bar magnets -- like tiny refrigerator magnets -- for memory, logic and switching operations theoretically would require no moving electrons."Today, computers run on electricity; by moving electrons around a circuit, you can process information," said Brian Lambson, a UC Berkeley graduate student in the Department of Electrical Engineering and Computer Sciences. "A magnetic computer, on the other hand, doesn't involve any moving electrons. You store and process information using magnets, and if you make these magnets really small, you can basically pack them very close together so that they interact with one another. This is how we are able to do computations, have memory and conduct all the functions of a computer."Lambson is working with Jeffrey Bokor, UC Berkeley professor of electrical engineering and computer sciences, to develop magnetic computers.
"In principle, one could, I think, build real circuits that would operate right at the Landauer limit," said Bokor, who is a codirector of the Center for Energy Efficient Electronics Science (E3S), a Science and Technology Center founded last year with a $25 million grant from the National Science Foundation. "Even if we could get within one order of magnitude, a factor of 10, of the Landauer limit, it would represent a huge reduction in energy consumption for electronics. It would be absolutely revolutionary."
One of the center's goals is to build computers that operate at the Landauer limit.
Lambson, Bokor and UC Berkeley graduate student David Carlton published a paper about their analysis online in the journal Physical Review Letters.
Fifty years ago, Rolf Landauer used newly developed information theory to calculate the minimum energy a logical operation, such as an AND or OR operation, would dissipate given the limitation imposed by the second law of thermodynamics. (In a standard logic gate with two inputs and one output, an AND operation produces an output when it has two positive inputs, while an OR operation produces an output when one or both inputs are positive.) That law states that an irreversible process -- a logical operation or the erasure of a bit of information -- dissipates energy that cannot be recovered. In other words, the entropy of any closed system cannot decrease.
In today's transistors and microprocessors, this limit is far below other energy losses that generate heat, primarily through the electrical resistance of moving electrons. However, researchers such as Bokor are trying to develop computers that don't rely on moving electrons, and thus could approach the Landauer limit. Lambson decided to theoretically and experimentally test the limiting energy efficiency of a simple magnetic logic circuit and magnetic memory.
The nanomagnets that Bokor, Lambson and his lab use to build magnetic memory and logic devices are about 100 nanometers wide and about 200 nanometers long. Because they have the same north-south polarity as a bar magnet, the up-or-down orientation of the pole can be used to represent the 0 and 1 of binary computer memory. In addition, when multiple nanomagnets are brought together, their north and south poles interact via dipole-dipole forces to exhibit transistor behavior, allowing simple logic operations.
"The magnets themselves are the built-in memory," Lambson said. "The real challenge is getting the wires and transistors working."
Lambson showed through calculations and computer simulations that a simple memory operation -- erasing a magnetic bit, an operation often called "restore to one" -- can be conducted with an energy dissipation very close, if not identical to, the Landauer limit.
He subsequently analyzed a simple magnetic logical operation. The first successful demonstration of a logical operation using magnetic nanoparticles was achieved by researchers at the University of Notre Dame in 2006. In that case, they built a three-input majority logic gate using 16 coupled nanomagnets. Lambson calculated that a computation with such a circuit would also dissipate energy at the Landauer limit.
Because the Landauer limit is proportional to temperature, circuits cooled to low temperatures would be even more efficient.
At the moment, electrical currents are used to generate a magnetic field to erase or flip the polarity of nanomagnets, which dissipates a lot of energy. Ideally, new materials will make electrical currents unnecessary, except perhaps for relaying information from one chip to another.
"Then you can start thinking about operating these circuits at the upper efficiency limits," Lambson said.
"We are working now with collaborators to figure out a way to put that energy in without using a magnetic field, which is very hard to do efficiently," Bokor said. "A multiferroic material, for example, may be able to control magnetism directly with a voltage rather than an external magnetic field."
Other obstacles remain as well. For example, as researchers push the power consumption down, devices become more susceptible to random fluctuations from thermal effects, stray electromagnetic fields and other kinds of noise.
"The magnetic technology we are working on looks very interesting for ultra low power uses," Bokor said. "We are trying to figure out how to make it more competitive in speed, performance and reliability. We need to guarantee that it gets the right answer every single time with a very, very, very high degree of reliability."
The work was supported by NSF and the Defense Advanced Research Projects Agency.
Plastic Found in Nine Percent of 'Garbage Patch' Fishes: Tens of Thousands of Tons of Debris Annually Ingested
Science Daily — The first scientific results from an ambitious voyage led by a group of graduate students from Scripps Institution of Oceanography at UC San Diego offer a stark view of human pollution and its infiltration of an area of the ocean that has been labeled as the "Great Pacific Garbage Patch."
Two graduate students with the Scripps Environmental Accumulation of Plastic Expedition, or SEAPLEX, found evidence of plastic waste in more than nine percent of the stomachs of fish collected during their voyage to the North Pacific Subtropical Gyre. Based on their evidence, authors Peter Davison and Rebecca Asch estimate that fish in the intermediate ocean depths of the North Pacific ingest plastic at a rate of roughly 12,000- to 24,000 tons per year.
Their results were published June 27 in the journal Marine Ecology Progress Series.
During the SEAPLEX voyage in August 2009, a team of Scripps graduate students traveled more than 1,000 miles west of California to the eastern sector of the North Pacific Subtropical Gyre aboard the Scripps research vessel New Horizon. Over 20 days the students, New Horizon crew and expedition volunteers conducted comprehensive and rigorous scientific sampling at numerous locations. They collected fish specimens, water samples and marine debris at depths ranging from the sea surface to thousands of feet depth.
Of the 141 fishes spanning 27 species dissected in the study, Davison and Asch found that 9.2 percent of the stomach contents of mid-water fishes contained plastic debris, primarily broken-down bits smaller than a human fingernail. The researchers say the majority of the stomach plastic pieces were so small their origin could not be determined.
"About nine percent of examined fishes contained plastic in their stomach. That is an underestimate of the true ingestion rate because a fish may regurgitate or pass a plastic item, or even die from eating it. We didn't measure those rates, so our nine percent figure is too low by an unknown amount," said Davison.
The authors say previous studies on fish and plastic ingestion may have included so-called "net-feeding" biases. Net feeding can lead to artificially high cases of plastic ingestion by fishes while they are confined in a net with a high concentration of plastic debris. The Scripps study's results were designed to avoid such bias. The highest concentrations of plastic were retrieved by a surface collecting device called a "manta net," which sampled for only 15 minutes at a time. The short sampling time minimizes the risk of net feeding by preventing large concentrations of plastic from building up, and also by reducing the amount of time that a captured fish spends in the net. In addition to the manta net, the fishes were also collected with other nets that sample deeper in the water column where there is less plastic to be ingested through net feeding.
The new study focused on the prevalence of plastic ingestion, but effects such as toxicological impacts on fish and composition of the plastic were outside of the study's goals.
The majority of fish examined in the study were myctophids, commonly called lanternfish because of their luminescent tissue. Lanternfishes are hypothesized to use luminescence for several purposes, including counter-illumination (thwarts predators attempting to silhouette the lanternfish against sunlight), mate attraction and identification and illumination of prey. Such fish generally inhabit the 200- to 1,000-meter (650- to 3,280-foot) depth during the day and swim to the surface at night.
"These fish have an important role in the food chain because they connect plankton at the base of the food chain with higher levels. We have estimated the incidence at which plastic is entering the food chain and I think there are potential impacts, but what those impacts are will take more research," said Asch.
Rather than a visible "patch" or "island" of trash, marine debris is highly dispersed across thousands of miles of the North Pacific Subtropical Gyre. The debris area cannot be mapped from air or space, so SEAPLEX researchers collected samples in 132 net tows (130 of which contained plastic) across a distance of more than 2,375 kilometers (1,700 miles) in an attempt to find the boundaries of the patch. The region, a "convergence zone" where floating debris in water congregates, is generally avoided by mariners due to its calm winds and mild currents. The North Pacific Subtropical Gyre has been understudied by scientists, leaving many open questions about marine debris in the area and its long-term effects on the marine environment.
"This study clearly emphasizes the importance of directly sampling in the environment where the impacts may be occurring," said James Leichter, a Scripps associate professor of biological oceanography who participated in the SEAPLEX expedition but was not an author of the new paper. "We are seeing that most of our prior predictions and expectations about potential impacts have been based on speculation rather than evidence and in many cases we have in fact underestimated the magnitude of effects. SEAPLEX also clearly illustrates how relatively small amounts of funding directed for novel field sampling and work in remote places can vastly increase our knowledge and understanding of environmental problems."
SEAPLEX was supported by the UC Ship Funds program, Project Kaisei/Ocean Voyages Institute and the National
NASA's Spitzer Finds Distant Galaxies Grazed On Gas
This split view shows how a normal spiral galaxy around our local universe (left) might have looked back in the distant universe, when astronomers think galaxies would have been filled with larger populations of hot, bright stars (right). (Credit: NASA/JPL-Caltech/STScI)
Science Daily — Galaxies once thought of as voracious tigers are more like grazing cows, according to a new study using NASA's Spitzer Space Telescope.
"Our study shows the merging of massive galaxies was not the dominant method of galaxy growth in the distant universe," said Ranga-Ram Chary of NASA's Spitzer Science Center at the California Institute of Technology in Pasadena, Calif. "We're finding this type of galactic cannibalism was rare. Instead, we are seeing evidence for a mechanism of galaxy growth in which a typical galaxy fed itself through a steady stream of gas, making stars at a much faster rate than previously thought."Astronomers have discovered that galaxies in the distant, early universe continuously ingested their star-making fuel over long periods of time. This goes against previous theories that the galaxies devoured their fuel in quick bursts after run-ins with other galaxies.
Chary is the principal investigator of the research, appearing in the Aug. 1 issue of the Astrophysical Journal. According to his findings, these grazing galaxies fed steadily over periods of hundreds of millions of years and created an unusual amount of plump stars, up to 100 times the mass of our sun.
"This is the first time that we have identified galaxies that supersized themselves by grazing," said Hyunjin Shim, also of the Spitzer Science Center and lead author of the paper. "They have many more massive stars than our Milky Way galaxy."
Galaxies like our Milky Way are giant collections of stars, gas and dust. They grow in size by feeding off gas and converting it to new stars. A long-standing question in astronomy is: Where did distant galaxies that formed billions of years ago acquire this stellar fuel? The most favored theory was that galaxies grew by merging with other galaxies, feeding off gas stirred up in the collisions.
Chary and his team addressed this question by using Spitzer to survey more than 70 remote galaxies that existed 1 to 2 billion years after the Big Bang (our universe is approximately 13.7 billion years old). To their surprise, these galaxies were blazing with what is called H alpha, which is radiation from hydrogen gas that has been hit with ultraviolet light from stars. High levels of H alpha indicate stars are forming vigorously. Seventy percent of the surveyed galaxies show strong signs of H alpha. By contrast, only 0.1 percent of galaxies in our local universe possess this signature.
Previous studies using ultraviolet-light telescopes found about six times less star formation than Spitzer, which sees infrared light. Scientists think this may be due to large amounts of obscuring dust, through which infrared light can sneak. Spitzer opened a new window onto the galaxies by taking very long-exposure infrared images of a patch of sky called the GOODS fields, for Great Observatories Origins Deep Survey.
Further analyses showed that these galaxies furiously formed stars up to 100 times faster than the current star-formation rate of our Milky Way. What's more, the star formation took place over a long period of time, hundreds of millions of years. This tells astronomers that the galaxies did not grow due to mergers, or collisions, which happen on shorter timescales. While such smash-ups are common in the universe -- for example, our Milky Way will merge with the Andromeda galaxy in about 5 billion years -- the new study shows that large mergers were not the main cause of galaxy growth. Instead, the results show that distant, giant galaxies bulked up by feeding off a steady supply of gas that probably streamed in from filaments of dark matter.
Chary said, "If you could visit a planet in one of these galaxies, the sky would be a crazy place, with tons of bright stars, and fairly frequent supernova explosions."
NASA's Jet Propulsion Laboratory in Pasadena, Calif., manages the Spitzer Space Telescope mission for the agency's Science Mission Directorate in Washington. Science operations are conducted at the Spitzer Science Center at Caltech. Caltech manages JPL for NASA.
For more information about Spitzer, visithttp://www.nasa.gov/spitzer and http://spitzer.caltech.edu/
Scientists Use 'Optogenetics' to Control Reward-Seeking Behavior
Science Daily — Using a combination of genetic engineering and laser technology, researchers at the University of North Carolina at Chapel Hill have manipulated brain wiring responsible for reward-seeking behaviors, such as drug addiction. The work, conducted in rodent models, is the first to directly demonstrate the role of these specific connections in controlling behavior.
"For most clinical disorders we knew that one region or another in the brain was important, however until now we didn't have the tools to directly study the connections between those regions," said senior study author Garret D. Stuber, PhD, assistant professor in the departments of cell and molecular physiology, psychiatry and the Neuroscience Center in UNC School of Medicine. "Our ability to perform this level of sophistication in neural circuit manipulation will likely to lead to the discovery of molecular players perturbed during neuropsychiatric illnesses."The UNC study, published online on June 29, 2011, by the journal Nature, uses a cutting-edge technique called "optogenetics" to tweak the microcircuitry of the brain and then assess how those changes impact behavior. The findings suggest that therapeutics targeting the path between two critical brain regions, namely the amygdala and the nucleus accumbens, represent potential treatments for addiction and other neuropsychiatric diseases.Because the brain is composed of diverse regions, cell types and connections in a compact space, pinpointing which entity is responsible for what function can be quite tricky. In the past, researchers have tried to get a glimpse into the inner workings of the brain using electrical stimulation or drugs, but those techniques couldn't quickly and specifically change only one type of cell or one type of connection. But optogenetics, a technique that emerged six years ago, can.In the technique, scientists transfer light-sensitive proteins called "opsins" -- derived from algae or bacteria that need light to grow -- into the mammalian brain cells they wish to study. Then they shine laser beams onto the genetically manipulated brain cells, either exciting or blocking their activity with millisecond precision.In Stuber's initial experiments, the target was the nerve cells connecting two separate brain regions associated with reward, the amygdala and the nucleus accumbens. The researchers used light to activate the connections between these regions, essentially "rewarding" the mice with laser stimulations for performing the mundane task of poking their nose into a hole in their cage. They found that the opsin treated mice quickly learned to "nosepoke" in order to receive stimulation of the neural pathway. In comparison, the genetically untouched control mice never caught onto the task.
Then Stuber and his colleagues wanted to see whether this brain wiring had a role in more natural behavioral processes. So they trained mice to associate a cue -- a light bulb in the cage turning on -- to a reward of sugar water. This time the opsin that the researchers transferred into the brains of their rodent subjects was one that would shut down the activity of neural connections in response to light. As they delivered the simple cue to the control mice, they also blocked the neuronal activity in the genetically altered mice. The control mice quickly began responding to the cue by licking the sugar-producing vessel in anticipation, whereas the treated mice did not give the same response.
The researchers are now exploring how changes to this segment of brain wiring can either make an animal sensitized to or oblivious to rewards. Stuber says their approach presents an incredibly useful tool for studying basic brain function, and could one day provide a powerful alternative to electrical stimulation or pharmacotherapy for neuropsychiatric illnesses like Parkinson's disease.
"For late-stage Parkinson's disease it has become more routine to use deep brain stimulation, where electrodes are chronically implanted into brain tissue, constantly stimulating the tissue to alleviate some of the disease symptoms," said Stuber. "From the technical perspective, implanting our optical fibers is not going to be more difficult than that. But there is quite a bit of work to be done before we get to that point."
The research was funded by NARSAD: The Brain & Behavior Research Fund; ABMRF/ The Foundation for Alcohol Research; The Foundation of Hope; and the National Institute on Drug Abuse, a component of NIH.
Study co-authors from Stuber's laboratory at UNC include Dennis R. Sparta, PhD, postdoctoral fellow, and Alice M. Stamatakis, graduate student.
Quantum 'Graininess' of Space at Smaller Scales? Gamma-Ray Observatory Challenges Physics Beyond Einstein
Gamma-ray burst captured by Integral's IBIS instrument. (Credit: ESA/SPI Team/ECF)
Science Daily — The European Space Agency's Integral gamma-ray observatory has provided results that will dramatically affect the search for physics beyond Einstein. It has shown that any underlying quantum 'graininess' of space must be at much smaller scales than previously predicted.
One of the great concerns of modern physics is to marry these two concepts into a single theory of quantum gravity.Einstein's General Theory of Relativity describes the properties of gravity and assumes that space is a smooth, continuous fabric. Yet quantum theory suggests that space should be grainy at the smallest scales, like sand on a beach.Now, Integral has placed stringent new limits on the size of these quantum 'grains' in space, showing them to be much smaller than some quantum gravity ideas would suggest.According to calculations, the tiny grains would affect the way that gamma rays travel through space. The grains should 'twist' the light rays, changing the direction in which they oscillate, a property called polarisation.High-energy gamma rays should be twisted more than the lower energy ones, and the difference in the polarisation can be used to estimate the size of the grains.Philippe Laurent of CEA Saclay and his collaborators used data from Integral's IBIS instrument to search for the difference in polarisation between high- and low-energy gamma rays emitted during one of the most powerful gamma-ray bursts (GRBs) ever seen.GRBs come from some of the most energetic explosions known in the Universe. Most are thought to occur when very massive stars collapse into neutron stars or black holes during a supernova, leading to a huge pulse of gamma rays lasting just seconds or minutes, but briefly outshining entire galaxies.GRB 041219A took place on 19 December 2004 and was immediately recognised as being in the top 1% of GRBs for brightness. It was so bright that Integral was able to measure the polarisation of its gamma rays accurately.Dr Laurent and colleagues searched for differences in the polarisation at different energies, but found none to the accuracy limits of the data.Some theories suggest that the quantum nature of space should manifest itself at the 'Planck scale': the minuscule 10-35of a metre, where a millimetre is 10-3 m.
However, Integral's observations are about 10 000 times more accurate than any previous and show that any quantum graininess must be at a level of 10-48 m or smaller.
"This is a very important result in fundamental physics and will rule out some string theories and quantum loop gravity theories," says Dr Laurent.
Integral made a similar observation in 2006, when it detected polarised emission from the Crab Nebula, the remnant of a supernova explosion just 6500 light years from Earth in our own galaxy.
This new observation is much more stringent, however, because GRB 041219A was at a distance estimated to be at least 300 million light years.
In principle, the tiny twisting effect due to the quantum grains should have accumulated over the very large distance into a detectable signal. Because nothing was seen, the grains must be even smaller than previously suspected.
"Fundamental physics is a less obvious application for the gamma-ray observatory, Integral," notes Christoph Winkler, ESA's Integral Project Scientist. "Nevertheless, it has allowed us to take a big step forward in investigating the nature of space itself."
Now it's over to the theoreticians, who must re-examine their theories in the light of this new result.
Loudest Animal Is Recorded for the First Time
Science Daily — Scientists have shown for the first time that the loudest animal on earth, relative to its body size, is the tiny water boatman, Micronecta scholtzi. At 99.2 decibels, this represents the equivalent of listening to an orchestra play loudly while sitting in the front row.
The song, used by males to attract mates, is produced by rubbing two body parts together, in a process called stridulation. In water boatmen the area used for stridulation is only about 50 micrometres across, roughly the width of a human hair. "We really don't know how they make such a loud sound using such a small area," says Dr. Windmill.The frequency of the sound (around 10 kHz) is within human hearing range and Dr. James Windmill of the University of Strathclyde, explains one clue as to how loud the animals are: "Remarkably, even though 99% of sound is lost when transferring from water to air, the song is so loud that a person walking along the bank can actually hear these tiny creatures singing from the bottom of the river."
The researchers, who are presenting their work at the Society for Experimental Biology Annual Conference in Glasgow, are now keen to bring together aspects of biology and engineering to clarify how and why such a small animal makes such a loud noise, and to explore the practical applications. Dr. Windmill explains: "Biologically this work could be helpful in conservation as recordings of insect sounds could be used to monitor biodiversity. From the engineering side it could be used to inform our work in acoustics, such as in sonar systems."
Subscribe to:
Posts (Atom)