Search This Blog

Sunday, October 2, 2011

Neuroscientists Record Novel Responses to Faces from Single Neurons in Humans



This figure shows the kind of stimuli used in the study: whole faces (left) and only partly revealed faces. According to the researchers, the surprising finding was that although neurons respond most strongly to seeing the whole face, they actually respond much less to the middle panel than to the far right panel, even though the middle panel is more similar to the whole face. (Credit: Ralph Adolphs, California Institute of Technology)

Science Daily  — Responding to faces is a critical tool for social interactions between humans. Without the ability to read faces and their expressions, it would be hard to tell friends from strangers upon first glance, let alone a sad person from a happy one. Now, neuroscientists from the California Institute of Technology (Caltech), with the help of collaborators at Huntington Memorial Hospital in Pasadena and Cedars-Sinai Medical Center in Los Angeles, have discovered a novel response to human faces by looking at recordings from brain cells in neurosurgical patients.
























"The finding really surprised us," says Ueli Rutishauser, first author on the paper, a former postdoctoral fellow at Caltech, and now a visitor in the Division of Biology. "Here you have neurons that respond well to seeing pictures of whole faces, but when you show them only parts of faces, they actually respond less and less the more of the face you show. That just seems counterintuitive."The finding, described in the journalCurrent Biology, provides the first description of neurons that respond strongly when the patient sees an entire face, but respond much less to a face in which only a very small region has been erased.
The neurons are located in a brain region called the amygdala, which has long been known to be important for the processing of emotions. However, the study results strengthen a growing belief among researchers that the amygdala has also a more general role in the processing of, and learning about, social stimuli such as faces.
Other researchers have described the amygdala's neuronal response to faces before, but this dramatic selectivity -- which requires the face to be whole in order to elicit a response -- is a new insight.
"Our interpretation of this initially puzzling effect is that the brain cares about representing the entire face, and needs to be highly sensitive to anything wrong with the face, like a part missing," explains Ralph Adolphs, senior author on the study and Bren Professor of Psychology and Neuroscience and professor of biology at Caltech. "This is probably an important mechanism to ensure that we do not mistake one person for another and to help us keep track of many individuals."
The team recorded brain-cell responses in human participants who were awaiting surgery for drug-resistant epileptic seizures. As part of the preparation for surgery, the patients had electrodes implanted in their medial temporal lobes, the area of the brain where the amygdala is located. By using special clinical electrodes that have very small wires inserted, the researchers were able to observe the firings of individual neurons as participants looked at images of whole faces and partially revealed faces. The voluntary participants provided the researchers with a unique and very rare opportunity to measure responses from single neurons through the implanted depth electrodes, says Rutishauser.
"This is really a dream collaboration for basic research scientists," he says. "At Caltech, we are very fortunate to have several nearby hospitals at which the neurosurgeons are interested in such collaborative medical research."
The team plans to continue their studies by looking at how the same neurons respond to emotional stimuli. This future work, combined with the present study results, could be highly valuable for understanding a variety of psychiatric diseases in which this region of the brain is thought to function abnormally, such as mood disorders and autism.

Engineers 'Cook' Promising New Heat-Harvesting Nanomaterials in Microwave Oven


Engineering researchers at Rensselaer Polytechnic Institute have developed new thermoelectric nanomaterials, pictured above, that could lead to techniques for better capturing and putting this waste heat to work. The key ingredients for making marble-sized pellets of the new material are aluminum and a common, everyday microwave oven. (Credit: Rensselaer/Ramanath)
Science Daily  — Waste heat is a byproduct of nearly all electrical devices and industrial processes, from driving a car to flying an aircraft or operating a power plant. Engineering researchers at Rensselaer Polytechnic Institute have developed new nanomaterials that could lead to techniques for better capturing and putting this waste heat to work. The key ingredients for making marble-sized pellets of the new material are aluminum and a common, everyday microwave oven.




















However, a team of researchers led by Ganpati Ramanath, professor in the Department of Materials Science and Engineering at Rensselaer, in collaboration with the University of Wollongong, Australia, have demonstrated a new way to decrease zinc oxide's thermal conductivity without reducing its electrical conductivity. The innovation involves adding minute amounts of aluminum to zinc oxide, and processing the materials in a microwave oven. The process is adapted from a technique invented at Rensselaer by Ramanath, graduate student Rutvik Mehta, and Theo Borca-Tasciuc, associate professor in the Department of Mechanical, Aerospace, and Nuclear Engineering (MANE). This work could open the door to new technologies for harvesting waste heat and creating highly energy efficient cars, aircraft, power plants, and other systems.
Harvesting electricity from waste heat requires a material that is good at conducting electricity but poor at conducting heat. One of the most promising candidates for this job is zinc oxide, a nontoxic, inexpensive material with a high melting point. While nanoengineering techniques exist for boosting the electrical conductivity of zinc oxide, the material's high thermal conductivity is a roadblock to its effectiveness in collecting and converting waste heat. Because thermal and electrical conductivity are related properties, it's very difficult to decrease one without also diminishing the other.
"Harvesting waste heat is a very attractive proposition, since we can convert the heat into electricity and use it to power a device -- like in a car or a jet -- that is creating the heat in the first place. This would lead to greater efficiency in nearly everything we do and, ultimately, reduce our dependence on fossil fuels," Ramanath said. "We are the first to demonstrate such favorable thermoelectric properties in bulk-sized high-temperature materials, and we feel that our discovery will pave the way to new power harvesting devices from waste heat."
Results of the study are detailed in a paper published recently by the journal Nano Letters.
To create the new nanomaterial, researchers added minute quantities of aluminum to shape-controlled zinc oxide nanocrystals, and heated them in a $40 microwave oven. Ramanath's team is able to produce several grams of the nanomaterial in a matter of few minutes, which is enough to make a device measuring a few centimeters long. The process is less expensive and more scalable than conventional methods and is environmentally friendly, Ramanath said. Unlike many nanomaterials that are fabricated directly onto a substrate or surface, this new microwave method can produce pellets of nanomaterials that can be applied to different surfaces. These attributes, together with low thermal conductivity and high electrical conductivity, are highly suitable for heat harvesting applications.
"Our discovery could be key to overcoming major fundamental challenges related to working with thermoelectric materials," said project collaborator Borca-Tasciuc. "Moreover, our process is amenable to scaling for large-scale production. It's really amazing that a few atoms of aluminum can conspire to give us thermoelectric properties we're interested in."
This work was a collaborative effort between Ramanath and Shi Xue Dou, a professor at the Institute for Superconducting and Electronic Materials at the University of Wollogong, Australia. Wollongong graduate student Priyanka Jood carried out the work together with Rensselaer graduate students Rutvik Mehta and Yanliang Zhang during Jood's one-year visit to Rensselaer. Co-authors of the paper are Richard W. Siegel, the Robert W. Hunt Professor of Materials Science and Engineering; along with professors Xiaolin Wang and Germanas Peleckis at the University of Wollongong.
This research is funded by support from IBM through the Rensselaer Nanotechnology Center; S3TEC, an Energy Frontier Research Center funded by the U.S. Department of Energy (DoE) Office of Basic Energy Sciences; the Australian Research Council (ARC); and the University of Wollongong.

Scientists Release Most Accurate Simulation of the Universe to Date



The Bolshoi simulation reveals a cosmic web of dark matter that underlies the large-scale structure of the universe and, through its gravitational effects on ordinary matter, drives the formation of galaxies and galaxy clusters. The image is a slice of the entire simulation, 1 billion light-years across and about 30 million light-years thick. (Credit: Stefan Gottlober (AIP))

Science Daily  — The Bolshoi supercomputer simulation, the most accurate and detailed large cosmological simulation run to date, gives physicists and astronomers a powerful new tool for understanding such cosmic mysteries as galaxy formation, dark matter, and dark energy.














"In one sense, you might think the initial results are a little boring, because they basically show that our standard cosmological model works," said Joel Primack, distinguished professor of physics at the University of California, Santa Cruz. "What's exciting is that we now have this highly accurate simulation that will provide the basis for lots of important new studies in the months and years to come."
The simulation traces the evolution of the large-scale structure of the universe, including the evolution and distribution of the dark matter halos in which galaxies coalesced and grew. Initial studies show good agreement between the simulation's predictions and astronomers' observations.
Primack and Anatoly Klypin, professor of astronomy at New Mexico State University, lead the team that produced the Bolshoi simulation. Klypin wrote the computer code for the simulation, which was run on the Pleiades supercomputer at NASA Ames Research Center. "These huge cosmological simulations are essential for interpreting the results of ongoing astronomical observations and for planning the new large surveys of the universe that are expected to help determine the nature of the mysterious dark energy," Klypin said.
Primack, who directs the University of California High-Performance Astrocomputing Center (UC-HIPACC), said the initial release of data from the Bolshoi simulation began in early September. "We've released a lot of the data so that other astrophysicists can start to use it," he said. "So far it's less than one percent of the actual output, because the total output is so huge, but there will be additional releases in the future."
The previous benchmark for large-scale cosmological simulations, known as the Millennium Run, has been the basis for some 400 papers since 2005. But the fundamental parameters used as the input for the Millennium Run are now known to be inaccurate. Produced by the Virgo Consortium of mostly European scientists, the Millennium simulation used cosmological parameters based on the first release of data from NASA's Wilkinson Microwave Anisotropy Probe (WMAP). WMAP provided a detailed map of subtle variations in the cosmic microwave background radiation, the primordial radiation left over from the Big Bang. But the initial WMAP1 parameters have been superseded by subsequent releases: WMAP5 (five-year results released in 2008) and WMAP7 (seven-year results released in 2010).
The Bolshoi simulation is based on WMAP5 parameters, which are consistent with the later WMAP7 results. "The WMAP1 cosmological parameters on which the Millennium simulation is based are now known to be wrong," Primack said. "Moreover, advances in supercomputer technology allow us to do a much better simulation with higher resolution by almost an order of magnitude. So I expect the Bolshoi simulation will have a big impact on the field."
The standard explanation for how the universe evolved after the Big Bang is known as the Lambda Cold Dark Matter model, and it is the theoretical basis for the Bolshoi simulation. According to this model, gravity acted initially on slight density fluctuations present shortly after the Big Bang to pull together the first clumps of dark matter. These grew into larger and larger clumps through the hierarchical merging of smaller progenitors. Although the nature of dark matter remains a mystery, it accounts for about 82 percent of the matter in the universe. As a result, the evolution of structure in the universe has been driven by the gravitational interactions of dark matter. The ordinary matter that forms stars and planets has fallen into the "gravitational wells" created by clumps of dark matter, giving rise to galaxies in the centers of dark matter halos.
A principal purpose of the Bolshoi simulation is to compute and model the evolution of dark matter halos. The characteristics of the halos and subhalos in the Bolshoi simulation are presented in a paper that has been accepted for publication in the Astrophysical Journal and is now available online. The authors are Klypin, NMSU graduate student Sebastian Trujillo-Gomez, and Primack.
A second paper, also accepted for publication in the Astrophysical Journal and available online, presents the abundance and properties of galaxies predicted by the Bolshoi simulation of dark matter. The authors are Klypin, Trujillo-Gomez, Primack, and UCSC postdoctoral researcher Aaron Romanowsky. A comparison of the Bolshoi predictions with galaxy observations from the Sloan Digital Sky Survey showed very good agreement, according to Primack.
The Bolshoi simulation focused on a representative section of the universe, computing the evolution of a cubic volume measuring about one billion light-years on a side and following the interactions of 8.6 billion particles of dark matter. It took 6 million CPU-hours to run the full computation on the Pleiades supercomputer, recently ranked as the seventh fastest supercomputer in the world.
A variant of the Bolshoi simulation, known as BigBolshoi or MultiDark, was run on the same supercomputer with the same number of particles, but this time in a volume 64 times larger. BigBolshoi was run to predict the properties and distribution of galaxy clusters and other very large structures in the universe, as well as to help with dark energy projects such as the Baryon Oscillation Spectroscopic Survey (BOSS).
Another variant, called MiniBolshoi, is currently being run on the Pleiades supercomputer. MiniBolshoi focuses on a smaller portion of the universe and provides even higher resolution than Bolshoi. The Bolshoi simulation and its two variants will be made publicly available to astrophysical researchers worldwide in phases via the MultiDark Database, hosted by the Potsdam Astrophysics Institute in Germany and supported by grants from Spain and Germany.
Primack, Klypin, and their collaborators are continuing to analyze the results of the Bolshoi simulation and submit papers for publication. Among their findings are results showing that the simulation correctly predicts the number of galaxies as bright as the Milky Way that have satellite galaxies as bright as the Milky Way's major satellites, the Large and Small Magellanic Clouds.
"A lot more papers are on the way," Primack said.
This research was funded by grants from NASA and the National Science Foundation.