Search This Blog

Friday, August 5, 2011

Engineers Solve Longstanding Problem in Photonic Chip Technology: Findings Help Pave Way for Next Generation of Computer Chips


Caltech engineers have developed a new way to isolate light on a photonic chip, allowing light to travel in only one direction. This finding can lead to the next generation of computer-chip technology: photonic chips that allow for faster computers and less data loss. (Credit: Caltech/Liang Feng)
Science Daily — Stretching for thousands of miles beneath oceans, optical fibers now connect every continent except for Antarctica. With less data loss and higher bandwidth, optical-fiber technology allows information to zip around the world, bringing pictures, video, and other data from every corner of the globe to your computer in a split second. But although optical fibers are increasingly replacing copper wires, carrying information via photons instead of electrons, today's computer technology still relies on electronic chips.























"We want to take everything on an electronic chip and reproduce it on a photonic chip," says Liang Feng, a postdoctoral scholar in electrical engineering and the lead author on a paper to be published in the August 5 issue of the journal Science. Feng is part of Caltech's nanofabrication group, led by Axel Scherer, Bernard A. Neches Professor of Electrical Engineering, Applied Physics, and Physics, and co-director of the Kavli Nanoscience Institute at Caltech.Now, researchers led by engineers at the California Institute of Technology (Caltech) are paving the way for the next generation of computer-chip technology: photonic chips. With integrated circuits that use light instead of electricity, photonic chips will allow for faster computers and less data loss when connected to the global fiber-optic network.In that paper, the researchers describe a new technique to isolate light signals on a silicon chip, solving a longstanding problem in engineering photonic chips.
An isolated light signal can only travel in one direction. If light weren't isolated, signals sent and received between different components on a photonic circuit could interfere with one another, causing the chip to become unstable. In an electrical circuit, a device called a diode isolates electrical signals by allowing current to travel in one direction but not the other. The goal, then, is to create the photonic analog of a diode, a device called an optical isolator. "This is something scientists have been pursuing for 20 years," Feng says.
Normally, a light beam has exactly the same properties when it moves forward as when it's reflected backward. "If you can see me, then I can see you," he says. In order to isolate light, its properties need to somehow change when going in the opposite direction. An optical isolator can then block light that has these changed properties, which allows light signals to travel only in one direction between devices on a chip.
"We want to build something where you can see me, but I can't see you," Feng explains. "That means there's no signal from your side to me. The device on my side is isolated; it won't be affected by my surroundings, so the functionality of my device will be stable."
To isolate light, Feng and his colleagues designed a new type of optical waveguide, a 0.8-micron-wide silicon device that channels light. The waveguide allows light to go in one direction but changes the mode of the light when it travels in the opposite direction.
A light wave's mode corresponds to the pattern of the electromagnetic field lines that make up the wave. In the researchers' new waveguide, the light travels in a symmetric mode in one direction, but changes to an asymmetric mode in the other. Because different light modes can't interact with one another, the two beams of light thus pass through each other.
Previously, there were two main ways to achieve this kind of optical isolation. The first way -- developed almost a century ago -- is to use a magnetic field. The magnetic field changes the polarization of light -- the orientation of the light's electric-field lines -- when it travels in the opposite direction, so that the light going one way can't interfere with the light going the other way. "The problem is, you can't put a large magnetic field next to a computer," Feng says. "It's not healthy."
The second conventional method requires so-called nonlinear optical materials, which change light's frequency rather than its polarization. This technique was developed about 50 years ago, but is problematic because silicon, the material that's the basis for the integrated circuit, is a linear material. If computers were to use optical isolators made out of nonlinear materials, silicon would have to be replaced, which would require revamping all of computer technology. But with their new silicon waveguides, the researchers have become the first to isolate light with a linear material.
Although this work is just a proof-of-principle experiment, the researchers are already building an optical isolator that can be integrated onto a silicon chip. An optical isolator is essential for building the integrated, nanoscale photonic devices and components that will enable future integrated information systems on a chip. Current, state-of-the-art photonic chips operate at 10 gigabits per second (Gbps) -- hundreds of times the data-transfer rates of today's personal computers -- with the next generation expected to soon hit 40 Gbps. But without built-in optical isolators, those chips are much simpler than their electronic counterparts and are not yet ready for the market. Optical isolators like those based on the researchers' designs will therefore be crucial for commercially viable photonic chips.
In addition to Feng and Scherer, the other authors on theScience paper, "Non-reciprocal light propagation in a silicon photonic circuit," are Jingqing Huang, a Caltech graduate student; Maurice Ayache of UC San Diego and Yeshaiahu Fainman, Cymer Professor in Advanced Optical Technologies at UC San Diego; and Ye-Long Xu, Ming-Hui Lu, and Yan-Feng Chen of the Nanjing National Laboratory of Microstructures in China. This research was done as part of the Center for Integrated Access Networks (CIAN), one of the National Science Foundation's Engineering Research Centers. Fainman is also the deputy director of CIAN. Funding was provided by the National Science Foundation, and the Defense Advanced Research Projects Agency.

Why diets don’t work? Starved brain cells eat themselves



(Biomechanism.com) — A report in the August issue of the Cell Press journal Cell Metabolism might help to explain why it’s so frustratingly difficult to stick to a diet. When we don’t eat, hunger-inducing neurons in the brain start eating bits of themselves. That act of self-cannibalism turns up a hunger signal to prompt eating.
"The present study demonstrates the unique nature of hypothalamic neurons in their ability to upregulate autophagy in response to starvation that is consistent with the roles of these neurons in feeding and energy homeostasis," the researchers wrote.
“A pathway that is really important for every cell to turn over components in a kind of housekeeping process is also required to regulate appetite,” said Rajat Singh of Albert Einstein College of Medicine.
The cellular process uncovered in neurons of the brain’s hypothalamus is known as autophagy (literally self-eating.) Singh says the new findings in mice suggest that treatments aimed at blocking autophagy may prove useful as hunger-fighting weapons in the war against obesity.
The new evidence shows that lipids within the so-called agouti-related peptide (AgRP) neurons are mobilized following autophagy, generating free fatty acids. Those fatty acids in turn boost levels of AgRP, itself a hunger signal.
When autophagy is blocked in AgRP neurons, AgRP levels fail to rise in response to starvation, the researchers show. Meanwhile, levels of another hormone, called melanocyte stimulating hormone, remain elevated. That change in body chemistry led mice to become lighter and leaner as they ate less after fasting, and burned more energy.
Autophagy is known to have an important role in other parts of the body as a way of providing energy in times of starvation. However, unlike other organs, earlier studies had shown the brain to be relatively resistant to starvation-induced autophagy.
“The present study demonstrates the unique nature of hypothalamic neurons in their ability to upregulate autophagy in response to starvation that is consistent with the roles of these neurons in feeding and energy homeostasis,” the researchers wrote.
Singh said he suspects that fatty acids released into the circulation and taken up by the hypothalamus as fat stores break down between meals may induce autophagy in those AgRP neurons. Singh’s research earlier showed a similar response in the liver.
On the other hand, he says, chronically high levels of fatty acids in the bloodstream, as happens in those on a high-fat diet, might alter hypothalamic lipid metabolism, “setting up a vicious cycle of overfeeding and altered energy balance.” Treatments aimed at the pathway might “make you less hungry and burn more fat,” a good way to maintain energy balance in a world where calories are cheap and plentiful.
The findings might also yield new insight into metabolic changes that come with age given that autophagy declines as we get older. “We already have some preliminary evidence there might be changes with age,” Singh said. “We are excited about that.”

Making sperm from stem cells in a dish



Researchers have found a way to turn mouse embryonic stem cells into sperm. This finding, reported in the journal Cell  in a special online release on August 4th, opens up new avenues for infertility research and treatment. A Kyoto University team has coaxed mouse embryonic stem cells into sperm precursors, called primordial germ cells (PGCs), and shown that these cells can give rise to healthy sperm. The researchers say that such in vitro reconstitution of germ cell development represents one of the most fundamental challenges in biology. 

When transplanted into mice that were unable to produce sperm normally, the stem cell derived PGCs produced normal-looking sperm, which were then used to successfully fertilize eggs. These fertilized eggs, when transplanted into a recipient mother, produced healthy offspring that grew into fertile male and female adult mice. The same procedure could produce fertile offspring from induced pluripotent stem cells that are often derived from adult skin cells. 

"Continued investigations aimed at in vitro reconstitution of germ cell development, including the induction of female PGCLCs and their descendants, will be crucial for a more comprehensive understanding of germ cell biology in general, as well as for the advancement of reproductive technology and medicine," the researchers concluded.

Stanford researcher explores whether language is the only way to represent numbers



 Other Sciences / Mathematics 
The Mental Calculation World Cup is a brutal contest, and one that threatens to fry the neurons of the unprepared. Over the course of a competition, contestants might be asked to add a string of 10 different 10-digit numbers, multiply 18,467,941 by 73,465,135, find the square root of 530,179 and determine which day of the week corresponds to Aug. 12, 1721 – all without writing anything down.
The speed with which the winners complete these tasks is remarkable. The World Cup record for finding the square roots of 10 six-digit numbers, for instance, is six minutes and 51 seconds. Even more remarkably, the holder of that record is 11 years old.
Priyanshi Somani, the tween-age reigning Mental Calculation World Champion in question, uses a method called "Mental Abacus." It's an increasingly popular teaching tool, particularly in India. And, according to a paper published online last week in the Journal of Experimental Psychology: General, it may represent one of the first known examples of non-language-based mental calculation. Michael Frank, an assistant professor of psychology at Stanford, is co-author of the paper, with David Barner, assistant professor of psychology at the University of California-San Diego.
Although animals and pre-language infants are able to keep track of small numbers and can approximately judge large quantities, they can't conceptualize exact larger numbers. Infants can't distinguish between three and six apple slices.
This inability begins to disappear as infants develop the language skills necessary for counting, which suggests that mental representations of large, exact quantities are often tied to language.
"But," Frank said, "all of that leaves open the question of whether language is really the only way to represent numbers."
Mental Abacus, or MA, suggests the answer is no. It advises practitioners to visualize a 400-year-old style of abacus known as a soroban. Students often flick their fingers when they calculate, miming the movement of abacus beads.
Researchers have suggested that MA makes use of visual, instead of verbal, working memory. But the theory leaves a major question unanswered.
"There are a limited number of things that we explicitly remember," said Frank. People are only able to hold three to four separate items in the visual working memory at any one time. In contrast, an MA calculation might involve the manipulation of fifteen beads. "Given these limitations, we were confused about how a whole abacus could be represented in working memory."
In their paper, Frank and Barner address this mystery. The researchers demonstrated that MA does, in fact, involve visual manipulations of an imagined abacus – but that the visual working memory stores information about each abacus column, rather than each bead.
The researchers examined elementary school students in India's Gujarat Province, where MA is taught in a three-year afterschool program. Children went through a series of timed addition games that adjusted their difficulty to the user's skill level.
Frank and Barner found that the children's impressive calculating abilities dropped off sharply when they were asked to add four-digit numbers.
Each new place value requires a new abacus column – the rightmost column is the ones place, the next is the tens place, and so on. The result suggests that MA users are unable to imagine more than three abacus columns at once.
On the other hand, increasing the number of imaginary beads necessary for a problem without increasing the number of columns had no effect. And when it came to counting how many beads were present on a flashcard, MA users were no faster than untrained adults.
The researchers concluded that the method doesn't increase the students' ability to hold a mental image of an abacus. Instead, it makes use of standard human visual memory.
"Clearly, the mental image doesn't carry all the details of the abacus itself," said Frank. "But we're zeroing in on what the image consists of."
The researchers also directly tested whether verbal or motor memory was in play during mental calculation. Participants were asked to calculate while drumming their fingers on the desk or repeating a book on tape.
In research subjects who had no experience with MA, verbal distractions significantly affected accuracy. Motor distraction had little effect.
MA users, on the other hand, showed only slight effects during both tasks, suggesting that verbal working memory plays at most a minor role.
"The process is similar to what electronic calculators do," Frank explained. "You start by reading out the problem in Arabic numerals or words, but then you convert it to a representation that's really good for calculations." In an electronic calculator, this representation is binary. In MA, it may be an imaginary soroban.
The researchers are now studying whether children who learn MA at a young age experience any benefits to their mathematical or cognitive abilities. He and Barner will finish a longitudinal study on the topic this spring – possibly granting Priyanshi's World Cup victory a long-term importance she has yet to appreciate.
Provided by Stanford University
"Stanford researcher explores whether language is the only way to represent numbers." August 4th, 2011.http://www.physorg.com/news/2011-08-stanford-explores-language.html
Posted by
Robert Karl Stonjek

Have we met before? Scientists show why the brain has the answer



 Neuroscience 
The research, led by Dr Clea Warburton and Dr Gareth Barker in the University's School of Physiology and Pharmacology and published in the Journal of Neuroscience, has investigated why we can recognise faces much better if we have extra clues as to where or indeed when we encountered them in the first place.
The study found that when we need to remember that a particular object, for example a face, occurred in a particular place, or at a particular time, multiple brain regions have to work together - not independently.
It has been known for some time that three brain regions appear to have specific roles in memory processing. The perirhinal cortex seems to be critical for our ability to recognise whether an individual object is novel or familiar, the hippocampus is important for recognising places and for navigation, while the medial prefrontal cortex is associated with higher brain functions.
These most recent studies, however, are the first to look at situations where these brain regions interact all together, rather than considering each one individually.
Dr Warburton said: "We are very excited to discover this important brain circuit. We're now studying how memory information is processed within it, in the hope we can then understand how our own 'internal library' system works."
The researchers investigated the neural basis of our ability to recognise different types of stimuli under different conditions. Of specific interest were two types of recognition memory: 'object-in-place recognition memory' (remembering where we put our keys), and 'temporal order recognition memory' (when we last had them).
Neither 'object-in-place' or 'temporal order recognition' memories could be formed if communication between the hippocampus and either the perirhinal cortex, or the medial prefrontal cortex, was broken. In other words, disconnecting the regions prevented the ability to remember both where objects had been, and in which order.
Finding that these regions must all act together has important implications for understanding memory and helping treat people with memory disorders such as Alzheimer's disease.
Provided by University of Bristol
"Have we met before? Scientists show why the brain has the answer." August 4th, 2011. http://medicalxpress.com/news/2011-08-met-scientists-brain.html
Posted by
Robert Karl Stonjek

Neuroscientists identify how the brain remembers what happens and when



 Neuroscience 
New York University neuroscientists have identified the parts of the brain we use to remember the timing of events within an episode. The study, which appears in the latest issue of the journal Science, enhances our understanding of how memories are processed and provides a potential roadmap for addressing memory-related afflictions.
Previous research has shown the brain's medial temporal lobe (MTL) has a significant role in declarative memory—that is, memory of facts and events or episodes. Past studies have shown that damage to the MTL causes impairment in memory for the timing of events within an episode. More specifically, declarative memory is impaired in patients suffering from Alzheimer's Disease. However, little is known about how individual structures within the MTL remember information about "what happened when" within a particular episode, such as the order of the toasts at a wedding reception or what preceded a game-winning hit in a baseball game.
The NYU researchers—Yuji Naya, an associate research scientist, and Professor Wendy Suzuki, both of NYU's Center for Neural Science—focused their study on the MTL.
To conduct the study, the researchers ran animal subjects through a temporal-order memory task in which a sequence of two visual objects were presented and the subjects were required to retrieve that same sequence after a delay. In order to perform the task correctly, the subjects needed to remember both the individual visual items ("what") and their temporal order ("when"). During the experiment, the researchers monitored the activity of individual brain cells in the subjects' MTL.
Their results showed that two main areas of the MTL are involved in integrating "what" and "when": the hippocampus and the perirhinal cortex. The hippocampus, which is known to have an important role in a variety of memory tasks, provides an incremental timing signal between key events, giving information about the passage of time from the last event as well as the estimated time toward the next event. The perirhinal cortex appeared to integrate information about what and when by signaling whether a particular item was shown first or second in the series.
"One of the Holy Grails of neuroscience is understanding exactly how our brains encode and remember episodic memories, including those of weddings, graduations, and other meaningful events in our lives," explained Suzuki. "These are rich memories that contain a lot of items with specific temporal contexts. We already knew the medial temporal lobe was critical for these complex memories, but what our new findings provide is insight into the specific patterns of brain activity that enables us to remember both the key events that make up our lives and the specific order in which they happened."
Provided by New York University
"Neuroscientists identify how the brain remembers what happens and when." August 4th, 2011.http://medicalxpress.com/news/2011-08-neuroscientists-brain.html
Posted by
Robert Karl Stonjek

National survey reveals widespread mistaken beliefs about memory



 Psychology & Psychiatry 

The survey evaluated the views of a representative sample of the US population. A majority of respondents agreed with six memory myths. Credit: Daniel Simons
A new survey reveals that many people in the U.S. – in some cases a substantial majority – think that memory is more powerful, objective and reliable than it actually is. Their ideas are at odds with decades of scientific research.
The results of the survey and a comparison to expert opinion appear in a paper in the journal PLoS ONE.
(Before reading further, test your own ideas about memory.)
This is the first large-scale, nationally representative survey of the U.S. population to measure intuitive beliefs about how memory works," said University of Illinois psychology professor Daniel Simons, who led the study with Union College psychology professor Christopher Chabris. Simons and Chabris conducted the survey during research for their book, "The Invisible Gorilla," which explores commonly held (and often incorrect) beliefs about memory and perception.
"Our book highlights ways in which our intuitions about the mind are mistaken," Simons said. "And one of the most compelling examples comes from beliefs about memory: People tend to place greater faith in the accuracy, completeness and vividness of their memories than they probably should."
The telephone survey, carried out by the opinion research company SurveyUSA, asked 1,500 respondents whether they agreed or disagreed with a series of statements about memory.
Nearly two-thirds of respondents likened human memory to a video camera that records information precisely for later review. Almost half believed that once experiences are encoded in memory, those memories do not change. Nearly 40 percent felt that the testimony of a single confident eyewitness should be enough evidence to convict someone of a crime.
These and other beliefs about memory diverge from the views of cognitive psychologists with many years of experience studying how memory works, the researchers report. While studies have shown, for example, that confident eyewitnesses are accurate more often than eyewitnesses who lack confidence, Chabris said, "even confident witnesses are wrong about 30 percent of the time."
Many studies have demonstrated the ways in which memory can be unreliable and even manipulated, Simons said.
"We've known since the 1930s that memories can become distorted in systematic ways," he said. "We've known since the 1980s that even memory for vivid, very meaningful personal events can change over time. For example, (Cornell University psychology professor) Ulric Neisser showed that personal memories for the Challenger space shuttle explosion changed over time, and (University of California professor) Elizabeth Loftus and her colleagues have managed to introduce entirely false memories that people believe and trust as if they had really happened."
"The fallibility of memory is well established in the scientific literature, but mistaken intuitions about memory persist," Chabris said. "The extent of these misbeliefs helps explain why so many people assume that politicians who may simply be remembering things wrong must be deliberately lying."
The new findings also have important implications for proceedings in legal cases, the researchers said.
"Our memories can change even if we don't realize they have changed," Simons said. "That means that if a defendant can't remember something, a jury might assume the person is lying. And misremembering one detail can impugn their credibility for other testimony, when it might just reflect the normal fallibility of memory."
Provided by University of Illinois at Urbana-Champaign
"National survey reveals widespread mistaken beliefs about memory." August 4th, 2011. http://medicalxpress.com/news/2011-08-national-survey-reveals-widespread-mistaken.html
Posted by
Robert Karl Stonjek

The brain grows while the body starves



 
When developing babies are growth restricted in the womb, they are typically born with heads that are large relative to their bodies. The growing brain is protected at the expense of other, less critical organs. Now, researchers reporting in the August 5th issue of Cell, a Cell Press publication, unearth new molecular evidence that explains just how the brain is spared.
In studies of rapidly growing fruit fly larvae, they've traced this developmental phenomenon to the activity of a gene called Anaplastic Lymphoma Kinase (ALK).
"ALK breaks the link between dietary nutrients and neural growth," said Alex Gould of the Medical Research Council's National Institute for Medical Research in London.
The first step for Gould's team was to find out if they could reproduce the same kind of brain sparing known to occur in humans in the lab. They looked at fruit flies in their larval stages because that's when they do most of their growing.
"If you restrict dietary nutrients at the late larval stage, body tissues shut down growth completely yet the neural stem cells in the brain continue growing at close to 100 percent," Gould said. The question is how.
The researchers got their first surprise when they disabled the nutrient sensing pathways that respond to amino acids and insulin, both of which were known to be essential for the growth of many different tissues. Without those pathways in working order, most parts of the fly body did indeed stop growing, but brain neural stem cells "just kept on going."
Further investigation revealed that activation of ALK in the brain allows neural stem cells to grow without the usual need for insulin and amino acid signals. In other words, ALK converts cells from their usual nutrient-sensitive state to a nutrient-responsive one, Gould explained.
As the name suggests, ALK was first identified for its role in lymphomas and has since been found in many other forms of human cancer. The new findings uncover a previously unknown molecular link between stem cell growth and cancer.
"It's interesting. We think of cancer cells as being able to outgrow normal healthy cells," Gould said. "So it appears that ALK can give cells a growth advantage in contexts as diverse as human cancers and developing fruit flies."
The fruit flies now offer an experimental model for intra-uterine growth restriction (IUGR) in humans, which may lead to a greater understanding of the genes and pathways involved. "I don't want to over-speculate," Gould said, "but, in the future, this genetic model may also shed light on the related issue of why IUGR predisposes individuals to metabolic disease later on in adult life."
Provided by Cell Press
"The brain grows while the body starves." August 4th, 2011. http://medicalxpress.com/news/2011-08-brain-body-starves.html
Posted by
Robert Karl Stonjek

Big brains evolved due to capacity for exercise



Biology / Plants & Animals 
The relatively large size of the mammalian brain evolved due to a capacity for endurance exercise, researchers conclude in a recent study.
In the study published in the journal PLoS ONE this month, anthropologists David A. Raichlen of the University of Arizona and Adam D. Gordon of the University at Albany conclude that the brain size in mammals may have evolved in conjunction with increases in exercise capacity, rather than solely in response to natural selection for higher cognitive abilities.
Mammals have larger brains than non-mammalian animals of the same body size; primates (apes, monkeys, humans, lemurs, and lorises) have larger brains than non-primate mammals of the same body size; and humans have larger brains than non-human primates of the same size. Anthropologists have long attempted to discern the reasons humans and other primates have relatively large brains compared to other animals species. The theories offered include the need for greater cognitive power to process visual information, and an increased capacity to manage complex social interactions in large groups.
Gordon said, "Brains are very energetically expensive to maintain, so most previous research has asked why certain species need big, expensive brains. We're asking a slightly different question: how do species grow and maintain expensive big brains in the first place?"
Earlier research in experimental settings had shown that endurance exercise boosts brain growth in some mammals. Controlling for associations with body size, the researchers examined the correlation between brain size and a metric known as the maximum metabolic rate (MMR), the measurement of the limit for aerobic exercise frequency and capacity. By collecting brain sizes and MMRs in mammals, they analyzed the relationship between body mass and the correlation of residual brain size to residual MMR. They found a significant correlation between maximum metabolic rate and brain size across a wide range of mammals.
"What we discovered," Gordon said, "is that across distantly related mammal species, those with relatively high capacities for endurance exercise have relatively large brains, while those with relatively low capacities for endurance exercise have relatively small brains. This suggests that the phenomenon observed in experimental lab settings may work on an evolutionary time scale as well."
The next step, the researchers say, is to test the brain-to-exercise relationship in primates, including humans, and determine the underlying mechanism for the brain's growth.
More information: Relationship between Exercise Capacity and Brain Size in Mammals. PLoS ONE 6(6): e20601.doi:10.1371/journal.pone.0020601
Provided by University at Albany
"Big brains evolved due to capacity for exercise." August 4th, 2011. http://www.physorg.com/news/2011-08-big-brains-evolved-due-capacity.html
Posted by
Robert Karl Stonjek

Research team finds species share perceptual capabilities that affect how communication evolves



 Biology / Plants & Animals 
A research team that included Hamilton E. Farris, PhD, Research Assistant Professor of Neuroscience and Otorhinolaryngology at LSU Health Sciences Center New Orleans, reveals that two entirely different species show similar perception of auditory cues that drive basic biological functions; that these perceptions may be universally shared among animals; and that such perception may also limit the evolution of communication signals. The work is published in the August 5, 2011 issue of Science.
Using the labs at the Smithsonian Tropical Research Institute in Panama, the team tested whether psychophysical laws explain how female túngara frogs and frog-eating bats compare male frog calls and whether the rules for perception constrain how communication signals evolve.
Animals, including humans, continuously make decisions based on comparing external stimuli from the environment. However, the decisions are not based on the actual, but rather the perceived physical magnitude of the stimuli. A perceptual rule called Weber's Law proposes that stimuli are compared based on ratios, not absolute differences. For example, distinguishing between a 1-lb. object and a 2- lb. object is easier than comparing a 50-lb. object vs a 51-lb. object. The comparison does not depend on the absolute difference (1 lb. in each case), but the relative difference (100% vs. 2%).
The researchers tested whether Weber's law or alternative hypotheses explain túngara frog mate choice. Male túngara frogs produce a vowel-like "whine," followed by 0-7 consonant-like "chucks." They placed wild-caught females in a sound chamber and alternately broadcast two call types with varying numbers of chucks from two speakers on opposite sides of the chamber. Choice was quantified as walking to within 10 cm of either speaker.
"By giving females a choice between calls with different numbers of chucks, we found that the female frogs prefer calls with the most chucks, but based on the ratio of the number of chucks.," notes LSUHSC's Dr. Farris. "This means that as males elaborate their signals by adding more chucks, their relative attractiveness decreases due to the perceptual constraint on the part of females."
To more fully understand how females' perception influences the evolution of the males' calls, the research team then tested fringe-lipped bats, a natural predator of túngara frogs who select their prey based on the calls of the male frogs. Using this rare case in which two very different species, amphibian and mammal, have evolved the same behavioral approach to the same communication signal, the research team asked whether hunting bats choose their prey based on chuck number ratio as well. Testing bats in a behavioral test similar to that used with female frogs, the team showed that bats compared calls using chuck number ratio as well.
"It is astounding that two disparate animals use the same perceptual scale, suggesting a generality in how animals compare stimuli," says Dr. Farris.
As males increase chucks, so do their neighbors. With a fixed difference of one chuck between neighbors, both the risks and benefits of adding chucks decrease with increasing elaboration. Adding one chuck to many chucks adds less risk than adding one chuck to few chucks. Adding multiple chucks to outcompete neighbors will not succeed because males maintain a fixed difference.
"Natural selection and bat predation are not limiting male call evolution, This supports our conclusion that it is the females' cognition that is limiting the evolution of chuck number," says Dr Farris. "The results are significant because we show that certain types of perception may be universal. Furthermore, with respect to the evolution of communication signals, we propose that by limiting signal elaboration, ratio-based coding could favor the evolution of signal innovation. That is, Weber's law would favor the evolution of a signal along a completely different perceptual axis."
Provided by Louisiana State University Health Sciences Center
"Research team finds species share perceptual capabilities that affect how communication evolves." August 4th, 2011.http://www.physorg.com/news/2011-08-team-species-perceptual-capabilities-affect.html
Posted by
Robert Karl Stonjek