Search This Blog

Wednesday, October 19, 2011

How skin makes out ‘good’ bacteria



CENTENARY INSTITUTE   



There are more bacteria living on our skin and in our gut than cells in our body. We need them. But until now, no one knew how the immune system could tell that these bacteria are harmless.

Centenary Institute researchers in Sydney have discovered a set of peacekeepers—immune cells in the outer layers of our skin that stop us from attacking friendly bacteria.

The work will open the way to new therapeutic options for immune-mediated diseases such as inflammatory bowel disease, of which Australia has some of the world’s highest rates.

In a paper published today in the Proceedings of the National Academy of Sciences (PNAS), Professor Barbara Fazekas de St Groth and her team have shown that the immune cells in the outer layer of the skin constantly act as peacekeepers to stop the immune system from reacting the way it usually would. Known as Langerhans cells, they resisted every attempt by the researchers to get them to generate an immune response. 

The researchers worked with a group of mice in which only the Langerhans cells could stimulate the immune system. They then activated the Langerhans cells and measured the response.

“No matter what we threw at them to get them to activate a long-term immune response, the Langerhans cells always induced immune tolerance,” Prof Fazekas says.

This result seems to go against the prevailing wisdom in immunology about the workings of dendritic cells, the class of immune cell to which Langerhans cells belong.

Dendritic cells engulf bacteria, viruses or other invaders and put a marker from that invader, known as an antigen, on a protein that can bind to other immune cells.

The antigen reprograms passing T cells, the workhorses of the immune system, which then set off a cascade of responses that eventually lead to the destruction of anything displaying that antigen. 

However, the Centenary team (which is affiliated with the University of Sydney and RPA Hospital) found Langerhans cells are very different from other dendritic cells: after turning on the helper T cells, they tell them to self-destruct instead.

“This is the opposite of what you’d usually expect.  In previous studies of immune cells, if there was a flurry of activity, we assumed it was the start of a long-term immune response,” Prof Fazekas says.

However, the immune system is a layered defence­—the next layer of skin has different kinds of dendritic cells, which program on-going responses against bacteria. So if bacteria penetrate deep enough to meet these cells, the immune response will kill them.

In inflammatory bowel disease, which afflicts thousands of Australians, the immune system is activated against the gut bacteria, which are usually left alone. 

This discovery opens up possible ways to figure out why this disorder occurs and to find treatments to a range of diseases of the immune system.

“There is so much we don’t know about the immune system, but sometimes just mimicking what the system does, like we do with vaccines, can work very well,” Prof Fazekas says,

“If we do manage to mimic what Langerhans cells do, then we could develop treatments that would precisely tolerise against specific antigens – just like the skin's immune system does now.”

Centenary Institute executive director Professor Mathew Vadas says this latest paper comes just weeks after Centenary researcher Patrick Bertolino made the front cover of PNAS for his paper on immune response in the liver.

“The Centenary Institute is interested in understanding how the immune system works—these discoveries and others already in the pipeline here are a major step towards that goal,” Prof Vadas says.
Editor's Note: Original news release can be found here.

Eat less protein: snack more



THE UNIVERSITY OF SYDNEY   



Including enough protein in our diets, rather than simply cutting calories, is the key to curbing appetites and preventing excessive consumption of fats and carbohydrates, a new study from the University of Sydney has found.

A multidisciplinary team of researchers has shown that people on a 10 percent protein diet will consume more snacks between meals and eat significantly more calories overall compared to those on a 15 percent protein diet.

The results, published in the online journal PLoS ONE, represent the first scientifically supported evidence that dietary protein plays an important role in appetite and total food consumption in humans, and are an important step in addressing the global obesity epidemic.

"Humans have a particularly strong appetite for protein, and when the proportion of protein in the diet is low, this appetite can drive excess energy intake," said lead author Dr. Alison Gosby, who conducted the study with Professor Steve Simpson from the School of Biological Sciences.

"Our findings have considerable implications for body weight management in the current nutritional environment, where foods rich in fat and carbohydrates are cheap, palatable, and available to an extent unprecedented in our history."

Protein is the driving force for appetite in many animals, according to Professor Steve Simpson, a world leader in nutrition. The 'protein-leverage' hypothesis, first proposed by Steve Simpson and co-author David Raubenheimer, proposes that animals have a fixed protein target, which they will defend at the expense of other nutrients.

"Our previous work on slime moulds, insects, fish, birds, rodents, mink, cats and monkeys has shown that animals have separate appetites for protein, fat and carbohydrate. Interestingly, if protein in the diet is diluted, even by a small amount by extra fat and carbohydrate, the appetite for protein dominates and they will keep eating in an attempt to attain their target level of protein," he says.

Although it has previously been suggested that protein content plays an important role in determining overall energy intake in humans, and is therefore linked to obesity, until now experimental verification has been lacking.

In their new study, Dr Gosby and Professor Simpson wanted to test the 'protein-leverage' effect in humans. The researchers created three menus that represented low (10 percent), intermediate (15 percent) and high (25 percent) protein, based on data from the World Health Organization recommending people eat 15 percent protein diets. Except for protein, the three diets were identical in all other factors such as appearance, palatability, variety and availability.

The researchers then took a group of 22 lean people and fed each subject each of the three menus during three separate four-day periods, monitoring energy intake over each four days and hunger ratings on day four.

The researchers found subjects who ate a 10 percent protein diet consumed 12 percent more energy over four days than those eating a 15 percent protein diet. Moreover, 70 percent of the increased energy intake on the lower protein diet was attributed to snacking.

When the protein content was further increased to 25 percent, however, the researchers observed no change in behavior relative to the 15 percent protein diet. On the fourth day of the trial, however, there was a greater increase in the hunger score between one to two hours after the 10 percent protein breakfast versus the 25 percent protein breakfast.

Dr Gosby commented: "This result confirms the 'protein-leverage' effect in humans and importantly, shows counting calories is not enough to manage appetite and body weight. In the western world, where food is abundant, if you reduce your calorie intake but fail to reach your protein target you will find it hard to resist hunger pangs."

Professor Simpson says today's western-world diets - where protein is increasingly diluted by fats and carbohydrates - are likely to be causing us to overeat and could be fueling the obesity epidemic.

"Our results indicate low protein diets will cause humans to overeat. Tragically in the modern westernised environment there are many factors encouraging us to eat foods that are high in sugars and fat, including reduced cost and increased availability of these foods. Underpinning all this is our ancestral environment in which fat and simple sugars were highly prized, leaving us with a predilection for these foods."
Editor's Note: Original news release can be found here.

TOP FIVE QUOTES FOR SUCCESS




Sometimes it takes inspiration to motivate you to make the changes that you want to see in your life. These top 5 quotes make choices, dreams and achievements clear and attainable. Take a tip from these intelligent people and jump-start into the life you want to live today!
Board of Wisdom highlights…
“Too many people go through life waiting for things to happen instead of making them happen!”
“While most are dreaming of success, winners wake up and work hard to achieve it.”
-        Unknown
“There are no shortcuts to any place worth going.”
-        Helen Keller
“The elevator to success is out of order. You’ll have to use the stairs…. One step at a time.”
-        Unknown
“I’ve missed more than 9000 shots in my career. I’ve lost almost 300 games. Twenty-six times, I’ve been trusted to take the game-winning shot and missed. I’ve failed over and over and over again in my life. And that is why I succeed.”
-        Michael Jordan

HOW TO FIND YOUR PERSONAL FRANCHISE




Finding the unique small business that can grow into a franchise can be difficult. Read this article to see how to do it.

Find Businesses With Franchising Potential

To discover viable candidates, scour business publications, tap business groups, and get to know online resources like MarketResearch.com and Springwise.com

My company looks for small businesses that have the potential to become franchises. Where would I find an information source I could use to discover unique small businesses?
—B.D., Austin, Tex.
Like a talent scout or professional sports scout, you’re looking for small-time players who have the potential to hit it big. This is something of a needle-in-a-haystack proposition. There are millions of solid, profitable small businesses, but it takes some special attributes to turn them into franchises—and it isn’t always clear which ones will make the cut.
“It is a judgment call, based on the type of product or service a small company has and its ability to duplicate itself in other locations,” says Daniel Burrus, chief executive officer ofBurrus Research Associates in Hartland, Wis. Because many small, privately owned companies are leery of disclosing their financials, it may be tough to figure out which ones could sustain rapid expansion. And even the owner may not be sure whether his or her concept is a candidate for franchising, Burrus says.
It’s likely you have developed a formula that small businesses must meet in order to be potential franchisors. For instance, the product or service theoretically must be marketable around the country and even the world. The operations must be organized enough to be broken down into replicable processes that can be taught to new employers and employees. And, most important, profits must be sufficient to scale the concept up onto a larger playing field, with multiple locations supplied by a central headquarters.

Archaeologists Find Blade 'Production Lines' Existed as Much as 400,000 Years Ago



Large numbers of long, slender cutting tools were discovered at Qesem Cave, located outside of Tel Aviv, Israel. (Credit: Image courtesy of American Friends of Tel Aviv University)

Science Daily  — Archaeology has long associated advanced blade production with the Upper Palaeolithic period, about 30,000-40,000 years ago, linked with the emergence of Homo Sapiens and cultural features such as cave art. Now researchers at Tel Aviv University have uncovered evidence which shows that "modern" blade production was also an element of Amudian industry during the late Lower Paleolithic period, 200,000-400,000 years ago as part of the Acheulo-Yabrudian cultural complex, a geographically limited group of hominins who lived in modern-day Israel, Lebanon, Syria and Jordan.


























The blades, which were described recently in the Journal of Human Evolution, are the product of a well planned "production line," says Dr. Barkai. Every element of the blades, from the choice of raw material to the production method itself, points to a sophisticated tool production system to rival the blade technology used hundreds of thousands of years later.


Prof. Avi Gopher, Dr. Ran Barkai and Dr. Ron Shimelmitz of TAU's Department of Archaeology and Ancient Near Eastern Civilizations say that large numbers of long, slender cutting tools were discovered at Qesem Cave, located outside of Tel Aviv, Israel. This discovery challenges the notion that blade production is exclusively linked with recent modern humans.
An innovative product
Though blades have been found in earlier archaeological sites in Africa, Dr. Barkai and Prof. Gopher say that the blades found in Qesem Cave distinguish themselves through the sophistication of the technology used for manufacturing and mass production.
Evidence suggests that the process began with the careful selection of raw materials. The hominins collected raw material from the surface or quarried it from underground, seeking specific pieces of flint that would best fit their blade making technology, explains Dr. Barkai. With the right blocks of material, they were able to use a systematic and efficient method to produce the desired blades, which involved powerful and controlled blows that took into account the mechanics of stone fracture. Most of the blades of were made to have one sharp cutting edge and one naturally dull edge so it could be easily gripped in a human hand.
This is perhaps the first time that such technology was standardized, notes Prof. Gopher, who points out that the blades were produced with relatively small amounts of waste materials. This systematic industry enabled the inhabitants of the cave to produce tools, normally considered costly in raw material and time, with relative ease.
Thousands of these blades have been discovered at the site. "Because they could be produced so efficiently, they were almost used as expendable items," he says.
Prof. Cristina Lemorini from Sapienza University of Rome conducted a closer analysis of markings on the blades under a microscope and conducted a series of experiments determining that the tools were primarily used for butchering.
Modern tools a part of modern behaviors
According to the researchers, this innovative industry and technology is one of a score of new behaviors exhibited by the inhabitants of Qesem Cave. "There is clear evidence of daily and habitual use of fire, which is news to archaeologists," says Dr. Barkai. Previously, it was unknown if the Amudian culture made use of fire, and to what extent. There is also evidence of a division of space within the cave, he notes. The cave inhabitants used each space in a regular manner, conducting specific tasks in predetermined places. Hunted prey, for instance, was taken to an appointed area to be butchered, barbequed and later shared within the group, while the animal hide was processed elsewhere.

Children, Not Chimps, Prefer Collaboration: Humans Like to Work Together in Solving Tasks -- Chimps Don't


Cooperation is child´s play: children that are presented with a task that they could perform on their own or with a partner show a preference to cooperate. (Credit: © MPI for Evolutionary Anthropology)

Science Daily  — Recent studies have shown that chimpanzees possess many of the cognitive prerequisites necessary for humanlike collaboration. Cognitive abilities, however, might not be all that differs between chimpanzees and humans when it comes to cooperation. Researchers from the MPI for Evolutionary Anthropology in Leipzig and the MPI for Psycholinguistics in Nijmegen have now discovered that when all else is equal, human children prefer to work together in solving a problem, rather than solve it on their own. Chimpanzees, on the other hand, show no such preference according to a study of 3-year-old German kindergarteners and semi-free ranging chimpanzees, in which the children and chimps could choose between a collaborative and a non-collaboration problem-solving approach.


































The research team presented 3-year-old German children and chimpanzees living in a Congo Republic sanctuary with a task that they could perform on their own or with a partner. Specifically, they could either pull two ends of a rope themselves in order to get a food reward or they could pull one end while a companion pulled the other. The task was carefully controlled to ensure there were no obvious incentives for the children or chimpanzees to choose one strategy over the other. "In such a highly controlled situation, children showed a preference to cooperate; chimpanzees did not," Haun points out.
Human societies are built on collaboration. From a young age, children will recognize the need for help, actively recruit collaborators, make agreements on how to proceed, and recognize the roles of their peers to ensure success. Chimpanzees are cooperative too, working together in border patrols and group hunting, for instance. Still, humans might have greater motivation to cooperate than chimpanzees do." A preference for doing things together instead of alone differentiates humans from one of our closely related primate cousins," says Daniel Haun of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany and the Max Planck Institute for Psycholinguistics in Nijmegen, The Netherlands. "We expected to find differences between human and chimpanzee cooperation, because humans cooperate in a larger variety of contexts and in more complex forms than chimpanzees."
The children cooperated more than 78 percent of the time compared to about 58 percent for the chimpanzees. These statistics show that the children actively chose to work together, while chimps appeared to choose between their two options randomly. "Our findings suggest that behavioral differences between humans and other species might be rooted in apparently small motivational differences," says Haun.
Future work should compare cooperative motivation across primate species in an effort to reconstruct the evolutionary history of the trait, the researchers say. "Especially interesting would be other cooperative-breeding primates, or our other close relatives, the bonobos, who have both previously been argued to closely match some of the human pro-social motivations," says Yvonne Rekers of the Max Planck Institute for Evolutionary Anthropology and first author of the study.

Severe Drought, Other Changes Can Cause Permanent Ecosystem Disruption


This giant waterbug, once the top insect predator in a stream in Arizona's French Joe Canyon, has now disappeared in some places due to severe drought. (Credit: Photo by Michael Bogan)

Science Daily  — An eight-year study has concluded that increasingly frequent and severe drought, dropping water tables and dried-up springs have pushed some aquatic desert ecosystems into "catastrophic regime change," from which many species will not recover.












"Populations that have persisted for hundreds or thousands of years are now dying out," said David Lytle, an associate professor of zoology at Oregon State University. "Springs that used to be permanent are drying up. Streams that used to be perennial are now intermittent. And species that used to rise and fall in their populations are now disappearing."
The findings, just published in the journal Freshwater Biology, raise concerns that climate change, over-pumping of aquifers for urban water use, and land management may permanently affect which species can survive.
The research, done by Lytle and doctoral candidate Michael Bogan, examined the effect of complete water loss and its subsequent impact on aquatic insect communities in a formerly perennial desert stream in Arizona's French Joe Canyon, before and after severe droughts in the early 2000s.
The stream completely dried up for a period in 2005, and again in 2008 and 2009, leading to what researchers called a rapid "regime shift" in which some species went locally extinct and others took their place. The ecosystem dynamics are now different and show no sign of returning to their former state. Six species were eliminated when the stream dried up, and 40 others became more abundant. Large-bodied "top predators" like the giant waterbug disappeared and were replaced by smaller "mesopredators" such as aquatic beetles.
"Before 2004, this area was like a beautiful oasis, with lots of vegetation, birds and rare species," Lytle said. "The spring has lost a number of key insect species, has a lot less water, and now has very different characteristics."
The phenomena, the researchers say, does not so much indicate the disappearance of life -- there is about as much abundance as before. It's just not the same.
"Our study focused on a single stream in isolation, but this process of drying and local extinction is happening across the desert Southwest," Bogan said. "Eventually this could lead to the loss of species from the entire region, or the complete extinction of species that rely on these desert oases."
Small streams such as this are of particular interest because they can be more easily observed and studied than larger rivers and streams, and may represent a microcosm of similar effects that are taking place across much of the American West, the researchers said. The speed and suddenness of some changes give species inadequate time to adapt.
"It's like comparing old-growth forests to second-growth forests," Lytle said. "There are still trees, but it's not the same ecosystem it used to be. These desert streams can be a window to help us see forces that are at work all around us, whether it's due to climate change, land management or other factors."
The researchers noted in their report that the last 30 years have been marked by a significant increase in drought severity in the Southwest. The drought that helped dry up French Joe Canyon in 2005 resulted in the lowest flow in Arizona streams in 60 years, and in many cases the lowest on record. At French Joe Canyon, the stream channel was completely dry to bedrock, leaving many aquatic invertebrates dead in the sediments.
That was probably "an unprecedented disturbance," the researchers said in their report. Community composition shifted dramatically, with longer-lived insects dying out and smaller, shorter-lived ones taking their places.
Conceptually similar events have taken place in the past in plant communities in the Florida Everglades, floodplains in Australia, and boreal forests following fire disturbance, other researchers have found. In the Southwest, climate change models predict longer, more frequent and more intense droughts in the coming century, the scientists noted in their study.
The research was supported by the National Science Foundation.

100,000-Year-Old Ochre Toolkit and Workshop Discovered in South Africa


Science Daily — An ochre-rich mixture, possibly used for decoration, painting, and skin protection 100,000 years ago, and stored in two abalone shells, was discovered at Blombos Cave in Cape Town, South Africa.






The findings were published in the journal Science on Oct. 14, 2011.
"Ochre may have been applied with symbolic intent as decoration on bodies and clothing during the Middle Stone Age," says Professor Christopher Henshilwood from the Institute for Human Evolution at the University of the Witwatersrand, Johannesburg, who, together with his international team discovered a processing workshop in 2008 where a liquefied ochre-rich mixture was produced.
The two coeval, spatially associated toolkits were discovered in situ (not having been moved from its original place of deposition), and the kits included ochre, bone, charcoal, grindstones, and hammerstones. The grinding and scraping of ochre to produce a powder for use as a pigment was common practice in Africa and the Near East only after about 100,000 years ago.
"This discovery represents an important benchmark in the evolution of complex human cognition (mental processes) in that it shows that humans had the conceptual ability to source, combine and store substances that were then possibly used to enhance their social practices," explains Henshilwood.
"We believe that the manufacturing process involved the rubbing of pieces of ochre on quartzite slabs to produce a fine red powder. Ochre chips were crushed with quartz, quartzite and silcrete hammerstones/grinders and combined with heated crushed, mammal-bone, charcoal, stone chips and a liquid, which was then introduced to the abalone shells and gently stirred. A bone was probably used to stir the mixture and to transfer some of the mixture out of the shell."
The quartz sediments in which the ochre containers were buried were dated to about 100,000 years using Optically Stimulated Luminescence (OSL) dating. This is consistent with the thermoluminescence dating of burnt lithics and the dating of calcium carbonate concretions using uranium-series dating methods.
"The recovery of these toolkits adds evidence for early technological and behavioural developments associated with humans and documents their deliberate planning, production and curation of pigmented compound and the use of containers. It also demonstrates that humans had an elementary knowledge of chemistry and the ability for long-term planning 100,000 years ago," concludes Henshilwood.
The two specimens will be on display at the Iziko Museum in Cape Town from Friday, 14 October 2011.
*Ochre is the colloquial term used by archaeologists to describe an earth or rock containing red or yellow oxides or hydroxides of iron
*Note: The Blombos Cave is situated on the southern Cape Coast, 300km east of Cape Town, South Africa.

Future Forests May Soak Up More Carbon Dioxide Than Previously Believed



An aerial view of the 38-acre experimental forest in Wisconsin where U-M researchers and their colleagues continuously exposed birch, aspen and maple trees to elevated levels of carbon dioxide and ozone gas from 1997 through 2008. (Credit: David Karnosky, Michigan Technological University)
Science Daily — North American forests appear to have a greater capacity to soak up heat-trapping carbon dioxide gas than researchers had previously anticipated.






The results of a 12-year study at an experimental forest in northeastern Wisconsin challenge several long-held assumptions about how future forests will respond to the rising levels of atmospheric carbon dioxide blamed for human-caused climate change, said University of Michigan microbial ecologist Donald Zak, lead author of a paper published online this week in Ecology Letters.As a result, they could help slow the pace of human-caused climate warming more than most scientists had thought, a U-M ecologist and his colleagues have concluded.
"Some of the initial assumptions about ecosystem response are not correct and will have to be revised," said Zak, a professor at the U-M School of Natural Resources and Environment and the Department of Ecology and Evolutionary Biology in the College of Literature, Science, and the Arts.
To simulate atmospheric conditions expected in the latter half of this century, Zak and his colleagues continuously pumped extra carbon dioxide into the canopies of trembling aspen, paper birch and sugar maple trees at a 38-acre experimental forest in Rhinelander, Wis., from 1997 to 2008.
Some of the trees were also bathed in elevated levels of ground-level ozone, the primary constituent in smog, to simulate the increasingly polluted air of the future. Both parts of the federally funded experiment -- the carbon dioxide and the ozone treatments -- produced unexpected results.
In addition to trapping heat, carbon dioxide is known to have a fertilizing effect on trees and other plants, making them grow faster than they normally would. Climate researchers and ecosystem modelers assume that in coming decades, carbon dioxide's fertilizing effect will temporarily boost the growth rate of northern temperate forests.
Previous studies have concluded that this growth spurt would be short-lived, grinding to a halt when the trees can no longer extract the essential nutrient nitrogen from the soil.
But in the Rhinelander study, the trees bathed in elevated carbon dioxide continued to grow at an accelerated rate throughout the 12-year experiment. In the final three years of the study, the CO2-soaked trees grew 26 percent more than those exposed to normal levels of carbon dioxide.
It appears that the extra carbon dioxide allowed trees to grow more small roots and "forage" more successfully for nitrogen in the soil, Zak said. At the same time, the rate at which microorganisms released nitrogen back to the soil, as fallen leaves and branches decayed, increased.
"The greater growth has been sustained by an acceleration, rather than a slowing down, of soil nitrogen cycling," Zak said. "Under elevated carbon dioxide, the trees did a better job of getting nitrogen out of the soil, and there was more of it for plants to use."
Zak stressed that growth-enhancing effects of CO2 in forests will eventually "hit the wall" and come to a halt. The trees' roots will eventually "fully exploit" the soil's nitrogen resources. No one knows how long it will take to reach that limit, he said.
The ozone portion of the 12-year experiment also held surprises.
Ground-level ozone is known to damage plant tissues and interfere with photosynthesis. Conventional wisdom has held that in the future, increasing levels of ozone would constrain the degree to which rising levels of carbon dioxide would promote tree growth, canceling out some of a forest's ability to buffer projected climate warming.
In the first few years of the Rhinelander experiment, that's exactly what was observed. Trees exposed to elevated levels of ozone did not grow as fast as other trees. But by the end of study, ozone had no effect at all on forest productivity.
"What happened is that ozone-tolerant species and genotypes in our experiment more or less took up the slack left behind by those who were negatively affected, and that's called compensatory growth," Zak said. The same thing happened with growth under elevated carbon dioxide, under which some genotypes and species fared better than others.
"The interesting take home point with this is that aspects of biological diversity -- like genetic diversity and plant species compositions -- are important components of an ecosystem's response to climate change," he said. "Biodiversity matters, in this regard."
Co-authors of the Ecology Letters paper were Kurt Pregitzer of the University of Idaho, Mark Kubiske of the U.S. Forest Service and Andrew Burton of Michigan Technological University. The work was funded by grants from the U.S. Department of Energy and the U.S. Forest Service.

Robotic Bug Gets Wings, Sheds Light On Evolution of Flight


Adding wings to a robotic bug improved running performance and stability. However, the boost may not have been good enough for flight. (Credit: Image by Kevin Peterson, UC Berkeley Biomimetic Millisystems Lab, All rights reserved.)

Science Daily  — When engineers at the University of California, Berkeley, outfitted a six-legged robotic bug with wings in an effort to improve its mobility, they unexpectedly shed some light on the evolution of flight.










The research team, led by Ron Fearing, professor of electrical engineering and head of the Biomimetic Millisystems Lab at UC Berkeley, reports its conclusions online on Oct. 18, in the peer-reviewed journal Bioinspiration and Biomimetics.
Even though the wings significantly improved the running performance of the 10-centimeter-long robot -- called DASH, short for Dynamic Autonomous Sprawled Hexapod -- they found that the extra boost would not have generated enough speed to launch the critter from the ground. The wing flapping also enhanced the aerial performance of the robot, consistent with the hypothesis that flight originated in gliding tree-dwellers.
Using robot models could play a useful role in studying the origins of flight, particularly since fossil evidence is so limited, the researchers noted.
First unveiled by Fearing and graduate student Paul Birkmeyer in 2009, DASH is a lightweight, speedy robot made of inexpensive, off-the-shelf materials, including compliant fiber board with legs driven by a battery-powered motor. Its small size makes it a candidate for deployment in areas too cramped or dangerous for humans to enter, such as collapsed buildings.
A robot gets its wings
But compared with its biological inspiration, the cockroach, DASH had certain limitations as to where it could scamper. Remaining stable while going over obstacles is fairly tricky for small robots, so the researchers affixed DASH with lateral and tail wings borrowed from a store-bought toy to see if that would help.
"Our overall goal is to give our robots the same all-terrain capabilities that other animals have," said Fearing. "In the real world, there will be situations where flying is a better option than crawling, and other places where flying won't work, such as in confined or crowded spaces. We needed a hybrid running-and-flying robot."
The researchers ran tests on four different configurations of the robotic roach, now called DASH+Wings. The test robots included one with a tail only and another that just had the wing's frames, to determine how the wings impacted locomotion.
With its motorized flapping wings, DASH+Wings' running speed nearly doubled, going from from 0.68 meters per second with legs alone to 1.29 meters per second. The robot could also take on steeper hills, going from an incline angle of 5.6 degrees to 16.9 degrees.
"With wings, we saw improvements in performance almost immediately," said study lead author Kevin Peterson, a Ph.D. student in Fearing's lab. "Not only did the wings make the robot faster and better at steeper inclines, it could now keep itself upright when descending. The wingless version of DASH could survive falls from eight stories tall, but it would sometimes land upside down, and where it landed was partly guided by luck."
The flapping wings improved the lift-drag ratio, helping DASH+Wings land on its feet instead of just plummeting uncontrolled. Once it hit the ground, the robot was able to continue on its way. Wind tunnel experiments showed that it is aerodynamically capable of gliding at an angle up to 24.7 degrees.
Tree-dwellers vs. ground-runners
The engineering team's work caught the attention of animal flight expert Robert Dudley, a UC Berkeley professor of integrative biology, who noted that the most dominant theories on flight evolution have been primarily derived from scant fossil records and theoretical modeling.
He referenced previous computer models suggesting that ground-dwellers, given the right conditions, would need only to triple their running speed in order to build up enough thrust for takeoff. The fact that DASH+Wings could maximally muster a doubling of its running speed suggests that wings do not provide enough of a boost to launch an animal from the ground. This finding is consistent with the theory that flight arose from animals that glided downwards from some height.
"The fossil evidence we do have suggests that the precursors to early birds had long feathers on all four limbs, and a long tail similarly endowed with a lot of feathers, which would mechanically be more beneficial for tree-dwelling gliders than for runners on the ground," said Dudley.
Dudley said that the winged version of DASH is not a perfect model for proto-birds -- it has six legs instead of two, and its wings use a sheet of plastic rather than feathers -- and thus cannot provide a slam-dunk answer to the question of how flight evolved.
"What the experiments did do was to demonstrate the feasibility of using robot models to test hypotheses of flight origins," he said. "It's the proof of concept that we can actually learn something useful about biological performance through systematic testing of a physical model."
Among other robotic insects being tested in the Biomimetic Millisystems Lab is a winged, bipedal robot called BOLT (Bipedal Ornithopter for Locomotion Transitioning) that more closely resembles the size and aerodynamics of precursors to flying birds and insects.
"It's still notable that adding wings to DASH resulted in marked improvements in its ability to get around," said Fearing. "It shows that flapping wings may provide some advantages evolutionarily, even if it doesn't enable flight."
The National Science Foundation's Center of Integrated Nanomechanical Systems and the U.S. Army Research Laboratory helped support this research.

Seeing Through Walls: New Radar Technology Provides Real-Time Video of What’s Going On Behind Solid Walls


Lincoln Lab researchers have built a system that can see through walls from some distance away, giving an instantaneous picture of the activity on the other side. (Credit: Lincoln Lab, MIT)
Science Daily  — The ability to see through walls is no longer the stuff of science fiction, thanks to new radar technology developed at MIT's Lincoln Laboratory.










The researchers' device is an unassuming array of antenna arranged into two rows -- eight receiving elements on top, 13 transmitting ones below -- and some computing equipment, all mounted onto a movable cart. But it has powerful implications for military operations, especially "urban combat situations," says Gregory Charvat, technical staff at Lincoln Lab and the leader of the project.


Much as humans and other animals see via waves of visible light that bounce off objects and then strike our eyes' retinas, radar "sees" by sending out radio waves that bounce off targets and return to the radar's receivers. But just as light can't pass through solid objects in quantities large enough for the eye to detect, it's hard to build radar that can penetrate walls well enough to show what's happening behind. Now, Lincoln Lab researchers have built a system that can see through walls from some distance away, giving an instantaneous picture of the activity on the other side.
Waves through walls
Walls, by definition, are solid, and that's certainly true of the four- and eight-inch-thick concrete walls on which the researchers tested their system.
At first, their radar functions as any other: Transmitters emit waves of a certain frequency in the direction of the target. But in this case, each time the waves hit the wall, the concrete blocks more than 99 percent of them from passing through. And that's only half the battle: Once the waves bounce off any targets, they must pass back through the wall to reach the radar's receivers -- and again, 99 percent don't make it. By the time it hits the receivers, the signal is reduced to about 0.0025 percent of its original strength.
But according to Charvat, signal loss from the wall is not even the main challenge. "[Signal] amplifiers are cheap," he says. What has been difficult for through-wall radar systems is achieving the speed, resolution and range necessary to be useful in real time. "If you're in a high-risk combat situation, you don't want one image every 20 minutes, and you don't want to have to stand right next to a potentially dangerous building," Charvat says.
The Lincoln Lab team's system may be used at a range of up to 60 feet away from the wall. (Demos were done at 20 feet, which Charvat says is realistic for an urban combat situation.) And, it gives a real-time picture of movement behind the wall in the form of a video at the rate of 10.8 frames per second.
Filtering for frequencies
One consideration for through-wall radar, Charvat says, is what radio wavelength to use. Longer wavelengths are better able to pass through the wall and back, which makes for a stronger signal; however, they also require a correspondingly larger radar apparatus to resolve individual human targets. The researchers settled on S-band waves, which have about the same wavelength as wireless Internet -- that is, fairly short. That means more signal loss -- hence the need for amplifiers -- but the actual radar device can be kept to about eight and a half feet long. "This, we believe, was a sweet spot because we think it would be mounted on a vehicle of some kind," Charvat says.
Even when the signal-strength problem is addressed with amplifiers, the wall -- whether it's concrete, adobe or any other solid substance -- will always show up as the brightest spot by far. To get around this problem, the researchers use an analog crystal filter, which exploits frequency differences between the modulated waves bouncing off the wall and those coming from the target. "So if the wall is 20 feet away, let's say, it shows up as a 20-kilohertz sine wave. If you, behind the wall, are 30 feet away, maybe you'll show up as a 30-kilohertz sine wave," Charvat says. The filter can be set to allow only waves in the range of 30 kilohertz to pass through to the receivers, effectively deleting the wall from the image so that it doesn't overpower the receiver.
"It's a very capable system mainly because of its real-time imaging capability," says Robert Burkholder, a research professor in Ohio State University's Department of Electrical and Computer Engineering who was not involved with this work. "It also gives very good resolution, due to digital processing and advanced algorithms for image processing. It's a little bit large and bulky for someone to take out in the field," he says, but agrees that mounting it on a truck would be appropriate and useful.
Monitoring movement
In a recent demonstration, Charvat and his colleagues, Lincoln Lab assistant staff John Peabody and former Lincoln Lab technical staff Tyler Ralston, showed how the radar was able to image two humans moving behind solid concrete and cinder-block walls, as well as a human swinging a metal pole in free space. The project won best paper at a recent conference, the 2010 Tri-Services Radar Symposium.
Because the processor uses a subtraction method -- comparing each new picture to the last, and seeing what's changed -- the radar can only detect moving targets, not inanimate objects such as furniture. Still, even a human trying to stand still moves slightly, and the system can detect these small movements to display that human's location.
The system digitizes the signals it receives into video. Currently, humans show up as "blobs" that move about the screen in a bird's-eye-view perspective, as if the viewer were standing on the wall and looking down at the scene behind. The researchers are currently working on algorithms that will automatically convert a blob into a clean symbol to make the system more end-user friendly. "To understand the blobs requires a lot of extra training," Charvat says.
With further refinement, the radar could be used domestically by emergency-response teams and others, but the researchers say they developed the technology primarily with military applications in mind. Charvat says, "This is meant for the urban war fighter … those situations where it's very stressful and it'd be great to know what's behind that wall."

Dark Matter Mystery Deepens





Science Daily — Like all galaxies, our Milky Way is home to a strange substance called dark matter. Dark matter is invisible, betraying its presence only through its gravitational pull. Without dark matter holding them together, our galaxy's speedy stars would fly off in all directions. The nature of dark matter is a mystery that a new study has only deepened.










The standard cosmological model describes a universe dominated by dark energy and dark matter. Most astronomers assume that dark matter consists of "cold" (i.e. slow-moving) exotic particles that clump together gravitationally. Over time, these dark matter clumps have grown and attracted normal matter, forming the galaxies we see today.
"After completing this study, we know less about dark matter than we did before," said lead author Matt Walker, a Hubble Fellow at the Harvard-Smithsonian Center for Astrophysics.
Cosmologists use powerful computers to simulate this process. Their simulations show that dark matter should be densely packed in the centers of galaxies. Instead, new measurements of two dwarf galaxies show that they contain a smooth distribution of dark matter. This suggests that the standard cosmological model may be wrong.
"Our measurements contradict a basic prediction about the structure of cold dark matter in dwarf galaxies. Unless or until theorists can modify that prediction, cold dark matter is inconsistent with our observational data," Walker stated.
Dwarf galaxies are composed of up to 99 per cent dark matter and only one per cent normal matter, like stars. This disparity makes dwarf galaxies ideal targets for astronomers seeking to understand dark matter.
Walker and his co-author Jorge Peñarrubia (University of Cambridge, UK) analyzed the dark matter distribution in two Milky Way neighbours: the Fornax and Sculptor dwarf galaxies. These galaxies hold one million to 10 million stars, compared to about 400 billion in our galaxy. The team measured the locations, speeds and basic chemical compositions of 1500 to 2500 stars.
"Stars in a dwarf galaxy swarm like bees in a beehive instead of moving in nice, circular orbits like a spiral galaxy," explained Peñarrubia. "That makes it much more challenging to determine the distribution of dark matter."
Their data showed that dark matter is distributed uniformly in both cases over a relatively large region, several hundred light-years across. This contradicts the prediction that the density of dark matter should increase sharply toward the centers of these galaxies.
"If a dwarf galaxy were a peach, the standard cosmological model says we should find a dark matter 'pit' at the center. Instead, the first two dwarf galaxies we studied are like pitless peaches," said Peñarrubia.
Some have suggested that interactions between normal and dark matter could spread out the dark matter, but current simulations don't indicate that this happens in dwarf galaxies. The new measurements imply that either normal matter affects dark matter more than expected, or dark matter isn't "cold." The team hopes to determine which is true by studying more dwarf galaxies, particularly galaxies with an even higher percentage of dark matter.
The paper discussing this research was accepted for publication in The Astrophysical Journal.