Search This Blog

Thursday, August 18, 2011

New Computer Chip Modeled on a Living Brain Can Learn and Remember



IBM, with help from DARPA, has built two working prototypes of a "neurosynaptic chip." Based on the neurons and synapses of the brain, these first-generation cognitive computing cores could represent a major leap in power, speed and efficiency
IBM's "Neurosynaptic" chip prototype IBM Research Zurich
A pair of brain-inspired cognitive computer chips unveiled today could be a new leap forward — or at least a major fork in the road — in the world of computer architecture and artificial intelligence.
About a year ago, we told you about IBM’s project to map the neural circuitry of a macaque, the most complex brain networking project of its kind. Big Blue wasn’t doing it just for the sake of science — the goal was to reverse-engineer neural networks, helping pave the way to cognitive computer systems that can think as efficiently as the brain. Now they’ve made just such a system — two, actually — and they’re calling them neurosynaptic chips.

Built on 45 nanometer silicon/metal oxide semiconductor platform, both chips have 256 neurons. One chip has 262,144 programmable synapses and the other contains 65,536 learning synapses — which can remember and learn from their own actions. IBM researchers have used the compute cores for experiments in navigation, machine vision, pattern recognition, associative memory and classification, the company says. It’s a step toward redefining computers as adaptable, holistic learning systems, rather than yes-or-no calculators.
“This new architecture represents a critical shift away form today’s traditional von Neumann computers, to extremely power-efficient architecture,” Dharmendra Modha, project leader for IBM Research, said in an interview. “It integrates memory with processors, and it is fundamentally massively parallel and distributed as well as event-driven, so it begins to rival the brain’s function, power and space.”
You can read up on Von Neumann architecture over here, but essentially it is a system with two data portals, which are shared by the input instructions and output data. This creates a bottleneck that will fundamentally limit the speed of memory transfer. IBM’s system eliminates that bottleneck by putting the circuits for data computation and storage together, allowing the system to compute information from multiple sources at the same time with greater efficiency. Also like the brain, the chips have synaptic plasticity, meaning certain regions can be reconfigured to perform tasks to which they were not initially assigned.
IBM’s long-term goal is to build a chip system with 10 billion neurons and 100 trillion synapses that consumes just one kilowatt-hour of electricity and fits inside a shoebox, Modha said.
The project is funded by DARPA’s SyNAPSE (Systems of Neuromorphic Adaptive Plastic Scalable Electronics) initiative, and IBM just completed phases 0 and 1. IBM’s project, which involves collaborators from Columbia University, Cornell University, the University of California-Merced and the University of Wisconsin-Madison, just received another $21 million in funding for phase 2, the company said.
Computer scientists have been working for some time on systems that can emulate the brain’s massively parallel, low-power computing prowess, and they’ve made several breakthroughs. Last year, computer engineer Steve Furber described a synaptic computer network that consists of tens of thousands of cellphone chips.
The most notable computer-brain achievements have been in the field of memristors. As their name implies, a memory resistor can “remember” the last resistance that it possessed when current was flowing through it — so after current is turned back on, the resistance of the circuit will be the same. We will not attempt to delve too deeply here, but this basically makes a system much more efficient.
A Map of the Mind: The highways and byways connecting the various regions of a Macaque monkey's brain.  PNAS
Hewlett-Packard has been developing memristors since first describing them in 2008, and has also been part of the SyNAPSE project. Last spring, HP engineers described a titanium dioxide memristor that uses low power.
For a brain-based computer system, memristors can function as a computer analogue for a synapse, which also stores information about previous data transfer. IBM's chip doesn't use a memristor architecture, but it does integrate memory with computation power — and it uses computer neurons and axons to do it. The building blocks are simple, but the architecture is unique, said Rajit Manohar, associate dean for research and graduate studies in the engineering school at Cornell.
"When a neuron changes its state, the state it is modifying is its own state, not the state of something else. So you can physically co-locate the circuit to do the computation, and the circuit to store the state. They can be very close to each other, so that cooperation becomes very efficient," he said.
Modha said it is just a new way to store memory.
"A bit is a bit is a bit. You could store a bit in a memristor, or a phase-change memory, or a nano-electromechanical switch, or SRAM, or any form of memory that you please. But by itself, that does not a complete architecture make," Modha said. "It has no computational capability."
But this new chip does have that power, he said. It integrates memory with processor capability on a typical SOI-CMOS platform, using traditional transistors in a new design. Along with integrated memory to stand in for synapses, the neurosynaptic “core” uses typical transistors for input-output capability, i.e. neurons.
Chip Simulation: A simulation of the cognitive chip. IBM has fabricated two chips based on this design, which the company says recreate the phenomena between spiking neurons and synapses in biological systems.  IBM Research
This new architecture will not replace traditional computers, however. “Both will be with us for a long time to come, and continue to serve humanity,” Modha predicted.
The idea is that future powerful chips based on this brain-network design will be able to ingest and compute information from multiple inputs and make sense of it all — just like the brain does.
A cognitive computer monitoring the oceans could record and compute variables like temperature, wave height and acoustics, and decide whether to issue tsunami or hurricane warnings. Or a grocer stocking shelves could use a special glove that monitors scent, texture and sight to flag contaminated produce, Modha said. Modern computers can’t handle that level of detail from so many inputs, he said. But our brains do it all the time — grab a rotting peach, and your senses of touch, smell and sight work in concert instantaneously to determine that the fruit is bad.
To do this, the brain uses electrical signals between some 150 trillion synapses, all while sipping energy — our brains need about 20 watts to function. Understanding how this works is key to building brain-based computers, which is why IBM has been working with neuroscientists to study monkey and cat brains. That research is progressing, Modha said.
But it will be quite some time before computer chips can truly match the ultra-efficient computational powerhouses that nature gave us.
Reading Handwriting : IBM researchers John Arthur (left) and Paul Merolla demonstrate handwriting recognition by the cognitive computing chip. The computer interprets what Paul has written (far right) into the figure "2" on the far left. Right next to it, it has determined its confidence that it is a 2 (not a zero, and definitely not a 9).  IBM Research

DARPA Fills Us In On HTV-2's Semi-Successful Flight and Very Successful Crash




HTV-2 in Flight You can't imagine the shutter speed our photographer had to use to capture this shot (kidding). DARPA
Last week, DARPA’s HTV-2 (Hypersonic Technology Vehicle 2) Falcon vehicle launched to near-orbital speeds aboard a Minotaur rocket before beginning what was designed to be a Mach 20 glide back to earth, demonstrating the kind of hypersonic capability needed to deliver a payload anywhere in the world in an hour. Then, a few minutes into its flight, HTV-2’s data transmitterswent silent and so did the DARPA news stream feeding us the play-by-play.
Now we know what happened to HTV-2. Sort of.

HTV-2 separated from its Minotaur carrier rocket successfully and entered the descent phase of its flight with all systems looking nominal. But somewhere “post-perigee”--where it was supposed to start climbing again--HTV-2 encountered a flight anomaly that caused its autonomous systems to initiate a controlled termination of the flight. That is, the computer took over and crashed the hypersonic vehicle into the Pacific just as it was designed to do. Per DARPA director Regina Dugan via press release:
According to a preliminary review of the data collected prior to the anomaly encountered by the HTV-2 during its second test flight, HTV-2 demonstrated stable aerodynamically controlled Mach 20 hypersonic flight for approximately three minutes. It appears that the engineering changes put into place following the vehicle’s first flight test in April 2010 were effective. We do not yet know the cause of the anomaly for Flight 2.
While not ideal news, don’t miss the most important message there. HTV-2 achieved stable Mach 20 flight for about three minutes, gathering data all the way through. That’s 20 times the speed of sound. And though DARPA isn’t sure what the “anomaly” was, HTV-2 also demonstrated that its autonomous systems worked perfectly--at least according to DARPA. You can’t have a rogue hypersonic missile out there roaming the skies out of control. When something went awry, HTV-2 offed itself in the controlled manner prescribed by its engineers.
So it wasn’t a completely successful flight, but it was a successful crash. No word on where the program goes from here, but its unlikely DARPA is simply going to sit on that valuable hypersonic flight data. Expect something equally cool to be in the works in coming months.

Malignant stem cells may explain why some breast cancers develop and recur



“Abnormalities in these cells may indicate the earliest mutations in breast cancers, an OHSU Knight Cancer Institute study found.”
Mutations that are found in stem cells could be causing some breast cancers to develop and may be the reason the disease recurs. These abnormal cells are likely controlling cell functions in the tumor and, given they are not targeted by chemotherapy and radiation, they enable the disease to recur.
Caption: SuEllen J. Pommier, Ph.D., is an associate research professor in the division of surgical oncology at Oregon Health & Science University's Knight Cancer Institute. Credit: Oregon Health & Science University Knight Cancer Institute.
The mutations were discovered in a study conducted by scientists and physicians at the Oregon Health & Science University Knight Cancer Institute. The study, which examined breast cancer cells removed during surgery, was recently published online in the Annals of Surgical Oncology.
“By studying normal and malignant cells that were collected from breast tissues removed during surgery, we were able to look at what is occurring in the body,” said SuEllen J. Pommier, Ph.D., the lead author of the study and associate research professor in the division of surgical oncology at the OHSU Knight Cancer Institute.
Working with samples taken directly from surgeries made the findings in this study possible, Pommier said, because the biology of breast stem cells could be compared with their malignant counterparts in a way that hadn’t been done before. The cultured cell lines used in most studies can’t provide accurate information about normal breast stem cells.
The study, which was funded primarily by the Avon Foundation for Women, may prove that some current therapies that target mutations in the tumor won’t be effective in stamping out the disease for some patients. It also suggests that more research should be done in two areas:
  • Determining the role of PIK3CA/AKT1 signaling mutations, which were found in 73 percent of the tumors in this study of fresh surgical specimens – an occurrence rate that is much higher than previously detected in stored samples.
  • And, exploring the importance of the loss of CD24 expression, which previously was considered a requirement for breast cancer stem cells, but may not be a characteristic of all breast cancer stem cells. 
Understanding the biology of individual tumors is the primary mission of the OHSU Knight Cancer Institute. “This study provided us with new insights into breast cancer stem cells and possibly into the earliest mutations. That information is crucial for developing treatments,” Pommier added.
___________
In addition to support from the Avon Foundation for Women, this study was funded by a Vertex Pharmaceutical/Oregon Health & Science Collaborative Research Grant.

Gene combination increases risk of lung cancer, particularly in light smokers, CAMH study finds



Smokers with variations in two specific genes have a greater risk of smoking more cigarettes, becoming more dependent on nicotine and developing lung cancer, a new study from the Centre for Addiction and Mental Health (CAMH) shows.
The cancer risk from these two genes appears to be even higher in smokers who consume 20 or fewer cigarettes a day, according to the study published in the September issue of the Journal of the National Cancer Institute.

X-ray of the lungs
CAMH Scientist Dr. Rachel Tyndale and her team studied two genes: the nicotine metabolic gene (CYP2A6) and the nicotinic gene cluster (CHRNA5-A3-B3). These genes have been independently linked to smoking behaviours and lung cancer.
The new CAMH study looked at the effect of combined risks of both genes, possible gene interactions and the relative contribution of each gene on the number of cigarettes smoked, nicotine dependence and lung cancer risk in smokers.
“We found that the nicotine metabolic gene appears to have a larger influence on how many cigarettes people smoke each day, while the nicotinic gene cluster has a larger impact on the risk of lung cancer,” said Dr. Tyndale. “The combined effect of having both high-risk gene variations more than doubled the odds of developing lung cancer.”
The CAMH study included 417 lung cancer patients and a comparison group of 443 individuals with no cancer – all current or former smokers. Each individual was classified as having a low, intermediate or high risk of heavier cigarette smoking or developing cancer, depending on which combination of gene variants they had. The genetic profiles of study participants were linked to reports of cigarette smoking, a standard test of nicotine dependence, and lung cancer presence.
The genetic risk of lung cancer due to these two genes was highest among lighter smokers with both high-risk gene variations, compared to those with one or none of the high-risk gene variations.
“While heavier smoking increases the overall risk for lung cancer (as well as other health problems), our study looked specifically at the effects of these two genes on cancer risk, said Dr. Tyndale, who is also head of CAMH’s Pharmacogenetics Lab. “We found the genetic risk from these two genes made a larger contribution among lighter smokers.”
This finding is significant because most people tend to smoke less than they used to – in Ontario, adult smokers use an average of 13 cigarettes per day, according to the 2009 CAMH Monitor. Lighter smokers tend to perceive their risk as being lower, but if they have high-risk variants of these two genes, the genetic risk for lung cancer is high.
The percentage of individuals with these two genetic variations depends on ethnic background. Together, these variations influence an individual’s response to nicotine and nitrosamines – a substance in tobacco smoke that is carcinogenic to the lungs. Dr. Tyndale’s previous research on the nicotine metabolic gene has shown that people who are “fast metabolizers” break nicotine down more quickly. They have a harder time quitting and do not respond as well to nicotine replacement therapy as “slow metabolizers.”
“This study indicates a significant risk for lung cancer is related to these two genes, and also suggests ways to intervene to reduce the risks,” said Dr. Tyndale. “For instance, we are conducting a clinical trial to optimize cessation treatments based on genetic profiles, and fast metabolizers could be treated by using approaches that make them slow metabolizers.”
Tobacco use is the primary cause of preventable disease and death in Canada and the USA. Smoking is the main cause of lung cancer, and also increases the risk of cancers of the colon, mouth, throat, pancreas, bladder and cervix.
____________
For more information or to request an interview, contact: Sara Goldvine, CAMH Public Affairs at 647-567-7457 begin_of_the_skype_highlighting            647-567-7457      end_of_the_skype_highlighting or sara_goldvine@camh.net

Snapping Ice Caps Snapping – An Arctic Meltdown



In case you’ve ever wondered what cascading ice looks like, photographers Sisse Brimberg and Cotton Coulson took a peek for you, and the readers of National Geographic. These photos, snapped from Arctic heights by the Copenhagen-based married couple, are exquisite and also daunting: looking at them, it’s like witnessing a crack in Nature’s plans take shape. There should be a word for when awe-inspiring and dreadful, polar (quite!) opposites, combine like this in one single shot.



Nature’s beautiful even while collapsing, but sugar-coating aside, the dam is breaking, and that is as ominous as it sounds. The two photographers are quite involved in raising awareness about global warming and the silent thaw of the polar ice caps. Also, e other more obviously Man-related issues, like how He can’t be bothered to clean up the debris that He leaves in his wake, in this precious ecosystem.

Database Finds New Uses for Old Drugs




sn-repurposing.jpg
Double whammy. Researchers have turned up a pill against heartburn that could be used to treat lung cancer and an antiepileptic pill that might treat Crohn's disease.
Credit: Marina Sirota
What if a cheap medicine sold over the counter turned out to be a cure against cancer or another deadly disease? Scientists have devised a new way of predicting such unexpected benefits of existing drugs, and they have confirmed two potential new therapies just to prove the point. "This promises new uses for drugs that have already been tested for their safety and offers a faster and cheaper way to new medicine," says Atul Butte, an expert in bioinformatics at Stanford University School of Medicine in California, who conducted the study.
There are plenty of examples of drugs originally developed to treat one disease that turned out to help another: Acetylsalicylic acid (aspirin) is not just a pain killer but is also used to reduce the risk of heart attack. And when a blood pressure drug called sildenafil was discovered to have an unexpected side effect, it went on to become the erectile dysfunction blockbuster now known as Viagra. Such crossovers can save drug developers a lot of time and money. Developing a single new drug on average takes more than a decade and costs about $800 million. Existing drugs have known safety profiles and are approved for human use, so they can be rapidly evaluated for new indications.
"But most repurposing of drugs is still due to chance observations or educated guesses," Butte says. In today's issue of Science Translational Medicine, he and his colleagues present a more efficient way of finding such new uses for old drugs: by bringing together data on how diseases and drugs affect the activity of the roughly 30,000 genes in a human cell. Researchers have collected information on which genes are activated or silenced in certain diseases and by certain drugs for many years. "Our hypothesis was, if a disease is characterized by certain changes in gene expression and if a drug causes the reverse changes, then that drug could have a therapeutic effect on the disease," he says.
To find such opposing pairs, Butte and colleagues used public databases and compared the data for 100 diseases with that for 164 drug molecules. They found candidate therapeutics for 53 of the diseases. Many matches had already been discovered and turned into therapies, but others were completely unexpected. For example, the analysis predicted that an epilepsy drug called topiramate would be active against inflammatory bowel diseases such as Crohn's disease. And the over-the-counter drug cimetidine, which inhibits acid production in the stomach and is used to treat heartburn, matched a certain type of lung cancer.
To confirm this latter link, the researchers investigated the compound in a mouse model of lung cancer. They showed that it slowed the growth of human lung cancer cells but not kidney cancer cells in these mice. Similarly, giving topiramate to rats with colitis reduced swelling and ulceration in the animals.
John Overington, a computational chemical biologist at the European Bioinformatics Institute in Hinxton, U.K., is not convinced that these two particular drugs will get very far. "Topiramate hits quite a lot of targets and has complex side effects, while the doses needed for functional effects for cimetidine seemed high," he cautions. But he praises the paper's main idea. "This is a really important concept; it is almost like they are looking for an antidote to a disease." Stefan Schreiber, an expert on the genetics of inflammatory bowel diseases at University of Kiel in Germany, agrees that the idea behind the papers is more important than the two drug candidates. "This is a proof of principle," he says. "The main point is that someone has taken all of this available genomic data and shown what you can do with it."
The opportunities are growing rapidly, Butte says. "When we started the project 5 years ago, we had data for a hundred diseases and 164 compounds, he says. "Today it would be about 1400 diseases and 300 molecules." Butte hopes that scientists and pharmaceutical companies will continue to make data publicly available. The unknown benefits of many more drugs are just waiting to be discovered, he says.

Suicide-Bombing Bacteria Could Fight Infections




sn-bacteria.jpg
Guerrilla tactics. Biologists have created synthetically engineered E. coli(left) that explode and kill pathogenic P. aeruginosa (right).
Credit: CDC
Like any good military unit, infectious bacteria have access to numerous weapons and efficient communication systems. But like soldiers in the field, they're also susceptible to suicide bombers. Researchers have used the tools of synthetic biology to create anEscherichia coli cell that can infiltrate foreign bacteria and explode, killing off the pathogens along with itself.
The project, says bioengineer Chueh Loo Poh of Nanyang Technological University in Singapore, was "inspired by nature," particularly by quorum sensing, the ability of some bacteria to detect the number of microorganisms—either of their own species or others—in their environment. When pathogenic Pseudomonas aeruginosasense other species impeding on their space and nutrients, they communicate with members of their own species using chemical signals and collectively start releasing a bacterial toxin called pyocin that kills off the competition. Together, these communication and defense capabilities allow P. aeruginosa to form tightly packed layers called biofilms, which can cause respiratory tract infections in humans and are particularly dangerous to cystic fibrosis patients.
Poh and chemical engineer Matthew Wook Chang, also at Nanyang Technological University, decided to turn P. aeruginosa's weapon system against itself, using E. coli as the carrier. The researchers tweaked the genes that allow P. aeruginosa to detect other members of its species and put this synthetic genetic code into E. coli's genome. They also gave E. coli a gene for making a modified pyocin that is toxic to P. aeruginosa. By linking the pyocin gene to the sensing genes, the researchers ensured that when the E. coli detected P. aeruginosa in the vicinity, it would fill itself with large amounts of pyocin and become a biological time bomb.
The researchers gave E. coli one last synthetic component: a "suicide gene" that is activated once the pyocin has had some time to build up, causing the cells to burst open and release their toxin. When Chang and Poh grew these synthetic E. coli in a dish with P. aeruginosa, the suicide bomber was able to kill 99% of the P. aeruginosa cells, the researchers report today in Molecular Systems Biology.
Justin Gallivan, a synthetic biologist at Emory University in Atlanta, says in an e-mail that the study "nicely illustrates" how synthetic bacteria can perform complex tasks. But he worries they may not be able to finish the job, because 1% of the infectious bacteria remained after the treatment—even when the researchers put four times as many E. coli as P. aeruginosa into the mix.
The system would also have to undergo a lot of work before it can be considered for use in humans—including, perhaps, replacing E. coli with another delivery system, says Richard Kitney, a synthetic biologist at Imperial College London. "Exposing people to E. coli is not a good thing," Kitney says, as the bacteria are toxic outside the gut. He adds that the team would also have to show that pyocin is effective at killing P. aeruginosa that have already formed a biofilm.
For their part, Chang and Poh say that they plan to test the suicidal bacteria in mice infected with P. aeruginosa. It's not clear, they say, whether pyocin is harmful to mammals, although some other natural bacterial toxins are currently approved for use as food preservatives. They also hope to tweak the synthetic system so that it can sense and respond to signaling molecules released by other species of pathogenic bacteria, such as those responsible for cholera.

Hanuman Chalisa (by Dr M.S.Subbalakshmi) with slogam lyrics in English