Packed Chip The Ion Proton I can sequence much of a human genome for just $1,000. Sequencing will become even cheaperCourtesy Life Technologies
Scientists needed $3 billion and 13 years to sequence the three billion base pairs encoded in a single human genome—the first time. By 2011, eight years after that first project was completed, the cost of sequencing a human genome had fallen to $5,000, in a process that took just a few weeks. And in January, Jonathan Rothberg, a chemical engineer and the founder of the biotech company Ion Torrent, unveiled an approach that is faster and cheaper still. He says his machine will be able to sequence a human genome, some 3.2 gigabytes’ worth of data, in two hours for just $1,000. Now thousands, and soon enough millions, of patients will have their genetic makeup laid bare, which presents an entirely new problem: How to analyze all that information?
Rothberg had introduced the first sequencing machine that could perform millions of chemical reactions on a fiber-optic array in 2004. But in 2010, he replaced the fiber-optic array with a semiconductor chip. In a powerful application of Moore’s law, which states that the number of transistors on a microchip doubles about every two years, the number of arrays Rothberg has been able to fit on his chips has grown rapidly. The more arrays he can squeeze onto a chip, the greater its performance and the cheaper the cost of sequencing. As if to prove the point, Rothberg sequenced Intel co-founder Gordon Moore’s genome last year on a silicon semiconductor chip with 1.2 million microwells. His new machine’s first chip, the Ion Proton I, has 165 million microwells. And Rothberg says he will release the Proton II, a chip with four times as many wells, later this year. The Proton II will make two-hour, full-genome sequencing possible.
Despite the falling cost of sequencing, personalized genomic medicine has thus far been used very selectively. A sequence from a single patient often isn’t enough to pinpoint the genes responsible for a disease, so even relatively cheap sequencing can quickly become prohibitively expensive. “If you knew you could find the answer in one patient, $1,000 versus $5,000 might not be a deciding factor,” says Richard Lifton, the chairman of the department of genetics at Yale School of Medicine. “But if you think you might need to study 20 patients with similar diseases, then you’re talking about $20,000 versus $100,000.” Lifton says he will use four of Rothberg’s Proton machines to help locate the genetic abnormalities that cause mysterious diseases in his patients.
Richard Gibbs, who runs the Human Genome Sequencing Center at Baylor College of Medicine, says he will use an Ion Torrent sequencer to investigate the genetic basis of Mendelian disorders, diseases caused by single-gene mutations, which afflict 25 million Americans. Last year, Gibbs was part of a team that sequenced the complete genomes of Noah and Alexis Beery, 14-year-old twins who were diagnosed when they were five with a rare movement disorder caused by a defect in how their body processes dopamine. Alexis had trouble breathing because of spasms in her larynx. By examining the twins’ genes and comparing them with that of their older brother, parents and grandparents, the team found that the siblings were also deficient in serotonin, allowing doctors to adjust their medication and normalize Alexis’s breathing.
And sequencing will only get cheaper. At IBM, researchers are at work on a $100 sequencer, a chip that could read bases as DNA fragments flow through nanometer-wide holes on its surface. When genome sequencing begins reaching millions of patients, it will help address the most common problems in medicine. St. Jude Children’s Research Hospital in Memphis, Tennessee, is now sequencing the DNA in cancer cells in pediatric patients to identify the gene mutations that lead to childhood cancers. Jay Shendure, a professor of genetics at the University of Washington, says that in 10 years sequencing will be routine. One’s genome could appear alongside other standard medical information, like blood pressure.
The day when a genome is seamlessly incorporated into everyone’s medical information will not arrive as quickly as $1,000 sequencing. After all, medicine isn’t governed by Moore’s law. Soon the price of sequencing will fall below the price of storing the data it generates. Two companies, GenomeQuest and DNANexus, now host genomes on the cloud for scientists and doctors to access. Doctors will need to be trained to apply genomic information to standard medical practice; the National Human Genome Research Institute has awarded more than $80 million for this purpose. “It’s not a system that moves very quickly,” Shendure says, “but it will happen.”
Decoding the Double Helix:The Y-axis is the number of incorporated base pairs per well. The X-axis is the "flow" or well number, with corresponding DNA base pairs. Courtesy Life Technologies
DECODING THE DOUBLE HELIX
To determine the order of nucleotide bases in a genome—the As, Gs, Cs and Ts in our DNA—scientists attach single strands of DNA fragments to the surface of micron-wide beads. The beads are centrifuged into microwells on the surface of an Ion Proton chip. Technicians place the chip inside a machine, where it is flooded with one of the four nucleotides at a time. The machine looks for nucleotide matchups, building a complementary strand of the patient’s double helix.
When matchups occur, a positively charged hydrogen ion is released. A metal sensor under the wells registers the increased charge, and transistors beneath the well convert the charge into a voltage. Software determines which base was incorporated, and a resulting chart [above] reassembles the fragments into a whole genome.
Check out more from our Future of Medicine issue here.
Two years ago, Nokia exec Anssi Vanjoki claimed that phones would be making DSLRs obsoletesooner than later. That's kind of crazy, of course, but it seems like they're still actually working on making it happen. Their new 808 phone has a 1/1.2" sensor with a total of 41-megapixels on-board. Yes, 41-megapixels.
It sounds a bit outlandish. In fact, some people were upset with Nikon for putting 36-megapixels on a full-frame sensor in their new D800. But, Nokia's technology is a bit different than the brute-force megapixel attack of days past. The 808 can take photos up to 38-megapixels in resolution, checking in at 7728 x 5354, but it doesn't seem like they intend for you to do that all too often. Their intention was to combine those pixels to make better 5- or 8-megapixel images.
According to reports, the Pure View has been in the works for more than five years, which helps explain why the new phone runs the now-ancient Symbian OS. The phone does what's called "oversampling," which combines pixels on the sensor in order to make clearer images at smaller resolutions. It's not an entirely new concept, but it makes for an impressive headline.
Part of the driving force behind the development was the fact that fitting optical zooming lenses into a phone is difficult, if not impossible. So, the phone uses its megapixel power to let you "zoom" up to 3x without a loss of image quality, or so they claim. That "zoom" power gets pumped up to 4x in video mode, since full HD video only requires 1920 x 1080 resolution. When you zoom, you're just selecting which area of the sensor to use the capture the image.
This is different than typical digital zoom because Nokia has limited the zoom range to prevent upscaling. It won't zoom past the point where the final image resolution exceeds that of the input resolution.
The sensor isn't the only piece of fancy photography kit that they put in the 808. The lens is a Carl Zeiss with a maximum Aperture of F/2.4 and has a full-frame equivalent of 26 or 28mm, depending on the aspect ratio you choose. Plus, the flash acts like a real camera flash, rather than a janky flashlight like some other smartphones.
In the end, it's an interesting experiment. The sensor is still very small considering the resolution. It's more in-line with what you'd find in a typical compact, which still makes is far smaller than anything you'll find in a DSLR or even a Micro Four-Thirds camera.
If you dig, you can find a couple sample images online, but we're always weary to trust those. We'll be interested to get our hands on one of these at some point. Unfortunately, that will likely be a ways off as it's only being officially released in Europe for the moment where it will cost 450 Euros. But, I wouldn't be at all surprised to see it showing up on a Windows Phone here in the States before long.
Study finds variants that influence arsenic metabolism and increase risk for skin lesions
One of the first large-scale genomic studies conducted in a developing country has discovered genetic variants that elevate the risk for skin lesions in people chronically exposed to arsenic. Genetic changes found near the enzyme for metabolizing the chemical into a less toxic form can significantly increase an individual’s risk for developing arsenic-related disease.
The discovery could point the way to new screening and intervention options for people who are exposed to groundwater with high levels of arsenic, according to the investigators at the University of Chicago Medicine, Columbia University’s Mailman School of Public Health, and in Bangladesh. The study is published in PLoS Genetics.
The group’s genome-wide association study, or GWAS, was conducted in nearly 3,000 individuals exposed to arsenic for decades in Bangladesh. Since the widespread installation of hand-pumped wells to tap groundwater sources in the 1970s, as many as 77 million people – about half the population of Bangladesh – have been accidentally exposed to dangerous levels of arsenic. The World Health Organization calls the exposure “the largest mass poisoning of a population in history.” Continue reading below…
For more than a decade, the scientists have studied the epidemiology of arsenic-related disease, such as skin lesions, diabetes, and respiratory illnesses, in this population, as well as the effectiveness of interventions to prevent toxicity. In the new study, the researchers sought genetic answers for why some individuals appear to be at higher risk for developing disease after arsenic exposure.
“These results add clarity to the genetic architecture that is playing a role in arsenic toxicity and its underlying biology,” said senior author Habibul Ahsan, MD, MMedSc, Louis Block Professor of health studies, medicine and human genetics at the University of Chicago Medicine. “It’s a rare type of study for a major problem affecting millions of people around the world, and it opens up opportunities for genetic studies of other major public health problems in developing countries.”
The researchers genotyped thousands of arsenic-exposed individuals from the group’s main studies for single nucleotide polymorphisms (SNPs) throughout the genome, and looked for associations with arsenic metabolite levels and risk of skin lesions.
The genetic findings provide strong evidence that efficient metabolism of arsenic through methylation protects against the toxin. Compounds that boost methylation, such as folic acid, could reduce arsenic toxicity – a strategy currently being tested by co-author Mary Gamble, PhD, associate professor of Environmental Health Sciences at Columbia University’s Mailman School of Public Health.
“If we could somehow find a way to do that in Bangladesh, it would make individuals much better methylators of arsenic, and as this current study shows if you’re a better methylator you’re at a lower risk for disease,” said co-author Joseph Graziano, PhD, professor of Environmental Health Sciences and Director of Superfund Research Program at the Mailman School of Public Health of Columbia University.
Beyond the clinical applications, the current study demonstrates that large-scale genomic studies are possible in a largely rural population of a developing country. The study offers a rare example of a GWAS result with clear, immediate potential for translational impact.
_________
The research was supported by the National Institute of Environmental Health Sciences and the National Cancer Institute.
Source: Columbia University’s Mailman School of Public Health
Researchers at the EPFL have identified an important mechanism that could lead to the design of more effective cancer vaccines. Their discovery of a new-found role of the lymphatic system in tumour growth shows how tumours evade detection by using a patient’s own immune system.
The VEGF-C-expresses (lymphangiogenic) tumors. Credit: EPFL
Tumour cells present antigens or protein markers on their surfaces which make them identifiable to the host immune system.
In the last decade, cancer vaccines have been designed that work by exposing the patient’s immune cells to tumour-associated antigens and so priming them to kill cells that present those antigens. These have caused much excitement, not least because by acting so specifically on cancer cells, they could potentially eliminate the unpleasant side effects of chemo- and radiotherapy. Continue reading below…
Like soldiers protecting a fort
However, clinical trials of such vaccines have had a very low success rate to date, mainly because tumours have various mechanisms for evading detection by immune cells, even when those immune cells – called T cells – have been primed to seek them out. Those mechanisms are, in general, poorly understood. But in a paper to be published this week in Cell Reports, the laboratories of Melody Swartz at EPFL and Stéphanie Hugues at UNIGE provide a key insight into one of them. They describe for the first time how, like soldiers protecting a fort, lymph vessels surrounding a tumour ward off T cell attack.
Plenty of research has shown that tumours can induce the growth of lymph vessels in their vicinity, and that this growth is correlated with metastasis and poor prognosis. It was assumed that these lymph vessels simply provided an escape route for cancer cells, transporting them to distant sites. In the new study, led by postdoc Amanda Lund, the researchers show that lymph vessels actually suppress the immune response, deleting the attacking T cells or leaving them “functionally exhausted” by the time they reach the tumour.
They studied a type of tumour that expresses large amounts of VEGF-C, a molecule that is naturally expressed in humans and that stimulates lymphatic growth. Having engineered the tumour cells to express a foreign antigen, they compared the efficacy of a vaccine designed to prime T cells to kill cells carrying that antigen, either when VEGF-C was present or when its activity was blocked . With VEGF-C suppressed, the vaccine’s efficacy increased and tumour growth slowed fourfold.
A weakness in cancer’s defense system exploited
The researchers went on to show that the endothelial cells which line lymph vessels “scavenge” tumour-specific antigens and present them to the tumour-specific T cells in a suppressive manner. This, in turn, promotes the local deletion of those T cells. According to Prof Swartz, that means that first targeting the lymph vessels associated with a tumour could, in theory, significantly increase the efficacy of existing cancer vaccines. “It would be like removing the soldiers from around the fort before sending in your opposing army,” she says. “If you disable the lymph vessels’ suppressive functions, our data suggest that tumour-killing T cells would do their job a lot more effectively.” Future clinical trials are needed to put that theory to the test.
________
Title: VEGF-C Promotes Immune Tolerance in B16 Melanomas and Cross-Presentation of Tumor Antigen by Lymph Node Lymphatics Authors: Amanda W. Lund,1,2 Fernanda V. Duraes,3 Sachiko Hirosue,1 Vidya R. Raghavan,1 Chiara Nembrini,1 Susan N. Thomas,1,2 Amine Issa,1 Stéphanie Hugues,3,* and Melody A. Swartz1,2, 1Institute of Bioengineering (IBI), EPFL 2Swiss Institute for Experimental Research (ISREC) School of Life Sciences, École Polytechnique Fédérale de Lausanne (EPFL), Lausanne 1015, Switzerland 3Department of Pathology and Immunology, University of Geneva Medical School, Centre Médical Universitaire (CMU), Geneva 1211, Switzerland
From 1942, the Nazi leader Adolf Hitler received daily injections of methamphetamine from his personal physician, Dr Theodor Morell. The Führer was also familiar with cocaine. Hitler's ailments have been attributed to everything from tertiary syphilis to Parkinson's disease. But many of The Führer's clinical signs and symptoms may have been caused by his exotic drug regimen.
In Hitler's Wehrmacht, methamphetamine tablets branded as Pervitin were liberally distributed to German fighting troops throughout the War. Amphetamines are "power drugs" that reduce fatigue, heighten aggression, and diminish human warmth and empathy.
How could Hitler continue to exert such a grip on the German people until the last days of the War? Talking to a prison psychologist while awaiting trial, ex-Governor General of Poland Hans Frank (1900-1946) describes Hitler's charismatic effect on him...
"I can hardly understand it myself. There must be some basic evil in me. In all men. Mass hypnosis? Hitler cultivated this evil in man. When I saw him in that movie in court, I was swept along again for a moment, in spite of myself. Funny, one sits in court feeling guilt and shame. Then Hitler appears on the screen and you want to stretch out your hand to him . . . . It's not with horns on his head or with a forked tail that the devil comes to us, you know. He comes with a captivating smile, spouting idealistic sentiments, winning one's loyalty. We cannot say that Adolf Hitler violated the German people. He seduced us."
In a look at how major stressors during childhood can change a person's biological risk for psychiatric disorders, researchers at Butler Hospital have discovered a genetic alteration at the root of the association. The research, published online in PLoS ONE on January 25, 2012, suggests that childhood adversity may lead to epigenetic changes in the human glucocorticoid receptor gene, an important regulator of the biological stress response that may increase risk for psychiatric disorders.
The association between childhood adversity, including parental loss and childhood maltreatment, and risk for psychiatric disorders such as depression and anxiety has been established in multiple studies. However, researchers have yet to define how and why this association exists in humans. "We need to understand the biology of this effect in order to develop better treatment and prevention programs," said Audrey Tyrka, MD, PhD, director of the Laboratory for Clinical and Translational Neuroscience at Butler Hospital and associate professor of Psychiatry and Human Behavior at Brown University. "Our research group turned to the field of epigenetics to determine how environmental conditions in childhood can influence the biological stress response."
Epigenetics is the study of changes to the genome that do not alter the DNA sequence, but influence whether genes will be expressed, or "turned on," versus whether they will be silenced. Knowing that the connection between childhood maltreatment and psychiatric disorders has been linked to the hormone system that coordinates biological stress responses, the researchers sought to identify the root cause at a genetic level.
The glucocorticoid receptor is an important regulator of the stress response, and methylation is a particularly stable type of epigenetic modification. "We knew that epigenetic changes to this gene could be affected by childhood parenting experiences because previous animal research showed that rodents with low levels of maternal care had increased methylation of this gene, and consequently, as adults these animals had greater stress sensitivity and fear in stressful situations," said Tyrka.
The researchers looked at 99 healthy adults, some of whom had a history of parental loss or childhood maltreatment. DNA was extracted from each of the participants using a blood sample, then analyzed to identify epigenetic changes to the glucocorticoid receptor. The researchers then performed a standardized hormone provocation test to measure the stress hormone, cortisol.
The researchers found that adults with a history of childhood adversity—maltreatment or parental loss—had increased methylation of the glucocorticoid receptor (GR) gene, which is thought to change the way this gene is expressed on a long-term basis. They also found that greater methylation was linked to blunted cortisol responses to the hormone provocation test. "Our results suggest that exposure to stressful experiences during childhood may actually alter the programming of an individual's genome. This concept may have broad public health implications, as it could be a mechanism for the association of childhood trauma with poor health outcomes, including psychiatric disorders as well as medical conditions such as cardiovascular disease," said Tyrka.
In early studies of animals, researchers have identified drugs that can reverse methylation effects. "More research is needed to better understand the epigenetic mechanism behind this association," said Tyrka, noting a larger scale study currently underway at Butler and a study of this association in children. "This line of research may allow us to better understand who is most at risk and why, and may allow for the development of treatments that could reverse epigenetic effects of childhood adversity."
An inflated cortical surface of the human brain reconstructed from MRI scans and viewed from the front. Areas of the prefrontal cortex where increased grey matter volume correlated with more extraordinary metacognitive ability are shown in hot colours. Credit: Dr Steve Fleming.
At New York University, Sir Henry Wellcome, Postdoctoral Fellow Dr Steve Fleming, is exploring the neural basis of metacognition: how we think about thinking and assess the accuracy of our decisions, judgements and other mental performance.
Metacognition is an important-sounding word for a very everyday process. We are 'metacognitive' when reflecting on our thinking process and knowledge.
We do it on a moment-to-moment basis, according to Dr. Steve Fleming at New York University. "We reflect on our thoughts, feelings, judgements and decisions, assessing their accuracy and validity all day long," he says.
This kind of introspection is crucial for making good decisions. Do I really want that bar of chocolate? Do I want to go out tonight? Will I enjoy myself? Am I aiming at the right target? Is my aim accurate? Will I hit it? How sure am I that I'm right? Is that really the correct answer?
If we ask ourselves these questions as a kind of faint, ongoing, almost intuitive commentary in the back of our minds, we will progress smoothly through life.
Although we all do it, we're not equally good at it. An example Steve likes to use is the gameshow 'Who Wants to be a Millionaire?' When asked the killer question, 'Is that your final answer?', contestants with good metacognitive skills will assess how confident they are in their knowledge.
If sure (I know that I know), they'll answer 'yes'. If unsure (I don't know that I know), they'll phone a friend or ask the audience. Contestants who are less metacognitively gifted may have too much confidence in their knowledge and give the wrong answer - or have too little faith and waste their lifelines.
Metacognition is also fundamental to our sense of self: to know who we are. Perhaps we only really know anyone when we understand how and what they think - and the same applies to learning ourselves. How reliable are our thought processes? Are they an accurate reflection of reality? How real is our knowledge of a particular subject?
Last year, Steve won a prestigious Sir Henry Wellcome Postdoctoral Fellowship to explore the neural basis of metacognitive behaviour: what happens in the brain when we think about our thoughts and decisions or assess how well we know something?
Killer questions
One of the challenges for neuroscientists interested in metacognition has been the fact that - unlike in learning or decision-making, where we can measure how much a person improves at a task or how accurate their decision is - there are no outward indicators of introspective thought, so it's hard to quantify.
As part of his PhD at University College London, Steve joined a research team led by Wellcome Trust Senior Fellow Professor Geraint Rees and helped devise an experiment that could objectively measure a person's performance on a task and how accurately they judged their own performance.
Thirty-two volunteers were asked to look at a series of similar black and grey pictures on a screen and say which one contained a brighter patch.
"We adjusted the brightness or contrast of the patches so that everyone was performing at a similar level," says Steve. "And we made it difficult to see which patch was brighter, so no one was entirely sure whether their answer was correct; they were all in a similar zone of uncertainty."
They then asked the 'killer' metacognitive question: How sure are you of your answer, on a scale from one to six?
Comparing people's answers to their actual performance revealed that although all the volunteers performed equally well on the primary task of identifying the brighter patches, there was a lot of variation between individuals regarding how accurately they assessed their own performance - or how well they knew their own minds.
Magnetic resonance imaging (MRI) scans of the volunteers' brains further revealed that those who most accurately assessed their own performance had more grey matter (the tissue containing the cell bodies of our neurons) in a part of the brain located at the very front, called the anterior prefrontal cortex. In addition, a white-matter tract (a pathway enabling brain regions to communicate) connected to the prefrontal cortex showed greater integrity in individuals with better metacognitive accuracy.
The findings, published in Science in September 2010, linked the complex high-level process of metacognition to a small part of the brain. The study was the first to show that physical brain differences between people are linked to their level of self-awareness or metacognition.
Intriguingly, the anterior prefrontal cortex is also one of the few parts of the brain with anatomical properties unique to humans and fundamentally different from our closest relatives, the great apes. It seems introspection might be unique to humans.
"At this stage, we don't know whether this area develops as we get better at reflecting on our thoughts, or whether people are better at introspection if their prefrontal cortex is more developed in the first place," says Steve.
I believe I do
Although this research and research from other labs points to candidate brain regions or networks for metacognition located in the prefrontal cortex, it needs to explain why they are involved. Steve plans to use his fellowship to address that question by investigating the neural mechanisms that generate metacognitive reports.
He's approaching the question by attempting to separate the different kinds of information (or variables) people use to monitor their mental and physical performance.
He cites playing a tennis shot as an example. "If I ask you whether you just played a good tennis shot, you can introspect whether you aimed correctly and how well you carried out your shot. These two variables might combine to make up your overall confidence in the shot."
To evaluate how confident we are in each variable (aim and shot) we need to weigh up different sets of perceptual information. To assess our aim, we consider the speed and direction of the ball and the position of our opponent across the net. To judge how well we carried out the actual shot, we would think about the position of our feet and hips, how we pivoted, and how we swung and followed through.
There may have been some discrepancy between the shot we wanted to achieve and the shot we actually made. This is a crucial distinction for scientists exploring decision-making. "Psychologists tend to think of beliefs, 'what I should do', as separate from actions," explains Steve.
"When choosing between two chocolate bars, you might decide on a Mars bar - that's what you believe you should have, what you want and value. But when you actually carry out the action of reaching for a bar, you might reach for a Twix instead. There's sometimes a difference between what you should do and what you actually do, and that's a crucial distinction for metacognition. My initial experiments are going to try to tease apart these variables."
Research into decision-making has identified specific brain regions where beliefs about one choice option (one chocolate bar or one tennis shot) being preferable to another are encoded. However, Steve says, "what we don't know is how this type of information [about values and beliefs] relates to metacognition about your decision making. How does the brain allow humans to reflect on its computations?"
He aims to connect the finely detailed picture of decision-making given to us by neuroscience to the vague notion we have of self-reflection or metacognition.
New York, New York
Steve is working with researchers at New York University who are leaders in task design and building models of decision making, "trying to implement in a laboratory setting exactly the kind of question we might ask the tennis player."
They are designing a perceptual task, in which people will have to choose a target to hit based on whether a patch of dots is moving to the left or right. In other words, people need to decide which target they should hit (based on their belief about its direction of motion), and then they have to hit it accurately (action).
"We can use various techniques to manipulate the task's difficulty. If we make the target very small, people will obviously be more uncertain about whether they will be able to hit it. So we can separately manipulate the difficulty of deciding what you should do, and the difficulty of actually doing it."
Once the task is up and running, they will ask the volunteers to make confidence judgements - or even bets - about various aspects of their performance: how likely they thought they chose the right target, or hit it correctly. Comparing their answers with their actual performance will objectively measure the accuracy of their beliefs (metacognition) about their performance.
Drilling down
Such a task will mean Steve and his colleagues can start to decouple the perceptual information that gives people information about what they should do (which target to hit) from the perceptual data that enables them to assess the difficulty of actually carrying out the action (hitting the target).
And that will make it possible to start uncoupling various aspects of metacognition - about beliefs and about actions or responses - from one another. "I want to drill down into the basics, the variables that come together to make up metacognition, and ask the question: how fine-grained is introspection?"
He'll then use various neuroscience techniques, including brain scanning and intervention techniques such as transcranial magnetic stimulation (to briefly switch off metacognitive activity in the brain), to understand how different brain regions encode information relevant for metacognition. "Armed with our new task, we can ask questions such as: is belief- and action-related information encoded separately in the brain? Is the prefrontal cortex integrating metacognitive information? How does this integration occur? Answers to these questions will allow us to start understanding how the system works."
Since metacognition is so fundamental to making successful decisions and knowing ourselves - it's important to understand more about it. Steve's research may also have practical uses in the clinic. Metacognition is linked to the concept of 'insight', which in psychiatry refers to whether someone is aware of having a particular disorder. As many as 50 per cent of patients with schizophrenia have profoundly impaired insight and, unsurprisingly, this is a good indicator of whether they will fail to take their medication.
"If we have a nice task to study metacognition in healthy individuals that can quantify the different components of awareness of beliefs, and awareness of responses and actions, we hope to translate that task into patient populations to understand the deficits of metacognition they might have." With that in mind, Steve plans to collaborate with researchers at the University of Oxford and the Institute of Psychiatry in London when he returns to finish his fellowship in the UK.
The science of metacognition also has implications for concepts of responsibility and self-control. Our society places great weight on self-awareness: think of a time when you excused your behaviour with 'I just wasn't thinking'. Therefore, understanding the boundaries of self-reflection is central to how we ascribe blame and punishment, approach psychiatric disorders, and view human nature.
More information: Fleming S. Relating introspective accuracy to individual differences in brain structure. Science 2010;329(5998):1541-3.
Provided by Wellcome Trust
"Metacognition: I know (or don't know) that I know." February 27th, 2012. http://medicalxpress.com/news/2012-02-metacognition-dont.html