Search This Blog

Wednesday, October 5, 2011

Robot Culture Machine Efficiently Grows Biological Cells Without Human Intervention



Robotic Cell Factory This robotic cell factory can churn out 500 cell cultures a month. © Fraunhofer IPM
The tedious, carpal-tunnel-inducing pipette work of cell biologists may soon be relegated to robots, thanks to a new cell factory developed in Germany. This could free humans to perform new studies and ask new questions, as automated equipment takes over the time-consuming task of growing, feeding and observing cells in the lab.
Cell cultures are one of the most important tools in biology, used to study a huge host of diseases and cellular functions. But cells are delicate, and for now they must be cultivated by hand, grown in petri dishes and nurtured with a special broth until there are enough cells to transfer to even more petri dishes. The transfer is done via pipette so the cells aren’t harmed.

Many robots aren’t equipped with a gentle enough touch to pull this off, and the humid, warm conditions cells need to grow are not very friendly to electronics. But now, researchers at three different Fraunhofer Institutes have developed a system that can automate this entire process, using several different robots and machines.
One robot is designed to move around the first-generation cell cultures, called multititer plates, among various spots. Then an automated microscope checks the cells to assess their growth, adjusting the light and focus as needed, and the images are fed into a computer system. Special software determines how many cell colonies are present on the plates, and if there are enough, another robot is tasked with picking them up. Using a hollow needle, it chooses cells measuring between 100 and 200 micrometers and transfers them to a new container for continued growth.
The system can produce about 500 cell cultures a month, according to a news release from Fraunhofer. Biologists can even train the system to recognize certain cell types, based on their physical characteristics. The whole thing is big enough to fill a small lab.
Fraunhofer already has a cell factory of a different sort, producing sheets of human skin. That process is also controlled by robots and computers that monitor the cells’ health and growth. But this new one is based on a modular design, so it can be adapted for various uses — for instance, if a lab only wants to automate one part of the cell culture process.
Researchers set up a prototype at the Max Planck Institute, where biologists will use it to determine protein functions, according to Fraunhofer Research News. Let's hope nothing goes wrong and the robots do not use their new skills to create a new legion of multicellular servants.

Underwater Nano-Mirage Effect Enables On-Demand Invisibility



Cloaking via Mirage
Get ready to witness some James Bond-esque, HALO-style active camouflage action. Researchers at the University of Texas at Dallas have cleverly tapped the unique characteristics of carbon nanotubes and the light-bending weirdness of the mirage effect to create a kind of invisibility cloak that can be turned on and off at the flip of a switch.
Though not quite ready to be integrated into an Aston Martin (it works best underwater actually), it is a pretty neat trick, and it could someday have a range of applications outside the lab. The cloaking capability is rooted in the mirage effect, the same phenomenon that occurs when temperatures vary greatly over a short distance. That variation in temperature causes light rays to bend toward the viewer’s eye rather than bounce off of objects normally.

That’s why people tend to see false pools of water in the desert. They are actually seeing the sky on the desert floor--light from the sky bends as it nears the heated ground and heads directly toward the viewer, therefore appearing as a sheen of blue coming from the ground. From there, the brain does the rest, seeing water rather than sky because that makes a lot more sense.
The same effect is happening in the video below. Using highly conductive carbon nanotubes--one-atom-thick layers of carbon wrapped into cylinders--pressed into a transparent sheet, the device you see is able to quickly heat the fluid around it (in this case, water), causing the mirage effect to conceal the object on the other side. And it does so nearly instantaneously.
Yeah. That’s cool.

Tiny Cilia Inside Corpse' Noses Could Be a More Reliable Indicator of Time of Death



The Body May Expire, But the Nasal Cilia Continue On Wikimedia
Despite how easy they make it look on TV dramas, determining time of death for a body requires a lot of difficult guesswork (unless someone is there when the person passes, of course). A range of environmental factors and other mitigating circumstances make any declaration of time of death an estimation at best. But a team of Italian scientists think they’ve found a built-in clockin the human nasal cavity that ticks off the minutes after a body expires, and it could make estimating the time of death a more precise exercise.
There are several ways for forensic examiners to roughly gauge time of death--decomposition rate, the state of rigor mortis, body temperature--but the specific circumstances of death can often influence those indicators, introducing variables that are difficult to account for.

But researchers at the University of Bari in Italy theorized that nasal cilia--small finger-like projections in the nose that help direct mucous, bacteria, and dust out of the nose--continue to pulsate after death. To test their hypothesis, the team took samples from 100 recently deceased cadavers to examine the characteristics of the cilia postmortem.
They found that the cilia do indeed continue beating up to 20 hours after death and that the beating slows at a predictable and consistent rate during that time, regardless of environmental factors. That means forensics teams and doctors could use the rate at which a person’s cilia are beating to make determining the time of death less of an art, and more of a science.

Apple's iPhone 4S: Faster, and a Better Listener, But the Same iPhone You Know and Love



The New iPhone Family This Is My Next
Today in Cupertino, Apple announced the newest version of its bajillion-selling iPhone, to be named the iPhone 4S. Like the iPhone 3GS, this is a small, mostly internal upgrade over its predecessor--a new dual-core processor here, an improved camera there--though there is a major addition in the form of Siri, a voice-command service Apple bought awhile back that allows you to ask your phone questions, or tell it to do things, in natural language. Lots of things.
So, what's new in the iPhone world? The new iPhone 4S is in the same case as last year's iPhone 4, so it's the same size and weight. There's no external change that we know of--all of the new goodies are on the inside. Hardware-wise, it'll be using the A5 dual-core processor that serves the iPad 2 so ably, and it's also getting a brand-new sensor. The old iPhone 4 camera was actually quite good, but the new one seems even better--it's getting a size bump to 8MP, but the new sensor is backside illuminated, which Apple claims will allow it to get 73% more light than the iPhone 4's sensor. It'll also have an f/2.4 lens, and will take pictures 33% faster.
Otherwise, not too much going on in the hardware--Apple's implemented a new sensor design that'll hopefully eliminate that whole "grip of death" problem the iPhone 4 has, and it'll also be a world phone, meaning it has the antennae to work on just about any international band. Oh, and there'll be a 64GB version, in addition to the expected 16GB and 32GB versions.
Software-wise, it'll be rocking iOS5, which brings some much-needed improvements in the notifications and multitasking departments, as well as some nice new features. There'll be more intense Twitter integration; you can tweet your location from the Maps app, for example. Apple also introduced an app called Friends and Family that works sort of like Foursquare, locating your, well, friends and family. There's iCloud, which will sync your contacts, calendars, mail, that kind of thing--a feature Google mastered several versions of Android ago, but still nice.
The big news is integration of Siri, a voice-command software that's previously been available as an app but is now deeply embedded within iOS 5. It lets you give commands in normal phrasing--"Find me a Greek restaurant in North Beach," say, or "define mitosis." It'll work with lots of apps, including calendar, email, maps, and services like Wikipedia and Wolfram Alpha.
Oh, right, pricing and availability. Well, the iPhone 4S will be available for pre-order on October 7th, releasing on the 14th. It'll sell for $200, $300, and $400 for the 16GB, 32GB, and 64GB versions, respectively. The iPhone 4 is still around, with an 8GB version selling for $100. Oh, and the old 3GS is still here, with its non-Retina Display and shameful curved body. It'll be free, on contract. Those will all be available on AT&T, Verizon, and now, for the first time, Sprint. Sorry, T-Mobile--not sure why you're left out, but we sympathize.

Preview Your Drive From the Air, With Google Helicopter View

By Rebecca Boyle
Google Helo View An aerial view from Carmel to Big Sur. Google
Maps can only get you so far in life — sometimes you need to veer off the beaten path, take the scenic route, or figure out how to get there as the crow flies. Now Google will help you do that.Helicopter View: When Street View and River View just aren’t enough.
Google’s helicopter view provides a 3-D view of your journey, so you can envision all the hillsides and neighborhoods you would otherwise miss by driving on boring streets. And it’s a more realistic portrayal of how we see the world, which for the most part is horizontally, not looking down vertically.

Google uses California’s scenic Pacific Coast Highway as an example. A flat top-down view, like from a satellite or something, doesn’t really give you an appreciation for this pretty road, Google explains. But helicopter view lets you see the terrain in all its rugged, Big Sur-y glory.
You fill out Google Maps like you would for any other driving directions, but instead of a 2-D flat map, you get a lovely topographic map with Google Earth-style graphics. It turns on like any other feature of Google Maps, except that you need a Google Earth plugin.
You can fly along the route you’ve mapped out, and if you want to check out the surrounding area, you can pause the “flight” and click and drag the map like you would in Google Earth. Click on a different step in the directions list, and your flight will speed toward that step.
The Street View cars, tricycle and riverboat have all brought us real-perspective views on neighborhoods and remote areas, so could a Google Helicopter View chopper be coming next?

NASA Awards the Largest Prize in Aviation History to an All-Electric, Super-Efficient Aircraft



Pipistrel's Taurus G4 NASA HQ Photo
NASA has awarded the single largest prize handed down in aviation history to Team Pipistrel-USA.com for designing and demonstrating its Taurus G4 electric aircraft. Per the rules of the NASA- and Google-sponsored CAFE Green Flight Challenge, Pipistrel’s Taurus G4 covered 200 miles in less than 2 hours and did so on the electricity equivalent of less than one gallon of fuel per passenger, scoring $1.35 million for the effort.
But the cash, substantial though it may be, is only part of the story here. The CAFE (that’s Comparative Aircraft Flight Efficiency) Challenge was created to push aircraft engineers toward new, more efficient airplane designs that would perhaps usher in a new era of ultra-efficient flight, based on either electric engines or extremely efficient fuel-burning engines.

So while you can argue the day belongs to Pipistrel--and we certainly don’t mean to diminish that achievement--the CAFE Foundation and NASA are the real winners here. Consider: The challenge asked teams to average 100 miles per hour over two hours, and to do so on the equivalent of one gallon of gas. Not only did Pipistrel manage this, but so did California-based e-Genius with its electric-powered plane (for which it netted a second place prize of $120,000).
The kicker: both teams did so on just a little more than a half-gallon of fuel equivalent. That means both Pipistrel and e-Genius did twice as well as NASA and CAFE asked them to do (and Pipistrel slightly better than e-Genius, hence the distribution of prizes).
That’s pretty amazing, considering that just a few years ago engineers were still trying to figure out how to get an all-electric powered plane into the air for any considerable length of time, much less at sustained triple-digit speeds and while using very little energy.
Our jetliners aren’t going green just yet of course. But the winning teams in the CAFE Green FLight Challenge collectively spent just two years and $4 million on two aircraft that have pushed the electric airplane field forward by a considerable step. Imagine what ten years and some serious investment might do for the electric aircraft space.
More background/details on Team Pipistrel-USA.com’s winning Taurus G4 in the video below.
[NASA]

Turning 'waste' into power



UNIVERSITY OF WOLLONGONG   
MichaelUtech__power_plant
Waste heat is a byproduct of nearly all electrical devices and industrial processes.
Image: MichaelUtech/iStockphoto
Thermoelectric power generation is expected to play an increasingly important role in meeting the energy challenges of the future.

And helping to meet that energy challenge is PhD student, Priyanka Jood, from the Institute for Superconducting and Electronic Materials (ISEM) whose groundbreaking research has just been published in the American Chemical Society journal, Nano Letters.

Priyanka, the first author of the paper, supervised by Dr Germanas Peleckis and Professor Xiaolin Wang, is working on thermoelectric materials which can generate electricity directly from waste heat. Dr Peleckis, Professor Wang, and the Director of the ISEM, Professor Shi Dou, are co-authors of the Nano Letters paper.

The UOW team along with researchers from Rensselaer Polytechnic Institute (RPI) in New York have created large marble-size pellets of thermoelectric nanomaterials. Priyanka spent about a year working alongside the US team.

The RPI team are also co-authors in the paper. The team was led by Professor Ganpati Ramanath and the other team members who contributed were Rutvik J. Mehta, Yanliang Zhang, Richard W. Siegel and Theo Borca-Tasciuc.

Waste heat is sometimes referred to as secondary heat or low-grade heat which is heat produced by machines, electrical equipment and industrial processes. It is a byproduct of nearly all electrical devices and industrial processes from driving a car to flying an aircraft or operating a power plant.

Now the UOW team based at the Innovation Campus along with engineering researchers at Rensselaer Polytechnic Institute have developed new nanomaterials that could lead to techniques for better capturing and putting this waste heat to work.

The key ingredients for making marble-sized pellets of the new material are aluminium and a common everyday microwave oven.

Harvesting electricity from waste heat requires a material that is good at conducting electricity but poor at conducting heat. One of the most promising candidates for this job is zinc oxide (ZnO) -- a non-toxic, inexpensive material with a high melting point.

While nanoengineering techniques exist for boosting the electrical conductivity of zinc oxide, the material’s high thermal conductivity is a roadblock to its effectiveness in collecting and converting waste heat. Because thermal and electrical conductivity are related properties, it’s very difficult to decrease one without also diminishing the other.

Now the UOW and US-based teams have demonstrated a new way to decrease zinc oxide’s thermal conductivity without reducing its electrical conductivity. The innovation involves adding minute amounts of aluminium to zinc oxide, and processing the materials in a microwave oven.

The research could lead to new technologies for harvesting waste heat and creating highly energy efficient cars, aircraft, power plants, and other systems.

Researchers say harvesting waste heat is a very attractive proposition, since the heat can be converted into electricity and used to power devices such as a car that is creating the heat in the first place. This would reduce the world’s dependence on fossil fuels.

Priyanka said it was possible that even further power factor enhancements using nano-structured zinc oxide might be possible making this material highly valuable for thermoelectrical industrial applications.

She said that researchers at ISEM are continuing to explore new and novel methods for producing high performance thermoelectric materials as a part of their research program in energy storage and energy conversion materials.

Results of the Australian Research Council funded study entitled “Al-Doped Zinc Oxide Nanocomposites with Enhanced Thermoelectric Properties,” can be seen online at Nano Letters at this site.
Editor's Note: Original news release can be found here.

When Water and Air Meet: New Light Shed On Mysterious Structure of World's Most Common Liquid Interface


A snapshot in the MD simulation trajectory of the HOD / D2O mixture that shows the water pair at the surface. White, green and red represent H, D and O atoms, respectively. (Credit: Image courtesy of RIKEN)

Science Daily  — Findings by researchers at the RIKEN Advanced Science Institute and their colleagues at Tohoku University and in the Netherlands have resolved a long-standing debate over the structure of water molecules at the water surface. Published in the Journal of The American Chemical Society, the research combines theoretical and experimental techniques to pinpoint, for the first time, the origin of water's unique surface properties in the interaction of water pairs at the air-water interface.






















At the heart of this mystery are two broad bands in the vibrational spectrum for surface water resembling those of bulk ice and liquid water. Whether these bands are the result of hydrogen bonds themselves, of intra-molecular coupling between hydrogen bonds within a single water molecule, or of inter-molecular coupling between adjacent water molecules, is a source of heated debate. One popular but controversial hypothesis suggests one of the spectral bands corresponds to water forming an actual tetrahedral "ice-like" structure at the surface, but this interpretation raises issues of its own.The most abundant compound on Earth's surface, water is essential to life and has shaped the course of human civilization. As perhaps the most common liquid interface, the air-water interface offers insights into the surface properties of water in everything from atmospheric and environmental chemistry, to cellular biology, to regenerative medicine. Yet despite its ubiquity, the structure of this interface has remained shrouded in mystery.The researchers set out to resolve this debate through a comprehensive study combining theory and experiment. For their experiments, they applied a powerful spectroscopy technique developed at RIKEN to selectively pick out surface molecules and rapidly measure their spectra. To eliminate coupling effects, which are difficult to reproduce in simulations, they used water diluted with D2O (heavy water) and HOD (water with one hydrogen atom, H, replaced by deuterium, D). Doing so eliminates coupling of OH bonds within a single molecule (since there is only one OH bond) and reduces the overall concentration of OH bonds in the solution, suppressing intermolecular coupling.With other influences removed, the researchers at last pinpointed the source of water's unique surface structure not in an "ice-like" structure, but in the strong hydrogen bonding between water pairs at the outermost surface. The extremely good match between experimental and theoretical results confirms this conclusion, at long last bringing clarity to the debate over the structure of the water surface and setting the groundwork for fundamental advances in a range of scientific fields.

Engineers Build Smart Petri Dish: Device Can Be Used for Medical Diagnostics, Imaging Cell Growth Continuously



The ePetri platform is built from Lego blocks and uses a smartphone as a light source. The imaging chip is seen in detail on the right. (Credit: Image courtesy of Guoan Zheng, California Institute of Technology)

Science Daily  — The cameras in our cell phones have dramatically changed the way we share the special moments in our lives, making photographs instantly available to friends and family. Now, the imaging sensor chips that form the heart of these built-in cameras are helping engineers at the California Institute of Technology (Caltech) transform the way cell cultures are imaged by serving as the platform for a "smart" petri dish.

Dubbed ePetri, the device is described in a paper that appears online this week in the Proceedings of the National Academy of Sciences(PNAS).
Since the late 1800s, biologists have used petri dishes primarily to grow cells. In the medical field, they are used to identify bacterial infections, such as tuberculosis. Conventional use of a petri dish requires that the cells being cultured be placed in an incubator to grow. As the sample grows, it is removed -- often numerous times -- from the incubator to be studied under a microscope.
Not so with the ePetri, whose platform does away with the need for bulky microscopes and significantly reduces human labor time, while improving the way in which the culture growth can be recorded.
"Our ePetri dish is a compact, small, lens-free microscopy imaging platform. We can directly track the cell culture or bacteria culture within the incubator," explains Guoan Zheng, lead author of the study and a graduate student in electrical engineering at Caltech. "The data from the ePetri dish automatically transfers to a computer outside the incubator by a cable connection. Therefore, this technology can significantly streamline and improve cell culture experiments by cutting down on human labor and contamination risks."
The team built the platform prototype using a Google smart phone, a commercially available cell-phone image sensor, and Lego building blocks. The culture is placed on the image-sensor chip, while the phone's LED screen is used as a scanning light source. The device is placed in an incubator with a wire running from the chip to a laptop outside the incubator. As the image sensor takes pictures of the culture, that information is sent out to the laptop, enabling the researchers to acquire and save images of the cells as they are growing in real time. The technology is particularly adept at imaging confluent cells -- those that grow very close to one another and typically cover the entire petri dish.
"Until now, imaging of confluent cell cultures has been a highly labor-intensive process in which the traditional microscope has to serve as an expensive and suboptimal workhorse," says Changhuei Yang, senior author of the study and professor of electrical engineering and bioengineering at Caltech. "What this technology allows us to do is create a system in which you can do wide field-of-view microscopy imaging of confluent cell samples. It capitalizes on the use of readily available image-sensor technology, which is found in all cell-phone cameras."
In addition to simplifying medical diagnostic tests, the ePetri platform may be useful in various other areas, such as drug screening and the detection of toxic compounds. It has also proved to be practical for use in basic research.
Caltech biologist Michael Elowitz, a coauthor on the study, has put the ePetri system to the test, using it to observe embryonic stem cells. Stem cells in different parts of a petri dish often behave differently, changing into various types of other, more specialized cells. Using a conventional microscope with its lens's limitations, a researcher effectively wears blinders and is only able to focus on one region of the petri dish at a time, says Elowitz. But by using the ePetri platform, Elowitz was able to follow the stem-cell changes over the entire surface of the device.
"It radically reconceives the whole idea of what a light microscope is," says Elowitz, a professor of biology and bioengineering at Caltech and a Howard Hughes Medical Institute investigator. "Instead of a large, heavy instrument full of delicate lenses, Yang and his team have invented a compact lightweight microscope with no lens at all, yet one that can still produce high-resolution images of living cells. Not only that, it can do so dynamically, following events over time in live cells, and across a wide range of spatial scales from the subcellular to the macroscopic."
Elowitz says the technology can capture things that would otherwise be difficult or impossible -- even with state-of-the-art light microscopes that are both much more complicated and much more expensive.
"With ePetri, you can survey the entire field at once, but still maintain the ability to 'zoom in' to any cells of interest," he says. "In this regard, perhaps it's a bit like an episode of CSI where they zoom in on what would otherwise be unresolvable details in a photograph."
Yang and his team believe the ePetri system is likely to open up a whole range of new approaches to many other biological systems as well. Since it is a platform technology, it can be applied to other devices. For example, ePetri could provide microscopy-imaging capabilities for other portable diagnostic lab-on-a-chip tools. The team is also working to build a self-contained system that would include its own small incubator. This advance would make the system more useful as a desktop diagnostic tool that could be housed in a doctor's office, reducing the need to send bacteria samples out to a lab for testing.
Funding support was provided by the Coulter Foundation.