Fluorescent microscopy using a cellphone and lenses

 News Media Releases 2009 07 Images Cellscope300  News Media Releases 2009 07 Images Cellscope250
UC Berkeley researchers have built a cell phone microscope capable of imaging malaria parasites, tuberculosis bacteria, and other bugs. The CellScope consists of compact microscope lenses attached to the phone's camera. Most impressive is the device's ability to do fluorescent microscopy.
The researchers showed that the TB bacteria could be automatically counted using image analysis software.

"The images can either be analyzed on site or wirelessly transmitted to clinical centers for remote diagnosis," said David Breslauer, co-lead author of the study and a graduate student in the UC San Francisco/UC Berkeley Bioengineering Graduate Group. "The system could be used to help provide early warning of outbreaks by shortening the time needed to screen, diagnose and treat infectious diseases."

The engineers had previously shown that a portable microscope mounted on a mobile phone could be used for bright field microscopy, which uses simple white light – such as from a bulb or sunlight – to illuminate samples. The latest development adds to the repertoire fluorescent microscopy, in which a special dye emits a specific fluorescent wavelength to tag a target - such as a parasite, bacteria or cell - in the sample.

"Fluorescence microscopy requires more equipment – such as filters and special lighting – than a standard light microscope, which makes them more expensive," said Fletcher. "In this paper we've shown that the whole fluorescence system can be constructed on a cell phone using the existing camera and relatively inexpensive components."
"UC Berkeley researchers bring fluorescent imaging to mobile phones for low-cost screening in the field"


  1. That is potentially phenomenal news, perhaps too good to be true. If the microscope requires those uber-expensive lenses then it’s almost worthless.

    On the other hand, if it really is cheap, then this means 3rd world doctors could quickly transmit images to more-equipped doctors and rapidly get to the bottom of a local outbreak.

    That would be really something.

  2. I feel for the researchers: A classmate of mine did his dissertation using a mobile phone to do some simple image processing and use that to drive a robot (in the end, it followed a yellow ball). While it worked fine in the end, apparently there was no camera in the emulator, J2ME hadn’t the rights to get pixel data… Basically this whole thing must have been a huge pain in the butt to program. I’d wager they didn’t do that automatic counting thing on the phone, just due to the pain-in-the-butt-ness of it all. And: ah-ha! : “For reasons of simplicity we implemented the automated particle count on a laptop computer onto which we had transferred the images, but phone computational resources are sufficient for such tasks to be performed on-phone,”

    What I’m basically saying is that this kind of stuff is the reason to be excited about Android and the iPhone. When our old handsets trickle into the third world and they can be easily reprogrammed and extended, apps like this can be made in a third of the time.

Comments are closed.