sor to shine a laser into the eye of the patient and measure the reflected light with
a wavefront sensor. It can be produced
for less than $2 and clips onto a mobile
phone screen. The subject uses the phone
to signal when patterns projected onto
the screen overlap. After the process is
repeated several times at different angles
for each eye, custom software loaded onto
the phone crunches the data and creates a
prescription, all within a few minutes.
Bioengineers at the University of California, Berkeley, have fitted a smartphone with magnifying optics
to create a real “cell” phone – a diagnostic-quality microscope that can be used in
The researchers, in the bioengineering
and biophysics lab of Daniel Fletcher,
initially envisioned a device so rugged
that it could be used for high-resolution
imaging outside of the lab. But Dr. Eva M.
Schmid, who works in the lab, decided to
evaluate the device, called CellScope, in
a middle-school science classroom after a
chance encounter with a secondary school
science teacher in San Francisco.
Those middle schoolers helped develop
the educational side of the device, using it
for a year to take macroscopic and microscopic pictures of objects in their homes,
gardens and other environments and
then displaying them on the screen and
posting them to social media platforms to
promote discussion. The devices are now
being tested with other classrooms and in
Schmid described CellScope’s development at the annual meeting of the American Society for Cell Biology in December
in San Francisco.
The commercial potential of the CellScope attracted a $1 million investment
last summer from Khosla Ventures, the
venture capital firm led by Sun Microsystems Inc. co-founder Vinod Khosla.
“Health data, the key ingredient to useful analysis and diagnosis, is starting to
explode exponentially – and CellScope is
on the cutting edge,” Khosla said.
CellScope’s first consumer offering
will be a smartphone-enabled otoscope
that enables physicians to remotely
diagnose ear infections in children from
pictures taken by parents using the smart-
The CellScope, created in the lab of
bioengineering professor Daniel Fletcher, turns the
camera of a standard cellphone
into a diagnostic-quality microscope with
a magnification of 53 to 603.
phone’s camera. Pediatric ear infections
result in 30 million doctor visits annually
in the US alone.
Future CellScope products will address
throat and skin exams and nonclinical applications, including consumer skin care.
Dining out with severe food allergies can be nerve-racking, relying on a busy server or kitchen to make sure
you’re not served something that could
make you sick or, even worse, deathly
ill. Even prepackaged foods can contain
ingredients not listed on the label. Now, a
team led by UCLA associate professor of
electrical engineering and bioengineering Aydogan Ozcan wants to give control
back to those with allergies by allowing
them to test their meals on the spot using
The lightweight (less than 2 oz) device,
called the i Tube, uses the phone’s camera,
in combination with an application, to test
with the same high level of sensitivity as
a lab would, Ozcan said.
The device tests for allergens by optically measuring a sample of the food in
question mixed with water and an extraction solvent, then mixing the prepared
solvent with a series of other reactive
The method digitally converts raw images from the cellphone camera into con-
Aydogan Ozcan and colleagues at UCLA have
developed the i Tube platform (left), which attaches
to a cellphone and uses colorimetric assays and a
digital reader to detect allergens in food samples.
(Right) A screen capture of the i Tube App.
centration measurements detected in the
food samples. The test goes beyond just
a “yes” or “no” answer to the presence of
allergens by quantifying how much of an
allergen is in a sample, in parts
The i Tube platform can test for a variety of trigger foods, including peanuts,
almonds, eggs, gluten and hazelnuts,
“We envision that this cellphone-based
allergen testing platform could be very
valuable, especially for parents, as well as
for schools, restaurants and other public
settings,” Ozcan said. “Once successfully
deployed in these settings, the big amount
of data – as a function of both location
and time – that this platform will continu-
ously generate would indeed be price-
less for consumers, food manufacturers,
policy makers and researchers, among
The device was introduced in 2012,
and, so far, the UCLA researchers have
successfully tested the i Tube using com-
mercially available cookies, analyzing
the samples to determine if they have
any harmful amounts of peanuts. Their
research was recently published online in
Lab on a Chip and will be featured in an
upcoming issue of the journal.
In 2008, Ozcan’s lab introduced the
imaging platform LUCAS (Lensless
Ultrawide-field Cell monitoring Array
platform based on Shadow imaging).
Instead of using a lens to magnify objects,
LUCAS generates holographic images of
microparticles or cells by using an LED
to illuminate the objects and a digital