Inside the Technology That Can Turn Your Smartphone into a Personal Doctor
The fantastic tricorder device that “Bones” used to scan aliens on “Star Trek” is nearly at hand—in your cellphone
Episode one of “Star Trek,” Stardate 1513.1. Chief medical officer Leonard “Bones” McCoy beams onto a desolate planet, M-113, with orders to perform a routine physical on Prof. Robert Crater, an ill-tempered archaeologist who wishes McCoy would just go away.
“Doubtless the good surgeon will enjoy prodding and poking us with his arcane machinery,” Crater snipes.
Think again, Crater: Prodding and poking is so last millennium.
Dr. McCoy packs a medical “tricorder.” Wand the body with this hand-held computer, and seconds later it coughs up the particulars of a patient’s condition.
“The machine is capable of almost anything,” McCoy says. As he sweeps the device across Crater’s chest and back, it purrs like a blissed-out electronic cat. In the 23rd century—as pictured by television writers in the late 1960s—that purr was a sign that a very sophisticated machine was working.
The tricorder-like devices in the UCLA engineering labs of Aydogan Ozcan don’t purr. Nor do they cause the shoulder strain of the cassette recorder-size clunkers of Trekkie lore. But in other respects, they’re the closest thing yet to the real McCoy.
Ozcan’s sleek gizmos, which fit onto the back of a smartphone, count thousands of red and white blood cells in seconds; screen urine for signs of kidney disease; spot viruses like HIV and influenza in a smear of blood; and test water for bacteria, parasites and toxic chemicals. Another phone attachment, the iTube, scanned for microscopic specks of allergy-causing peanut in what one of Ozcan’s journal articles last year described as “3 different kinds of Mrs. Fields Cookies.”
When I visited Ozcan on the UCLA campus, a dozen of the devices were arrayed like museum pieces in an illuminated glass display case in a corner of his laboratory. The ones in the original “Star Trek” series resembled antediluvian Walkmen. Ozcan’s devices are the size of a lipstick case or matchbox.
“This is honestly one of our first hacks,” he told me with a touch of nostalgia, pulling out a six-year-old Nokia phone that he’d somehow retooled into a lens-free digital microscope. He says “hack” because he takes technology already in our pockets—the smartphone, another gadget anticipated by “Star Trek’s” inaugural episode—and cheaply reworks it into lightweight, automated versions of the bulky instruments found in medical laboratories.
At the rate he’s going, Ozcan, who at 35 already holds the title of UCLA chancellor’s professor, may soon hack the whole clinical lab. He wants nothing less than to make it small and cheap enough—and so idiot- and klutz-proof—that we can carry it in our pocket like loose change.
***
I’d visited Ozcan during a week in January when temperatures tripped into the 80s. So when one of his postdocs, Qingshan Wei, a 32-year-old with stylish clip-on shades, asked if I wanted to scope out the waves in Marina del Rey, I raised no objection.
Our “scope” was a Samsung Galaxy with an attachment that turned the phone’s camera into a mercury detection system. The toxic metal can build up in fish, and water tests can serve as an early warning system. “We want to detect mercury in water before it goes into the food chain,” Wei told me.
We splashed barefoot into shin-deep surf, and Wei pipetted seawater into a small plastic box on the back of the phone. Inside were a pair of LEDs that fired red and green beams of light through the water sample and onto the phone’s camera chip. An app scrutinized the subtle shifts in color intensity, and four seconds later, results flashed on the screen.
Two months earlier, mercury levels at this very spot had been worrisome. Today, the phone told us, the water was safe.
Similar tests performed by a full-scale environmental laboratory are very expensive, Wei told me. They also require schlepping the sample to the lab, for a complicated analysis called inductively coupled plasma-mass spectrometry. “For this,” Wei said, nodding at the mercury tester, which cost $37 and was made by a 3-D printer, “we write a smart application. You just sample, click open the application, follow the instructions and click ‘analyze this.’”
The brains of the system are Ozcan’s algorithms, which turn the phone’s humdrum camera into a powerful optical instrument that sees what the eye can’t, then tells us how worried to be. His devices—because they piggyback on GPS-enabled smartphones—no sooner test a sample than they can send time- and location-stamped results to your doctor, an environmental agency or, say, Google Maps. Supply the technology to enough of the world’s three billion mobile subscribers, and you’ve got battalions of citizen scientists beaming up health and environmental data from across the globe in real time.
Ozcan’s software funnels the data into a continually updating map where epidemiologists, public health officials and your uncle Murray could follow the spread of a disease or chemical spill live, the way our smartphones already use our speed and location to crowd-source data for mobile traffic apps. Ozcan’s goal: to chart the world’s invisible threats—the pollutants in water, the allergens in food, the pathogens in air—as panoramically as traffic or weather.
And the device’s potential to safeguard health is vast. At the moment “we’re lost in low-dimensional data,” Ozcan says. “It’s like looking at a Picasso picture where only a few pixels are there. With more dots painted by more individuals, you can see more of the big picture.”
***
Ozcan’s work has drawn particular praise for its promise in the developing world, where state-of-the-art diagnostic tools are in short supply. With his phone attachments, minimally trained health workers could quickly test blood for HIV and malaria, and water for E. coli and Giardia.
I asked Ozcan to picture a future in which smartphones spotted threats to our health as faithfully as they identified gridlock on our commute. “I would be more healthy with my choices, more informed about air quality, tap water,” he said. “If you’re camping and you don’t have a lot of things with you”—or frantic after a hurricane, earthquake or other disaster—“there’s another opportunity to sense what to drink, what not to drink, what to eat, what not to eat.”
Should my kids drink from that rusty faucet? Are peanut crumbs lurking in that carrot cake? There’s a hypochondriac-worthy list of health questions we might wish to answer, at least preliminarily, with a screen tap.
This future is arriving at warp speed, says Ozcan. “Less than five years. It’s going to boom.”
***
Aydogan Ozcan (pronounced I-doe-on Euz-john) was born in Istanbul, but had a peripatetic childhood. His father was a low-level clerk in Turkey’s forests ministry, and his mother a homemaker. His only sibling—a brother, Cumhur, nine years older—struggled in school, and the family hopscotched the country in search of the right educational setting. Aydogan went to five elementary schools. (Cumhur, now a physician in Istanbul, became the first in the family to attend college.)
Aydogan fell in love with the elegant symmetries of mathematics, and that passion wheeled into a fascination with physics. As boyhood TV habits went, he was only a tepid “Star Trek” fan: “cold and too dark,” he says of the atmospherics. He preferred “The Smurfs”—specifically Handy Smurf, the can-do inventor in workmen’s overalls and pencil above his ear.
But Ozcan, a svelte man with the tightly wound intensity of a loaded spring, discouraged me from mining his childhood for clues to his career. There were no medical crises. He never wanted for health care. “No apple emotionally hit me,” he says. As a kid, he dismantled pens and watches, not computers. When I asked him to name the most exciting piece of technology to arrive in the family home as a boy, he said “color TV.” This was the 1980s.
Despite his professional renown as an innovator, his personal life is still something of a throwback. His current cellphone, for instance, is an unfashionable BlackBerry awarded to him by the University of Southern California when one of his smartphone microscopes won the school’s Body Computing Slam. That was four years ago.
His pursuit of pocket-size labs was less a childhood dream than a product of almost Vulcan rationality: Digital microscopy was a wide-open field with the potential to improve lives, particularly in remote parts of the world, and he saw an opportunity. “It’s quite unfair that some people don’t have access to very fundamental things because their government is corrupt, because the aid system is broken. It was timely to produce some more cost-effective and very advanced tools.”
After earning a PhD from Stanford in 2005, Ozcan took a short-term job at the Wellman Center for Photomedicine, at Massachusetts General Hospital. He worked for Harvard professors trying to boost the field of view of dishwasher-size optical microscopes, but soon he had his own ideas. “I was sure that some of the problems of imaging and counting cells could be solved in different ways.”
Ozcan and a former Stanford classmate, Utkan Demirci, went on eBay and bought used surveillance cameras on the cheap. With lasers and screwdrivers, they pried off the lenses and scooped out the imaging chips. Then, almost for kicks, they dribbled a solution of cultured liver cells right onto the chips’ silicon faces to see what kind of a picture they might get.
Something similar was tried a couple of years before, by a NASA collaborator at Stanford named Gregory Kovacs. For an experiment into the effects of zero gravity, Kovacs had rigged a video chip to image the movements of tiny roundworms, Caenorhabditis elegans, as they plummeted to earth from a high-altitude balloon. The camera chip successfully tracked the backlit wigglers by their shadows.
But C. elegans were one millimeter long—visible to the naked eye. How on earth would Ozcan get a similar chip to pick up the shadows of cells one-hundredth the worms’ size?
To Ozcan’s surprise, the liver cells threw respectable shade. The shadows grew if he put the cells on a slide one-fifth of a millimeter from the camera chip—just as the shadow of your hand grows the nearer your hand is to a light. Before long, Ozcan had a prototype that could count hundreds of thousands of cells in seconds, work done in hospitals by machines called cytometers with the girth of a linebacker and a price tag in the hundreds of thousands of dollars.
The parts for Ozcan’s gadget—an off-the-shelf image sensor, a few LEDs and two AA batteries—cost less than $10.
But could a lens-free device do more than just count cells? Ozcan wondered. Could it be tweaked to actually see inside them?
Because of the way lenses bend light, traditional microscopes can focus on just one smidge of a sample slide at a time. If you ditched lenses, though, your field of view would be limited only by the physical size of the camera chip. A half-centimeter-square chip, like those in many cellphones, was at least 100 times larger than a conventional scope’s field of view. That meant Ozcan could both count more cells at once and more readily spot so-called “rare cells”—such as markers of early-stage cancer—within a pool of healthy ones.
But to see nuclei and other internal cell features, Ozcan needed more than shadows. He found that if he trained an LED through a pinhole, the light created a funky hologram as it passed through the insides of a cell. The challenge now was somewhat like deducing the shape of a mid-sea rock from the contours of waves on a faraway beach. “I literally spent a summer deriving tons of equations,” Ozcan told me. The goal was to digitally “time-reverse” those holographic waves until their source—a cell, a parasite—came into focus.
Over the next few years, Ozcan refined the physical design and software until his scopes—some with tiny lenses, many without—could see stuff as small as individual flu viruses and adenoviruses. Some of his apps mimic facial recognition software, identifying cells by comparing their size, shape and internal architecture with a library of reference images.
As we hustled back to his office after a tour of his 25-person lab, Ozcan let drop that he was on the verge of another breakthrough: smartphone detection of a single DNA molecule, less than three-billionths of a meter wide. When I reacted with what must have been a look of astonishment, Ozcan, with a note of swagger, straightened his black cashmere sport coat with a snap of the elbows.
What Ozcan didn’t know when he first dreamed of mini-microscopes was the eventual role of smartphones. Without the technological leaps fueled by our lust for the latest models, Ozcan says, a university might have to spend tens of millions of dollars to develop similar gear to image, process and transmit data from his optical devices.
Daniel Fletcher, a UC Berkeley bio-engineer and leader in lens-based microscopes for smartphones, gave a crisper salute to America’s phone mania in a recent Wall Street Journal op-ed:
“Thank you for upgrading.”
***
The road to a real-life tricorder is paved with prize money.
Top honors in the first round of the $2.25 million Nokia Sensing XChallenge went last fall to Nanobiosym, a Cambridge, Massachusetts, firm led by the physicist and physician Anita Goel. Its Gene-Radar detects HIV and other diseases in bodily fluids dripped on a disposable microchip, which slides into an iPad-like device that looks for the DNA and RNA signatures of known pathogens.
The Qualcomm Tricorder XPrize will split $10 million next year among gizmos that read vital signs, diagnose 15 ailments each, and are lightweight and user-friendly enough for the masses. Contest organizers have called health care one of the few industries in which consumer needs have failed to ignite innovation. If you’re sick and need, say, a throat swab, you have few choices beyond “seeing a health care professional at a clinic or hospital, creating an access bottleneck,” the organizers say. A better system would equip ordinary people with mobile technology to “make their own reliable health diagnoses anywhere, anytime.”
Ozcan’s start-up company, Holomic, was a finalist in the Nokia contest. Four teams in the ongoing and more elaborate Qualcomm showdown have asked Holomic about tucking its tech into prototypes, though final agreements are pending.
Whether next year’s XPrize winners will be Starfleet-grade—or, more important, Food and Drug Administration-grade—is an open question. But more modest efforts have already found their way to market. The $199 AliveCor heart monitor, a home EKG device that won FDA clearance in December 2012, gloms onto the back of a smartphone and identifies irregular heartbeats from the pulse of a patient’s fingertips. The Scanadu Scout, now in clinical trials, pledges to read blood oxygen levels, heart and respiratory rhythms, blood pressure and other vitals noninvasively in ten seconds. (The company’s motto: “Sending your smartphone to med school.”)
Among those racing to shrink the lab, Ozcan stands out for his focus on one of its oldest and most indispensable instruments: the microscope. Four centuries after its invention, the Renaissance-era gadget remains a thing of Rubenesque proportions: big and expensive. Silicon Valley has made warehouse-size computers small enough to fit in our pockets—and cheap enough not to empty them. But high-end microscopes remain beasts of the lab, tended by white-robed scientists who get back to us later with the results.
Ozcan’s insight was to do to microscopes what digital audio did to vinyl. He replaced the scope’s heaviest, costliest and most iconic element—its stacks of glass lenses—with something weightless: computer algorithms that make cheap image sensors, like those in your phone camera, sharp enough to glimpse viruses and other minuscule particles.
Ozcan hacks smartphones not because they’re cool—or status symbols—but because they thrum with once unthinkable computing power. There’s no mystique or bling factor for him. Without the staggering growth in phones’ processor speeds and megapixel counts in recent years, he would have looked to other technology.
Yet for all his digital wizardry, Ozcan remains old-fashioned about the relationship between doctor and patient. “Medicine means caring for the person,” he says. “I don’t see the future as being everything—nurse, technician, surgeon—replaced by robots.” He sees his devices as your Siri, M.D., between doctor visits; your guardian angel, watching your back when your fellow humans can’t or won’t. “If I create a tech system that will send an ambulance to your house 24 hours before something happens to you, then I save you.”
A couple of years ago, a pair of doctors writing in the Lancet proposed a name for their colleagues’ sometimes excessive faith in technology: “McCoy’s syndrome.” Often, they said, a thorough physical and patient history reveals far more than any MRI. In truth, even McCoy knew limits. Mid-physical on that long-ago episode, McCoy puts down his tricorder, picks up a tongue depressor and tells Professor Crater to open his mouth. When Crater looks mystified by McCoy’s sudden reversion to old-school medicine, McCoy says, “I’ll still put my trust in a healthy set of tonsils.”
***
The promise of smartphone-powered health care in the developing world has drawn millions of dollars from the Bill & Melinda Gates Foundation and spawned groups like the mHealth Alliance, a nonprofit started by the Rockefeller, Vodafone and United Nations foundations.
Patricia Mechael, the alliance’s executive director, told me that digital successes in far-flung parts of Africa, South America and Asia so far are mainly a product of basic “telemedicine”: text messages that remind patients to keep medical appointments and take their pills, and apps that help indigenous health workers track patient records and recognize disease symptoms.
Still absent are tricked-out smartphones, like Ozcan’s, that perform automated, tricorder-like diagnoses. “To me this is one of the potential game changers,” Mechael says.
Point-of-care, or on-the-spot, diagnosis is of distinct benefit to migrant workers and people in isolated villages. By the time health workers learn the results of a lab test, they may no longer know where to find the patient, who then goes without care. “The single greatest advantage [of Ozcan’s devices] is how quickly the information can be shared with experts and decision makers across a wider swath of geography,” says Anurag Mairal, a program leader at PATH, a Seattle nonprofit that fosters tech innovation in global public health.
One of Ozcan’s most promising inventions is a universal reader of rapid diagnostic tests: chemically treated strips, like a home pregnancy test, that reveal a line if a blood, saliva or urine sample is positive for malaria, HIV or, say, heart trouble. People can and do eyeball such tests. But because Ozcan’s reader “sees” the line more sharply than the human eye, it can answer not just “Am I sick?” but also “How sick am I?” From nuances in the shading of the “positive” line on a rapid blood test for prostate cancer risk, for example, his apps can glean a relatively precise count of prostate-specific antigen, or PSA, concentrations in blood.
How popular the devices will prove to be in the real world remains to be seen. When one of Ozcan’s students took a lens-free microscope to a health clinic in the Brazilian Amazon in 2011, the technology worked well—but local feelings were mixed. The student, Onur Mudanyali, now the PhD-bearing director of research at Holomic, told me that some clinic workers viewed it as a job threat. But in nearby dorms for visiting researchers, people were more encouraging. “They were delighted that one day they would have a backpack of tools like these [to] visit villages and diagnose in the field.”
The doctor who arranged Mudanyali’s visit was Karin Nielsen, a distinguished UCLA professor of pediatric infectious diseases who frequently works in South America and Africa. When I stopped in her office after seeing Ozcan, she showed me a photograph she’d taken of a ramshackle houseboat on the Solimões River, near the Amazonian capital of Manaus. “Our next step would be to go to areas like this,” she said. Inhabitants of these boats—known as the população ribeirinha—rarely visit clinics, so health workers pull up alongside in “boat hospitals” and do medicine midstream. She says Ozcan’s devices “would likely double if not triple the number of people who get diagnosed.”
While she and Ozcan await funding for more overseas fieldwork, his start-up has set its sights closer to home. The U.S. Army is paying Holomic to investigate how soldiers might use smartphone add-ons as monitors of personal health and bioterror. There’s also a long list of potential civilian uses, from hand-held forensic analysis and animal disease monitoring to anti-counterfeiting (identifying microscopic seals of authenticity) and home fertility testing. One of his devices, a lens-free 3-D video microscope, recently mapped the never-before-seen helical swimming patterns of sperm cells.
FDA approval could come as early as this year for what would be Ozcan’s first commercially available medical device, a smartphone reader of rapid blood tests for hypothyroidism, a common disorder of the thyroid gland. (The test measures levels of thyroid-stimulating hormone.)
Sharon Cunningham, the president of ThyroMetrix, which will market the reader, sees in gadgets like Ozcan’s a revolution in the cost and convenience of routine medical testing. “Walmart? MinuteClinic? Do you think they’ll want to send stuff off to labs?” she says. “No, they’ll be standing there scanning you. And they’ll be using something like this. And you’ll pay for it and be happy about it because you’re not waiting all day for results.”