Notice & Comment

What Does Google Maps Teach Us About Medical Privacy?

Medical privacy matters. But it’s not the only thing that matters, and people may not value it as much as they say they do.

Here’s an analogy. If I asked you whether you think private companies should monitor your movements, you’d probably say no. You might even feel strongly about it, and understandably so. It’s creepy. If I listened only to what you said, I’d come away thinking privacy was really important to you.

But if I asked you whether you use Google Maps to avoid heavy traffic, you’d probably say yes. You might even love the traffic feature, like I do. It’s so darn useful.

Odds are, though, that you’ve never stopped to ask how Google knows that traffic is heavy on I-94. Google knows, of course, because it’s tracking your cell phone. Yes, it’s creepy. But most people—though not everyone—would agree that the benefits of the tracking outweigh the privacy intrusion.

There’s no real informed consent here. Maybe real-time tracking was part of the deal when you accepted Google Maps’ terms and conditions, but you didn’t read those terms and conditions. No one does. There’s probably a setting on your phone that would tell Google to stop tracking you. But I have no idea where that setting is and you probably don’t either.

If we wanted to, we could force Google to get real informed consent before it tracked you. Instead of being an opt-out, tracking could be an opt-in. To assure the consent was fresh, we could require periodic renewals. And we could compel Google to ask for consent in prominent and plain-language terms: “Is it OK if we track your every move?”

But here’s the sticky point. Unaware of the benefits of relinquishing privacy, lots of people will refuse to give their consent. Even those who have some dim sense of the collective benefits may prefer to free-ride. Starved of data, traffic monitoring would become less effective or disappear altogether.

In other words, you can be diligent about protecting privacy or you can have real-time traffic information. You probably can’t have both.

I bring this up because of the controversy surrounding proposed changes to the Common Rule, which lays out ethical guidelines for human-subjects research. Rebecca Skloot, the author of The Immortal Life of Henrietta Lacks, has a New York Times op-ed supporting a change that would, for the first time, require informed consent before de-identified biospecimens can be used in research. “Some people may not mind” if their anonymous blood and tissue is used in research, she writes. “But I assure you, many do.”

I’m sure that’s right. But, as Chris Roberston and Jonathan Loe note in a really smart response, what’s the value in asking someone to sign a form they never read? “Americans should be wary of reforms such as this one that would expand regulatory oversight of science but, while well-intentioned, provide little real benefit to human subjects.”

We could always try harder to get bona fide informed consent, and Skloot thinks we should. But the costs of getting that consent could make a lot of life-saving biomedical research prohibitively expensive or even impossible. Carl Schneider’s recent book about the punishing burdens that institutional review boards place on research is an extended meditation on that theme.

Tradeoffs here are inescapable. Relinquishing privacy is necessary to achieve some collective goals, whether that’s research or real-time traffic monitoring. And getting consent is so hard and so expensive that it won’t ever be able to eliminate privacy concerns. Honoring privacy will thus come at the expense of other things we care about. There’s no way around it.

But people don’t have those tradeoffs in mind when you ask them about privacy in the abstract. They may not even know the tradeoffs exist. That’s why it pays to be skeptical of confident assertions, like Skloot’s, about the value that people place on their privacy. They care, sure—but they may care more about all the wonderful things that come from giving it up. If you doubt it, ask Google.