Skip to main content

Users beware: Apps are using a loophole in privacy law to track kids' phones

Washington Post technology columnist Geoffrey Fowler says smartphones and apps are harvesting our personal data — and that of our kids — on a scale that would shock most users.

52:30

Guest

Contributor

Related Topic

Transcript

DAVE DAVIES, HOST:

This is FRESH AIR. I'm Dave Davies, in for Terry Gross, who's off this week. Our guest, Geoffrey Fowler, is a technology columnist for The Washington Post who has a habit of revealing things about our computers and smartphones that make us squirm. He writes in a recent column, for example, that apps are spying on our kids on a scale that will shock you. That's from an ongoing series Fowler publishes titled "We The Users," which details the ways our phones and apps harvest our personal data, manipulate information we see and limit our choices. He says changes are needed in our laws and corporate practices governing the internet, and he presents some ideas in the series. He's also written recently about concerns that information stored in smartphones could be used to investigate those seeking abortions, should they become illegal in some states. Geoffrey Fowler writes about technology issues from San Francisco. He worked for The Wall Street Journal before coming to The Washington Post in 2017.

Geoffrey Fowler, welcome back to FRESH AIR.

GEOFFREY FOWLER: Thanks for having me.

DAVIES: So, Geoffrey, the title of this series is "We The Users." You want to explain the expression, what you're getting at here?

FOWLER: Well, with apologies to James Madison, who wrote the Constitution, I felt like we needed to have that same kind of spirit to look at what's going on with the technology in our lives today. You know, I've been a technology journalist for a long time, and originally I was a reviewer of technology. That means that my job was to, you know, check out the new gadget and tell you, is it worth it? You know, is the price right? Does it work well enough, or is it easy to use? But the longer I do this, the more I realize that that frame is not what we really need anymore.

We need to really challenge the technology companies and the world that they've built for us in a different frame. It's no longer a question of, is it too hard to use? The question now has become, is it evil? Is it taking away our rights and our choices? You know, living with technology is not just for geeks anymore. It's an intersectional issue. And I realize that we use technology not just as geeks and nerds, but now as everyday consumers, as parents, as patients, as employees in all sorts of different hats. And so I wanted to sort of look at the products and technologies that we use every day with that sort of hat on.

DAVIES: All right. Let's talk about some of the specific issues that you identify. You write that if there were a van outside your children's bedroom window spying on them, you'd call the police. But you say our apps are doing this all the time, on a scale that would shock us. What are they doing, exactly?

FOWLER: Apps that your kids are using are spying on them at just an extraordinary scale. So I work with some researchers at a company called Pixalate, and they looked at this question on a really broad scale. So they tried to categorize all of the apps that exist that might be appealing to kids, and then they tracked what happened to the personal data that those apps were collecting - things like ways to identify the phone, the location of the phone. And they found that more than two-thirds of them, on iPhones, were sending this information off to the advertising industry. It was even higher number, 79%, on Android phones. And what shocked me about this is that we have a law in America that's supposed to protect the privacy of children. And yet, this is happening.

DAVIES: Right. And so it sends - they can figure out where the kids are, their ages, all of that. What's the harm? How might this be used in ways that should concern us?

FOWLER: Yeah, there's an ongoing discussion about the harms that can come from collecting people's data broadly, and we can talk about some of those for all people, not just children. But specifically with children, we've sort of decided as a society that children are special. They are supposed to be protected because they can't necessarily, you know, make the best choices for themselves. They don't really know what they're giving up. They're not really making a concerted, you know, decision to hand over this information. They just want to play, you know, the game where they get to shoot birds or crush candy or whatever it might be.

But specifically with kids, research has suggested that lots of children aren't able to distinguish ads from content, which means that when a company can track their every interest, they can really target them with messages that could harm them - could not only sell them stuff, but could shape their self-esteem, shape their view of their bodies. And we learned last fall, from the disclosures that were leaked out of Facebook, that things like social media, where kids' own data is used to shape the experience that they have there, can contribute to eating disorders. So we know that kids, their data in particular, deserves special protection.

DAVIES: So if I have a 8- or 9-year-old kid, what might happen that could affect them in a way that should concern us?

FOWLER: So companies out there that you would never know the name of, that really have no, even, relationship with the app that you were trying to use, could be, first of all, tracking your kid's interests, then trying to predict what they might want to buy, even then, or sell their information on to others. There's an entire, you know, multibillion-dollar ecosystem of collecting this information and then repackaging it and selling it to others and then selling it to others yet again, all of which you don't really have any control over. I mean, one just sort of shocking stat I came across in reporting this is by the time a child reaches 13, online advertising firms hold an average of 72 million data points about kids. That's according to this company that, you know, helps companies market to children ethically.

DAVIES: That's 72 million about a kid?

FOWLER: Correct. Every single kid.

DAVIES: Wow (laughter). I don't know how there could be 72 million data points about anybody.

FOWLER: It's true. Well, you know, think about every location you go, every IP address, you know, how you use different apps in different ways. I mean, the thing about data is, in the moment, it always seems like, oh, well, that's no big deal. But when you step away from it and you realize that once it's collected, it's out of your control, it could be used in all kinds of ways that we can't even imagine yet. And yet we keep getting reminders that it can come back, and it doesn't ever necessarily go away and can be used to hurt people, that we would look back and say, gosh, we just shouldn't have done that.

DAVIES: So, for example, would you have - might you have games marketed to kids with violent or other disturbing content?

FOWLER: The issue here is less about content. Apps in both the Google and Android app stores all have to have an age rating on them, which is for, you know - about, you know, how violent the content is or how adult it might be in nature. So you'll see those ratings in the store. The problem here is that those ratings have nothing to do with whether or not those apps are collecting data about children. And the law that we have in the United States called COPPA, the Children's Online Protection Act, says that, pretty clearly, if someone is under 13, companies are not supposed to collect data about them without their parent's explicit permission. But the problem is that this giant industry of app developers and also Apple and Google, who run these app stores and make billions of dollars off of it, have found some really big loopholes in that law, so they're doing it anyway.

DAVIES: Let's talk about that. There is a law, the 1998 Children's Online Privacy Protection Act, that requires tech companies to protect the privacy of kids. Why doesn't it? What are the loopholes?

FOWLER: The loophole is, in the law, a company has to have actual knowledge that a child is using the app or website in order for the law to kick in. That means that they have to - you know, an app developer has to know, oh, yes, there is a 12-year-old on my app right now, for them to be forced to say, OK, well, I guess I won't collect data in this instance. I guess I won't send their information off to advertisers. So many of them then just claim, oh, we don't know who's using our app. It could be adults. Or they'll say, you know, we're really not marketing this coloring app or this math homework assistance app to children. We're marketing it to adults. And Apple and Google, who run these app stores and are sort of the de facto police for them, let them get away with it.

DAVIES: Right. So - and when you looked at this and others looked at this, what did they learn about apps that presumably are for adults and the extent to which kids use them?

FOWLER: Kids use all kinds of things. I mean, the app stores that they have available to them on their phones are just the same as the app stores that adults have. So they want to play the same games that we want to play oftentimes. That's things like Angry Birds and Candy Crush. They want to use calculator apps. They want to do coloring apps where they, you know, tap on pixels to - for enjoyment. They want to do a lot of the same things we do. And these kinds of apps are all claiming that they are general audience apps, which means that they are made for adults, rather than acknowledging that, you know, actually, kids are going to be interested in this stuff, too, so you ought to treat them differently.

DAVIES: You found some apps that were clearly designed for kids, but the app developer says it's for ages 12 and over. Can you give us an example of something you found?

FOWLER: Yeah, there was one app that's used for coloring. It's called Pixel Art, and you spend your time tapping on pixels to fill in the colors on, you know, pre-made pictures of things like ice cream or unicorns or fancy unicorns. And the developer of that app told me, oh, but we're a coloring app for adults.

Now, look, I know there has been, like, a trend in adults coloring on apps as a stress relief, and I 100% get that. But America's child privacy law, COPPA, says that if the product is supposed to be used in any way by children, any part of the audience is children, then the app developer really needs to comply with COPPA. And so they ought to then be checking to see, OK, are you a kid who's here for the coloring app? And if so, don't collect their data, or then, get their parents' permission.

DAVIES: So what's the fix here?

FOWLER: There's a couple of fixes here. First of all, Apple and Google are the de facto cops for apps in our world. You know, they get to decide what goes in those stores, and they actually go to Washington all the time right now arguing that they alone should be allowed to run these stores, even though they are kind of like monopolies, because they alone can protect our privacy and our security and protect our kids. So if they're going to say that, then they really ought to force apps to tell the truth about whether children are potentially using the apps, and if so, treat them differently.

Right now, they kind of turn the other way when apps say, oh, no, no, we're just marketing this coloring app for adults. And so they should force every app to identify whether it is in any way directed at children. And if the apps lie when they submit themselves to the store, the app stores should punish them. So that's one basic step.

You know, the FTC actually asked Apple and Google to do this a decade ago, but they did not because they realized that that would put a lot more work and - you know, on their shoulders, to have to kind of make those distinctions, make those decisions. But somebody has got to make these decisions, and the app makers themselves aren't doing it. So I think Apple and Google ought to take on that work.

DAVIES: Right, just on this point. So if there's a way to actually discover whether an app is being used by kids, what should happen? The app-maker then should be - should itself ensure that it harvests no personal information and gives it to the advertising industry?

FOWLER: That's right. This part of the COPPA law is already super clear. It says that, you know, if a kid is using the app, you either have to not collect personal information, which is always an option for apps and websites out there. They could just stop collecting all of our data. Or they have to get explicit parental permission. So they - you know, they ask, OK, can you please ask mom or dad to verify that you're using this? And they might ask mom or dad a question that only a mom or a dad would know the answer to.

DAVIES: OK. And other ideas?

FOWLER: Another idea that came from, of all places, Instagram, which is one of the apps that we've been talking a lot about as causing problems for kids, is to have the phone itself know whether or not a kid is using it. So the idea is when a parent sets up an iPhone or an Android phone for their child, they enter in the age of the child. And so if that age is under 13, then it would send a signal out to apps that would say, hey, there is a kid here. Do not collect data unless you get parental permission.

I think that would be really useful in a lot of different ways. You know, No. 1, apps could build this into their systems so that, you know, it would be automatic, and they wouldn't collect the data. Hey, many app developers tell me they don't want to be collecting kids' data, but it's hard for them to know. And so this would give them a hand but also just give this - them information about who's using their app so that they could understand, actually, we really are child-directed, and we need to think about that as we - so we develop this further.

DAVIES: Let me reintroduce you. We need to take a break here. We're speaking with Geoffrey Fowler. He's a technology columnist for The Washington Post based in San Francisco. We'll continue our conversation in just a moment. This is FRESH AIR.

(SOUNDBITE OF TODD SICKAFOOSE'S "TINY RESISTORS")

DAVIES: This is FRESH AIR, and we're speaking with Geoffrey Fowler. He's a technology columnist for The Washington Post. He writes about consumer advice on navigating the tech world. He has an ongoing series about privacy and other information issues with our apps and smartphones. It's titled "We The Users."

You know, a while back, Apple began giving users of iPhones the option of telling them, I don't want this app to track my data. And it was kind of a cool thing. And I think a lot of people felt reassured, and it gave them some real options. In an article - I think this was last fall - you looked into this and found apps don't exactly do that, right? What did you find?

FOWLER: You're exactly right. So we've all now seen this TV commercial, or there's actually several of them for Apple, where, you know, something really creepy starts happening to someone who's using a phone. Maybe they start being followed around by a bunch of people. Or there's another one that's on right now where there's an auction going on. They're auctioning off people's data. And the solution to these problems in these commercials is that you press a button on your iPhone that says ask app not to track, and then magically, poof, all the bad guys go away.

So this is really intriguing. This is the sort of solution that people want. And it fits in the Apple brand of, hey, we're going to protect people's privacy. So - but I was really curious. What does happen when you press the button? Does it stop all of this tracking and data collection that apps are doing kind of behind the scenes? And the short answer is it stops some of it. It stops one form of it. But it certainly does not stop it all.

DAVIES: So what does go on?

FOWLER: So when you press the ask-app-not-to-track button, first of all, it's kind of encoded in the language that Apple uses there. You're putting in a request not to track you, but you're not exactly shutting off the system - so that would be used to potentially track you. So what you're doing is you're stopping the app from using one particular kind of ID that exists on your phone. It's called the IDFA. And it was actually made by Apple, built into the iPhone a long time ago. And it's just a code that allows apps to know who you are across different apps. That's really helpful for advertisers, for example, who might want to show you the same ad for fancy underwear that you see in one app and then, you know, show it to you on another website or in another app.

So when you press the ask-app-not-to-track button, it says you can't grab that form of ID anymore. But it doesn't really do anything to stop all of the other kinds of data that can still be used to identify you that apps might want to grab. We did a little test. We tested an app called Subway Surfers, which is one of the most popular games for the iPhone. And we said - we tested, OK, what happens when we press this button? And when we did, we watched it still grab 29 other very specific data points about your phone, including your internet address, the free storage on your phone, the current volume level of your phone and even the battery level on your phone.

What is it doing with all this information? Well, these kinds of details allow the app to make what's called a fingerprint of your phone. Basically, it's an alternative way to identify you. And then that basically lets the app track who you are and then send that information out so it can be used to track you across lots of different apps.

DAVIES: Right. So the app developer itself has to affirmatively decide, I'm not going to market this stuff, I'm not going to collect this stuff and market it. And a lot of them just aren't doing that, right? They're kind of taking a back door and getting information that's valuable and selling it?

FOWLER: That's right. The way this happens is it's a little - it's often a little less direct in the intention of the app developer. We've developed this giant economy of apps, you know, billions and billions of dollars where the way - the easiest way to make money as a developer is to include this software. They're called SDK, software development kits, that they just plug in to the app that will promise to, like, help them make money. Many of them have to do with advertising. And the app developer itself may not even know what exactly it's doing. But oftentimes, it's these SDKs that are grabbing all of these details to try to better target ads. And, you know, the more data they grab, the more they can sell those ads for.

DAVIES: So an effective practice to really prevent tracking, I guess, would be they would have to do some of the analytics that you and others did to determine whether apps are taking other indicators of identity and selling them, and then those that violate it - what? - toss them off the app, fine them? I mean, it's hard to do that for tens of thousands of apps, right?

FOWLER: You're right. Apple and Google would have to - because they run these app stores would have to take responsibility for it. And they would have to investigate each app. They would have to test each new version of it to make sure that they're not getting up to anything creepy. And if they break those rules, they would have to boot them out of the store. It might sound like a lot of work. And yet Apple already claims it does this, right? It has this entire App Store review process, which it says, you know, is critical to the trust that we have in the iPhone and making it a private service and making it private and secure. So they just ought to complete the job is what I say.

DAVIES: All right. In the meantime, you check that button, don't allow this app to track my stuff?

FOWLER: It's better than nothing. So, yeah, take advantage of everything you can out there because it's, you know, it's a battle.

DAVIES: We need to take another break here. Let me reintroduce you. Geoffrey Fowler is the technology columnist for The Washington Post, based in San Francisco. His ongoing series of articles about issues with our computers and smartphones is titled "We The Users." He'll be back to talk more after this short break. I'm Dave Davies, and this is FRESH AIR.

(SOUNDBITE OF MUSIC)

DAVIES: This is FRESH AIR. I'm Dave Davies, in for Terry Gross, who's off this week. We're speaking with Geoffrey Fowler. He writes about technology for The Washington Post from San Francisco, often giving consumer advice on navigating the tech world. He has an ongoing series about privacy and other information issues with our apps and smartphones titled "We The Users."

You write about privacy policies. I mean, we are all used to, at this point, being confronted with a long block of text saying, these are our terms of service. Do you agree? And because we want to go on to the next step, we just simply click, I agree. You know, I do this all the time. I never read that stuff. In fact, one of the most shocking things about your article about this was that a Pew study showed that 9% of users actually do read these privacy agreements. I would have thought it would've been none.

FOWLER: I think they - maybe they were lying.

DAVIES: (Laughter) Right, right - I read one once. For an example of the scale of this, I mean, you totaled up the length of the privacy policies that one might read just for the apps on your phone. How long were they?

FOWLER: It was a million words. And just to give you a little context, that's about twice the length of "War And Peace." There is no way that any normal-functioning person is going to have time enough to read that even once, much less keep reading it as these companies tweak the language and update them, you know, all the time. It's just nuts. But unfortunately, it is the basis of how our privacy is supposed to be protected in the U.S. Our problem right now is that we're just overwhelmed by data collection. And this model that's built into American law and the economy that we, the users, are somehow consenting to each and every one of these data uses is completely broken. In fact, it's really mean to us as consumers. It's not really fair. It puts the onus on us, puts on our shoulders that if something happens with our data that we didn't like or something bad happens to our data, it's our fault for consenting all along. And I just think that's really, really broken.

DAVIES: You know, it's - one of the interesting things is that Apple has for its apps a privacy label, kind of like the nutrition labels that you see on foods, which are supposed to tell you quickly and clearly what the - you know, the nutritional content is. These labels are supposed to tell you what this particular app's privacy policies are. You took a closer look at this. What did you find?

FOWLER: I found that they were pretty lacking. First of all, I'm a professional. It's my job to be paid to read privacy policies and to understand these things. I look at Apple's labels and I'm still not entirely certain what I'm learning from them. It's not really giving me the information that I want to know, which is, like, well, who's getting my data? It doesn't say that. It doesn't give me any power, really, to stop them from collecting some of these things, you know, beyond pressing that ask app not to track button, which, again, is more of kind of a request. And most puzzling of all, the labels aren't necessarily even accurate. When they first came out, I just tested a whole bunch of them and then, you know, looked behind the curtain to see what data the apps were collecting. And I found plenty were, you know, fibbing in these labels. They were saying that there was no data collected when, in fact, there was plenty of data being collected.

And when I confronted Apple about this, they said, oh, well, these are self-reported (laughter). So you know, if we find someone is breaking the rules, we might do something about it. But thanks for pointing out some that might be breaking the rules. And good golly, you know, if me, one journalist sitting in San Francisco, can spot all of this, how can a how-many-trillion-dollar corporation - why can't it take on this responsibility? So there's a theme that's been coming up in our conversation today so far, and that is responsibility of giant tech corporations. And they certainly want to make a lot of money and become, you know, the most powerful and wealthy companies in history. But when it comes to protecting consumers, they really are not getting the job done entirely. And that's one of the things that most galls me as a consumer advocate and as a technology journalist, that if you're going to make that much money off of us, you really got to live up to your promises.

DAVIES: Yeah. One of the most striking things when you describe these labels, these privacy labels that are supposed to tell you quickly what the app does in terms of privacy, there's a little phrase that says, this information has not been verified by Apple. So huh (laughter)?

FOWLER: I mean, would you buy a loaf of bread from the store if the nutrition label at the back said, like, might be accurate, might not be? I mean, we just wouldn't let that go. But again, this is an industry that is, for the most part, completely unregulated in the U.S. and, not only that, also has no competition. You know, Apple and Google have a stranglehold over the app stores that we use for iPhones and Android phones. They are a huge - you know, almost the entire market for app stores. So it's not like I could go and say, you know what? I'm going to choose the app store that's made by Mozilla, which is a nonprofit which is really committed to consumer privacy. I would love a Mozilla-made app store for my iPhone or Android phone. That feels like that would really empower me because I trust them more than, frankly, I trust Apple or Google.

DAVIES: This area of privacy policies is one in which you, again, have some ideas about what we should do. And you write that we should abolish the notion that we should read privacy policies. What's the alternative?

FOWLER: Yeah. Well, at the highest level, we just need a better rule for what can happen with our data. There's been, you know, decades of discussion about a federal privacy law in the U.S. And states like California have tried their own. You know, there's renewed discussion about it as of even the last couple of weeks about how we might get a law that would cover everything. But there's a basic premise here that, I think, really ought to be at the core of everything, and that is only take as much data as you need to do the thing that I asked you to do, no more, no less. Then we would have much fewer problems.

The thing is, I think that's what most consumers already assume is happening, that if you ask, you know, a website to show you a map, it's collecting your location just for that moment to give you directions, and that's it. The problem is that's not what's happening. These companies are taking it as an opportunity to then collect your data all the time or, you know - and do what they want to with it. And so that's the fundamental disconnect. So we need a new notion of what the base level of privacy is so it's kind of built in by default. Now, there's discussion, again, about making a federal privacy law that might do that. If they don't, you know, the FTC has some new leadership under the Biden administration. And I've spoken with some commissioners there. And they seem like they're ready to be much more aggressive, even with our existing laws, of enforcing some of these kinds of ideas that it is a failure of trust to the consumer to take more data than you really need in that moment. So they haven't done a ton with it yet, but I'm really curious to see where that goes.

DAVIES: And by the way, when you see this little question on the - on your app that says, read our privacy policy - do you accept? - if you don't accept it, does that mean you don't get to use the app?

FOWLER: Yeah.

DAVIES: That's it (laughter). There's...

FOWLER: You don't have much choice. I mean, that's one of sort of the critical elements here is that the - we're getting this idea that somehow we are making a choice to accept these. But it's not like you can negotiate with these things. You can't go in and say, OK, I like paragraph 13 but not paragraph 17, so let's strike that one out. It's an all or nothing. So, you know, you could say, OK, fine, just don't use these apps. And increasingly, people are making that choice about things like Facebook or Instagram or maybe even Google products or Amazon products.

But it is increasingly hard to be a connected person in our society without, you know, agreeing to these things. You know, if you're a student in school, you've got to use Google's software for schoolwork. If you - you know, many people have to use stuff for work. So it's just - again, putting the onus on us, the consumer, to, like, read each of these things and and make that choice is just too much.

DAVIES: Let me reintroduce you. We need to take a break here. We're speaking with Geoffrey Fowler. He's a technology columnist for The Washington Post based in San Francisco. We'll continue our conversation in just a moment. This is FRESH AIR.

(SOUNDBITE OF GEORGE FENTON'S "THE NEW VAN")

DAVIES: This is FRESH AIR. And we're speaking with Geoffrey Fowler. He's a technology columnist for The Washington Post. He writes about consumer advice on navigating the tech world. He has an ongoing series about privacy and other information issues with our apps and smartphones. It's titled "We The Users."

Another thing you write in terms of, you know, another way to do things is you say we could make our computers privacy butlers, meaning what?

FOWLER: This is actually one of the most exciting ideas to me because, you know, we're all very aware that technology can be used to invade our privacy. But what if technology could be used to help protect our privacy? And the way this might work is if privacy policies started being written in a standardized format - so maybe something closer to labels, but let's, have a standardized kind of way to do it, standardized way to disclose information. But instead of assuming that consumers are going to read all of those, if we tag them, then computers could read them, and they could know what kinds of things are happening. They could tell, aha, OK, this website has targeted ads on it. This website is collecting your location, will send it to the following data brokers or other kinds of people that you don't necessarily want with your data.

Then your computer, in the form of your phone or in the form of a web browser, could interact with each of those tagged privacy policies for you and know your preferences like a butler, know that, OK, Geoff does not like websites with targeted ads or Geoff, you know, doesn't want his information sent to certain data brokers and then make those choices for you so you didn't have to read them.

DAVIES: So for that to happen, the computers or smartphone makers would have to decide to do this, and then the app makers would have to cooperate or we could have a law.

FOWLER: Or we could have a law that is required. There actually was a law put forward - a bill put forward earlier this year called the Too Long; Didn't Read Act, the TL;DR Act. And part of it was that privacy policies would have to be written in a readable format and that they would have to be tagged. And if we could get those kind of elements put in place and we started tagging them, then we could develop the systems and the technology to interact with them for us. But, you know, there's already sort of hints that this is possible and is going on. So in California, where I'm based, we have a law that says that, you know, you as a consumer can always tell a company, don't sell my data. The problem is that right now, it's really hard to tell all the companies that you interact with not to sell your data.

So there's an effort - a technological effort underway to build into browsers something called the Global Privacy Control that would send a signal out to every website you go to that says, hey, don't - I'm a California resident. By law, you're not allowed to collect my data if I tell you not to and sell it. So don't do it. And that's just getting off the ground. And it, you know, will require some enforcement by the state of California. But that's the kind of idea that I find really intriguing.

DAVIES: You know, you recently wrote about the form that you often sign when you go into a doctor's office. You know, we were talking about privacy policies. You know, you're confronted with this long block of text and then you say, yes, I agree. You looked into what you might be authorizing in terms of use of your medical data. What did you find?

FOWLER: You might be authorizing a company that you've never heard of to take your medical information and use it to market to you very specific pharmaceuticals.

DAVIES: Wow. I mean, like, how does that work?

FOWLER: I heard about this happening actually from a number of Washington Post readers who wrote to me about seeing something kind of curious when they were filling out their forms that they got asked to sign when they went to the doctor's office. And they pointed me that there was this company called Phreesia - that's Phreesia with a P-H - that was making the software that they were using to check in at their doctor's office.

So you might have gotten this from a doctor that asks you to e-check in maybe the day before. Or when you arrive at the doctor's office, they hand you a tablet, and they say, OK, you know, fill in all this information here. And it's - a lot of it's the typical kinds of stuff. It's like, give me your name and your address and demographic information and what medications you're on and your previous history of ailments and all that kind of stuff. But there was one more form in the stack that said at the bottom, I agree. And the wall of text in it said, I agree to allow Phreesia to use the information that I just entered to show me very targeted marketing information based on my health care status.

DAVIES: So let's understand this. I mean, is this the case of the medical practice deciding it's going to ask you to essentially waive the HIPAA right to privacy and allow the medical practice to sell the data? Or is it the party that does this? I mean, how does this work?

FOWLER: Yeah. So the way HIPAA works is it applies to what are known as covered entities. So in this instance, your doctor is a covered entity, so your doctor is not allowed to share your health information without your permission. And there's all sorts of rules about how they must protect it and all kinds of stuff. But your doctor's office has contracted out to this company called Phreesia to make the software that is used for patient intake. So they - you know, they pay a fee to this Phreesia company to help them run the front office of their clinic, which I completely get.

You know, doctors' offices are totally slammed. They have to collect all this information. This seems like a good deal. The thing is, Phreesia has this side business in marketing and targeted ads, and the folks who made the decision for your doctor's office or clinic to go with them may or may not even totally realize that that's happening. And under HIPAA, Phreesia is a business associate of the covered entity, which means that it is automatically allowed to have access to all of the data that you are sharing with your doctor for your care. But it is allowed if it gets you to click a button that says, I agree to use that data in other kinds of ways. And so we've developed all of these secondary uses of data.

Some of them are arguably pretty good for our society. We get to know, for example, the COVID case rates, you know, that kind of information. It becomes much more accessible much more quickly. But businesses now have come along and decided that they want to come up with some secondary uses of the data. And this is one of them - Phreesia taking your health information and basically mining it, trying to figure out, OK, do we know who you are? Do you have some ailments that we have advertisers that they want to address? Or might you - future - you know, have some ailments you didn't even know about that we could make you aware of so that you'll go into the doctor's office in just a moment and ask your doctor about them?

DAVIES: So you contacted this company that does this, Phreesia. What did they tell you?

FOWLER: Phreesia says they do not sell our health information. This is actually really similar to an argument I often hear from Google and Facebook, which also do not technically sell your data to other companies. Instead, they mine your data and use it to target you with ads in a place where they've got your attention.

In the case of Phreesia, the place they've got your attention is you're at the doctor's office. You just filled out information on a tablet given to you by the doctor, and you're waiting to see the doctor. And so they've got that screen in front of you, and that's where they're going to show you these ads, at that moment. In the case of Facebook and Google, it's when you're using their services.

So it's a subtle, legal distinction, but practically, you know, for consumers, it doesn't make any difference. The pharmaceutical company may not know - may not actually have your medical record, but it's still able to use it to target you with messages about its products at the at the moment when they know you're most susceptible to them.

DAVIES: And when you say you get the ads on the screen, you don't mean the tablet screen that has the text and the permission for them. You mean your smartphone if you're in the doctor's office?

FOWLER: I mean literally on the tablet screen - yeah - that they handed you.

DAVIES: Oh, really? So if I tell them I have high blood pressure, I'm going to get an ad for high blood pressure medication?

FOWLER: You might. You might very well. Or they might try to use some kind of algorithm to decide, all right, Dave, you look like - you know, demographically and other ways, you look like someone who might have hidden high blood pressure. So let's plant that idea in your brain so you go in and you tell your doctor, hey, might I have high blood pressure? Do I need a medication for that? So that's the marketing game that they're playing.

FOWLER: Let me reintroduce you again. We're going to take another break. We are speaking with Geoffrey Fowler. He's a technology columnist for The Washington Post. We'll continue our conversation after this short break. This is FRESH AIR.

(SOUNDBITE OF JOSHUA REDMAN'S "HIT THE ROAD JACK")

DAVIES: This is FRESH AIR, and we're speaking with Geoffrey Fowler. He's a technology columnist for The Washington Post based in San Francisco. He has a new series, an ongoing series, about privacy and other information issues with our apps and smartphones titled "We The Users," which points out issues and offers some solutions.

You know, our phones store so much information about us. And one thing you recently wrote about is the prospect that if the Supreme Court overturns the Roe v. Wade decision and abortion becomes a crime in some states, that law enforcement or others might get phone data to investigate who has sought or gotten or provided an illegal abortion. What could happen here?

FOWLER: Your phone knows a lot about you, and so it would know if you were searching for information about where to get an abortion. It might know if you were at a clinic. It might know the history of your fertility cycle because a lot of people use cycle tracking apps. All of this data could be used against you if you happen to be in a state where seeking an abortion becomes against the law.

There is some precedent for this already, that search histories and other information has been used to try to show that women were guilty of not caring for their fetuses or leading to the death of a baby. And the thing that I think people forget is any time that a company collects information about you, the government can get access to that information either by issuing a court order or increasingly, just by buying it. Right? We're talking about a giant industry economy of selling people's data. So increasingly, we are seeing government go and do that to gather evidence and to try to prosecute crimes.

DAVIES: And could private actors do this, too? I mean, the Texas law, for example, offers essentially a financial incentive for a private citizen to find someone who's violated the abortion statutes.

FOWLER: Indeed, a private actor could do this. And, in fact, the day after the leaked Roe v. Wade decision became news, some technology journalists went and bought a bunch of data from a location data broker where they were able to identify people who were at abortion clinics and where they were before they were at the clinic and then where they went afterwards. So even journalists are able to do this stuff. So, you know, folks who really want to make use of this kind of data certainly could.

Another example of how data collected by companies can be used in unexpected ways is that recently a Catholic news organization got access to some data that indicated that a well-known priest was gay. And they did this by buying the data that they linked to the gay dating app Grindr and were able to track the location of one user from the known address of this priest and his known whereabouts to gay clubs and other sorts of places. And this priest had to end up resigning because of this. So, again, this is all done through private industry.

DAVIES: So the journalist did a story of effectively outing this priest?

FOWLER: Yeah. You know, and what these examples remind me is that digital rights are civil rights, you know, and these things are increasingly linked together. And that's why I think it's so important that we talk about what's being collected and what kind of control and power we have over it Because, you know, you can't tease them apart anymore.

DAVIES: You know, on the issue of information about abortion, I was stunned to read something in one of your stories that you linked to another article showing that an anti-abortion group had gotten smartphone data to then target anti-abortion messages to the phones of patients in the waiting rooms of an abortion provider. This is pretty remarkable. How did this happen?

FOWLER: Yeah, this happened, I believe, in Massachusetts, where as part of this targeted ad industry, you can target people in specific places. And so they did that with messages to people that they thought were in abortion clinics. I think Massachusetts has since made that illegal to do. But still, they showed that it is possible. So, you know, when we think that, oh, there's no real harm in, you know, allowing apps to collect our data or maybe even in seeing, you know, being targeted with ads in these kind of very ultra-specific kinds of ways, it can be hard to see in the future how this stuff might be used against us. But we're starting to get examples like this.

DAVIES: You know, we began the conversation talking about kind of a comparison to the writing of the U.S. Constitution. And at a time in the late 18th century, when the country was governed by the Articles of Confederation, and they were outdated, and they didn't work, and we're now in a technology world where things need to change - what are the prospects for good legislation getting through Congress, do you think?

FOWLER: They're better than they've ever been. So that's the positive sign. Right now, you know, there's new discussion about a privacy law in Washington. There's also discussion about several bills that would introduce more competition into the digital marketplaces, which I think would really make a big impact because that way, you know, companies would be forced to compete on privacy, forced to compete on having a well-run app store, forced to compete on, you know, the quality of search results and other sorts of things. So, you know, some of this is going to be hashed out this summer. And the bad news is - the thinking is that if some of this stuff doesn't pass in the next couple of months, it's going to be a long time, given the distractions of the midterm elections and the potential that the Democrats are going to lose control of Congress. So there's a lot of focus on this right now.

DAVIES: OK. I'll just close by noting that Geoffrey Fowler's work is available online. It includes a lot of interesting links and videos that he does to explain a lot of this in simple terms. So, Geoffrey Fowler, thanks so much for speaking with us again.

FOWLER: Thanks for having me.

DAVIES: Geoffrey Fowler is a technology columnist for The Washington Post based in San Francisco. His ongoing series of articles about issues with our computers and smartphones is titled "We The Users." If you'd like to catch up on interviews you've missed, like our conversation with NBC correspondent Katy Tur about her new memoir, or with Linda Villarosa about how racism affects American health care, check out our podcast. You'll find lots of FRESH AIR interviews.

(SOUNDBITE OF MUSIC)

DAVIES: FRESH AIR's executive producer is Danny Miller. Our technical director and engineer is Audrey Bentham. Our interviews and reviews are produced and edited by Amy Salit, Phyllis Myers, Roberta Shorrock, Sam Briger, Lauren Krenzel, Heidi Saman, Therese Madden, Ann Marie Baldonado, Seth Kelley and Joel Wolfram. Our digital media producer is Molly Seavy-Nesper. Thea Chaloner directed today's show. For Terry Gross, I'm Dave DAVIES. Transcript provided by NPR, Copyright NPR.

You May Also like

Did you know you can create a shareable playlist?

Advertisement

Recently on Fresh Air Available to Play on NPR

52:30

Daughter of Warhol star looks back on a bohemian childhood in the Chelsea Hotel

Alexandra Auder's mother, Viva, was one of Andy Warhol's muses. Growing up in Warhol's orbit meant Auder's childhood was an unusual one. For several years, Viva, Auder and Auder's younger half-sister, Gaby Hoffmann, lived in the Chelsea Hotel in Manhattan. It was was famous for having been home to Leonard Cohen, Dylan Thomas, Virgil Thomson, and Bob Dylan, among others.

43:04

This fake 'Jury Duty' really put James Marsden's improv chops on trial

In the series Jury Duty, a solar contractor named Ronald Gladden has agreed to participate in what he believes is a documentary about the experience of being a juror--but what Ronald doesn't know is that the whole thing is fake.

08:26

This Romanian film about immigration and vanishing jobs hits close to home

R.M.N. is based on an actual 2020 event in Ditrău, Romania, where 1,800 villagers voted to expel three Sri Lankans who worked at their local bakery.

There are more than 22,000 Fresh Air segments.

Let us help you find exactly what you want to hear.
Just play me something
Your Queue

Would you like to make a playlist based on your queue?

Generate & Share View/Edit Your Queue